Flexible and inexpensive: Improving learning transfer and program evaluation through participant action plans

  • Published on
    06-Jul-2016

  • View
    213

  • Download
    1

Transcript

18Performance Improvement, vol. 49, no. 5, May/June 20102010 International Society for Performance ImprovementPublished online in Wiley InterScience (www.interscience.wiley.com) DOI: 10.1002/pfi.20147FLEXIBLE AND INEXPENSIVE: IMPROVINGLEARNING TRANSFER AND PROGRAM EVALUATION THROUGH PARTICIPANTACTION PLANSChris A. Cowan, MDIV Ellen F. Goldman, EdD Melissa Hook, BAAction plans have been shown to improve transfer of learning and have proven an effective toolin training evaluation. This study describes how action planning was simply and successfullyadapted to a preexisting curriculum with few additional resources. The decision to useparticipant action planning, the administration of it, and the participants and the sponsorsresponses are discussed, along with suggestions for future human performance technologyresearch.PARTICIPANT ACTION PLANS can be a flexible andinexpensive performance improvement tool. They areoften dichotomized as either a participants to-do listswith little impact or are considered to be a complex andresource-intensive means for evaluating participants per-formance improvement. However, this understandinglimits an appreciation for how useful the process can bewhile still remaining flexible and cost-effective. Previousstudies have suggested that participant action planningcan improve transfer of knowledge (Foxon, 1987;Hollenbeck & Ingols, 1990; Swinney, 1989) and thataction plans can provide a practical and effective meansfor evaluating training solution outcomes (AmericanSociety of Training and Development [ASTD], 2008;Basarab & Root, 1992; Phillips, 1994; Youker, 1985).However, these studies are often perceived as limitingbecause (a) action planning definitions are too vague forperformance improvement professionals to apply theprocess, or (b) action planning is presented in such a waythat it seems it can only achieve measurable benefits ifsignificant resources are dedicated to the project.According to these sources, action planning is definedas a particular approach that helps participants apply whatthey have learned during training of their jobs. Participantaction plans have occasionally been mentioned in the per-formance improvement literature (Phillips, 2003; Swinney,1989), in which they have been called performanceimprovement plans. Further, the learning and develop-ment industry has highlighted them as a valuable evalua-tion tool (ASTD, 2008; Phillips, 1994, 1997, 2003). Youker(1985) went further by elaborating on 10 specific benefitsfrom the action planning approach to workplace learn-ing and training: (a) the transfer of learning, (b) verbaliza-tion and commitment from trainees, (c) practice forimplementation during training discussions, (d) contin-gency planning, (e) commitment to action, (f) expectationof a follow-up check on the progress of trainees actionplans, (g) motivation generated by follow-ups, (h) a sup-portive post-training atmosphere, (i) evaluation of behav-ior changes, and (j) a system for organizational change.This study builds on this previous work by describingthe design and application of a participant action plan-ning process used during a critical training program forprofessionals working with crime victims in Washington,DC. The process was adapted to fit within the evaluationdesign of the training solution, but quickly proved to be avaluable tool for improving learning transfer and pro-moting on-the-job performance.BACKGROUNDAction planning can be traced back to the 1950s when itwas used by Mosel (1957) to describe a strategy for over-coming negative influences of the work environment onPerformance Improvement Volume 49 Number 5 DOI: 10.1002/pfi 19the participants learning. The idea was that participantswould think through their learning by taking notes anddevelop a plan for applying their knowledge when theyreturned to work. The exercise would include a discussionabout what participants learned during the training, waysin which they could use that learning to improve their jobperformance, potential barriers and resources needed toimplement the proposed actions, and, depending on thecontext, a supervisory review of the plan for further devel-opment (ASTD, 2008; Basarab & Root, 1992; Hollenbeck& Ingols, 1990; Phillips, 2003; U.S. Office of PersonnelManagement Productivity Research and EvaluationDivision, 1980; Youker, 1985). By the mid-1970s, actionplanning had become a popular approach for improving,training transfer or transfer of training (Foxon, 1987,1994), the terms used to connote the application of atrainees learning back to their jobs. More recently, actionplanning was used by the coaching and personnel devel-opment fields to deal more intensively with an individualslong-term performance improvement (Dransfield, 2000;Hall, Otazo, & Hollenbeck, 1999).Within the discipline of training, however, action plan-ning became an explicit evaluation tool that was used tohelp professionals make better decisions about the effec-tiveness of training solutions (ASTD, 2008; Basarab &Root, 1992; Phillips, 1994, 1997, 2003). Throughout thisevolution, empirical research on the effects of action planning has been relatively scarce because practitionersand sponsors do not typically invest the necessary time andmoney needed to conduct rigorous empirical research onthe process (Campbell & Cheek, 1989). Despite this, partic-ipant action planning has remained a popular choice withlearning and development professionals as an evaluationstrategy (ASTD, 2008) and with coaches and personneldirectors as a means for long-term employee development(Dransfield, 2000; Hall, Otazo, & Hollenbeck, 1999).This study provides the view that action plans can beeffective as part of a comprehensive, resource-intensiveevaluation strategy, as well as a part of a simple, low-costaddition to the instructional design process. The studybuilds on the previous work with participant action plan-ning with specific distinctions. For example, Phillips(1994) describes a management training program atCola-Cola, where the evaluation strategy included the useof action plans to promote learning transfer and capture improvement measures across multiple levels of impact, including on-the-job performance, businessimpact, and return-on-investment. In this example, theaction planning process was systematically integratedwith multiple performance improvement designs and, assuch, required significant planning, design, and imple-mentation resources. In contrast, this study suggests thata simplified, condensed action planning process, wherethere is little alteration to preexisting curriculum and fewadditional resources provided to evaluation efforts. Forinstance, McGrath (1996) presents an action planningprocess for evaluating an instructors performance with aparticular course. Swinney (1989) describes the useful-ness of action plans for improving performance withfirst-line supervisors. Although his case is strong, it doesnot seem replicable in other contexts. Similarly,Hollenbeck and Ingols (1990) mention the successful useof an action planning process in Harvards AdvancedManagement program, though the specifics of that pro-gram are not provided. This study enhances these effortsby highlighting the versatility and simplicity of the partic-ipant action planning process while, at the same time,maintaining its effectiveness.THEORETICAL BACKGROUNDThe literature underpinning this study comes from twocomplementary disciplines: transfer of training andtraining evaluation. Broadly defined, transfer of trainingis the process through which skills or knowledge learnedin one task or context helps problem solving or perfor-mance in another task or context (Holding, 1991). For thepurposes of this study, transfer of learning is defined as aline of inquiry in human resources which uses psycholog-ical and sociological theories to explain how training par-ticipants transfer what they have learned during trainingback to their jobs (Baldwin & Ford, 1988; Broad, 2001,2005; Yamnill & McLean, 2001). According to Brown andSeidner (1998), four conditions must be present fortransfer of training to take place: (a) the person must havea desire to change, (b) the person must know what to doand how to do it, (c) the person must work in the rightclimate for change, and (d) the person must be rewardedfor changing. This highlights the contextual factorsneeded to ensure that an effective training intervention isconverted into improved employee performance.Additionally, Foxon (1987, 1994) makes a distinctionbetween two primary types of learning transfer: specificpoint transfer and the process model of transfer. In speci-fic point transfer, evaluators attempt to measure the appli-cation of learning at a specific point in time after the training event. The focus in the evaluations is whetherparticipants are using what they learned from the trainingevent back on the job. The process model, based uponLewins (1951) theory of force field analysis, conceptualizestransfer of learning as a diffuse, viral, and ongoing process(Foxon, 1994). The evaluation focus is not just on whetherparticipants are using the training on the job, but also onwhich skills have been used, how often, and why they are20 www.ispi.org DOI: 10.1002/pfi MAY/JUNE 2010 not being used more often (Foxon, 1994). If Foxons secondconceptualization is correctthat learning transfer outsideof a laboratory and within real-world human resource con-texts is a diffuse and protracted processthen it may notbe surprising that empirical work on action planning andlearning transfer may often be neglected in favor of lesscomplicated time-intensive research.In summary, the literature has shown that the concep-tual connections among participant action planning,learning transfer, and program evaluation have evolvedover time (Foxon, 1994). Participant action planning hasbecome known as both a learning transfer and evaluationtool, particularly when conducting a Level 3 behavioralevaluation using Kirkpatricks four-level evaluation frame-work (ASTD, 2008; Basarab & Root, 1992). The literaturediscussed above indicates that participant action planningis a viable approach for improving transfer and a highlyeffective and highly resource-intensive approach for evalu-ating training as a performance improvement solution.PROGRAM DESCRIPTIONIn this example, action planning was used to support atraining academy program for crime victims assistanceprofessionals, called the District of Columbia VictimsAssistance Academy (DCVAA; hereafter referred to asthe academy). See Exhibit 1 for a modified announce-ment of the inaugural academy.The academy trains police officers, attorneys, socialworkers, counselors, nurses, and related advocates on avariety of victims assistance skills they need to performtheir jobs. These skills include handling various types oftrauma, helping victims navigate the judicial system, andusing effective self-care techniques. The academy curricu-lum was developed based on various needs assessmentstudies. Now in its third year, which is the focus of thisstudy, the academy brings together diverse trainers topresent material during a 3-day residential program heldat a local conference center.The training takes place during the day, with breaks forlunch and nightly group meetings to process the daysactivities. The 23 attendees are first nominated from theirrespective organizations to attend the academy based onyears of service, which ranges from 3 to 8 years. The acad-emy staff then selects participants from among thosenominated to ensure that diverse job functions, agencies,and demographics are represented. Next, selected partici-pants are divided into four groups called, mentorgroups, with each group having five to seven members toone mentor. The mentors are senior professionals in theirfields and are there to help participant groups navigate thelogistics of the 3 days of training, assist with training activ-ities, conduct nightly mentor group meetings, and serve inan advisory capacity on the academys steering committee.The training program is divided into thematic dayssuch as the justice system and self-care. Participants sitEXHIBIT 1 ACADEMY PROVIDES FREE TRAINING FOR VICTIM SERVICE PROVIDERS The Office of Victim Services (OVS) Crime Victims Assistance Academy has developed and implemented a comprehensive, academic, interdisciplinary training for service providers and allied professionals who work with crime victims. The residential academies, which are supported by U.S. Department of Justice grants, have been immensely successful over the last 10 years in educating individuals on cutting-edge topics that will enhance their ability to effectively serve and assist crime victims. The academy will be an excellent venue for victim-serving practitioners to network with colleagues in other agencies and organizations. The academy is designed for intermediate level service providers and allied professionals with 3 to 8 years of experience. Specific focus at the academy will be given to: Victims rights Interacting with victims at various stages of trauma Listening and communicating with victims, respecting their individual unique experiences, needs, and concerns Assisting victims in the navigation of the criminal, juvenile, domestic violence, and abuse and neglect justice systems Helping victims deal with the aftermath of victimization and maximize their healing process Recognizing ethical challenges and negotiating ethical dilemmas in victim services Self-careThe Department of Justice developed the academy grants to assist states and territories in raising the level of effective assistance to crime victims. OVS has partnered to develop an exciting academic program based on adult learning techniques. The curriculum will be delivered by experts in the fields of victim services, law enforcement, criminal and juvenile prosecution, clinical psychology, health sciences, education,and self-care.Performance Improvement Volume 49 Number 5 DOI: 10.1002/pfi 21at round tables in their mentor groups and with the vari-ous trainers, who are largely responsible for conductingtheir own training session. The trainers use a variety ofteaching approaches including lecture, panel discussion,case studies, and simulation. The mentor groups usephysical and psychological categorization. The mentorsand their groups are all seated together at one table andbecome the centerpiece for all individual and group activ-ities. Mentors are responsible for answering participantsquestions and facilitating the group assignments, includ-ing group meetings where participants work on theiraction plans. They also observe the academy proceedings,suggest adjustments to the curriculum or environment,and help administer the evaluation instruments.In this case, one of the authors (CC) served as the pro-gram evaluator and was charged with designing an evalu-ation strategy that was both valuable and practical, giventhe structure of the academy and the diversity of thetraining presented. The action planning process wasselected as one component of the overall evaluationdesign and was used based upon the evaluators recom-mendation that the process (a) would provide partici-pants with something concrete to show their supervisors,thus reaffirming the value of the training; (b) help thetrainers reinforce the training; (c) provide opportunitiesfor the participants and mentors to network with eachother in a meaningful way; (d) facilitate the collection ofimportant evaluative data; (e) provide a reflective compo-nent to the learning process; and (f) collect data from theparticipants about how the training can be improved. Inselecting this design, the concern was diversity both in theexperience level of the participants and the content ofthe material presented. The flexibility of the participantaction planning process allowed for that diversity by hav-ing the participants create their own learning. Further, theprocess was also amenable to the existing curriculum inthat it worked well with the mentor groups and the mis-sion of the academy, which is to improve the trainees jobperformance.PARTICIPANT ACTION PLANSThe program evaluator was familiar with action planningand wrote a description of a customized process for howit could be implemented as part of the academy (see theAppendix). The mentors and the academy steering com-mittee, which included the academy director, agreed tothe evaluators proposal. The format of the action planswas verbally presented to the participants with individualpackets provided to each participant the first night ofthe training. The packet of information for the partici-pants contained a cover sheet, a page describing basicground rules for interpersonal communication within the groups, an example of action plan notes, three blankpages for notes to be written each day, and an action plantemplate (see Exhibit 2).The participants were instructed to make notes fortheir action plans during the days and were told that thesenotes would be discussed each evening under the super-vision of the mentors.The prompts used in the action plan template wereselected after the evaluator reviewed a number of actionplan templates available online. The training evaluatorand the academy steering committee decided to includethe action planning process in the mentor meetingsbecause (a) the process was unfamiliar to the participantsand they would likely require help from the mentors,(b) mentor meetings were already designed into the train-ing, (c) mentor supervision and peer support encouragedthe participants to complete their plans, and (d) past eval-uations indicated that networking opportunities, like the mentor group meetings, were a primary benefit of theacademy.The action planning process relied heavily on the indi-vidual mentors ability to facilitate meaningful and pro-ductive conversations among the participants. Theevaluator coached the mentors on how to facilitate the initial meeting. The evaluator further encouragedparticipants to not only follow the mentors instructionsbut also make adaptations as needed. The only mandatewas that each individual describe measurable actions thathe or she would be willing to take back to the workplace after the training completion.The final versions of participant action plans werewritten the last day, collected by the evaluator, photo-copied, and returned to the participants before they leftthe academy. One month later, the evaluator emailed participants the specific action steps mentioned in theirindividualized plans and a reminder that he would be following up with them in another months time. The 2-month follow-up evaluation comprised conversationswith participants about progress and success withplanned actions detailed in their respective action plans.RESULTSThe data collected about the participant action plan-ning process were collected from three different sources:(a) mentors, (b) steering committee observers includingthe academy evaluator, and (c) the participants them-selves. The observable response of the participants to theaction plans was positive. A steering committee meetingwas held immediately after the academy to discuss obser-vations because feedback data from the participants hadyet to be analyzed. All commented that the participantsappeared to understand what was expected and believed22 www.ispi.org DOI: 10.1002/pfi MAY/JUNE 2010 that the process was helpful for their learning. Mentorsmade the following comments:The process was an extremely important approach . . .one that should be more fully incorporated into the curriculum.The process was new to me, but it seemed to work wellin the group setting.The participant action planning process helped theparticipants focus on what is important to them.It proved a framework for the mentoring groups.The approach encouraged trainers to make the infor-mation practical.One mentor concluded: We should definitely incor-porate [action planning] into future academies; it reallyfocused all of the content. Further, mentors andobservers who attended last years academy commentedhow the participants seemed more engaged in the nightlymentor meetings, which they attributed to the structureof the action planning process.Immediately following the completion of the academy,participants were asked to rate the action planning com-ponent during the overall academy evaluation andaccording to the following scale, 1 (strongly disagree) to 5 (strongly agree). See Table 1 for the results.Upon program completion, the Level 1 data indicatedthat two thirds of the participants found the action plan-ning process valuable, appropriate to the academy, andthat they planned to follow through with at least oneaction step (4.36/5.00). These results were described asachievable, although only half of the participants hadused a similar process in the past (2.50/5.00).When participants were asked to describe how theaction plan process could be improved, they respondedwith the following suggestions:This process was very helpful for me; however, I thinkthat if you are not in a management position, it wouldbe difficult to make real progress on the action plan.The action planning process made me think a lotabout my job. It helped me focus on what was mostrelevant to me and showed me where there werethings I didnt know.It was hard to use this process at times because someof the information wasnt really relevant to my job, butwhen things did come up, I used the action plan tothink of ways to make a change.I found the process extremely helpful and would haveliked more time to work on it.The evaluator conducted 2-month follow-up inter-views with a sample of seven participants to discoverwhat elements of the action planning process proved to be the most helpful and what specific steps were takenbecause of their plans. Five participants indicated thatthey shared items from their plans with supervisors orother associates in their home organization. Six partici-pants stated that they followed through on at least oneaction step. And all stated that the process would beextremely valuable in future academies with little or nomodification. One participant gave this report:I thought that the whole academy was a little chaoticat first, but the action planning process helped me putit all together and it was fairly easy to keep my notes.The conversations in the groups were really helpful. IEXHIBIT 2 SAMPLE ACTION PLAN TEMPLATEWHAT RESOURCESHOW CAN THIS WHAT ACTION WILL I NEED? WHATWHAT DID I LEARN HELP ME DO STEPS, IF ANY, START DATE: BARRIERS MIGHT ITODAY? MY JOB BETTER? CAN I TAKE? EVALUATION DATE: ENCOUNTER?Start:Start:Performance Improvement Volume 49 Number 5 DOI: 10.1002/pfi 23made some new friends and got a great idea for ourteen program. I havent been able to get approval forit yet, but its definitely something that we should doand it is something I can directly attribute to the actionplanning.Given that the academy participants came from a vari-ety of different organizations, it was beyond the scope ofthe academy evaluation to follow-up with the participantssupervisors, or otherwise measure the degree to whichplanned actions were actually performed on the job andthe degree to which those actions may have contributed toparticipants improved work performance. Despite this,the preliminary results from this study suggest that thesponsor, mentors, and participants were impressed withthe action planning process because of its flexibility andfocus on workplace skill application. In addition, itallowed the academy evaluator an effective tool for rein-forcing key content and promoting learning transfer.DISCUSSIONThe data suggest that the action planning process can beuseful as a simple and inexpensive way to improve learn-ing transfer and program evaluation. The participantshighlighted how the action plans helped them thinkthrough the learning received and its specific applicationto their work roles. Observers and mentors notedincreased participant engagement and academy orga-nization as a result of the process. Moreover, the sponsorwas extremely pleased with the process as it used the expe-rience of the mentors and the networking opportunities ofthe mentor groups in a practical and meaningful way.This study has also shown that participant actionplanning can be an effective way to improve programevaluation for those with little previous experience andlittle preparation. Published work by Jack Phillips (1994,1997, 2003), an expert in return-on-investment evalua-tion for human resource development and performanceimprovement programs, has further claimed that actionplans are the key to high transfer (1997, p. 160). Thisstudy strengthens the evidence that the participantaction planning process is viable for increasing learningtransfer and improving evaluation practices for a largerange of applications.The limitations to this study should also be considered.First, the participants who made themselves available forthe follow-up discussions were a small and slighly skewedsample. Those most likely to follow through on theiraction plans were also the most likely to make themselvesavailable to the evaluator following the academy. Second,with a simplified evaluation process, it is hard to know howTABLE 1 INITIAL ACTION PLAN EVALUATION DATAThe action plan process was clearly explained to me. 4.00/5.00The action plan process enhanced my learning. 3.32/5.00I have done a similar process to the DCVAA action plan in the past. 2.50/5.00I would like to use a similar process in future trainings. 3.18/5.00I feel confident that I will follow through on at least one action step included in my plan. 4.36/5.00I found the mentor meetings helpful in understanding the action plan process. 3.77/5.00When I return to work, I will show my supervisor and team some of the action steps. 2.91/5.00I will feel comfortable discussing the progress of my action steps as part of the follow-up evaluation. 3.45/5.00The action plan process was appropriate for this type of academy. 3.57/5.00Overall, I found the action plan process valuable. 3.23/5.0024 www.ispi.org DOI: 10.1002/pfi MAY/JUNE 2010 much direct responsibility the action plans can take for theparticipants follow-up actions and on-the-job perfor-mance. It is possible that certain action steps would havebeen taken whether the participants had gone through theaction planning process or not. Third, the respective workenvironments were not considered or evaluated as an influ-ence on learning transferwith work environment being akey factor cited in the existing and growing literature onlearning transfer. Ideally, transfer of learning theory sug-gests a careful evaluation of supervisor support and jobautonomy, but given that the academy participants werefrom different organizations, the academy did not dedicateresources to include this type of investigation. Given theselimitations, it is difficult to directly connect the actionplanning process to specific performance improvements.However, this study does show that participants, the train-ing evaluator, and the program sponsor can still find valuein the action planning process even when resources are notavailable to fund higher levels of evaluation.Future human performance technology (HPT) studiescould address these issues by exploring, replicating, orenhancing existing case studies where participants on-the-job performance was followed after implementing anaction plan. Empirical studies of learning transfer haveproven to be demanding but they are certainly worth theeffort. Second, over the last 10 years, participant actionplanning has been implemented in a cost-effective, prac-tical manner with a variety of different applications, so amore thorough and critical literature review of these casestudies would also help advance understanding and prac-tice. Third, literature review should consider specific casestudies showing how Foxons (1994) process approachaffected the usefulness of action planning as a discrete,holistic performance tool. Despite limitations in thisstudy, however, the action planning process is recom-mended for HPT practitioners as an effective, flexibleprocess with wide application to unique situations.The process, as applied in this example, challenges thenotion that action planning is either too simple to beeffective or too resource-intensive to be practical.ReferencesAmerican Society of Training and Development. (2008).Measurement & evaluation: Essentials for measuring trainingsuccess. Alexandria, VA: Author.Baldwin, T., & Ford, J. (1988). Transfer of training: A reviewand directions for future research. Personnel Psychology, 41,63105.Basarab, D., & Root, D. (1992). The training evaluation process: A practical approach to evaluating training. New York:Springer.Broad, M. (2001). Transfer of training: Action-packed strate-gies to ensure high payoff from training investments. New York:Da Capo Press.Broad, M. (2005). Beyond transfer of training: Engaging systemsto improve performance: Pfeiffer essential resources for trainingand HR professionals. New York: Wiley.Brown, S., & Seidner, C. (1998). Evaluating corporate training:Models and issues. Boston: Academic.Campbell, C., & Cheek, G. (1989). Putting training to work.Journal of European Industrial Training, 13(4), 3236.Dransfield, R. (2000). Human resources management: Studies ineconomics & business. New York: Heinemann EducationalPublishers.Foxon, M. (1987). Transfer of trainingA practical application.Journal of European Industrial Training, 11(3), 1720.Foxon, M. (1994). A process approach to the transfer of train-ing. Australian Journal of Educational Technology, 10(1),118.Hall, D., Otazo, K., & Hollenbeck, G. (1999). Behind closeddoors: What really happens in executive coaching.Organizational Dynamics, 27(3), 3953.Holding, H. (1991). Transfer or training. In J.E. Morrison(Ed.), Training for performance: Principles of applied humanlearning (pp. 93125). New York: Wiley.Hollenbeck, G., & Ingols, C. (1990). Whats the takeaway?Training and Development Journal, 44(7), 8384.Lewin, K. (1951). Field theory in social science: Selected theoreti-cal papers. D. Cartwright (Ed.). New York: Harper & Row.McGrath, I. (1996). Participant action plans and the evalua-tion of teachers courses. Edinburgh Working Papers in AppliedLinguistics, 7, 317.Mosel, J. (1957). Why training programs fail to carry over.Personnel, 34(3), 5664.Phillips, J. (1994). Measuring return on investment, Vol. 1:Eighteen case studies from the real work of training.Alexandria, VA: American Society for Training andDevelopment.Phillips, J. (1997). Handbook of training evaluation and mea-surement methods. New York: Gulf Professional.Phillips, J. (2003). Return on investment in training and perfor-mance improvement programs (2nd ed.). New York:Butterworth-Heinemann.Swinney, J. (1989). Whos gonna turn the crank? Or imple-menting training or performance improvement projects.Performance & Instruction, 28(1), 3337.Performance Improvement Volume 49 Number 5 DOI: 10.1002/pfi 25U.S. Office of Personnel Management Productivity Researchand Evaluation Division. (1980). Assessing changes in jobbehavior due to training: A guide to the participant action planapproach. Washington, DC: Superintendent of Documents,U.S. Government Printing Office.Yamnill, S., & McLean, G. (2001). Theories supporting transferof training. Human Resource Development Quarterly, 12(2),195208.Youker, R. (1985). Ten benefits of participant action planning.Training, 22(6), 52, 5456.APPENDIX Mentor Brief: The Action Plan ProcessPrepared by CC, District of Columbia Victims AssistanceAcademy (DCVAA) EvaluatorWhat Is an Action Plan?Having the participants write an action plan will be a great way for them to integrate the content of the academy, the evaluation, and on-the-job application. It isboth a process and a document. The process will involvethe following three steps: (1) action plan orientation,(2) action plan note-taking, and (3) the final action planitself.The participants will be given a packet in which towrite the action plan. The action plan will be writtenthroughout the 4 days, with the final version being writ-ten on the last day. Participants will take the action plansthey have used over the previous days to select one tothree action steps that seem like the most significant forthe final version. In addition to this final version, partici-pants will be asked to complete an action plan question-naire (included in the packet). This action plan packet isin addition to the final academy evaluation, which willinclude other data like overall satisfaction, trainer feed-back, areas for improvement, and so forth.The action plans are something that the mentorgroups will work on each day with the mentors. It willallow them a chance to reflect on their learning indivi-dually and as a group, and help them strategize ways totranslate the content of the academy into new actions intheir workplace. Not everything learned with have anobvious action step; thats all part of the learning process.Why Use the Action Plan Process?The action plan process should be used for the followingreasons: It provides the participants with something concrete toshow their supervisors, thus reaffirming the value ofthe academy. It helps the trainers reinforce the learning they are try-ing to communicate. It provides a chance for the participants and mentorsto network with each other in a meaningful way. It facilitates the collection of evaluative data. It provides a reflective component to the learningprocess. It collects data from the participants about how theDCVAA can add future value.CHRIS A. COWAN, MDIV, is a doctoral student at George Washington University in human andorganizational learning, where he is also the project director on a federal grant focused on training rehabilitation counselors. He has coauthored 12 articles in the field of rehabilitation counseling and is also a certified action learning coach and training evaluator. He may be reached atchrcowan@gmail.com.ELLEN F. GOLDMAN, EdD, is assistant professor of human and organizational learning at GeorgeWashington University. Her research concerns learning processes across different professions and, in particular, how the ability to think strategically develops. Before joining academia, she was a full-time management consultant specializing in organizational strategy. She may be reached at egoldman@gwu.edu.MELISSA HOOK, BA, is the director of the DC Office of Victim Services and the former executivedirector of the Victims Assistance Legal Organization. She is the author of Ethics in Victim Services(Sidran Institute Press, 2005), and coauthor of Spiritually Sensitive Caregiving: A Multi-FaithHandbook (Compassion Books, 2008). She launched the Filmmakers Forum for Sensitivity to CrimeVictims (2002), a campaign to address the insensitive treatment of crime victims in film, reality tele-vision, and documentary film. She may be reached at Melissa.hook@dc.gov.

Recommended

View more >