rmafss_1100100011592_eng

Embed Size (px)

Citation preview

  • 7/29/2019 rmafss_1100100011592_eng

    1/33

    RMAF Special Study

    RMAF Special Study

    Indian and Northern Affairs Canada

    Audit & Evaluation Sector

    August 13, 2008

    James BantingBen EisenJacqueline GreenblattJames KimJee LimFrdric ParadisBernita RebeiroDavid Suk

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 1 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    2/33

    RMAF Special Study

    Table of Contents

    Executive Summary.. 3Chapter One: Introduction & Methodology.. 4Chapter Two: RMAF Quality and Completeness. 6

    Chapter Three: Status of RMAF Implementation. 16Chapter Four: Conclusion. 24Recommendations. 25Areas for Further Investigation. 27

    Appendix A: RMAF Special Study Terms of Reference (p.28)Appendix B: Treasury Board Secretariat RMAF Guidelines (www.tbs.gc.ca)Appendix C: RMAF Review Template (p.30)Appendix D: RMAF Special Study Survey to Program Managers on Data Collection (p.32)Appendix E: Treasury Board Submission Inventory (excel table)Appendix F: RMAF Inventory by Strategic Outcome (excel table)

    Appendix G: Survey Responses (excel table)Appendix H: Complete List of Indicators (from the surveys) (excel table)

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 2 of 33

    http://www.tbs.gc.ca/http://www.tbs.gc.ca/
  • 7/29/2019 rmafss_1100100011592_eng

    3/33

    RMAF Special Study

    Executive Summary

    Following the introduction ofResults for Canadians in 2000, programs seeking approval orrenewal through Treasury Board have been required to submit a Results-Based Management

    Accountability Framework (RMAF). These RMAFs are intended as a tool, allowing managers tothink through their programs from a results-based perspective and to clearly set out plans (andresponsibilities) for performance measurement, evaluations, and reporting.

    Until now, no formal tracking of the RMAFs developed for programs at Indian and NorthernAffairs Canada (INAC) has been undertaken. The objective of this study is to present a portraitof the current state of results-based management at INAC by examining the quality, coverage,and level of implementation of RMAFs in the department. This project was carried out in foursteps. First, an inventory of RMAFs in the department was done, with a total of 59 RMAFsfound. Second, each RMAF was assessed and scored using a template developed from TreasuryBoard criteria. Third, they were mapped out against INACs Program Activity Architecture

    (PAA) to determine RMAF coverage across each departmental Strategic Outcome area. Finally,a survey was developed and distributed to program managers to determine the extent to whichRMAFs, particularly the data collection and reporting plans they contain, are being implemented.

    This study found that RMAFs developed at INAC are generally of high quality. However,several key areas for improvement were also identified. In general, each assessed RMAF clearlylaid out objectives, expected results, and a logic model in accordance with Treasury Boardcriteria. While also generally acceptable, the evaluation plans of many RMAFs lacked solid datacollection information and other key details on the evaluation plan, including issues andproposed methodologies. In many cases, an overwhelming number of performance indicators,often output-focused (a burden for both program managers and First Nations to collect),

    sometimes took the place of a few key indicators that clearly linked activities to outputs andoutcomes. Finally, responsibility for collecting and reporting also needs to be more clearlyidentified. Improvement in these areas would help RMAFs be a more useful tool for managers.

    It was found that data collection and RMAF implementation is comparatively more problematicthan RMAF quality. Results from the surveys to managers indicate that data is being collectedfor approximately 42 per cent of the performance indicators listed in RMAFs. Issues with datacollection ranged from indicators being reassessed for their usefulness during the course of theprogram to more serious problems, including a lack of capacity (in both program recipients andprogram managers) to collect, report, and analyze data. Other issues in data collection may belinked to a problem identified earlier: too many indicators are listed for measurement, many of

    which are output-focused. This study found no areas of significant overlap in data collection, andas such identified no areas where duplication could be eliminated.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 3 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    4/33

    RMAF Special Study

    Chapter One: Introduction and Methodology

    Over the past several years, the Treasury Board has introduced the requirement that a Results-Based Management Accountability Framework (RMAF) be submitted for most programs

    seeking Treasury Board approval or renewal. Although the programs of Indian and NorthernAffairs Canada (INAC) have drafted RMAFs to meet this requirement as their fundingauthorities have come up for Treasury Board renewal, no monitoring mechanism has ever beenestablished to track the development, approval, and implementation of RMAFs department-wide.As such, the Audit and Evaluation Sector conceived of this RMAF Special Study project in thespring of 2008 to develop a more complete picture of RMAF quality and implementation. Theproject was assigned our group of eight interns.

    The RMAF Special Study Terms of Reference (Appendix A), developed by our team incollaboration with the management of the Audit and Evaluation Sector, and approved by theAudit and Evaluation Committee on June 27th, 2008, listed four steps to the RMAF Special

    study. Briefly, these steps were:

    1. Construct an inventory of all INAC RMAFs approved by the Treasury Board since theyear 2000.

    2. Determine the position of each RMAF within INACs Program Activity Architecture(PAA).

    3. Assess the quality and completeness of each RMAF.

    4. Determine to what extent each RMAF has been implemented, with particular focus on

    Performance Measurement Plans (PMPs).

    For our first step, our group reviewed every one of the approximately 350INAC Treasury Boardsubmissions approved from January, 2000 through April, 2008. Through this process, weinitially identified 62 RMAFs and similar documents. A record of this phase of the project isincluded in Appendix E. Then we narrowed the focus to 49 that are directly and primarilyapplicable to INAC programs. Some of the initial 62 were excluded, for instance, because theywere primarily written by and/or the responsibility of another department.

    We handled steps two and three simultaneously. Each of the 49 RMAFs was reviewedindividually by two members of our team. Reviewers identified the Strategic Outcome, Program

    Activity, and Sub-Activity for each RMAF according to INACs 2009-2010 Program ActivityArchitecture, and evaluated the quality and completeness of each RMAF.

    For the evaluation process, our group developed nine assessment categories based on theTreasury Board Secretariat RMAF guidelines. Each RMAF received a score between one andthree in each category depending on the extent to which it conformed to the guidelines. TheTreasury Board Secretariat RMAF guidelines are attached as Appendix B and a blank RMAFassessment template is attached as Appendix C.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 4 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    5/33

    RMAF Special Study

    Our team defined a three point scale for each category as follows:

    1/3: The component is either missing entirely or is missing several key elements.

    2/3: The Evaluation Plan generally meets most of the minimum requirements but is inneed of improvement (e.g. specific required details are absent, component exists but isunclear).

    3/3: The component is clear, logical, and meets all requirements.1

    The nine scores were then added to give each RMAF a total score out of a maximum possible 27points. Once all reviewers had finished their individual evaluations, each pair of reviewers foreach RMAF compared their results, discussed differences, and agreed to a final single set ofPAA information, scores, and comments. The Merged RMAF Assessment CIDM# column inthe table included in Appendix F contains CIDM numbers for completed evaluation templates

    containing individual RMAFs scores. The final PAA information for each RMAF can be foundin the same table.

    Finally, for the fourth step, our team developed a Program Manager Survey to determine theextent to which RMAF PMPs are being implemented at INAC. The most important aspect of thesurvey was a table with room for program managers to indicate which and how often dataindicators listed in the RMAF are still being collected, as well as how collected indicators arebeing used. Additional survey questions asked whether or not the data being collected wassufficient, why data was not being collected, and what key barriers to data collection exist, etc. Ablank survey is attached as Appendix D. Additionally, The survey CIDM# column in the tableincluded in Appendix G contains the relevant CIDM numbers for all of the surveys that were

    returned competed.

    Surveys were distributed to all INAC Assistant Deputy Ministers along with a letter asking thatthey ensure the proper program administrators fill out one survey for each RMAF. As our groupreceived completed surveys back, we analyzed the information programs provided on the datathey were currently collecting in comparison to the data collection commitments listed in thePMP of the relevant RMAFs.

    Chapters three and four of this report list and explain the conclusions our group has developedwith regards to steps three and four. Section three describes the results of our analysis of theRMAFs we collected, and section four describes the results of the Program Manager Surveys

    that were returned.

    1 For the sake of readability, this report often refers to a score of 3/3 as strong, a score of 2/3 asacceptable, and a score of 1/3 as opportunity for improvement.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 5 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    6/33

    RMAF Special Study

    Chapter Two: RMAF Quality and Completeness

    2.1 Findings Based on Overall RMAF Scoring

    Nine components of each RMAF were evaluated on a scale of one to three. As such, the lowestpossible overall score that an RMAF could be given was nine while the highest score was 27.The chart below shows the number of RMAFs that received each overall score.

    Overall RMAF Scores

    0

    1

    2

    3

    45

    6

    7

    8

    9

    1 3 5 7 9 11 13 15 17 19 21 23 25 27

    Total RMAF Score (27)

    NumberofR

    MAFs

    Our key findings based on the overall scoring are listed below.

    The average overall score was 22 (out of 27).

    The majority of RMAFs were clustered with overall scores between 21 and 26.

    Only one RMAF received the minimum score of nine.

    Those RMAFs that received a low score (less than 20) tended to exhibit the followingcommon characteristics: missing sections, missing components within sections,

    underdevelopment of sections / lack of detail, and lack of clarity.

    These findings seem to indicate that overall, the quality of RMAFs produced within thedepartment tends to be acceptable. The clustering of most RMAFs within the 21-26 rangeimplies that RMAFs generally meet the minimum Treasury Board Secretariat guidelines, whilesome improvements in specific areas could be made.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 6 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    7/33

    RMAF Special Study

    2.2 Objectives / Planned Results / Logic Model Overall Findings

    The first three sections of the RMAF evaluations looked at the stated program Objectives, theprogram Planned Results, and the RMAF Logic Model. The program Objectives section is adescription of what the program is trying to achieve. The Planned Results section outlines the

    expected immediate, intermediate and ultimate outcomes that the program expects to haveinfluenced. The Logic Model shows how the program links its activities, outputs, outcomes, andobjectives together.

    The evaluation of the Objectives was based on the usual 3-point scale, and the reviewcriterion was that the objectives be stated clearly.

    The evaluation of the Planned Results section was broken into two sub-categories that wereeach reviewed on the 3-point scale. The criterion for the first sub-category was that ExpectedOutcomes be logical and connected to each other and the objectives. The review criterion forthe second sub-category was that the Expected Outcomes be realistic and achievable.

    The evaluation of the Logic Model was also on the 3-point scale, and the criteria was that themodel be clear and well articulated (logical flow).

    As these sections are all intrinsically linked, they are aggregated in this findings section. Thefindings are described in the chart below on a 12-point scale (four evaluated sections on a 3-pointscale). A total score of 10-12 is considered strong, a score of 7-9 is considered acceptable,and 4-6 is considered opportunity for improvement.

    The results of our evaluation of the aggregate Objectives / Planned Results / Logic Modelsections for the 49 RMAFs are depicted in the chart below:

    Scores of Objectives/ Results/ Logic Model

    Opportunity for Imrpovements (4-6)4.1%

    Acceptable (7-9)16.3%

    79.6%

    Strong (10-12)

    Almost 80% of RMAFs were within the strong range for the Objectives / Planned Results/ Logic Model sections.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 7 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    8/33

    RMAF Special Study

    Only two RMAFs received opportunity for improvement scores within this aggregatecategory.

    The most commonly identified problems with RMAFs that received less than Strong scores

    were the following: the use of directional outcomes, lack of clarity in Logic Model, unclearpresentation of outcomes (no defined section), outcomes not directly linking to a StrategicOutcome, and listed outcomes actually being outputs.

    Our findings indicate that these components of INACs RMAFs are being consistently well doneacross the Department. This is illustrated by the fact that almost 80% of RMAFs receivedStrong scores in this aggregate category. This confirms what one would expect: programsusually have a good grasp of what they are trying to achieve (Objectives), and how they plan todo it (Planned Results). These should then be easily integrated into a clear and logically flowingprogram model (Logic Model). An inability to clearly articulate its Objectives and PlannedResults may point to a larger systemic problem within a particular program.

    2.3 Performance Measurement Plan Findings

    One of the central purposes of an RMAF is to provide a Performance Measurement Plan for eachof INACs programs. These PMPs are required to include a series of performance indicatorswhich can be used to gauge program success. Furthermore, the PMPs are required to provide asolid data collection plan to ensure that the appropriate information is being collected to measureprogram success.

    To assess the quality of the PMPs in existing RMAFs, our team assessed each PMP on a scale of1-9. This score was based on the combined score of three of the related categories of the

    aforementioned review criteria which were evaluated on a scale of 1-3. The results of ourevaluation of the PMPs found in the Departments RMAFs are illustrated in the graph below.

    Performance Measurement Plan Scoring

    0

    2

    4

    6

    8

    10

    12

    14

    16

    18

    1 2 3 4 5 6 7 8 9

    Scores (/9)

    Numbe

    rofRMAFs

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 8 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    9/33

    RMAF Special Study

    As the above graph suggests, the majority of the PMPs we assessed contained a significantamount of useful information. However, there were important ways in which the plans wereconsistently weak.

    In order to clearly explain our teams findings concerning the areas of weakness in the existingPMPs across INAC, this section of our report will present our analysis of the plans in two sub-sections. The first sub-section will discuss the quality of the performance indicators found in theRMAFs and the second sub-section will discuss the quality of the proposed data collection plans.

    2.3.1 Performance Indicators

    The development of useful performance indicators is one of the most important ways forprograms to help promote successful results-based management. The RMAF should be viewedas a valuable tool that can be used to help clarify program goals and develop useful, precise andmeasurable performance indicators that can be used to gauge program success. While the

    majority of the RMAFs included some useful performance indicators, we identified severalcommon problems that undermining the usefulness of the PMPs as management tools. The mostimportant problems that we identified in the 49 RMAFs that we studied were:

    Many RMAFs included far too many indicators

    Several of the RMAFs contained more than 40 performance indicators. In someinstances, the list of performance indicators went on for several pages. Although therewere usually some useful indicators to be found in these RMAFs, they were buriedunderneath so many less-useful indicators that it was often difficult for us to identifythem. Program managers attempting to use them would probably encounter the sameproblem. Out of the 49 RMAFs studied, we found that 19 (39%) of them included toomany performance indicators for a manageable performance measurement strategy.

    The problem of too many indicators was particularly prevalent for programs withmany different outcomes

    Programs with a large number of discrete activity areas and a large number of unrelatedoutcomes were more likely to include an unmanageably large number of performanceindicators in their RMAFs. Some RMAFs, such as the National Child BenefitReinvestment RMAF, the Labrador-Innu Comprehensive Healing Strategy RMAF, andthe Renewal of the Urban Aboriginal Strategy RMAF, have several different activityareas and therefore require indicators to measure success towards several different andsometimes unrelated outcomes. The result is that these sorts of programs, most of whichare relatively small (financially), tended to have several pages of performance indicators.This created PMPs that were complicated and generally unmanageable for theseprograms. This finding highlights the need that programs, with a large number ofoutcomes and discrete activity areas, become selective in the development ofperformance indicators and choose only those which will be most helpful in gaugingprogram performance in a particular activity area.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 9 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    10/33

    RMAF Special Study

    Insufficient attention to outcomes rather than outputs

    Our analysis of the departments existing RMAFs showed that a large number of programPMPs fail to properly emphasize a commitment to the measurement of progress towards

    program outcomes as opposed to monitoring outputs. One common problem was that inmany instances there were too many listed performance indicators that were output-focused. The large number of output-related performance indicators, such as the numberof meetings held or the number of documents produced often made the PMPsunmanageably large without adding to the programs ability to gauge success. Of the 49RMAFs that were analyzed, our team found that 20 (41%) of the PerformanceManagement Plans were overly focused on outputs and would have benefited from aclearer focus on measuring progress towards program outcomes.

    Many performance indicators were vague and difficult to measure

    While the majority of the RMAFs that we analyzed contained some useful, measurableperformance indicators, several of them also included indicators that were very vague andextremely difficult to accurately measure. Some RMAFs, for example theInitiativesUnder the Youth Employment Strategy, contained many indicators that rely on theperceptions of participants concerning whether or not particular outcomes have beenachieved. While this sort information may be useful, it should not be the basis of aPerformance Measurement strategy. Our team found that 18 (37%) of the PerformanceManagement Plans that we studied placed too much emphasis on vague performanceindicators and would benefit from a greater focus on precise, measurable progress. It isimportant for the developers of RMAFs to ensure that the PMP is solidly rooted in aseries of performance indicators that are precise, measurable, and relatively easy tocollect and interpret data for.

    The PMP component of the RMAF holds the potential to be an extremely useful managementtool. Currently, although most programs have relatively solid PMPs, there are several areas wehave identified in which improvement is still needed. By focusing on the development ofmanagement strategies which are manageable in size, focused as much as possible on measuringprogress towards outcomes, and rooted in precise, measurable performance indicators, INACsprograms can help ensure that their RMAFs function as useful management tools.

    2.3.2 Proposed Data Collection Plan

    As part of the evaluation of the PMP, the Data Collection Plan was also evaluated. The keyreview criteria was that the data sources and frequency of data collection are clearly identified.The RMAF evaluations identified a number of issues pertaining to the RMAFs Data CollectionPlans.

    The chart below depicts the most common issues raised:

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 10 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    11/33

    RMAF Special Study

    Data Collection Issues

    0

    2

    4

    6

    8

    1012

    14

    16

    Frequency

    missing/unclear

    Responsibility Unclear

    / Reporting Burden

    Data source unclear

    Issues

    NumberofOccurences

    Frequency missing/unclear

    15 of the 49 RMAF evaluations mentioned that the Data Collection Plans were eitherunclear in their description of the frequency of the data collection, or the frequency wasnot indicated.

    Responsibility unclear / reporting burden

    In ten instances, RMAF evaluations mentioned that the responsibility for data collectionwas not clearly expressed. This failure to clearly identify responsibility for data collection

    is a serious issue for INAC to address as it can often lead to one of two undesirablescenarios. If it is unclear who is responsible for data collection it is possible that no partywill collect the necessary data, making successful performance management difficult. Asecond possibility is that if responsibilities are unclear, more than one organization willspend time and resources collecting identical data. Considering the heavy reportingburden that already faces First Nations on reserve, who are responsible for a significantportion of data collection, it is important to eliminate redundancy and inefficiency in datacollection wherever possible. By clearly identifying responsibilities for data collection inRMAFs the programs can help promote effective, efficient program management whilehelping to lessen the reporting burden facing First Nations.

    Data source unclear

    In six instances RMAF evaluations indicated that the data sources to be used in variouselements of the data collection strategy was not clearly identified. It is an obviousrequirement that the source of important data should be clearly stated.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 11 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    12/33

    RMAF Special Study

    2.4 Evaluation Plan Findings

    All RMAFs are required to include an Evaluation Plan which describes when the program is tobe evaluated and provides information designed to assist evaluators in determining how to bestassess program performance. Evaluation Plans are expected to include: key evaluation issues,

    data sources, a data collection strategy, a preliminary methodology and an estimated cost for theevaluation. As part of our effort to provide a comprehensive assessment of the quality of thedepartments existing RMAFs, our team examined the Evaluation Plan in each RMAF andassigned each Evaluation Plan a score according to the three point scale described in themethodology section of this report.

    The results of our analysis of the quality of existing RMAF Evaluation Plans were:

    45% of the RMAFs had strong, detailed Evaluation Plans that included all of therequired components.

    37% of the RMAFs had acceptable Evaluation Plans that provided significantamounts of useful information but failed to include one or more required components.

    18% of the RMAFs either failed to provide an Evaluation Plan at all or provided avery minimal and/or vague Evaluation Plan with few specific details.

    Scoring of Evaluation Plans

    Strong45%

    Acceptable

    37%

    Opportunity for Improvement18%

    Having sorted the Evaluation Plans into these three categories, our team next analyzed theEvaluation Plans which did not receive a perfect score in order to identify the most commonomissions. Our key findings were:

    The RMAFs that received a grade of opportunity for improvement either contained noEvaluation Plan at all or provided an extremely minimal plan which would be of little use toevaluators.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 12 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    13/33

    RMAF Special Study

    The most common omission found in the RMAFs that we scored as acceptable was thatthey failed to include a detailed methodology that could be used to inform future evaluations.Of the 18 RMAFs that received a grade of acceptable, 9 of them either did not include amethodology or provided insufficient detail to help inform future evaluations.

    Six of the 18 RMAFs that received a score of acceptable failed to provide adequatepreliminary evaluation issues.

    Evaluation Plans that received a grade of acceptable generally omitted one or morerequired components which are intended to provide specific information about the plannedevaluation. Of these 18 RMAFs, 4 did not include the data sources to be used, 4 did notinclude a data collection strategy and 6 did not provide an estimate for the cost of theproposed evaluations.

    While most of the Evaluation Plans that we studied included significant useful information thatwill be helpful for evaluators, our analysis points to several possible areas of improvement that

    will help RMAFs to function as an effective performance management tool. Of particularimportance is the aforementioned failure of a large number of RMAF Evaluation Plans to includeuseful methodologies and evaluation questions. In order for RMAFs serve as useful tools forfacilitating the measurement of program performance, it is important that RMAFs be moreconsistent in their inclusion of proposed evaluation issues and thorough preliminary evaluationmethodologies.

    2.5 Reporting Plan findings

    The final category of the evaluations focused on the programs strategies for reporting theirperformance data to the department at large. Key characteristics of an acceptable reporting plan

    include: clear descriptions of the nature and scope of the plan, responsibility for the preparationof the plan, the reporting frequency, and the intended user(s) or readers of the reports.

    The Reporting Plans were evaluated on the 3-point scale that is described in the Methodologysection found above. The results of our evaluation of the Reporting Plans for the 49 RMAFs aredepicted in the chart below:

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 13 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    14/33

    RMAF Special Study

    Scores of Reporting Plan

    38.8%

    Excellent

    Opportunity for Improvement16.3%

    44.9%

    Acceptable -

    Our overall findings for the evaluation of the Reporting Plans are listed below:

    Almost 84% of RMAFs received a score of acceptable or strong on their Reporting Plan.

    Few RMAFs received opportunity for improvement rating for their Reporting Plan.

    For those RMAFs that received a score of 1 (opportunity for improvement) or 2 (acceptable)the most common problem indicated was missing components. For example; the plan did notoutline who the intended users are, or the plan did not clearly indicate the reportingfrequency.

    The findings indicate that the Reporting Plans are generally of acceptable quality. While someimprovement is needed to ensure that all necessary components are included, very few of theplans did not meet minimum requirements.

    2.6 Conclusions

    The RMAF is, potentially, a very useful tool for facilitating effective, results-based programmanagement. Our analysis concluded that, while most existing RMAFs are of acceptable qualityand contain significant useful information, there are several areas which are generally in need ofimprovement. In order for future RMAFs to serve their purpose as effective management tools,

    we identified the following key issues that the department should seek to address:

    Many RMAFs (38%) contain too many performance indicators. Reducing the number ofperformance indicators will allow for the development of manageable PMPs.

    Performance indicators are still too focused on outputs rather than outcomes. By ensuringthat indicators are aligned with outcomes, programs can ensure that they are appropriatelymeasuring progress towards the programs stated objectives.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 14 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    15/33

    RMAF Special Study

    There is a need to ensure that program data collection strategies are clear and comprehensive.To ensure that all necessary data is being collected and to prevent duplication in thecollection of data, RMAFs must clearly identify responsibilities for data collection. RMAFsmust also clearly state the data sources to be used and the frequency of collection to ensure

    effective, efficient data collection.

    The Evaluation Plans included in RMAFs should consistently include preliminary evaluationissues and proposed evaluation methodologies.

    The Reporting Plans contained in existing RMAFs are generally well done. In order to furtherimprove the reporting plans it is necessary to ensure that all necessary components areconsistently included.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 15 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    16/33

    RMAF Special Study

    Chapter Three: Status of RMAF Implementation

    3.1 Rationale

    A component of this study was to investigate the status of RMAF implementation for programsacross INAC. Data collection commitments for programs as stated in their respective RMAFswere assessed by comparing responses from program surveys to the PMPs found in the programRMAFs. In evaluating the status of RMAF implementation, we also aimed to identify gaps andduplications in data collection.

    3.2 Approach

    In June 2008, we developed a survey (see Appendix D) to identify the extent to which RMAFPMPs were being implemented. Surveys were distributed to Assistant Deputy Ministers who, inturn, requested all program administrators to fill out the survey. Upon completion, survey

    responses were analyzed and major findings were reported by our team. In reporting on thegeneral trends, we found it useful to aggregate survey responses in the following manner:

    Few refers to the same response by 1-3 surveys.

    Some refers to the same response by 4-6 surveys.

    Many refers to the same response by 7 or more surveys.

    3.3 Profile of the Survey Responses

    As of the first week of August 2008, we received 44 responses to our survey from program

    administrators. 26 of the responses were actual surveys adequate enough for us to analyze (seeAppendices G for the survey results). In one case multiple surveys were received for a singleRMAF, so the 26 surveys corresponded to 22 RMAFs. The remaining 18 responses did notinclude a completed survey, instead indicating that a survey could not be completed for one ofthe following reasons:

    Many of the RMAFs in the initial inventory were outdated. Approved several years ago,the RMAFs in this group either had been replaced by an updated RMAF, or correspondedto programs that were no longer active.

    Some RMAFs had been completed so recently that a survey could not be filled out for

    them. For some of the programs in this group, implementation had not yet begun. Alsoincluded in this category were programs that were updating or awaiting final approvalsfor their RMAFs before they began implementation of their performance measurementstrategies.

    No response was received for many RMAFs. One possible explanation for this is thatProgram administrators are not always aware that their programs are covered by anRMAF. It is also possible that some of the RMAFs for which we received no response

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 16 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    17/33

    RMAF Special Study

    fall into the outdated or too recent categories as well. Finally, surveys may not havereached proper parties. Program managers may have been on vacation.

    We also received a few surveys for RMAFs that were not included in our originalinventory. These RMAFs were approved by the Treasury Board after the April 2008 cut

    off date of the Treasury Board Submissions that we reviewed. Because they werereceived well-after our RMAF quality and completeness analysis (Chapter II) had beencompleted, they were not included in that portion of the project. However, their surveyswere included in data collection analysis that follows.

    A graphical description of this breakdown is provided below.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 17 of 33

    40Active RMAFs

    4verbalresponses

    26Surveys in Total

    10newly foundRMAFs during the

    survey

    49RMAFs initially foundfrom TBS sub Search

    9outdated RMAFs

    No responses for8RMAFs

    14Responses for newly foundRMAFs

    16verbalresponses

    16 fullsurveys

    10fullsurveys(5 for 1RMAF)

    59RMAFs in Total

    32Responses forinitially found RMAFs

    Figure 3.1: RMAF and Responses

    Breakdown

  • 7/29/2019 rmafss_1100100011592_eng

    18/33

    RMAF Special Study

    Responses were received from all sectors, although at slightly different rates. Breakdown of the44 program responses received by sectors and SO can be graphically represented as follows:

    Sector Response Rate

    9

    2

    109

    3 32

    5

    1

    0

    2

    4

    6

    8

    10

    12

    TAG

    Landand

    Trust

    Services

    SEPRO

    Northern

    Affairs

    Corporate

    Services

    Officeof

    Federal

    Interlocutor

    Policy&

    Strategic

    Direction

    Indian

    Residential

    School

    Other

    Sectors

    #ofResponses

    SO Respo nse Rates

    13

    8

    34

    8

    34

    1

    0

    2

    4

    6

    8

    10

    12

    14

    The

    Government

    ThePeople

    TheLand

    The

    Economy

    TheNorth

    Officeofthe

    federal

    Interlocutor

    Internal

    Services

    other

    Strategic Outcomes

    #ofRespo

    nses

    3.4 Specific Findings

    Our key findings derived from our analysis of the program surveys are as follow:

    Data is being collected for 42.9% of all performance indicators listed in the 22RMAFs that were examined in this part of the project.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 18 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    19/33

    RMAF Special Study

    The 22 RMAFs for which we received surveys included a total of 492 performance indicators.Of these 492 performance indicators, the corresponding surveys indicated that data was beingcollected for 211 performance indicators in total (42.9%). Notably, the surveys revealed that datawas being collected for an additional 27 indicators not listed in the respective program RMAFs.2A list of all indicators included on returned Program Manager Surveys is included in Appendix

    H.

    Of all the indicators for which data is being collected, 83.6 % are said to be used(199 out of 238 collected indicators).

    The most commonly reported reason for non-use was that data for indicators was still beingcollected. Nevertheless, it appears as though most of the data collected for the indicators is beingused. Furthermore, respondents indicated that data was most commonly used for annual and/orquarterly reporting, and departmental and other performance reports.

    When surveys and their corresponding RMAFs are grouped according to strategicoutcomes, data is being collected for performance indicators in the followingmanner:

    % of SO Performance Indicators in RMAF

    actually being co llected

    62.7

    23.4

    0

    60 61.42

    23.3

    0

    10

    20

    30

    40

    50

    60

    70

    The

    Government

    The People The Land The

    Economy

    The North Internal

    Services

    SOs

    %o

    fPIsinSurveys/RMAF

    When surveys and their corresponding RMAFs are grouped according to sectors,data is being collected for performance indicators in the following manner:

    2 These figures should be considered approximations. At times, it was difficult to determine whether a givenindicator should be counted as a single indicator or as multiple indicators. In addition, the way in which performanceindicators were phrased in the surveys was not identical to the way it was originally stated in the respective RMAFs.As such, judgement was occasionally required when attempting to identify the same indicators from surveys andtheir corresponding RMAFs.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 19 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    20/33

    RMAF Special Study

    % of Sector Performance Indicators in RMAF

    actually being Collected

    55.7

    0

    24.6

    66.5

    0

    23.3

    92.6

    01020304050607080

    90100

    Treatyand

    Aboriginal

    Government

    Landand

    Trust

    Services

    SEPRO

    Northern

    Affairs

    Corporate

    Services

    Policy&

    Strategic

    Direction

    Indian

    Residential

    School

    Sectors

    %o

    fPIsinSurveys/

    RMAF

    Many respondents perceive that data requirements are currently being met.

    When asked To what extent is the data identified in RMAFs Performance MeasurementFramework (PMF) currently being collected? (Question #1) 18 out of 26 respondents indicatedthat it is between 75% and 100%. This indicates that perhaps program mangers areoverestimating the implementation of their RMAFs PMF.

    Programs' perceptions of prog rams of data

    currently being co llected

    02

    3

    18

    3

    0

    5

    10

    15

    20

    0-25% 25-50% 50-75% 75-100% do not know

    data coll ection status

    numberofrespondents

    Many of the respondents perceive the data being collected as clearly demonstratingeffectiveness of their programs.

    When asked Is the data being collected sufficient to provide an accurate picture of theeffectiveness of the program? (Question #3), 82% of the respondents indicated that they do.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 20 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    21/33

    RMAF Special Study

    Doest the data collection give a clear picture of

    success?

    yes

    82%

    no

    9%

    other

    9%

    3.5 General Findings

    Trends in our analysis point to three general findings which contribute to a better understandingof performance measurement and reporting at INAC.

    Finding #1: Overall, data collection based on RMAF PMPs is far from complete.

    According to the surveys we received, only 42.9% of the performance indicators listed in theRMAFs are actually being collected. This is a troubling statistic. If the required indicators are notbeing collected, there may be insufficient information to determine whether or not a program issuccessful. Paradoxically, in light of the low data collection rate, only a few of the survey

    respondents indicated that they believe that the data being collected was insufficient to providean accurate picture of the effectiveness of the program. One possible explanation is thatprograms are only collecting the important performance indicators. As noted in Chapter 3, manyRMAFs include an unnecessary number of output-based indicators that will not be particularlyhelpful in determining the effectiveness of a program. Interestingly, for those RMAFs that breakdown their performance measurement indicators by output/activity, immediate outcome,intermediate outcome, and final outcome, a lesser percentage of intermediate and final outcomealigned indicators was being collected than output/activity and immediate outcome alignedindicators.

    It is important to note that these values and percentages may be inaccurate. Many programs are

    only able to collect intermediate and final outcome data in the later stages of the programimplementation. As such, it is possible that the proportion of these indicators included as beingcollected in Program Manager Surveys was skewed downwards. While the data does not clearlyindicate that program managers are simply declining to collect less useful output-oriented dataand instead focusing on more valuable outcome oriented data, this may still be the case.

    The low data collection rate and several written comments indicate that is that original datacollection plans are often modified by program administrators over time to better represent

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 21 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    22/33

    RMAF Special Study

    program activities. Indeed, severalsurvey respondents indicated that the initial PMPs laid out intheir program RMAFs and PMPs had been or were in the process of being revised.

    On one hand, it is encouraging to see that programs are flexible enough to make the RMAFswork for them. Adapting the PMP may demonstrate a programs ability to recognise new

    circumstances and evolve in accordance over time to better represent the programs outcomes.One program wrote that we are working to improve the quality of data so it can provide us withbetter evidence of the characteristics of our clients, the degree to which our programs meet theirneeds and a more accurate picture of program effectiveness. Considering how new RMAFs areto line departments, it is not surprising that many programs may have written their PMPs withoutfull knowledge of the consequences of these plans and are therefore revising them as theyimplement them over time.

    On the other hand, this gap between commitments and actual data collection may demonstratethat programs are revising the criteria they are being held accountable for without approval fromeither Audit and Evaluation Sector or Treasury Board Secretariat. This would renderperformance measurements of these programs null.

    Finding #2: There is a lack of capacity within INAC and among First Nations tocollect data.

    One of the most prevalent comments that appeared in returned surveys was that there isinsufficient capacity to collect all of the performance indicators listed in the RMAF. This islikely the most important reason that data collection coverage is so low. Limited capacity of bothINAC and Stakeholders was reported to be a problem. This lack of capacity includes, but is notlimited to, insufficient resources, low ability among some First Nations to fill out the datacollection forms, lack of internet access for data reporting, lack of INAC personnel to performdata analysis, and a lack of information systems to assess the data. For instance, one response

    explained,

    There are three barriers [to effective data collection]:

    a) We dont have the most relevant indictors and the data collection systems associatedwith them;

    b) First Nations dont have the capacity to collect data;

    c) We dont have the capacity at HQ and Regions to collect and analyze the data beingprovided by First Nations.

    Another wrote, The barriers to more effective data collection are: not wanting to add to thereporting burden of First Nations, the availability of data the program wants to collect, as well asno information system to collect data, store and analyze data.

    Even if the data is being collected, these factors can lead to an inability to utilise the data andmonitor performance in addition to lags and gaps in the reporting of these performancemeasurements.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 22 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    23/33

    RMAF Special Study

    Finding #3: No obvious data collection overlap.

    One of the hopes for this project was that overlapping data collection commitments andcollection activities would be identified. If we had determined that many different programs werecollecting the same data, there would be room for streamlining to save resources and reduce the

    reporting burden on First Nations. Unfortunately, our team was unable to locate any significantareas of overlapping data collection. The most important reason for this was that mostperformance indicators are program specific for example, RMAFs dealing with climate changewill measure the reduction to greenhouse gas emissions that occurred as a result of upgrades topower plants funded by the program, rather than the total change to greenhouse gas emissions.One second possible reason for this is that so few indicators are being collected. If 100% ratherthan 42.9%of indicators were being collected, it is possible that overlap would become aproblem. Finally, one very interesting possible explanation for the lack of overlapping datacollection is that Program Managers are already refraining from collecting some possiblyoverlapping data in order to reduce the reporting burden on First Nations. Indeed, some Surveyrespondents indicated the reporting burden to be a reason why data for all indicators were not

    being collected.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 23 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    24/33

    RMAF Special Study

    Chapter Four: Conclusion

    This report has listed and explained the main findings of the RMAF Special Study project. Whenwe compared INACs RMAFs to the Treasury Board Secretariat RMAF guidelines, we find that

    overall, their quality was generally high. Nevertheless, we found that there remains substantialroom for improvement in the quality of RMAFs. Most importantly, too many RMAFs include fartoo many indicators, focus too much on outputs and too little on outcomes, include unclear orincomplete data collection strategies, and little information on proposed evaluation issues andmethodologies.

    With respect to data collection implementation, it appears that less than half of the performanceindicators listed in RMAFs are actually being collected. Insufficient stakeholder and INACcapacity is likely the most important factor causing the low level of data collection. Additionalpossible sources of low data collection are the avoidance of what are perceived as unhelpfulindicators and attempts to reduce the First Nation reporting burden. There do not seem to besignificant opportunities to further reduce that reporting burden by eliminating overlapping datacollection efforts, as no such overlap was identified.

    The comparatively small problems with RMAF quality seem much less serious than theextremely opportunity for improvement data collection coverage. However, it should be notedthat the problems may be related. Some of the most troubling aspects of RMAF quality was thevery large number of indicators for many RMAFs, and the very large number of output ratherthan outcome oriented performance indicators. If indicators were fewer and more narrowlyfocused on what matters, it could become easier to collect a larger proportion of performanceindicators.

    In conclusion, one thing is absolutely clear: the RMAFs potential is only realized when it is

    used as the management tool it was designed to be. Implementation is only the first step in this.Just as importantly, the department must take the insight that RMAFs and their PerformanceMeasurement Strategies offer seriously. To date, this has been difficult because of the scarcity ofinformation available on RMAF quality and coverage within the department. As such, the mostvaluable aspect of this study is not in its conclusions, but rather in the very large quantity of rawinformation that is appended to this report. This information should be kept up to date andextended so that ongoing performance monitoring and evaluation can increasingly benefit fromthe wealth of information that well-designed and well-implemented RMAFs provide.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 24 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    25/33

    RMAF Special Study

    Recommendations

    The RMAF Project Final Report submitted on August 13th made a variety of conclusions

    regarding RMAF quality and implementation at INAC. This document is intended to supplementthe final report by providing a few concrete recommendations for how INACs use of RMAFscould be improved.

    1. Performance Measurement Plans (PMPs) should contain short lists of performanceindicators that are direct, outcome-oriented, precisely defined, and collectable.

    There are five parts to this recommendation: PMPs should contain a short list of performance indicators. Long PMPs suffer

    from lack of focus and are difficult to implement. Reducing the total number ofindicators will force program administrators to select the most effective indicators

    and give a less diluted picture of overall program efficacy. Indicators should be direct.Proxy measures (for example, measuring perceptions

    of outcomes rather than outcomes) are substantially less valuable than indicatorsdesigned that directly measure whether or not the objectives of an initiative haveor have not been met.

    Indicators should be outcome-oriented. Indicators that focus on outputs andactivities do little to show how effective a program has been, and only dilute theoverall usefulness of a PMP.

    Indicators should be precisely defined. It should be absolutely clear what inindicator will measure and how the necessary data will be collected. Opportunityfor improvemently defined indicators only frustrates those responsible for data

    collection, and necessitates revisions to the PMP. Indicators should be collectable.The value of the information indicators provide

    should be weighed against the cost and feasibility of allocating resources to data-collection efforts. Any indicator that can or will not be collected should beexcluded from the PMP.

    It appears that currently, only a small proportion of the performance indicators thatRMAFs list are being collected. Improving the focus and quality of PMPs whiledecreasing their size will help give a less diluted assessment of overall programperformance, increase the overall rate of data collection, ensure that the most essentialindicators are always collected, and reduce the strain on data-collection resources of

    INAC and First Nations.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 25 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    26/33

    RMAF Special Study

    2. INAC should track the status of RMAF development and implementation.

    RMAFs are intended as tools to be used in on-going program management. In spite of

    this, until now, it has been impossible to keep track of what programs do and do not haveRMAFs, whether RMAFs are still active, whether they have been revised, and whetherthey are being implemented. A near complete lack of information on the current status ofRMAF makes it all but impossible for them to be used in an on-going manner.

    While responsibility for implementing and keeping RMAFs current ultimately rests withthe implementing sector, it is not surprising that some RMAFs get lost in the shufflewhen many managers are liable to encounter interest in their RMAF only twice oncewhen the program authorities come up for approval or renewal, and once when theprogram is evaluated. Mechanisms to monitor and when necessary, correct the statusof RMAF implementation must be developed. The first step towards doing this

    successfully is simply keeping an up-to-date record of the status every INAC RMAF.

    3. Engaging Program Managers

    Our surveys results indicate that Program Managers are very concerned with the qualityof their programs and the experience of the clients they serve. However, the limitationsthey face (including time and resource constraints as well as lack of capacity) negativelyimpact their ability to carefully plan for performance measurement and evaluation, aswell as to monitor programs and collect data. Improving their understanding of howRMAFs are a useful program management tool and building their capacity for strategicplanning could improve RMAF quality and facilitate their implementation in the future.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 26 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    27/33

    RMAF Special Study

    Areas for Further InvestigationThe project participants have identified the following areas additional research would beinformative.

    1. Ongoing Project Maintenance

    The most useful aspect of the project may be the extensive information gathered onINACs RMAFs and their implementation. This information should be kept up to date.

    RMAFs should be added to the RMAF inventory as they are approved by theTreasury Board.

    Inquiries should be made as to implementation, renewal, and expiry arrangementsas RMAFs in the inventory age.

    RMAFs for which the current status remains unknown should be investigated.

    2. The Role of Consultants in RMAF Development

    During the study, it became clear that many of the RMAFs examined had been developedby consultancy firms rather than the sectors that would eventually be responsible for theirimplementation. This may mean that RMAF implementers are not as familiar with theirRMAFs as they should be, and/or that the RMAF authors were not as familiar with therealities of program implementation as they should have been. It may be possible toidentify those RMAFs that where developed outside of the department, and cross-reference this information with RMAF quality and degree of implementation.

    3. Mapping RMAFs by Program Authority

    It remains unclear what percentage and what portions of INACs overall spending iscovered by RMAFs. An investigation into RMAF coverage by program authority wouldbe necessary to make progress on this front.

    Additionally, linking up with other sectors of INAC (for example, Finance) that also have aninterest in this area would be useful in carrying out the next steps of the project.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 27 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    28/33

    RMAF Special Study

    Appendix A: RMAF Special Study Terms of Reference

    RMAF Special Study: Terms of Reference

    ContextThe purpose of this study is to contribute to the Evaluation, Performance Measurement, andReview Branchs annual commitment to review performance-monitoring and results-reportingwithin the Department of Indian and Northern Affairs. This will be accomplished by assessingthe quality and implementation of Results-Based Management Accountability Frameworks(RMAFs) within the Department. Furthermore, this study will contribute to preparations for theStrategic Review of 2009-10 by reporting on the state of performance measurement in thedepartment through the lens of RMAFs. The findings from the study will be presented as a finalreport to the Evaluation team.

    ApproachFour steps are required to complete this final report. First, the study requires collecting allexisting RMAFs by reviewing all Treasury Board Submissions by the Department since 2000.

    The second task is to map out (correlate) the collected RMAFs against the 2009-2010 INACProgram Activity Architecture (PAA). RMAFs currently under development will also beconsidered if possible. By examining which activities and sub-activities on the PAA are coveredby an RMAF, the department can better understand the state of accountability and performancemeasurement within the department. To compliment this assessment, some study participants arealso mapping audits and evaluations conducted to date by the A&E sector.

    The third task is to determine the quality of the collected RMAFs, and the performancemeasurement tools they describe to determine their effectiveness. RMAFs will be evaluatedagainst the requirements set out by the Treasury Board Secretariat. The logic and usefulness ofperformance indicators will also be assessed.

    In addition, this study will verify whether the commitments stated in RMAFs are currently beingimplemented. Hence, the fourth task is to determine whether data is actually being collected bythe programs based on their RMAF commitments. This will be done via surveys to the programs.The collected information will be matched against the information from relevant RMAFs. Withthis knowledge, the department can identify where the gaps or duplications in the data exist andcontinue to work towards decreasing reporting burdens imposed on First Nations.

    MethodologyThis study will perform the following:1. An RMAF inventory (completed June 3rd; the team collected approximately 50 RMAFs):

    o Identify all existing RMAFs and create an inventory list by: examining all INAC Treasury Board Submissions since 2000; exploring the Audit and Evaluation Sectors master authority table; meeting with members of the evaluation team.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 28 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    29/33

    RMAF Special Study

    2. An RMAF Map: (to be completed by end of J une)o Use the PAA to determine where RMAFs fit into the PAA of the department.

    3. A review: (to be completed by end of J une).o

    Assess the quality of the RMAFs using the template based on Treasury Board criteria.o Take note of data collection commitments listed in each RMAF.

    4. Determine the State of RMAF Implementation: (J uly)o Verify data collection commitments are being fulfilled.

    This is to be done using a survey written by the study team, but distributed toprograms by senior members of the evaluation team.

    o Determine where there are gaps or duplications in the data collected across thedepartment.

    5. Final Report: (to be completed by August 7th)o

    Draft a final report and presentation materials describing findings.o Brief the evaluation team on August 7th.

    Key Deliverables

    A written report consisting of(to be completed by August 7th):o executive summary;o background (description of the project, RMAF requirements & purpose, etc.);o RMAF and program evaluation coverage at INAC;o data collection, gaps and duplications;o notable observations; ando Future directions.

    Useful raw data and working documents (accompanied by brief explanations to facilitateinterpretation if necessary) such as:

    o treasury Board submissions inventory table;o RMAF inventory table;o RMAF evaluation forms;o Data indicators tables; ando Survey to programs results.

    Project Management

    Each person is responsible for developing an RMAF inventory of their assigned SO

    Each person will be responsible for a set number of RMAFs to map and evaluate. Group members in Toronto will focus on mapping completed audit and evaluations to the

    PAA to complement findings from the RMAF review. Individual group members will be given specific additional responsibilities (liaising with the

    members of the evaluation team, chairing meetings, taking minutes, etc.)

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 29 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    30/33

    RMAF Special Study

    Appendix C: RMAF Review Template

    INAC-AINC RMAF Review Template for the Interns Summer 2008*

    Treasury Board RMAF Review Template modified

    Reader

    RMAF #:RMAF Title

    TB Submission#(s)

    Date Approved

    SO/ PA/SA/ SSA

    Objectives andExpected Results

    Review Criteria

    Score (1-3)

    1incomplete;

    2 meetsrequirementbut needs

    work;3 meets all

    requirements

    Explanation for the ScoreGiven

    Objectives - The objectives are stated clearly.

    - The expected outcomes: immediate,intermediate, and ultimate, are

    logically articulated and connected toeach other. (i.e.// there is a logicflow/chain)

    - The outcomes are consistent withand/or linked to the above statedobjectives.

    Planned Results

    - The expected outcomes are realisticand achievable.

    Logic Model- The logic model is clear and wellarticulated. (i.e.// there is a clear logicflow/chain).

    Evaluation Review Criteria Score (1-3) Explanation for the ScoreGiven

    PerformanceMeasurementPlan

    - Overall, the performancemeasurement strategy is sound. (i.e.//there is a logic cascading ofindicators and a solid data collectionplan)

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 30 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    31/33

    RMAF Special Study

    - The performance indicators for alllevels of outcomes stated are relevantand logical (indicators areappropriately aligned to outcomes,not outputs.)

    - The data sources and frequencies ofdata collection are clearly identified.

    Evaluation Plan

    - Evaluation plans and commitmentsare well developed.(especially summative evaluations)(e.g.// Type of evaluation, key evaluationissues, data sources and collectionstrategy, proposed methodologies,estimated cost, and proposed timing).

    Reporting Review Criteria Score (1-3)Explanation for the Score

    Given- The nature/scope, the responsibility

    for preparation, the frequency, andthe intended user(s) of the reports arewell defined.

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 31 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    32/33

    RMAF Special Study

    Appendix D: RMAF Special Study Survey to Program Managers on Data Collection

    RMAF Special Study

    Survey for Program Managers on Data Collection

    RMAF Title:Sector:Strategic Outcome / Activity / Sub-Activity:Your Name and Title:Contact Information:

    1. To what extent is the data identified in the RMAFs performance measurementframework (PMF) currently being collected? (All performance indicators for thisprogram are outlined in the PMF section of the RMAF) (please check a box):

    0-25%25-50%50-75%75-100%Do not know

    2. Please specify, in the table that follows: (and expand if necessary to provide completeinformation):a. the indicatorsonlyfor which data is currently being collected;b. how often data is collected for each indicator;c. How the collected data is being used. (e.g. Quarterly Reporting, DPR, etc.)

    NCR-#1473984-v5-RMAF_SPECIAL_STUDY _FINAL_REPORT Page 32 of 33

  • 7/29/2019 rmafss_1100100011592_eng

    33/33

    RMAF Special Study

    PerformanceIndicators

    for which data iscurrently and

    consistently being

    collected

    Frequency/ cycleof the data

    collection (annual,quarterly, etc)

    Do youcurrentlyuse this

    data? (Y/N)

    If you use the data,how is this datacurrently being

    used (i.e. what isthe reporting format

    used)?

    If you do notuse this data,please specify

    why.

    Additional Questions:

    3. Is the data being collected sufficient to provide an accurate picture of the effectiveness of theprogram?

    4. If data for indicators listed in the RMAF are not being collected, why is this the case?

    5. What are some of the barriers to more effective data collection?

    Additional Comments(optional):