Upload
mohamed-westbrook
View
232
Download
3
Embed Size (px)
Citation preview
ISACA’s COBITISACA’s COBIT®® Assessment Programme Assessment Programme(based on COBIT(based on COBIT®® 5) 5)
ISACA’s COBITISACA’s COBIT®® Assessment Programme Assessment Programme(based on COBIT(based on COBIT®® 5) 5)
Presented by:
Understanding the COBIT Assessment Programme, its ISOIEC 15504 base and the use of COBIT 5 content in it
Understanding the relationship to ISO/IEC 15504 and why ISACA selected this standard and approach
Understanding the COBIT Assessment Programme materials and support from ISACA
Session Objectives
Copyright ISACA 2014 All rights reserved Slide 2
ISO/IEC 15504-4 identifies process assessment as an activity that can be performed either as part of a process improvement initiative or as part of a capability determination approach.
The purpose of process improvement is to continually improve the enterprise’s effectiveness and efficiency.
The purpose of process capability determination is to identify the strengths, weaknesses and risk of selected processes with respect to a particular specified requirement through the processes used and their alignment with the business need.
It provides an understandable, logical, repeatable, reliable and robust methodology for assessing the capability of IT processes.
What is A Process Assessment?
Copyright ISACA 2014 All rights reserved Slide 3
The COBIT Assessment Programme brings together two proven heavyweights in the IT arena, ISO and ISACA.
The process assessment standard from ISO, ISO/IEC 15504 is combined with the process model from COBIT 5 to provide an understandable, logical, repeatable, reliable and robust methodology for assessing the capability of IT processes.
What is the COBIT Assessment Programme?
Copyright ISACA 2014 All rights reserved Slide 4
Programme support
The COBIT Assessment Programme www.isaca.org/Knowledge-Center/cobit/Pages/COBIT-Assessment-Programme.aspx
products include:• COBIT Self Assessment Guide: Using COBIT 5
• A self-assessment tool kit
• COBIT Assessor Guide: Using COBIT 5
• COBIT Process Assessment Model (PAM): Using COBIT 5
In addition, Accredited Training Organizations (ATOs) deliver the COBIT Assurance training course to candidates who have obtained the COBIT 5 Foundation certification.
ISACA has established a Certified COBIT Assessor certification, to allow appropriately trained and experienced assessors to be able to demonstrate their competence to assessment project sponsors, www.isaca.org/COBIT/Pages/COBIT-5-Certified-Assessor-Program.aspx.
Copyright ISACA 2014 All rights reserved Slide 5
Self-assessment approach
Copyright ISACA 2014 All rights reserved Slide 6
Simple, stand alone guidance (10 pages plus short appendices and a supporting tool kit) has been developed in a Self-assessment Guide to support completion of a simplified assessment approach.
This approach can be used to perform a less rigorous status assessment, perhaps to determine problem or issue areas for internal discussion or for targeting a formal future 15504 compliant assessment.
This approach is aligned with the formal approach but does not require evidence collection. It is a good way to learn initially about the programme.
Assessment Overview
This figure is reproduced from ISO 15504-2:2003 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Process Assessment Model
Assessment Process
Copyright ISACA 2014 All rights reserved Slide 7
Process Reference Model (PRM)
The COBIT process reference model (PRM) is defined in the Process Assessment Model publication.
The PRM content is directly based on COBIT 5: Enabling Processes content, with adjustments only made to reflect ISO/IEC 15504 terminology as necessary.
Process domain and scope, purpose and outcomes are defined for each of the 37 COBIT 5 processes.
Copyright ISACA 2014 All rights reserved Slide 8
Assessment Overview
This figure is reproduced from ISO 15504-2:2003 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Copyright ISACA 2014. All rights reserved Slide 9
The COBIT assessment process measures the extent to which a given process achieves specific attributes relative to that process— ‘process attributes.’
The COBIT assessment process defines nine process attributes (based on ISO/IEC 15504-2)• PA 1.1 Process performance
• PA 2.1 Performance management
• PA 2.2 Work product management
• PA 3.1 Process definition
• PA 3.2 Process deployment
• PA 4.1 Process measurement
• PA 4.2 Process control
• PA 5.1 Process innovation
• PA 5.2 Continuous optimisation
Measurement Framework
Copyright ISACA 2014. All rights reserved Slide 10
Process Capability Levels
Level 0 Incomplete process
Level 0 Incomplete process
IncompleteThe process is not implemented or fails to achieve its purpose.
Level 1 Performed processPA 1.1 Process performance attribute
Level 1 Performed processPA 1.1 Process performance attribute
PerformedThe process is implemented and achieves its process purpose.
Level 2 Managed processPA 2.1 Performance management attribute
PA 2.2 Work product management attribute
Level 2 Managed processPA 2.1 Performance management attribute
PA 2.2 Work product management attribute
ManagedThe process is managed and work products are established, controlled and maintained.
Level 4 Predictable processPA 4.1 Process measurement attribute
PA 4.2 Process control attribute
Level 4 Predictable processPA 4.1 Process measurement attribute
PA 4.2 Process control attribute
PredictableThe process is enacted consistently within defined limits.
Level 5 Optimizing processPA 5.1 Process innovation attribute
PA 5.2 Process optimization attribute
Level 5 Optimizing processPA 5.1 Process innovation attribute
PA 5.2 Process optimization attribute
OptimizingThe process is continuously improved to meet relevant current and projected business goals.
Level 3 Established processPA 3.1 Process definition attribute
PA 3.2 Process deployment attribute
Level 3 Established processPA 3.1 Process definition attribute
PA 3.2 Process deployment attribute
EstablishedA defined process is used based on a standard process.
Copyright ISACA 2014. All rights reserved Slide 11
PA 1.1 Process performance• The process performance attribute is a measure of the extent
to which the process purpose is achieved.
• As a result of full achievement of this attribute, the process achieves its defined outcomes.
Process Attributes (example)
Copyright ISACA 2014 All rights reserved Slide 12
PA 2.1 Performance management• A measure of the extent to which the performance of the process is managed. As a result
of full achievement of this attribute: a. Objectives for the performance of the process are identified.
b. Performance of the process is planned and monitored.
c. Performance of the process is adjusted to meet plans.
d. Responsibilities and authorities for performing the process are defined, assigned and communicated.
e. Resources and information necessary for performing the process are identified, made available, allocated and used.
f. Interfaces between the involved parties are managed to ensure effective communication and clear assignment of responsibility.
PA 2.2 Work product management• A measure of the extent to which the work products produced by the process are
appropriately managed. As a result of full achievement of this attribute: a. Requirements for the work products of the process are defined.
b. Requirements for documentation and control of the work products are defined.
c. Work products are appropriately identified, documented and controlled.
d. Work products are reviewed in accordance with planned arrangements and adjusted as necessary to meet requirements.
Process Attributes (example)
Copyright ISACA 2014. All rights reserved Slide 13
The COBIT assessment process measures the extent to which a given process achieves the ‘process attributes:’
Process Attribute Rating Scale
N Not achieved—0 to 15% achievement There is little or no evidence of achievement of the defined attribute in the assessed process.
P Partially achieved—> 15% to 50% achievementThere is some evidence of an approach to, and some achievement of, the defined attribute in the assessed process. Some aspects of achievement of the attribute may be unpredictable.
L Largely achieved—> 50% to 85% achievement There is evidence of a systematic approach to, and significant achievement of, the defined attribute in the assessed process. Some weakness related to this attribute may exist in the assessed process.
F Fully achieved—> 85% to 100% achievement There is evidence of a complete and systematic approach to, and full achievement of, the defined attribute in the assessed process. No significant weaknesses related to this attribute exist in the assessed process.
Copyright ISACA 2014. All rights reserved Slide 14
PA 2.2 Work product management
PA 2.1 Performance managementLevel 2 - Managed
PA 1.1 Process performanceLevel 1 - Performed
Level 0 - Incomplete
PA 3.2 Deployment
PA 3.1 DefinitionLevel 3 - Established
PA 4.2 Control
PA 4.1 MeasurementLevel 4 - Predictable
PA 5.1 Innovation
PA 5.2 OptimizationLevel 5 - Optimizing
1
L/F
2
L/F
F
F
3
L/F
F
4
L/F
F
F
F
L/F
5
F
F
F
F
L/F = Largely or Fully F= Fully
Process Attribute Ratings and Capability Levels
This figure is reproduced from ISO 15504-2:2003 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Copyright ISACA 2014. All rights reserved Slide 15
COBIT Assessment Process Overview
This figure is reproduced from ISO 15504-2:2003 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Copyright ISACA 2014. All rights reserved Slide 16
Process Capability Levels and
Attributes
Slide 17
COBIT 5
ISO
This figure is reproduced from ISO 15504-2:2003 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Process Attribute Rating
Assessment indicators in the PAM are used to support the assessors’ judgement in rating process attributes: • Provide the basis for repeatability across assessments
A rating is assigned based on objective, validated evidence for each process attribute.
Traceability needs to be maintained between an attribute rating and the objective evidence used in determining that rating.
Copyright ISACA 2014. All rights reserved Slide 18
Overview
This figure is reproduced from ISO 15504-2:2003 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Copyright ISACA 2014 All rights reserved Slide 19
1 Initiation
2 Planning the assessment
3 Briefing
4 Data collection
5 Data validation
6 Process attributes rating
7 Reporting the results
Assessment Process Activities
Copyright ISACA 2014. All rights reserved Slide 20
Identify the sponsor and define the purpose of the assessment: • Why it is being carried out?
Define the scope of the assessment:• Which processes are being assessed?
• What constraints, if any, apply to the assessment?
Identify any additional information that needs to be gathered. Select the assessment participants, the assessment team
and define the roles of team members. Define assessment inputs and outputs:
• Have them approved by the sponsor.
1. Initiation
Copyright ISACA 2014. All rights reserved Slide 21
An assessment plan describing all activities performed in conducting the assessment: • Is developed
• Is documented
• Contains an assessment schedule
Identify the project scope. Secure the necessary resources to perform the assessment. Determine the method of collating, reviewing, validating and
documenting the information required for the assessment. Co-ordinate assessment activities with the organisational
unit being assessed.
2. Planning the Assessment
Copyright ISACA 2014. All rights reserved Slide 22
The assessment team leader ensures that the assessment team understands the assessment: • Input
• Process
• Output
Brief the organisational unit on the performance of the assessment: • PAM, assessment scope, scheduling, constraints, roles and
responsibilities, resource requirements, etc.
3. Briefing
Copyright ISACA 2014. All rights reserved Slide 23
The assessor obtains (and documents) an understanding of the process(es) including process purpose, inputs, outputs and work products, sufficient to enable and support the assessment.
Data required for evaluating the processes within the scope of the assessment are collected in a systematic manner.
The strategy and techniques for the selection, collection and analysis of data and justification of the ratings are explicitly identified and demonstrable.
Each process identified in the assessment scope is assessed on the basis of objective evidence: The objective evidence gathered for each attribute of each process assessed
must be sufficient to meet the assessment purpose and scope. Objective evidence that supports the assessors’ judgement of process attribute
ratings is recorded and maintained in the assessment record:• This record provides evidence to substantiate the ratings and to verify
compliance with the requirements.
4. Data Collection
Copyright ISACA 2014. All rights reserved Slide 24
Actions are taken to ensure that the data are accurate and sufficiently cover the assessment scope, including: • Seeking information from firsthand, independent sources
• Using past assessment results
• Holding feedback sessions to validate the information collected
Some data validation may occur as the data is being collected.
5. Data Validation
Copyright ISACA 2014. All rights reserved Slide 25
For each process assessed, a rating is assigned for each process attribute up to and including the highest capability level defined in the assessment scope.
The rating is based on data validated in the previous activity. Traceability must be maintained between the objective
evidence collected and the process attribute ratings assigned. For each process attribute rated, the relationship between the
indicators and the objective evidence is recorded.
6. Process Attribute Rating
Copyright ISACA 2014. All rights reserved Slide 26
The results of the assessment are analysed and presented in a report .
The report also covers any key issues raised during the assessment such as:• Observed areas of strength and weakness
• Findings of high risk, i.e., magnitude of gap between assessed capability and desired/required capability
7. Reporting the Results
Copyright ISACA 2014. All rights reserved Slide 27
Level 1 Level 2
PA 1.1 PA 2.1 PA 2.2 PA 3.1 PA 3.2
Level 3
Process A Target Capability
Assessed
Process C Target Capability
Assessed
L
L LF
L LF F F
Process B Target Capability
Assessed
Target Process Capabilities (example)
Copyright ISACA 2014 All rights reserved Slide 28
Consequence of Gaps at Various Capability Levels
Consequence of Capability Gaps
This figure is reproduced from ISO 15504-4 2006 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Copyright ISACA 2014. All rights reserved Slide 29
Risk Associated With Each Capability Level
Capability Gaps and Risk
This figure is reproduced from ISO 15504-4 2006 with the permission of ISO at www.iso.org. Copyright remains with ISO.
Copyright ISACA 2014. All rights reserved Slide 30
COBIT process assessment roles:• Lead assessor—a ‘competent’ assessor responsible for
overseeing the assessment activities
• Assessor—an individual, developing assessor competencies, who performs the assessment activities
Assessor competencies:• Knowledge, skills and experience:
• With the process reference model; process assessment model, methods and tools; and rating processes
• With the processes/domains being assessed• Personal attributes that contribute to effective performance
Assessor roles and competencies
Copyright ISACA 2014. All rights reserved Slide 31
Assessor training and certification opportunities
Accredited training organizations (ATOs) deliver the COBIT Assurance training course to candidates who have obtained the COBIT 5 Foundation certification.
ISACA has established a Certified COBIT Assessor certification, to allow appropriately trained and experienced assessors to be able to demonstrate their competence to assessment project sponsors, www.isaca.org/COBIT/Pages/COBIT-5-Certified-Assessor-Program.aspx
Copyright ISACA 2014. All rights reserved Slide 32
COBIT Assessment Programme:
www.isaca.org/cobit-assessment-programme
Contact Information:
Goodbye and thank you . . .
Copyright ISACA 2014. All rights reserved Slide 33