Upload
others
View
7
Download
0
Embed Size (px)
Citation preview
IVT C t d S ft V lid tiIVT Computer and Software ValidationDublin, 27-29 September 2010
Optimize CSV by Selecting Qualified VendorsClaus le Fevre, Consulting Director, NNIT A/S
Author: xxxx • Approved by: xxxx • Version xx INTERNAL USE ONLYSlide 1
AgendaAgenda
Introduction1
Approach to vendor selection2
C l i4
Risk based validation 3
Conclusion4
Slide 2
NNIT In briefNNIT - In brief
f h f l d fNNIT is one of the four largest providers of IT services in Denmark
Focus areas: It consultancy development Focus areas: It consultancy, development, implementation and operations for regulated industries
Approx. 1,450 employees
Turnover in 2009 - EUR 213 million
Head office in Lyngby, Denmark - offices in five countries including China and the Philippines
Customers throughout EuropeCustomers throughout Europe
Subsidiary of Novo Nordisk A/S
Slide 3
NNIT Life Sciences Organisation
Focus InternationalLife Sciences IndustryCritical Business processes
CVP
Prod mgmtAdministration Prod.mgmt.Administration
Life SciencesConsulting
Life SciencesProjects
Life SciencesAMS
Life Sciences ServiceDelivery
Life SciencesInternational
Off
eri
ng
s) Clinical
Pharmacovigilance
rvic
e A
reas
(O
Compliance Management
Production
Ser
Strategy Design/Transition Operations/
Slide 4
g / p /Continual Improvement
The Right Quality Level…
YesCommon SenseNo YesCommon SenseNo
Bureaucracy Quality
YesBureaucracy Quality
Standards &
Chaos Creative Chaos
Methods
No
Slide 5
AgendaAgenda
Introduction1
Approach to vendor selection2
C l i4
Risk based validation 3
Conclusion4
Slide 6
Computer validation modelComputer validation model
El b ti /
gem
ent
InceptionElaboration/Construction Transition
n ort
Vendor
man
ag
Val
idat
ion p
la
Val
idat
ion r
ep
V
User requirements specification . Performance qualification
V V
.
.
.
Installation qualification
System specification - Functional part
Detailed specification
Coding
Operational qualification
Unit test
Code review
System specification - technical part
Time
g
Slide 7
Computer validation modelComputer validation model
El b ti /
gem
ent
InceptionElaboration/Construction Transition
n ort
Vendor
man
ag
Val
idat
ion p
la
Val
idat
ion r
ep
V
User requirements specification . Performance qualification
V V
.
.
.
Installation qualification
Time
Slide 8
Selection of vendor
Identification of potential vendorsBusiness application
Prior experience
Indication of capabilities
Potential vendors
Vendors chosen for proof-of-conceptFunctionality of application
Capabilities and competencies
Knowledge of business processes
Technical competencies
Proof-of-concept
Validation Management
Test Management
h l i f dSelection
The selection of vendor
The formal vendor audit
Slide 9
Proof-of-concept workshop – an example
Demonstration and discussion of general validation approachcontent of and interfaces between processes (workflows) used for validation activities content of and interfaces between processes (workflows) used for validation activities (for example Computer system validation SOP, Test Management SOP and handling of non-conformities SOP)
linkages to project management workflowsg p j g
Demonstration and discussion of validation package & servicescontent of and walkthrough of package
the full portfolio of documents (e.g. plans, protocols, reports, test cases & CILs)
content of validation services (e.g. activities to facilitate the planning and execution of validation)
Presentation and discussion of general development approachcontent of processes for development of user requirements, functional and technical design requirements
how are design reviews and code reviews performed and documented
Slide 10
Proof-of-concept workshop – an example
Demonstration and discussion of configuration management approachapproach
content of configuration processes (workflows) used in connection with development and validation activities.
How are the processes used for handling ofHow are the processes used for handling of
configuration item lists (CILs)
source code
documents in general
Presentation and discussion of release management activitiest t f ( kfl ) d f l i f d d t hcontent of processes (workflows) used for releasing of upgrades and patches
content of deployment activities
Slide 11
Proof-of-concept workshop – an example
Demonstration of tool for automatic OQ test generation and test executiontest execution
structure and functionality of the tool
validation of the tool and relevant documentation
maturity of the tool (e.g. extent it has been used in projects)
Presentation of activities performed to prepare for the t t d OQ t t tiautomated OQ test executionhow are the OQ test cases (TCs) built and what guidelines are used
what kind of reviewing and approval activities are performed of the OQ TCs
documentation for quality assuring activities
Demonstration of activities performed to execute the automated OQ test
presentation of a live run of a section of the OQ test cases (or equivalent)
what activities are performed and what documentation are produced by the test
Slide 12
p p y
how to handle stop and resuming of test runs. Logging of data.
Proof-of-concept workshop – an example
Presentation and discussion of activities perfomed after execution of the automated OQ testexecution of the automated OQ test
handling of defects and re-runs including what guidelines are used
what kind of reviewing and approval activities are perfomed of the results of executed OQ TCsOQ TCs
documentation of executed OQ TCs
OQ test coverageOQ test coveragestructure and focus of the OQ test cases
predefined workflows
i t f ddi dditi l OQ TCrequirements for adding additional OQ TCs
impact upon predefined and executed OQ TCs
What are the known challenges of using the tool in projects?What are the known challenges of using the tool in projects?
Slide 13
Value of the proof-of-concept?
Significant knowledge of vendor strengths and weaknessesWhat are the risksWhat are the risks
Where do we need to pay increased attention
To what extent can we reuse vendor validation deliverables
Input to which quality assuring activities to implement
Vendor better able to understand our requirementsH d h lid ti How do we approach validation
Which requirements do we have regarding validation deliverables
Increased openness between the partiesIncreased openness between the parties
Strengthened personal business relationship
Input for special focus areas in the formal audit
Slide 14
A friendly business visit…
Discussion of competence development activitiesHow are competence development training resource planning and resource How are competence development, training, resource planning and resource prioritization performed?
How are the future competencies identified, planned for and monitored?
Demonstration of development activitiesWhat development model(s) are used, what roles are performed, what are the activities, how are they planned, executed and monitored?
Walkthrough of a representative release project from initial planning to closure
Demonstration of handling of application supportwhat roles are performed
scope of technical and functional support
structure and operation of Helpdesk (what are the activities, how are they planned, p p ( , y p ,executed and monitored)
tracking of a number of support cases from registration to closure in connection with present customers
Slide 15
How to select the vendorHow to select the vendor
Slide 16
The formal vendor auditThe formal vendor audit
Use knowledge from proof-of-concept activitiesUse knowledge from proof of concept activities
Focus on evidence of efficient QMSSelect carefully what to scrutinizeSelect carefully what to scrutinize
Focus on critical SW development and maintenance aspectsProject managementj g
Software development practices
Release and configuration management
Ongoing maintenance and supportOngoing maintenance and support
Can you still minimize your validation when upgrading to next releases
Get behind the surface – talk to employeesp y
Slide 17
The formal vendor auditThe formal vendor audit
Look for consistency and accuracy of specifications and test y y pevidence
Use auditor specialised in IT audit
Accept tools, e.g. automated testLook for qualification of tools
Accept terminology other than IQ, OQ, PQAs long as content complies with requirements
Emphasize in audit report and validation plan if vendor specification and test is adequate
Supports that you only need to specify and test your own configuration.Supports that you only need to specify and test your own configuration.
Use vendor documentation by referencing it in your own validation documents.
Slide 18
AgendaAgenda
Introduction1
Approach to vendor selection2
C l i4
Risk based validation 3
Conclusion4
Slide 19
Risk based validationRisk based validation
Ri k b d hRisk based approachRisk based approach starts identifying risks associated with selected vendor
Risk based approach includes identifying impact on pp y g paspects of regulatory concern
Base risk analysis on GAMP 5 categoriesDifferent levels of risk analysis dependent on GAMP Different levels of risk analysis dependent on GAMP category
Different levels of GxP criticality in same system
Different levels of risk analysis during the Different levels of risk analysis during the project phases
Initial phase: Identify GxP modules and function
Design phase: Identify GxP critical record and control
Test phase: Identify different levels of testing
Slide 20
Validation strategy – SW categoriesgy gGAMP5 guidelines, ISPE
Slide 21
Risk assessment small system exampleRisk assessment – small system exampleRisk assessment part of URS - appendix
Slide 22
Defining QA activities for vendor deliverables g Q(for category 5) – an example
Vendor deliverables
Configuration Item List
Listing of versioned and uniquely named software modules
Development Review Statement
IQ test casesIQ test cases
OQ test cases (automated scripts)
PQ test cases
Traceability matrix (to FRS)
Software Test Statement
Installation guide
Slide 23
Acceptance criteria for vendor deliverables ( t 5) l(category 5) – an example
Design walk-through-meetingVendor receives FRS
Formal walk-through meeting where vendor presents design and test approach
Conclusion whether design can startConclusion whether design can start
Defect walk-through meetingReview of vendor deliverables
Defect-logging
Evaluation of fulfilment of acceptance criteria
Input for formal design review
Reuse of test cases in validation
Slide 24
AgendaAgenda
Introduction1
Approach to vendor selection2
C l i4
Risk based validation 3
Conclusion4
Slide 25
ConclusionConclusionEvaluate the vendor capability
Screening
Establish knowledge of vendor strength and weaknesses by evaluation of
capabilities and competencies
documentation
results
Face-to-face interaction
proof-of-concept sessions
business visit
Structured selection of the vendor
Formal auditFocus especially on areas for special scrutiny
Risk based approach Which parts will the vendor be involved in?
Which QA activities will we implement to be able to benefit from vendor capabilities?
Slide 26
Which QA activities will we implement to be able to benefit from vendor capabilities?
Thank you!Thank you!
Slide 27
Contact dataContact data
Consulting DirectorLife Sciences Consulting
Claus le Fevre
Life Sciences Consulting
NNIT A/SBuddingevej 197DK-2860 Søborg
+45 3075 3003 (mobile)+45 3075 3003 (mobile)
Slide 28