Upload
raoul
View
15
Download
0
Embed Size (px)
DESCRIPTION
End of Semester Presentation. Common Test Case Project. Agenda. Introduction Project Plan Common Test Case Identification – Updates Test Case Comparison System – Updates Risks Reflections Plan for the Fall Semester. Introduction. Team Members Dilip Narayanan Gaurav Jalan - PowerPoint PPT Presentation
Citation preview
End of Semester Presentation
Common Test Case Project
AgendaIntroductionProject PlanCommon Test Case Identification – UpdatesTest Case Comparison System – UpdatesRisksReflectionsPlan for the Fall Semester
IntroductionTeam Members
Dilip NarayananGaurav JalanNithya Janarthanan
ClientJason Weighley, Rail Control Systems, Bombardier Transportation
MentorsEduardo Miranda, Associate Teaching Professor, ISRVijay Sai, Software Engineering Institute
External ConsultantsProf. Anthony TomasicFrederick Pfisterer
FAT Testing Division V&V Testing Division
FAT Test Document
V&V Test Case DocumentI see that there are a lot of test cases common to FAT and V&V . If only I can identify the common test cases !!!
The Problem
Are there any test case that are similar to the one I am going to write now ?
Match Found
Match not Found
I don’t have to write a new test case
I have to write a new test case
This is reducing 20-30% of my costs
FAT
V&V
Project OverviewCustomer Goals: To help Bombardier
testing division to identify at least 20% of the common test cases between their test groups
Academic Goals: To formalize the BT Testing domain and organize information in a more searchable and retrievable manner
Project PlanDilip – can u update the visio dig u made last
time to reflect the latest milestone plan?milestone plan and/or milestones along
timeline (like the one nithya made for last mentor meeting)
Nithya – can u put ur timeline plan taking into accnt the updated macro plan?
say why 2 tracks- Why no usual software processes not
applicable to our project?
Why two plans??Customer Track
Identify common test cases between the two testing groups before October
Suspects a commonality of at least 20%Solves the problem at hand. But not repeatable
Academic TrackProvide a repeatable solution for comparing the test
cases between the two testing groups.We should identify the common test cases by manual
comparison to acquire domain knowledge. But we can stop identifying once we are comfortable with the domain.
Status – CTCI (assigned to G)Graphs from mentor meeting (no need for
individual test cases per week graph – cumulative shud suffice)
- by products of ctci (ask eduardo how to say this in mpp cuz things like no tcs for VA and VA integral mode in VNV are pretty serious)
- domain learning (??) - or we cud elaborate what we have done
toward each milestone
Common Test Case Identification
0
50
100
150
200
250
300
4 816
28 33 40
73
112 112 112 112 112 112 112 112 112 112
1938
5776
95114
133152
171 171 171190
209228
247266
285
Actual Cumulative Constant
Week
# o
f te
st c
ases
Common Test Case identification - UpdatesAp-
proved Com-mon;
27; 49%
Need approval for success; 27.8; 51%
% of success threshold achieved
112; 41%
162; 59%
Overall StatusExamined Remaining
08/Jun 15/Jun 22/Jun 29/Jun 06/Jul 13/Jul 20/Jul 27/Jul0
20
40
60
80
100
120
Examined Actual CumulativeWeek
# C
ases
Exa
min
ed
Test Case Comparison System - UpdatesTechnical Research – Identification of suitable
approachesLiterature ReviewMentor MeetingsMeetings with Subject Matter Experts
Areas of ResearchNatural Language Processing
Text Processing SystemsVector space modelsOntologyHybrid Approach
Context Based Approach EvaluationIdentify the approach Usage Context and Evaluation goalsPlan the Evaluation
Form Evaluation TeamIdentify stakeholdersSelect an approachEstimate effort and Schedule
Develop Model ProblemDevelop hypothesisDevelop criteriaDesign model solutionImplement and evaluate solution against criteria
Analyze the results against approach Usage context
Identify Technology Usage Context and Evaluation goals
Plan the Evaluation
Develop Model problems
Analyse model problem results
against technology usage context
Modify Technology or Context
Technology is a good fitTechnology is not a good fit
Plan for the Fall SemesterCompletion of Common test case IdentificationVerification and validation of the selected
approach to provide the solution for identifying common test cases
Top 3 RisksSerial No. Risk Mitigation Strategy
1Customer Unavailability Scheduled meeting with customer in one week's
advance. Negotiating the time period of the customer meeting.
2 Research and learning activity not resulting in a concrete results
Separate Learning Plan iscreated, exclusively for the learning activities.
3 Not Finding any approach to formalize the test cases
Meet with domain specialists from the Clients side to get more insight into the testing process and also scheduled meetings with technology experts.
Meeting processAgenda All agenda items are time-boxedBuffer time = 15% of total durationIdeas listTo do listNotes/decisions/etc.Scribe uses a shared Google doc and projects
it
ReflectionsCustomer CommunicationLack of Clarity of project ScopeRoles and ResponsibilitiesLack of a proper Process for each activityIneffective MeetingsLack of a planActivity based planning
Lessons LearntDeadlines for the tasksSending across a deliverable on its deadline irrespective of the status of completionWell- defined roles and responsibilities is importantImportance of having a Macro Plan
Questions for the mentors, etc.Do we have anything for mentors?
Questions for us?
Backup slides
How we are trackingG thinks that this slide can be removed. Or if
not, at least put it up in bkup slides.- for ctci i think v dont need to say how v
track cuz its too trivial - v can say something on how we are
tracking tccs
Processes (we shud delete this)D and G agreed to remove this slide. Need
N’s opinion.brief and generic overview of some of our
processes – I don’t think this is reqd as we don’t really have any other processes except for CTCI and for Meetings
CTCI process we shud put in the bkup slides.
CTCI ProcessI’m not sure if this is reqd even for bkupExplain manual comparision
diagrammatically, trying to de-emphasize the manual aspect of this process
Backup slides
SLRC - System Level Requirements CatalogPast projects were Word Documents
Newer projects are moving to Doors/Slate
Design/Development Groups
a) Perform manual tests unique to Development group
Safety Group
a) Write automatic scripts unique to Safety Group
System/Factory Test Groups
a) Write test plan unique to Test Groupb) Test scripts describe manual tests
Design/Development Groups
a) Perform manual regression tests unique to Development group
Safety Group
a) Run scripts unique to Safety Groupb) Perform manual tests
System/Factory Test Groups
a) Perform manual tests unique to Test Group
Deve
lopm
ent
Phas
eTe
st P
hase
Duplication of effort - Three different test methods are independently developedNo clear method to determine if software is ready for release to V&V and Factory TestsInconsistency across groupsManual testing yields slower turn-around timesLow confidence that all requirements were implemented and tested
Test Scripts / Plans are not continuously updated as the software is debugged.
Manual and incomplete mapping of requirements from SLRC to test scripts
Project Overview – Current Scenario