Upload
oliver-chapman
View
214
Download
0
Embed Size (px)
Citation preview
Test Analysis of Data-Collection Software: A
Case Study
Lewis Sykalski
Background (cont.)Background (cont.)
• Net Centric Warfare Data Collector
• Approximately 180KLOC
• Project testing covers ~50% of the software
• Written in Java and heavily uses JDBC and RMI from J2EE package
• CMMI Level 1
• Testing covers ~ 80% of interface
GLOBAL VISION NETWORK (GVN)
Integrated WarfareDevelopment Center
Fort Worth, TX
Light HouseSuffolk, VA
LM – Mission SysColorado Springs, CO
DC
FUSIONCAOC
WCS
JSAF
JIMM
JTAC
JABE
DC
LM – Sim & TrainingOrlando, FL
OtherSimulators
ThreatSims
VBMS
VBMS
BackgroundBackground
Result SummaryResult Summary• Functional Checklist (Black Box)
– 14 interface defects
– Used both DISCOM playback file & hand-made test surrogate
– Test cases: list of 16 outgoing functions to check
– Creation of Test Tool: total 15 hours (Example Packet)
– Test & Verification: total 6 hours (using Oracle EMC)
• Boundary testing (Black Box)– 2 interface defects
– Test cases:
• 1-D EPC Testing (under, min, max, over, interior)
• 5 cases per XML message packet type
– Test & Verification: total 3 hours
Result Summary (cont.)Result Summary (cont.)• Finite State Machine (White Box)
– 3 defects found (same problems were also found in functional checklist)
– Modeled & Tested receipt of PDU
– Test cases: Transitions thru five states using path coverage sensitization
– Analysis: total 2 hours
– Test & Verification: total 3 hours
• Data Flow (White Box)– 1 interface defect
– Start/Resume and Stop/Freeze PDUs
– Test cases:
• Sensitization through Data slicing
– Analysis: total 3 hours
– Test & Verification: total 2 hours
Result Analysis and Follow-upResult Analysis and Follow-up
• Effort Comparison:– Functional Checklist: since no documentation, this took significant effort,
but not very complex and had most test coverage– Boundary Testing: Insignificant after Test Surrogate was built, moderate
test coverage.– FSM/DFT testing: Significant effort to develop the FSM and DDG models.
Very little coverage.
• Defect Comparison:– Functional Checklist: Incorrect or incomplete implementations– Boundary Testing: Interface issues caused by boundary conditions being
violated– FSM/DFT testing: Code logic issues
• Follow-up actions:– Secure funding for QA– Fix defects found during testing– Expand XML Tester to send DIS Packets– Script to invoke regression testing
Conclusion/More Followup
• Too much scope attempted…
• WBT does not yield enough cost/benefit to be useful for our organization
• In my free-time, as a hobby, I plan to apply UBST concepts to modify fault bountiful websites to be even more bountiful! Example