Upload
usoa
View
43
Download
2
Tags:
Embed Size (px)
DESCRIPTION
PATFrame Workshop #2 Tool Concepts. Dan Ligett Softstar Systems [email protected] (603) 672-0987 www.SoftstarSystems.com. Decision Support System. Risk tradeoff tool (a la Madachy/Valerdi for SE) Testing process tool Management process tool Leading Indicators tool - PowerPoint PPT Presentation
Citation preview
1
PATFrame Workshop #2
Tool Concepts
Dan LigettSoftstar Systems
[email protected](603) 672-0987
www.SoftstarSystems.com
2
Decision Support System• Risk tradeoff tool (a la Madachy/Valerdi for SE)• Testing process tool• Management process tool• Leading Indicators tool• Cost/Schedule/Effort/Risk/Quality tradeoff tool• Game, Training tool, Simulation• Risk Identification• Risk Mitigation• Tell testers when they're done• Real Options• Give advice on split to Live/Virtual/Constructive• It’s a tool to help you win arguments
3
Requirements Elicitation
4
(~) Mix & Match -- Technology vs. Question
Done? Invest? Improve? Healthy? Trade? Risk? Adapt?
Defect Model Tool 1
Real Options Tool 2
System Dynamics
Tool 3
Leading Indicators
Tool 4
Parametric Model
Tool 5
Risk Model Tool 6
Architecture Modeling
Tool N
5
PATFrame DimensionsCan imagine thousands of different tools. For example:Real Options can be applied to the SUT or at the Test Infrastructure level
1) The question to answer2) The technology to apply3) SUT or Infrastructure4) OT or DT5) Now vs. 15/25 years from now6) Army/Navy/AF7) Program of Record vs. Rapid Acquisition8) Space/Air/Sea/Land/Undersea9) Degree of autonomy10) Degree of netcentricity11) Manned vs. unmanned12) Complexity of SUT13) Tester vs. Evaluator
I don't claim that these dimensions are perfectly orthogonal.
6
Need Feedback!
• Which of these might help you?• Which are too hard?• Which make no sense for you?• What have we missed?
DSS will be driven by MIT / UTA / USC research AND feedback we get to these tool concepts.
7
Tool Concept #1• Question:
– When am I Done testing? • Technology:
– Defect estimation model– Trade Quality for Delivery Schedule
• Inputs:– Defects discovered
• Outputs:– Defects remaining, cost to quit, cost to continue
8
Tool 1, Data Input
9
Tool 1, Trends
10
Tool 1, Analysis
11
Tool Concept #2• Question(s):
– Should I invest $$$ in infrastructure?– (What should I test?)
• Technology:– Real Options
• Inputs:– Cost of each investment, decisions & risks
• Outputs:– Value of the investment, value of flexibility
12
Tool 2, Data Input
13
Tool 2, Data Input
14
Tool 2, Analysis
15
Tool Concept #3• Question:
– Will XXX improve my testing process? • Technology:
– System Dynamics simulation• Inputs:
– Description of process• Outputs:
– Prediction of progress/quality/testing rate
16
Tool Concept #4• Question:
– Is my project healthy? • Technology:
– Leading Indicators• Inputs:
– Description of progress (available early)• Outputs:
– Comparison to completed projects– Odds of success
17
Tool 4, Trends
18
Tool 4, Analysis
19
Tool Concept #5• Question:
– When will I finish testing?– Can I trade X for Y?
• Technology:– Parametric estimation model
• Inputs:– Size drivers, cost drivers
• Outputs:– Cost, Effort, Duration, odds of success
20
Tool 5, Size Input
21
Tool 5, Cost Driver Input
22
Tool 5, Output per Phase
23
Tool 5, Output per Activity
24
Tool 5, Effort PDF
25
Tool 5, Effort CDF
26
Tool 5, Duration PDF
27
Tool 5, Duration CDF
28
Tool Concept #6• Question:
– What can I do to mitigate risks? • Technology:
– Risk trade model (a la Madachy & Valerdi) • Inputs:
– Descriptions of system under test• Outputs:
– Risks prioritized– Mitigation strategies
29
Tool 6, Data Input
Madachy & Valerdi, 24th COCOMO Forum, 11/4/2009
30
Tool 6, Prioritized Risks
Madachy & Valerdi, 24th COCOMO Forum, 11/4/2009
31
Tool 6, Mitigation Guidance
Madachy & Valerdi, 24th COCOMO Forum, 11/4/2009
32
Tool Concept #N• Question:
– If the test instrumentation software is reconfigured during the adaptation, will X run-time requirement be met?
– Can the test system automatically adapt in appropriate ways to changing data rates from external C4ISR systems?
– Are the probes in the test system sufficient to detect and evaluate autonomous behavior X in the system-under-test?
• Technology:– Architectural modeling
• Inputs:– Models of SUT & infrastructure
• Outputs:– Simulation of testing architecture
33
Remember, each of these Generalizes…
• The “Question” is just an example• Could look at micro or macro questions• Trade-off tool
– Schedule/Cost/Quality/Risk/Training/etc.• System Dynamics tool
– Schedule/Defect Rates/Staffing Levels• Prediction tool…
34
Backup Slides
35
Other Candidate Questions…• My access to resource X ends next quarter. What work depends on that access?
• What testing can I perform in parallel?
• I can't analyze the test data in time to plan the next day's tests. What are my alternatives? Can I reschedule work? Can I take short cuts? Can I use heuristics to guide me? Which ones?
• Are Rev 12 of System X and Rev 314 of System Y compataible?
• System X has changed. What do I need to retest?
• Can I test Rev 12 of System X and Rev 314 of System Y simultaneously? How many of the tests are valid in this environment?
• System X and System Y can't currently operate within 2 miles of each other because of radio interference. What useful testing can I perform under that constraint? What remedies have worked in the past? Historically, how much has this delayed a comparable program?
36
Other Candidate Questions II…• After testing System X, we must wait 24 hours for the nerve gas to disipate.
What useful testing can we perform during those 24 hours?
• Was Test #1234 performed with the prototype of System X, or the production version?
• Are the results of Run #123 substantially the same as Run #122?
• Test #1234 is failing. Trace that back, and tell me which requirements of which systems are not being met.
• I have more than the usual number of contractors/systems. How will that impact risk/schedule? What can I do to mitigate risk/reduce schedule?
• What are the odds that the SoS will work in the field?
• Has anyone ever faced this situation before?
• What's worked before? What's failed before? Why?
37
Other Candidate Questions III…• Is the test plan/case good enough?
• Is my test plan/case in compliance with the standards?
• Do I have the right people/tools/methodology?
• Where should I put my resources?
• Am I in compliance with XXX std?
• How many test cases should I have?
• What part of the program should I focus on next?
• Have I reached the point of diminishing returns in my testing/planning?
• How can I lower the risk?
38
Other Candidate Questions IV…• I need to prioritize my testing. What are some safe & easy tests I can start with,
to get my team warmed up?
• I need to prioritize my testing. What tests should I perform to ensure that the SoS can meet its fundamental objectives?
• I need to prioritize my testing. I want to do the riskiest parts first -- where should I start?
• System X is not ready to be tested. What useful work can I do now without it?
• Things aren't going well. Can your system dynamics model help me understand the problem? What if we tweak the process by doing X?
• Your system dynamics model indicates that we can meet the schedule if we hire 10,000 people to analyze the test results. Can we try to model other solutions?
• Everything has been going as planned. But, now they need systems X, Y, and Z in Iraq right now, as is. What capabilities have been tested?
39
Other Candidate Questions V…
• Remember that SoS we tested last year? Well, now we need to certify it with the F-99 rather than the old F-35. Do we need to do anything more than edit the Word docs?
• What test should I do next?
• What part of my plan deviates the most from the normative framework?
• My program is in progress. How does it compare to the norm?
• Is this program/project/plan more risky than the norm?
• What are the odds of success?
• Do we have the right people?
• What is the most similar program performed in the last N years?
• How long did a similar project take?
40
Tool 3, Causal Loop Model
http://en.wikipedia.org/wiki/Systems_dynamics
41
Tool 3, Stock & Flow Model
http://en.wikipedia.org/wiki/Systems_dynamics
42
Tool 3, Simulation Results
http://en.wikipedia.org/wiki/Systems_dynamics
43
Tool 6, Cost CDF
44
Tool 6, Cost PDF
45
Requirements (Early!)• V1 Costar manual vs. V7 Costar manual• Eliciting, not soliciting • Platform?• Some of these are poor ideas?• Some of these outside PATFrame scope?• Some of these may be hard to build?• But, need to start a discussion...• …are any of these ideas close?