Upload
flora-richards
View
220
Download
0
Embed Size (px)
Citation preview
Slide 1
A Reference Model for Requirements and Specifications
NOTES
Slide 2
Software Artifacts in the Real World
Example from the ESA software engineering standard:• User Requirements Specification.
• Software Requirements Specification.
Example from Lucent:• Customer Specification Document.
• Feature Specification Document.
Common scenario: just the code.
Slide 3
Reference Models
Provide an idealized model of the architecture for common concepts, standards, reference implementation.
Accommodate both variation and precision.
Successful example: ISO 7-layer model for networks. Benefits include:• Improved communication.
• Guide for good designs.
• Vehicle for analysis of tradeoffs.
Slide 4
Key Concepts for a Reference Model for Software Artifacts
Kinds of artifacts.• Informal vs. Rigorous vs. Formal.
• Verbal vs. Document vs. Code.
Environment and System.
Visibility and Control.
Refinement Obligations.
Slide 5
Reference Model Artifacts
W R S P M
Domain (“World”) knowledge provides presumed facts about the environment.
Environment System
Slide 6
Reference Model Artifacts
W R S P M
Requirements that indicate what the customer needs
from the system, described in terms of its effect on the
environment.
Environment System
Slide 7
Reference Model Artifacts
W R S P M
Specification providing enough information for a
programmer to build a system that satisfies the
requirements.
Environment System
Slide 8
Reference Model Artifacts
W R S P M
Program implementing the specification on the given
program platform.
Environment System
Slide 9
Reference Model Artifacts
W R S P M
Programming platform (“Machine”) provides a basis for
programming a machine to satisfy the specification.
Environment System
Slide 10
Reference Model Artifacts
W R S P M
Given(Indicative)
Environment System
Given(Indicative)
Given(Indicative)
Slide 11
Reference Model Artifacts
W R S P M
Environment System
“Wished For”(Optative)
Slide 12
Reference Model Artifacts
W R S P M
Common Language Refinement
Environment System
Slide 13
Designations
Environment System
eh ev sv sh
Visibility Control
Slide 14
Example: Patient Monitoring System
Requirement: a system to warn a nurse ifa patient’s heart stops beating.
Slide 15
Artifacts
Domain knowledge W.• A nurse is always at the nurse’s station and can hear a bell.
• If the patient’s heart has stopped, then a sensor on the patient’s chest ceases detecting the sound of a heartbeat.
Programming platform M.• Sound sensor.
• Bell.
• Computer that can activate the bell based on information from the sensor.
Slide 16
Designations
eh – The nurse and the heart of the patient.
ev – The sound of the heartbeat.
sv – The ringing bell.
sh – Internal variables used in calculations.
All sets distinct.
Let e = eh ev and s = sh sv.
Slide 17
Anecdote
Safety requirement in an aircraft: reverse thrusters should not activate while the plane is in the air or stopped on the ground.
Domain knowledge assumption: wheels are turning iff the plane is rolling on the ground.
Hidden from the system (eh): whether the plane is on the ground.
Visible to the system (ev): whether the wheels are turning.
Fallacy in 2.
Slide 18
Fundamental Refinement Obligations
FO1 Adequacy: e s. W M P RImplementing the program P on the machine M implies that t
he requirements R are met so long as the domain W behaves as expected.
FO2 Consistency: e s. WThe domain assumptions W are consistent.FO3 Relative Consistency:
ev. (eh s. W) (eh s. W M P) Allowed actions of the environment must be consistent with t
he system behavior.
Slide 19
Relative ConsistencyHow to Get it Wrong!
FO3 ev. (eh s. W) (eh s. W M’)
where M’ = M PSome choice of environment events makes the syste
m consistent: e s. W M’.
Too weak: the system doesn’t get to pick what the environment will do.
Slide 20
Second Way to Get it Wrong
FO3 ev. (eh s. W) (eh s. W M’)
Every set of events consistent with the domain assumptions is allowed by the system. e s. W M’.
Too strong: this defeats the point of the requirements, which say how the system should constrain possibilities.
Slide 21
Third Way to Get it Wrong
FO3 ev. (eh s. W) (eh s. W M’)
If there is a system behavior consistent with the domain assumptions then the system must allow it. e s. W M’.
Too weak: given e, if there is an s that is inconsistent with domain knowledge, then the system can implement anything.
Slide 22
Fourth Way to Get it Wrong
FO3 ev. (eh s. W) (eh s. W M’)
If there is a system behavior consistent with the domain assumptions then the system must allow it. e. (s. W) (s. W M’)
Too strong: consider patient monitoring system. It is consistent with W for the patient’s heart (ev) to stop beating without the nurse being warned (eh)!
Slide 23
Role of the Specification
The specification provides communication between the user and the developer expressed in the common vocabulary of the environment and system.
It enables a factorization of responsibilities between user and developer.• Users work with designations visible in the environment (viz. W
and R).
• Developers work with designations visible in the system (viz. P and M).
Slide 24
Environment-Side Obligations
EO1 e s. W S R
EO2 e s. W
EO3 ev. (eh s. W)
(s. S) ( s. S eh. W)
Slide 25
System-Side Obligations
SO1 e. ( s. S)
(s. M P) ( s. M P S)
Slide 26
How to Get it Wrong
EO3 ev. (eh s. W)
(s. S) ( s. S eh. W)
FO3 ev. (eh s. W) (eh s. W M’)
Replace EO3 with a weaker obligation formed by close analogy with FO3: ev. ( eh s. W) ( eh s. W S)
Fails to imply that the fundamental obligations are met.
Slide 27
The Key Objective
Theorem:
EO1 EO2 EO3 SO1
FO1 FO2 FO3
Slide 28
Related Work
Functional documentation model (Parnas and Madey).• Feasibility similar to relative consistency.
• Obligations weaker than WRSPM.
• FD model allows implementations with unpredictable behavior.
Domain descriptions (Jackson and Zave).
Reactive Modules (Alur and Henzinger).
Village Telephone System (Karthik, Obradovic, and authors).
Slide 29
Conclusions
Reference model based on 5 artifacts: WRSPM.
Four variables reflect visibility and control between environment and system.
Proof obligations can be factored between environment and system by the use of a specification in the common language.
Slide 30
Functional Documentation Model
REQ(m,c)
Nat(m,c)
m c
IN(m,i) OUT(o,c)SOF(i,o)
i o
Environment
System
Slide 31
FD Proof Obligations
FD1 Acceptability m i o c. NAT(m,c) IN(m,i) SOF(i,o) OUT(o,c) REQ(m,c)
FD2 Feasibility m. (c. NAT(m,c)) (c. NAT(m,c) REQ(m,c))
FD3 m. (c. NAT(m,c)) (i. IN(m,i))
Slide 32
FD Counterexample
The following definitions satisfy all FD conditions, but cannot be realized.
NAT: (t. c(t) > 0) (t. m(t) < 0)
REQ: t. c(t + 3) = -m(t)
IN: t. i(t+1) = m(t)
SOF: t. o(t+1) = i(t)
OUT: t. o(t+1) = i(t)
Slide 33
FD to WRSPM
m ev
c sv
i,o sh
W = NAT(m,c)
R = REQ(m,c)
M’ = IN(m,i) SOF(i,o) OUT(o,c)
Slide 34
WRSPM Conditions for FD Artifacts
T1 Admissibility m i o c. NAT(m,c) IN(m,i) SOF(i,o) OUT(o,c) REQ(m,c)
T2 Consistency m c. NAT(m,c)T3 Relative Consistency m. (c. NAT(m,c)) (
m i o c. NAT(m,c) IN(m,i) SOF(i,o) OUT(o,c))
TheoremT1 T2 FD1 FD2 FD3