Upload
adacore
View
564
Download
1
Embed Size (px)
DESCRIPTION
Citation preview
Rod Chapman
Directions in Secure Software Development
High Assurance Software Symposium
Contents • Some observations
• Experiences with secure systems…
• Trends and Concerns
• A future?
Some observations • The safety-critical industry has been building
remarkably reliable and high-quality systems for
years…
• Can we apply the same approach for secure
systems?
• At Altran Praxis, we first tested this hypothesis in
1999 with the MULTOS CA system.
• It seems the answer is “Yes…but with a few key
assumptions changed…”
Some observations • In the last ten years or so, we have seen a huge rise
in interest in “security vulnerabilities” in
• operating systems,
• embedded systems,
• SCADA systems etc.
• What changed?
What changed? • In “Software Security: Building Security In…”, Gary
McGraw cites three main areas of change…
What changed? • Connectivity
• “Let’s network everything!”
What changed? • Extensibility
• “Dynamically loadable everything!”
What changed? • Growth in complexity
• More code implies more bugs
Safety and security… the difference? • Possibly the most notable difference is in the
environment itself.
• Safety-related systems – mostly operate in a benign
environment that we can control to some extent.
• Security-critical systems operate in a openly malicious
environment where we might have no control over inputs,
access, and so on
• ..and…attackers are arbitrarily intelligent, well-funded and
motivated.
• “Programming Satan’s Computer” [Needham and Anderson]
Experiences with Security
• In the worst-case, we must assume systems will be
attacked in a way that we haven’t even thought of (let
alone tested)…
• Therefore any sole claim of “we’ve tested it lots”, or
“we’ve tested it more…” offer limited confidence for
security properties
• So – why not use formal methods and sound static
verification – i.e. Correctness-by-Construction?
• Examples: MULTOS CA, Tokeneer, SPARKSkein, and other
SPARK users.
Experiences with Security
• At a technical level, strong static verification offers
confidence in a number of basic areas:
• Absence of “dumb bugs”…
• Verification of specific program properties of
interest.
• But...no silver bullet. There is still a need for solid
requirements engineering, security engineering,
architecture and so on.
Trends and Concerns
• Trend: massive resurgence in interest in static
analysis in general. (How many companies selling
“security vulnerability finder” tools???)
• Good! Raises awareness and average maturity.
• Concern: Almost all effort aimed at “low hanging
fruit” of retrospective analysis of legacy code-
bases. Gives the impression that this is the only
thing you can do…
Trends and Concerns
• Trend: lots of effort in “enumeration” of defects,
vulnerabilities, etc.
• Examples: SANS “Top 25” list, CVE, CWE, ISO OWGV
effort etc.
• Good! Provides a starting point..
• Concern: Encourages “box ticking” mentality.
• Concern: Application-specific security requirements
(i.e. the interesting stuff) is an infinite set. Trying to
enumerate it is unlikely to terminate!
A Future?
• We have shown that the CbyC/SPARK approach can
yield strong results, particularly for the highest-grade
secure systems.
• But…many systems incorporate:
• Legacy code, COTS, “SOUP”, many languages,
developed by many teams…
• The boundary between “safety” and “security” is
becoming blurred – many systems exhibit “both”…
A Future?
• Can we use architecture to isolate (sub-)systems of
differing security requirements?
• Can we combine verification evidence from multiple
sources to gain confidence where it’s most needed?
• Formal methods for the most critical components?
• Test, isolate, and contain for the less critical?
• What can we do to contractualize and enforce the
boundary between such components?
Altran Praxis Limited
22 St. Lawrence Street, Southgate,
Bath BA1 1AN
United Kingdom
+44 (0) 1225 466991
+44 (0) 1225 469006
www.altran-praxis.com
Telephone
Facsimile
Website