Upload
jemima-turner
View
215
Download
2
Embed Size (px)
Citation preview
CSCE 548 CSCE 548 Code ReviewCode Review
CSCE 548 - Farkas 2
ReadingReading
This lecture: – McGraw: Chapter 4– Recommended:
Best Practices for Peer Code Review, http://www.smartbear.com/docs/BestPracticesForPeerCodeReview.pdf
Next lecture:– Architectural Risk Analysis – Chapter 5
CSCE 548 - Farkas 3
Application of TouchpointsApplication of Touchpoints
Requirement and Use cases
Architecture and Design
Test Plans Code Tests andTest Results
Feedback fromthe Field
5. Abuse cases
6. Security Requirements
2. Risk Analysis
External Review
4. Risk-Based Security Tests
1. Code Review(Tools)
2. Risk Analysis
3. Penetration Testing
7. Security Operations
CSCE 548 - Farkas 4
Code Review (Tool)Code Review (Tool)
Artifact: CodeImplementation bugsStatic Analysis toolsWhite Hat activity
CSCE 548 - Farkas 5
Software BugsSoftware Bugs
Programming bugs: – Compiler catches error, developer corrects bug,
continue development
Security relevant bug: – May be dormant for years– Potentially higher cost than programming error
Who should be responsible for security bug?– Software developer?– Security expert?
CSCE 548 - Farkas 6
Manual vs. Automated Code Manual vs. Automated Code ReviewReview
Manual Code Review– Tedious, error prone, exhausting– Need expert with the mindset of an attacker!
Static analysis tools– Identify many common coding problems– Faster than manual– Need developer with basic understanding of
security problems and how to fix detected ones
CSCE 548 - Farkas 7
Best PracticesBest Practices
Peer Code Review recommendations from SmartBear Software
Based on Cisco code review study– Over 6000 programmers and 100 companies
“lessons learned” results– Light weight code review
CSCE 548 - Farkas 8
Best Practices Best Practices Recommendations 1.Recommendations 1.
1. Review fewer that 200-400 lines Optimizes number of detected vulnerabilities (70-90 %)
2. Aim for an inspection rate of less than 300-500 line of code/hour
Faster is not better! Based on number of detected vulnerabilities
3. Do not spend more than 60-90 mins on review at a time Efficiency drops after about an hour of intense work
4. Make developers annotate their code Encourage developers to “double-check” their work Reduce the number of vulnerabilities in the code
CSCE 548 - Farkas 9
Best Practices Best Practices Recommendations 2.Recommendations 2.
5. Establish quantifiable goals for code review External metrics: e.g., reduced # of support calls Internal metrics: e.g., defect rate
6. Maintain checklist Prevent omissions of important security components
7. Verify that defects are actually fixed Need good collaborative review of software
8. Managers must support code review Support team building and acceptance of process
CSCE 548 - Farkas 10
Best Practices Best Practices Recommendations 3.Recommendations 3.
9. Beware of the “Big Brother” effect Use of metrics – role of manager
10. The Ego effect User code review to encourage developers for good
coding habits Review at least 20-33% of code
11. Light weight style of review Tool assisted Just as efficient as formal, heavy weight review but
1/5 less time required
CSCE 548 - Farkas 11
Source Code vs. Binary Code Source Code vs. Binary Code CheckCheck
What to check? Source Code or Binary Code? Source Code:
– See the logic, control, and data flow– See explicit code lines– Fixes can be carried out on the source code
Compiled Code:– May need reverse engineering (disassemble,
decompile)– Finding a few vulnerabilities is easy. Finding all is
difficult – Fixes may be incorporated as binary modules or
external filters
CSCE 548 - Farkas 12
How Static Analysis Works?How Static Analysis Works?
Look for fixed set of patterns or rules– Syntactic matches– Lexical analysis– Flow analysis (control flow, call chains, data flow)
False negatives False positives Sound tool: given a set of assumptions, the static
analysis tool does not produce false negatives Commercial tools: unsound
CSCE 548 - Farkas 13
Static AnalysisStatic Analysis
Identify vulnerable constructs Similar to compiler – preprocess source file and
evaluates against known vulnerabilities Scope of analysis
– Local: one function at a time– Module-level: one class (or compilation unit) at a time
– incorporates relationships between functions– Global: entire program – all relationships between
functions
CSCE 548 - Farkas 14
Rule CoverageRule Coverage
Taxonomy of coding errors:– Language specific (e.g., C/C++, Java, etc.)– Functions or APIs
Academia vs. industry
CSCE 548 - Farkas 15
Commercial ToolsCommercial Tools
Easy to use – still need expert knowledgeCan process large code (millions of lines)
efficientlyNeed competence of reviewer of resultsEncapsulates knowledge (known
vulnerabilities) and efficient flow analysisEncourages efficient and secure coding
CSCE 548 - Farkas 16
Tool CharacteristicsTool Characteristics
Be designed for securitySupport multiple tiresBe extensibleBe useful for both security analyst and
developerSupport existing development processMake sense for multiple stakeholders
CSCE 548 - Farkas 17
Next ClassNext Class
Architectural Risk Analysis – Chapter 5