Upload
daniele-loiacono
View
797
Download
0
Tags:
Embed Size (px)
DESCRIPTION
Charalambos Ioannides. "Introducing LCS to Digital Design Verification". IWLCS, 2011
Citation preview
Introducing LCS to Digital Design Verifica6on
Charalambos Ioannides
What is a Design?
• Simply – A collection of code files that define the functionality of a single digital electronics component
• Piece of code in hardware description language (Verilog or VHDL)
• A Design goes through various phases:
Code List of Gates Silicon List of
Requirements
What is Design Verification?
• “Process used to ensure a design’s functional correctness with respect to requirements and specification prior to manufacture”
• Ultimate goal is to discover as many bugs on a design as possible before shipping product to customer
• Maximize bug discovery if we maximize coverage (code, functional) on design by means of testing
Simulation-based DV
Tests TG
SIM
Biases
Coverage
ML Technique
Directed Test Generation
Biased Random Test Generation
Test Learning
Bias Learning
DUV
Why is DV Hard?
• Increasing digital design complexity • Greater automation of Design process
than Verification process • Increasing miniaturization level of silicon
chips • Competition – increasing feature demands
by customers • Power management
• Increasing Verification Effort • Many hundreds of tests, 7 month projects
Why is DV Hard?
Design Complexity
Technology
Engineers
Process
Get it Right!
Make it Fast! Limited Budget
Product to Market
ReputaIon Limited Resources
Previous attempts
CDG via ML
GA • Mature plaKorm • Decent results -‐-‐-‐ • Non-‐universal environment • Finds one/few soluIons for enIre search space
GP • Good results • No Domain Knowledge -‐-‐-‐ • Code Diversity • Finds one/few soluIons for enIre search space
BN • Good results • Approx. CDG process well -‐-‐-‐ • Domain Knowledge • Difficult to interpret know-‐ledge
Markov Models • Excellent bug discovery • Approx. CDG process well -‐-‐-‐ • Effort to setup environment • Difficult to interpret know-‐ledge
For details: http://www.cs.bris.ac.uk/Publications/pub_master.jsp?id=2001405
Why LCS (XCS) on DV?
• Adaptive Learning Systems Ø Could formulate the problem at a range of different
possibilities Ø Designs change over time and also coverage requirements
change during a simulation run
• Develop a complete, accurate and minimal representation of a problem Ø Achieve coverage in more than one ways and balance it
• Rules developed easy to understand, analyse, combine and alter. No domain knowledge required.
Why LCS (XCS) on DV?
0.0001
0.001
0.01
0.1
1
10
100
1000
10000
100000
1000000
10000000
100000000
1E+09
1E+10
1E+11
1E+12
1E+13
1E+14
1E+15
6 11 20 37 70
MUX size
XCS Generaliza6on on MUX
GeneralizaIon RaIo
Time(h)/experiment
Why not LCS (XCS) on DV?
• XCS has issue with Boolean problems that require overlapping rules
• Problem itself is too big for XCS, but can scale it down
• Need to make any future attempt noise-proof
FUTURE RESEARCH WILL TELL!
First XCS attempt on DV
• Single step problem • Learn relationship between biases for a Test
Generator (Condition) and the coverage (Action) they achieve
• Noiseless environment as single randomization seed used by the TG
• Both Conditions and Action are bit strings and we use the ternary alphabet {0,1,#} for expressing the learnt relationships
• Use the standard XCS parameters as in the 2002 XCS algorithmic description
Proposed Solution
DV1 Function
XCS (orig.) on DV1
XCS (impr.) on DV1
Proposed Solution
DV3 Function
XCS (impr.) on DV3
Learnt Rules (DV3) Classifiers
ID Cond. : Action R E F AS EXP NUM
1 0###1## : 0000 1000 0 1.00 26.98 22805 21
2 ###10## : 0000 0 0 0.48 31.76 390 6
3 0#110## : 1110 1000 0 0.63 23.31 803 6
4 01#10## : 1110 1000 0 0.58 26.06 392 2
5 1#01100 : 0001 1000 0 0.42 9.34 102 3
6 0#100## : 0010 1000 0 0.50 13.29 1534 2
7 01#00## : 0010 1000 0 0.82 17.97 3520 8
8 1###0#0 : 1100 1000 0 0.62 20.85 403 4
9 ####### : 0011 0 0 1.00 36.16 6285 36
10 ####### : 0110 0 0 1.00 44.35 6157 30
• ID1 – which 32 biases to avoid • ID2 – wrong, but tells us that the 32 bias vectors will cover at least one signal • ID3 & 5 or ID4 & 5 achieve max coverage, longer are ID 5, 6 & 8 or 5, 7 & 8 • ID9 & 10 – tell us what cannot be achieved
Why deal with DV?
• DV is a hard real world problem • Designs have complex interactions and becoming more
complex • Maximise coverage, minimizing resources for it. • Wicked fitness landscape resembling needle in haystack or
deceptive problems • 80/20 Rule applies
• Chance to compete with other EA and probabilistic ML techniques
• Formulate the problem as either multistep and single step, using a variety of representations (binary, integers, real numbers, graphs etc.)
Fame and Fortune!!!
THANK YOU
Any Questions?
Previous attempts 1
• Genetic Algorithms • Test structure or bias for maximising coverage • Pros:
• Decent results in both Code and Functional Coverage (>70%)
• Easy to understand evolved knowledge • Mature platforms
• Cons: • Some techniques required domain knowledge (setting
fitness function or tweaking other parameters) • Non-universal verification environment • Search for a single solution for the entire search space
– this is not very helpful for DV problems
Previous attempts 2
• Genetic Programming • Test structure for maximising coverage by learning DAGs • Pros:
• Good results in Code Coverage (>90%) • Only point of user involvement is the Instruction Library • Mature platform
• Cons: • Earlier versions had problem with code diversity • Verification environment mostly for microprocessors • Search for a single solution for the entire search space
– this is not very helpful for DV problems
Previous attempts 3
• Bayesian Networks • Probabilistic Network model to answer MPE questions on
coverage to be achieved • Pros:
• Good results in Functional Coverage (~90-100%) • Approximates the CDG process well
• Cons: • Domain Knowledge in constructing initial Network
(though automation techniques have been tried) • Verification environment mostly for sub-systems of
microprocessors (i.e. doesn’t scale on larger systems) • Difficult to understand what has been learnt, difficult to
later manually improve
Previous attempts 4
• Markov Models • Probabilistic Network model (FSM) to generate stimuli for
achieving maximum coverage • Pros:
• Excellent results in bug coverage (100%) • Approximates the CDG process very well
• Cons: • Effort in constructing the template files (TG) and activity
monitors • Difficult to understand what has been learnt, difficult to
later manually improve