44
Testbench Organization and Design

Testbench Organization and Design. 2 Simulation Technology Testbenches are discarded once the design is verified the structure of testbenches are

  • View
    242

  • Download
    3

Embed Size (px)

Citation preview

Page 1: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

Testbench Organization and Design

Page 2: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

2

Simulation Technology

Testbenches are discarded once the design is verified

the structure of testbenches are often at the mercy of verification engineers.− frequently generate wrong stimuli,− compare with wrong results,− miss corner cases.

diverting valuable engineering time to debugging the testbench (instead of the design)

Without well-organized guidelines, testbenches can be a nightmare to maintain.

Important to understand testbench design.

Page 3: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

3

Testbench Environment

Init’nInput Stimuli

Response Assessment

Verification Utility

Clock

Generation

& Synch’n

Testbench to Design Interface

Testbench

Design Under Verification

Page 4: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

4

Testbench Environment: Example

• Compute Remainder of CRC-8

Input vector % (10000111)

FFQ0 D

FFQ1 D

FFQ2 D

FFQ3 D

FFQ4 D

FFQ5 D

FFQ6 D

FFQ7 D

Input

Clock

CRC8 DUV (.IN(in), .CLK(clk), .Q0(q0), … , .Q7(q7));

Instantialtion:

Page 5: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

5

Testbench Environment: Example

Description of design under verification and input stimuli:− Need to apply a bit stream:

− Store bits in an array and apply the array.

initial i = size_of_input;

always @(posedge clk) begin if (i != 0) begin in <= input_array[i]; i <= i – 1; endend

Response assessment:

remainder = input_array % 8’b10000111;if (remainder != {q7, q6, q5, q4, q3, q2, q1, q0}) print_error();

Page 6: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

6

Testbench Environment: Example

FF initialization:

initial begin DUV.Q0 = 1’b0; DUV.Q1 = 1’b0; … DUV.Q7 = 1’b0;

Clock generation:

always clk = #1 ~clk;

Page 7: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

7

Testbench Environment: Example

Testbench-to-design interface:− access to the design signals through primary inputs/outputs and

hierarchical paths.

Verification utility:− functions and modules shared by various parts of the testbench: e.g. print_error()

Page 8: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

8

Test Cases

• Test Case: Properties (scenarios) to be verified.

• Example: ALU:

− TC1: Verifying integer operations,− TC2: Verifying Boolean operations.

Page 9: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

9

Test Cases

Each TC may have its own − initial values,− input stimuli,− expected responses.

• Example: TC1:

− Verify integers add/subtract− Input vectors chosen to cause corner cases (e.g. overflow)

TC2:− Verify Boolean operations:− Input vectors: certain bit patterns (e.g. 101010101, 11111111)

• Reusability: Use the same testbench for multiple test cases

− To maximize portability, TCs must be separated from testbench,− e.g. read initial values from a file (that contains a TC).

Page 10: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

10

Initialization

• Initialization: Assign values to state elements (FFs, memories) Although the task of circuitry (at power-on), often done

in testbench. Reasons:

− Initialization circuit has not designed at the time.− Simulation is to run starting from a long time after power-on,

− (e.g. simulating through initialization stage takes too long)

− Simulation emulates an exception condition (normal operation never reaches it from its legal initial states).

To gain reusability, initialization code should be encapsulated inside a procedure.

Page 11: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

11

Initialization• Initialization @ time zero:

Some simulators create a transition (event) from unknown X (uninitialized) to initial value and some others don’t.

− Inconsistent results from simulator to simulator.

Initialize at a positive time. Even safer:

− Initialize to X (or ‘U’) at time zero.− Then initialize to init value at a positive time.

task init_later;input [N:0] value;begin

design.usb.xmit.Q = 1’bx;…#1;design.usb.xmit.Q = value[0];…

endend task

Page 12: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

12

Clock Generation and Synchronization

• Explicit Method:

initial clock = 1’b0;always begin

#1 clock = 1’b1;#1 clock = 1’b0;#2 clock = 1’b1;#2 clock = 1’b0;

end

clock

period

• Toggle Method:

initial clock = 1’b0;always begin

#1 clock = ~clock; // rising#1 clock = ~clock; // falling#2 clock = ~clock; // rising#2 clock = ~clock; // falling

end

Page 13: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

13

Clock Generation and Synchronization

• Toggle Method: Difficult to see the value of clock at a given time

− Comments: falling/rising.

If left uninitialized, doesn’t toggle (starts at x)− A potential bug.

Easy to change the phase or initial value.− Other statements kept intact.

Page 14: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

14

Clock Generation and Synchronization

• Time Unit and Resolution: During verification, clock period/duty cycle may change

− Use parameters (rather than hard coding)

Page 15: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

15

Clock Generation and Synchronization

• Multiple Clock Systems with a Base Clock: Clock divider:

initial i = 0;always @(base_clock)begin

i = i % N;if (i = 0) derived_clock = ~derived_clock;i = i +1;

end

Clock multiplier:− Note: Synchronized with the base clock.

always @(posedge base_clock)begin

repeat (2N) clock = #(period)/(2N)) ~clock;end

Page 16: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

16

Clock Generation and Synchronization• Multiple Clock Systems with a Base Clock:

If the period of the base clock not known, Measure it:

initial beginderived_clock = 1’b0; //assume starting 0@(posedge base_clock) T1 = $realtime;@(posedge base_clock) T2 = $realtime;period = T2 – T1;T1 = T2;->start; // start generating derived_clockend

// continuously measure base block’s periodalways @(start)forever@(posedge base_clock) beginT2 = $realtime;period = T2 – T1;T1 = T2;end

//generate derived_clock N times the freq of base_clockalways @(start)forever derived_clock = #(period/(2N)) ~derived_clock;

Page 17: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

17

Clock Generation and Synchronization If two periods are independent, don’t generate one

from the other.

initial clock1 = 1’b0;always clock1 = #1 ~clock1;

initial clock2 = 1’b0;always clock2 = #2 ~clock;

Right

initial clock1 = 1’b0;always clock1 = #1 ~clock1;

initial clock2 = 1’b0;always @(negedge clock1) clock2 = #2 ~clock;

Wrong

Page 18: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

18

Clock Generation and Synchronization

initial clock1 = 1’b0;always clock1 = #1 ~clock1;

jitter = $random(seed) % RANGE;assign clock1_jittered = (jitter) clock1;

Any simulator shows the same waveform but they are different:

− Adding jitter to clock1 must not affect clock2

Page 19: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

19

Clock Synchronization

When two independent waveforms arrive at the same gate, glitches may be produced:

− intermittent behavior.

Independent waveforms should be synchronized before propagation.

w3

w2

w1

Synch’er

Synchronized signal

Synchronizing Signal

Page 20: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

20

Clock Synchronization

• Synchronizer: A latch. Uses a signal (synchronizing) to trigger sampling of

another to create a dependency between them.− Removes uncertainty in their relative phase.

always @(fast_clock)clock_synchronized <= clock;

Some transitions may be missed.− The signal with highest frequency is chosen as the synch’ing signal.

Page 21: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

21

Stimuli Generation

• Synchronous Method:

Applying vectors to primary inputs synchronously.

1011101110111100000011100010110100001111110110110011001010101010

input vectorStimuli Memory

Stimulus Clock

Testbench

Design

I Dmemory

I/O

Control Data

Page 22: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

22

Stimulus Generation

Vectors stored in stimuli memory (read stimuli from file) triggered by a stimulus clock, memory is read one

vector at a time. Stimulus clock must be synchronized with design

clock. Encapsulate the code for applying data in a task

(procedure)− Stimulus application is separated from particular memory or design

Page 23: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

23

Stimulus Generation

• Asynchronous Method: Sometimes inputs are to be applied asynchronously.

− e.g. Handshaking.

Page 24: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

24

Response Assessment

• Two parts of response assessment:

1. Monitoring the design nodes during simulation2. Comparing the node values with expected values.

• Absence of discrepancy could have many meanings: Erroneous nodes were not monitored, Bugs were not exercised by the input stimuli, There are bugs in the “expected” values that masks the

real problem, The design is indeed free of bugs

Page 25: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

25

Response Assessment

• Comparison Methods:

1. Offline (post processing) Node values are dumped out to a file during simulation Then the file is processed after the simulation is finished.

2. On the fly Node values are gathered during simulation and are

compared with expected values.

Page 26: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

26

Response Assessment

• Offline: Design State Dumping:

In text or VCD (Value change Dump)− To be viewed by a waveform viewer.

With dumping simulation speed: 2x-20x decreased.− If simulation performance needed, avoid dumping out all

nodes.

− Must use measures to locate bugs efficiently.

Page 27: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

27

Response Assessment:Design State Dumping

• Measures for efficient bug locating:

Scope:− Restrict dumping to certain areas where bugs are more likely to occur.

Interval:− Turn on dumping only within a time window.

Depth:− Dump out only at a specified depth (from the top module).

Sampling:− Sample signals only when they change.− Sample signals with each clock: Not recommended because:

1. Some signal values are not caught.

2. Slow if some signals are not changed over several clocks.

Page 28: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

28

Response Assessment:Design State Dumping

• Scope: The range of signals to be printed out.

1. HDL scope: function, module, block.

2. User-defined scope: group of functionally similar modules.

Page 29: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

29

Response Assessment:

• Run-Time Restriction: Dumping routines should have parameters to

turn on/off dumping.

$dump_nodes(top.A, depth, dump_flag)

Scope: Module A

Restrict up to depth depth

E.g. if forbidden state reached

Page 30: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

30

Response Assessment:

• Golden Response:

Visual inspection of signal traces is suitable only − for a small number of signals.− when the user knows where to look for clues (i.e. scope is narrow).

For large designs, the entire set of dumped traces needs to examined.− Manual inspection is not feasible.

• Common Method:

Compare with “golden response” automatically.− Unix “diff” if in text.

Page 31: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

31

Response Assessment:Golden Response

• Golden Response: Can be generated directly. Or by a different model of the design

− e.g. non-synthesizable higher level model or a C/C++ model.

If different responses − bugs in design,

− bugs in golden response.

Page 32: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

32

Response Assessment:Golden Response

• Things to be printed in a golden file: I/O ports are the minimum:

− Printing every node is overkill,

− Reference model and the design usually do not have the same internal nodes.

Time stamp State variables (if well-defined in the reference

model).

Page 33: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

33

Response Assessment:Golden Response

• Time Window: Wider window:

− More coverage,− More simulation time,− More disk space.

Page 34: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

34

Response Assessment:Golden Response

• Hard to update: There may be thousands of golden files in large

designs. Golden files may need to change:

− if a bug is found in it (in the supposedly correct design).− if design is changed to meet other constraints (e.g. pipelining).− if specifications are modified,− if printing variables (or formats) are changed.

All golden files may need to change. Golden file may be very large:

− Gigabytes are commonplace.

Maintenance problems.

Page 35: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

35

Response Assessment:Self-Checking Codes

Dumping signals and comparing with golden files: − Disk I/O slows down simulation by 10x.

• Self Checking:

Checking is moved to testbench.− Signals monitored and compared constantly in testbench.

Two parts:1. Detection:

− Compares monitored signals with expected values.

2. Alert:− Different severity levels: different actions.

Page 36: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

36

Response Assessment:Self-Checking Codes

Generating the expected behavior can be:1. Offline:

− A generator runs the stimuli and creates the expected behavior in a file.

− During simulation, the file is read and searched for the expected values of variables.

2. On the fly:− A model of the design runs.

− At the end of each step, the model generates and sends the expected behavior to the comparison routine.

Page 37: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

37

Response Assessment:Self-Checking Codes

• On-the-fly Checking: Example: Multiplier

// RTL code of a multipliermultiplier inst(.in1(mult1), .in2(mult2), .prod(product));

// behavior model of the multiplierexpected = mult1 * mult2;

// compare resultsif (expected != product)begin // alert component$display (“ERROR: incorrect product, result = %d, …);$finish;

end

Page 38: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

38

Response Assessment:Self-Checking Codes

• Good practice:

Separate checking code from design code.− Checking code is not part of the design.

− Verification engineer should make it straightforward to remove it.

− E.g. Encapsulate it in a task/procedure along with other verification utility routines in separate file.

Use C/C++ routines to derive expected behavior.− Use VHDL PLI (VHPI) or Verilog PLI to construct a function in HDL.

Page 39: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

39

Response Assessment:Self-Checking Codes• Example:

// RTL code of a multipliermultiplier inst(.in1(mult1), .in2(mult2), .prod(product));

// behavior model of the multiplier

$multiplication(mult1, mult2, expected);// compare resultsif (expected != product)begin // alert component

$display (“ERROR: incorrect product, result = %d, …);$finish;

end

void multiplication(){…m1 = tf_getp(1); // get the first argumentm2 = tf_getp(2); // get the second argumentans = m1 * m2;tf_putp(3, ans); // return the answer to the third argument}

Page 40: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

40

Response Assessment:Self-Checking Codes

• Very versatile technique: because C/C++ can easily compute complex

behavior (Verilog/VHDL difficult to write).− E.g. Fourier transformation, encryption, simulated annealing(!)…

RTL & C/C++ are very different− More confidence in verification.

• Disadvantage: Communication overhead between languages

(Performance penalty)− The simulation must pause for PLI task execution,− Must wait for a long time on data transfer.

Page 41: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

41

Response Assessment:Self-Checking Codes

• Off-line checking:

When expected behavior takes long time to compute, on-the-fly node is not suiatble.− Store expected responses in a table or database.

− If small, can reside in RTL code (faster)

otherwise, a PLI user task must be called for RTL to access it (more portable).

Page 42: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

42

Responses Assessment:Temporal Specification

After functional specification, timing correctness.

• Two types: Synchronous timing specification,

− Timing requirements expressed in terms of a clock.

Asynchronous timing specification,− Timing requirements as absolute time intervals.

Page 43: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

43

Responses Assessment:Temporal Specification

• Example:

FIGURE 4.14, PAGE 170

Synchronous: Out rises when both in_a and in_b are both low between 2nd and 3rd clock rising edge.

Asynchronous: Out falls when both in_a and in_b are both high between 2 ns and 6 ns.

Page 44: Testbench Organization and Design. 2 Simulation Technology  Testbenches are discarded once the design is verified  the structure of testbenches are

44

Responses Assessment:Temporal Specification

• For interval [t1, t2]:

Checking can be done in two stages: