LECTURE B 4 FSM Encoding Intro

Embed Size (px)

Citation preview

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    1/32

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    2/32

    FSM (Finite State Machine) Optimization

    State tables

    State minimization

    State assignment

    Combinational

    logic optimization

    net-list

    identify and remove

    equivalent states

    assign unique binary

    code to each state

    use unassigned state-codes

    as dont care

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    3/32

    Sequential Circuits Sequential Circuits

    Primitive sequential elements

    Combinational logic

    Models for representing sequential circuits Finite-state machines (Moore and Mealy)

    Representation of memory (states)

    Changes in state (transitions)

    Basic sequential circuits Shift registers

    Counters

    Design procedure State diagrams

    State transition table

    Next state functions

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    4/32

    State Assignment

    Choose bit vectors to assign to each symbolic state

    With nstate bits for m statesthere are 2n! / (2nm)!state assignments[log n

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    5/32

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    6/32

    One-hot State Assignment Simple

    Easy to encode, debug

    Small Logic Functions Each state function requires only predecessor state bits

    as input

    Good for Programmable Devices Lots of flip-flops readily available Simple functions with small support (signals its

    dependent upon)

    Impractical for Large Machines Too many states require too many flip-flops Decompose FSMs into smaller pieces that can be one-hot

    encoded

    Many SlightVariations to One-hottwo hot

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    7/32

    Heuristicsfor State Assignment

    Adjacent codesto states that share a commonnext state

    Group 1's in next state map

    Adjacent codesto states that share a commonancestor state Group 1's in next state map

    Adjacent codesto states that have a commonoutput behavior

    Group 1's in output map

    l h

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    8/32

    General Approachto HeuristicState Assignment

    All current methods are variants of this 1) Determine which statesattract each other(weighted pairs) 2) Generate constraints on codes(which should be in same cube) 3) Place codes on Boolean cubeso as to maximize constraints

    satisfied (weighted sum)

    Different weightsmake sense depending on whether weare optimizing for two-level or multi-level forms

    Can't consider all possible embeddings of state clusters inBoolean cube Heuristics for ordering embedding To prune searchfor best embedding Expand cube(more state bits) to satisfy more constraints

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    9/32

    Output-BasedEncoding

    Reuse outputs as state bits- use outputsto help distinguish states

    Why create new functions for state bits whenoutput can serve as well

    Fits in nicely with synchronous Mealyimplementations

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    10/32

    HG = ST H1 H0 F1 F0 + ST H1 H0 F1 F0HY = ST H1 H0 F1 F0 + ST H1 H0 F1 F0FG = ST H1 H0 F1 F0 + ST H1 H0 F1 F0

    HY = ST H1 H0 F1 F0 + ST H1 H0 F1 F0

    Output patterns are unique to states, we do notneed ANY state bitsimplement 5 functions(one for each output) instead of 7 (outputs plus

    2 state bits)

    Inputs Present State Next State Outputs

    C TL TS ST H F0 HG HG 0 00 10 0 HG HG 0 00 101 1 HG HY 1 00 10 0 HY HY 0 01 10 1 HY FG 1 01 101 0 FG FG 0 10 000 FG FY 1 10 00

    1 FG FY 1 10 00 0 FY FY 0 10 01 1 FY HG 1 10 01

    Example of KISS Format

    C S i

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    11/32

    CurrentState AssignmentApproaches

    For tight encodings using close to the minimum number ofstate bits Best of 10 randomseems to be adequate (averages as well as

    heuristics) Heuristicapproaches are not even close to optimality Used in custom chipdesign

    One-hot encoding Easy for small state machines Generates small equationswith easy to estimate complexity

    Common in FPGAsand other programmable logic Output-basedencoding

    Ad hoc - no tools Most common approach taken by human designers Yields very smallcircuits for most FSMs

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    12/32

    State Assignment = Various Methods

    Assign unique code to each state to produce logic-level description

    utilize unassigned codes effectively as dont cares

    Choice for S state machine

    minimum-bitencoding

    log S

    maximum-bitencoding one-hot encoding

    using one bit per state

    something in between

    Modern techniques hypercube embeddingof face constraintderived for collections of

    states (Kiss,Nova)

    adjacency embeddingguided by weightsderived between state pairs(Mustang)

    H b E b ddi T h i

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    13/32

    Hypercube Embedding Technique

    Observation : one -hot encoding is the easiest

    to decode

    Am I in state 2,5,12 or 17?binary :x4x3x2x1x0(00010) +

    x4x3x2x1x0 (00101) +

    x4x3x2x1x0(01100) +

    x4x3x2x1x0 (10001)

    one hot :x2+x5+x12+x17But one hot uses too many flip flops.

    Exploit this observation

    1. two-level minimizationafter one hot

    encoding identifies useful state group for

    decoding

    2. assigning the states in each groupto a single

    face of the hypercube allows a single product

    term to decode the group to states.

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    14/32

    FSM Optimization

    S2

    S1

    S3

    01

    -0

    00

    10

    -1

    0-1-01

    -0

    11

    11

    Combinational

    Logic

    PI PO

    PS NS

    u1

    u2

    v1

    v2

    S4

    St t G Id tifi ti

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    15/32

    State Group Identification

    Ex: state machine

    input current-state next state output

    0 start S6 00

    0 S2 S5 00

    0 S3 S5 00

    0 S4 S6 00

    0 S5 start 10

    0 S6 start 010 S7 S5 00

    1 start S4 01

    1 S2 S3 10

    1 S3 S7 10

    1 S4 S6 10

    1 S5 S2 00

    1 S6 S2 00

    1 S7 S6 00

    Symbolic Implicant: represent a transition from

    one or more state to a next state under some input

    condition.

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    16/32

    Representation of Symbolic Implicant

    Symbolic cover representation is related to amultiple-valued logic.

    Positional cube notation: a p multiple-valued

    logic is represented as P bits

    (V1,V2,...,Vp)Ex: V = 4for 5-value logic

    (00010)

    represent a set of values by one string

    V = 2orV = 4

    (01010)

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    17/32

    Minimization of Multi-valued Logic

    Find a minimum multiple-valued-input cover

    - espresso

    Ex:A minimal multiple-valued-input cover

    0 0110001 0000100 00

    0 1001000 0000010 00

    1 0001001 0000010 10

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    18/32

    State Group

    Consider the first symbolic implicant

    0 0110001 0000100 00

    This implicant shows that input 0 maps

    state-2 or state-3 or state-7 into state-5

    and assert output 00 This example shows the effect of symbolic logic

    minimization is to group together the states that aremapped by some input into the same next-stateand assert

    the same output. We call it state group if we give encodings to

    the states in the state group in adjacent binary

    logicand no other states in the group face, then the states

    group can be implemented as a cube.

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    19/32

    Group Face

    group face :the minimal dimension subspace containing theencoding assigned to that group.

    Ex: 0010 0**0 group face

    0100

    0110

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    20/32

    Hyper-cube Embeddingc

    a b

    256

    12 17

    125

    6

    2 17

    state groups :

    {2,5,12,17}

    {2,6,17}

    wrong!

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    21/32

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    22/32

    Hyper-cube Embedding

    Advantage :

    use two-level logic minimizerto identify good stategroup

    almost all of the advantage of one-hot encoding,but fewer state-bit

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    23/32

    Adjacency-Based State Assignment

    Basic algorithm:(1)Assign weightw(s,t) to each pair of states

    weight reflects desire of placing states

    adjacent on the hypercube

    (2) Define cost functionfor assignment of codes

    to the states

    penalize weights for the distance between the state codes

    eg. w(s,t) * distance(enc(s),enc(t))

    (3) Find assignment of codes which minimize

    this cost functionsummed over all pairs ofstates.

    heuristic to find an initial solution

    pair-wise interchange (simulated annealing)

    to improve solution

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    24/32

    Adjacency-Based State Assignment

    Mustang :weight assignment technique based on looselymaximizing common cube factors

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    25/32

    How to Assign Weight to State Pair

    Assign weights to state pairs based on ability to extract

    a common-cube factorif these two states are adjacenton the hyper-cube.

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    26/32

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    27/32

    Fan-Out-Oriented

    present states pair asserts the same output

    S2

    S3S1

    S4

    $/j $/j

    Add 1to w(S1 ,S3)because of outputj

    $$$ S1 S2 $$$1$

    $$$ S3 S4 $$$1$

    Fanin-Oriented (exam next state pair)

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    28/32

    Fanin Oriented (exam next state pair)

    The same present state causes transition to next state

    pair.

    $$$ S1 S2 $$$$

    $$$ S1 S4 $$$$

    Add n/2 to w(S2,S4) because of S1

    S1

    S2 S4

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    29/32

    Fanin-Oriented (exam next state pair)

    The same input causes transition to next state pair.

    $0$ S1 S2 $$$$

    $0$ S3 S4 $$$$

    Add 1 to w(S2,S4) because of input i

    i i

    S1 S3

    S2 S4

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    30/32

    Which Method Is Better?

    Which is better?

    FSMs have no useful two-level

    face constraints => adjacency-embedding

    FSMs have many two-level

    face constraints => face-embedding

    S

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    31/32

    Summary Models for representing sequential circuits

    Abstraction of sequential elements Finite state machines and their state diagrams Inputs/outputs Mealy, Moore, and synchronous Mealy machines

    Finite state machine design procedure Deriving state diagram Deriving state transition table Determining next state and output functions Implementing combinational logic

    Implementation of sequential logic State minimization State assignment Support in programmable logic devices

  • 8/12/2019 LECTURE B 4 FSM Encoding Intro

    32/32

    Some Tools

    History:Combinational Logic single FSM Hierarchy of FSMs

    MISIISequential Circuit

    Partitioning

    Facilities for managingnetworksof FSMs

    VIS (handleshierarchy)

    Sequential Circuit Optimization(single machine)

    SIS

    Facilities for handling latches