ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

Embed Size (px)

Citation preview

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    1/61

     

     ARTIFICIAL NEURAL

    NETWORKS

    MODULE-3

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    2/61

     

    Module-3.

    Counter Propagation Networks: Kohonen layer - Training

    the Kohonen layer - Pre initializing the weight vectors -

    statistical properties Training the Grossberg layer - Full

    counter propagation network - Application

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    3/61

     

     INTRODUCTION 

    Perceptron Training.

    Back Propagation Networks.

    Self-organizing Maps & Counter Propagation.

    nsuper!ised Training.

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    4/61

     

    Self-organized clustering "a# $e defined as a

    "apping t%roug% w%ic% N-di"ensional pattern space

    can $e "apped to into a s"aller nu"$er of points in an

    output space.

    T%e "atc%ing is ac%ie!ed autono"ousl# wit%out

    super!ision. i.e. clustering is done in a self-organized

    "anner

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    5/61

     

    T%e ter" self-organizing refers to t%e a$ilit# to learn

    and organize infor"ation wit%out $eing gi!en t%e

    correct answer.

    Self-organized network perfor" unsuper!ised

    learning.

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    6/61

     

    Co"petiti!e Networks

    When more t

    han one neuron in the out put layerfires an a!!itional structure is inclu!e! in the

    network so that the net is force! to trigger only one

    neuron"

    This mechanism is terme! as competition" When one

    competition is complete! only one neuron in the

    competing group will have a non zero out put"

    The competition is base! on the ‘winner take all’  

     policy

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    7/61

     

    Counter Propagation Networks.

    #ounter Propagation is a combination of two well

    known algorithm $

    Kohenen %s &elf organizing maps"

    Gross bergs 'utstar"

    #ounter Propagation is a network with high

    representational power compare! to single layer

     perceptron"

    (t is having high spee! of training"

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    8/61

     

    '()(N*N S*+, (/N0S0N M/PS.

    Kohonen self organizing maps assume a topological structure

    among clustering unit" Kohonen network aims at using

    kohonen learning to a!)ust weights an! finally results in a

     pattern"

    Structure of 'o%onen 

    There are m clustering units arrange! in a linear or two

    !imensional array"The input are arrange! as n-tuples"

    All input are given to all the neuron"

    The weights of clustering unit will serve as an e*empler of

    input patteren"

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    9/61

     

    Structure of 'o%onen 

    There are m clustering units arrange! in a linear or two

    !imensional array"

    The input are arrange! as n-tuples"

    All input are given to all the neuron"

    The weights of clustering unit will serve as an e*emplar of

    input pattern"

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    10/61

     

    Kohonen +etwork also follows ,Winner Takes All policy"

    The network cluster unit whose weight vector matches

    more closely with the input pattern is consi!ere! as

    ,Winner"

    The winning is usually !eci!e! base! on the .ucli!ean !istance

    .ucli!ean !istance /0123 iji   w x   −Σ

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    11/61

     

    4 4 4 4

    4 4 4 4

    4         

    4         

    4         

    4         

    4         

    4         

    4         

    4         

    4

    4

    4

    4

    4 44

    'o%onen1s S2uire grid clustering unit structure

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    12/61

     

    'o%onen Training /lgorit%"

    &tep 5$ (nitialize weights set learning rate an! neighborhoo!

     parameters

    &tep 1$ While stopping con!ition is false !o the following

    steps36 to 72 "

    &tep 6$For each input vector calculate the .ucli!ean !istance"

    /3)20Σ(wij-xi)2

    &tep 8$ 9ocate the winner"

    &tep :$ A!)ust the weightswi)3new20wi)3ol!2;   (x i-wij(old))

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    13/61

     

    *a"ple 4efer Pro$le"-56

    #onsi!er a Kohonen net with two cluster units an! five

    input units" The weight vector for the cluster units are

    w50 ="61 ="1?@

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    14/61

     

    Counter Propagation Network Structure

    #ounter propagation networks consists TW' layers

    Kohonen 9ayer 

    Grossberg layer 

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    15/61

     

    5

    1

    6

    Kohononen

    9ayer 

    Grossberg

    9ayer 

    K 5E

    k 1

    K 6

    G5

    G1

    G6

    w55

    w66

    W15

    55

    66

    61

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    16/61

     

    N(M/+ (P*/T0(N (, '()(N*N +/7* 

     +.T0C4W

     +.T0Σ*i4wi)

    The Kohonen +euron with largest value of net is consi!ere!

    as ,winner

    N(M/+ (P*/T0(N (, (SSB* +/7* 

    (f k5K1be the kohonen layer output then the Grossberg

    layer net output is weighte! kohonen layer output"

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    17/61

     

     +.T)0Σk ivi)

    H0K

    Where 0Grossberg 9ayer weight Iatri*

    K 0 Kohonen 9ayer weight Iatri*

    H0Grossberg 9ayer output ector"

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    18/61

     

    Preprocessing of the input vectors

    'o%onen Training 4P%#sical 0nterpretation6

    CiJ0Ci3C51;C11;;Cn12

     +ee! for preprocessing

    .*ample$ C50

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    19/61

     

    8

    6

    8:

    6:

    Mepresentation of input vectors before an! after normalization

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    20/61

     

    Two /imensional nit ectors on The nit #ircle

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    21/61

     

    Ci

    Wol!

    Wnew

    C-Wol!

    α3C-Wol!

    2

    Training process of 'o%onen +a#er weig%ts

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    22/61

     

    Pre initialization of weig%t !ectors

    /ll t%e weig%t !ectors are to $e set to initial !alues

    $efore starts training.

    0nitial !alues are rando"l# selected and s"all !alues

    arte selected.

    ,or 'o%onen t%e initial training !ectors s%ould $e

    nor"alized.

    T%e weig%ts !ectors "ust end up e2ual to nor"alized

    input !ectors

    Pre nor"alization will s%orten t%e training process.

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    23/61

     

    Pro$le"s wit% rando"izing 'o%onen la#er weig%ts

    (t will uniformly !istribute weight vectors aroun! the

    hypersphere"

    Iost of the input vectors are groupe! an!

    concentrate! at relatively in small area"

    Nence most of the Kohonen neuron may be waste!

    !ue to zero output"

    The remaining weights may be too few in number

    to categorize the input into groups"

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    24/61

     

    The most !esirable solution is to !istribute the weight

    vectors accor!ing to !ensity of input vectors that must be

    separate!" This is impractical to implement"

    Met%od-0 4con!e co"$ination "et%od6&et all weight to the same value where n is the number

    of components of input vectors 3hence weight vectors2" All the

    input *iare given a value eBual to the

    where n is the number of inputs"

    n

    5

    @5

    @

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    25/61

     

    /ssign"ent :No.84/6

    .*plain the nee! of initialization of weight matri* inKohonen layerO

    What are the !ifferent metho!s use!O

    Statistical propert# of T%e trained Network 

    Kohonen network has a useful an! interesting ability to

    e*tract statistical properties of the input !ata set"

    (t is shown by Kohonen that probability of ran!omly selecte!

    input vectors closest to any given weight vector is 5kwhere

    k is the number of Kohonen neuron"

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    26/61

     

    Training of ross$erg la#erGross berg training is a supervise! training"

    An input vector is applie! from the output of Kohonen9ayer"

    'utput from Grossberg layer is calculate! an! is

    compare! with the !esire! output"

    The amount of weight a!)ustment proportional to this

    !ifference"

    iold  jiiold  jinewij  k v yvv 23

      −+=   β 

    k i0output from Kohonen 9ayer"

    y )0!esire! output component"

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    27/61

     

    T%e unsuper!ised 'o%onen la#er produces outputs at

    indeter"inate position .

    T%ese are "apped into t%e desired outputs $# t%e

    ross$erg la#er.

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    28/61

     

    CPN

    ,orward Counter Propagation,ull Counter Propagation

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    29/61

     

    ,++ C(NT* P(P//T0(N N*T9(' 

    *5

    *1

    *6

    z5

    z )

    zn

    y5

    y1

    yn

    y54

    H14

    Hn4

    C54

    C1

    4

    C64

       C  -  o  u   t  p  u   t   9

      a  y  e  r

       H  -  o  u   t  p  u   t   9

      a  y  e  r

       C  -

       i  n  p  u   t   9  a  y  e  r

       H  -   i  n  p  u   t   9  a  y  e  r

    9

    T

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    30/61

     

    The Ia)or aim of a full counter propagation network is to

     provi!e an efficient means of representing a large number of

    vector pairs ;:7 by a!aptively constructing a look up table"

    (t Pro!uces an appro*imation ;:7 

    9it% ; alone.

    9it% 7 alone

    9it% ;:7

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    31/61

     

    (t uses ,winner Take All policy

    ectors are normalize!

    9earning Algorithms for Kohonen 9ayer are

    23   old  jiiold  jinewij

      v xvv   −+=   α 

    iold  jiiold  jinewij  k t  yat t  23

      −+=

    23   old  jiiold  jinewij

      w xww   −+=   β 

    9earning Algorithm for Grossberg are

    iold  jiiold  jinewij  k u ybuu 23

      −+=

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    32/61

     

    A full counter propagation network to compliment the

    function y05*

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    33/61

     

    /pplications of CPN

    ector Iapping

    /ata #ompression

    (mage #ompression

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    34/61

     

    A #P+ can be use! to compress !ata before transmission"

    The image to be transmitte! is first !ivi!e! into sub images".ach sub image is further classifie! into pi*els"

    .ach pi*el represent either one 39ight2 or zero 3!ark2"

    (f there are n pi*els n pi*els are reBuire! to transmit this"

    (f some !istortion can be tolerate! a fewer bits aresufficient for transmission"

    A #P+ can perform vector Buantization"

    'nly one neuron of the Kohonen layer output become 5"

    The Grossberg layer will generate a co!e for that neuron

    an! is transmitte!"

    At the receiving en! an i!entical #P+ accept the binary

    co!e an! pro!uces the inverse function "

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    35/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    36/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    37/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    38/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    39/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    40/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    41/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    42/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    43/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    44/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    45/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    46/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    47/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    48/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    49/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    50/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    51/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    52/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    53/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    54/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    55/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    56/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    57/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    58/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    59/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    60/61

     

  • 8/20/2019 ARTIFICIAL NEURAL NETWORKS-moduleIII.ppt

    61/61