66
Genetic Programming of Logic-Based Neural Networks Vincent Charles Gaudet Department of Electrical and Computer Engineering University of Manitoba Winnipeg, Manitoba, Canada R3T 5V6 e-mail: [email protected] March 1995

Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

Embed Size (px)

Citation preview

Page 1: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

Genetic Programming ofLogic-Based Neural Networks

Vincent Charles GaudetDepartment of Electrical and Computer Engineering

University of ManitobaWinnipeg, Manitoba, Canada R3T 5V6

e-mail: [email protected]

March 1995

Page 2: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

i

Abstract

Genetic algorithms and genetic programming are optimization methods in which

potential solutions evolve via operators such as selection, crossover and mutation.

Logic-Based Neural Networks are a variation of artificial neural networks which fill

the gap between distributed, unstructured neural networks and symbolic programming.

In this thesis, the Genetic Programming Paradigm is modified in order to obtain

Logic-Based Neural Networks. Modifications include connection weights on the parse

trees, a new mutation operator, a new crossover operator, and a new method for randomly

generating individuals. The algorithm is part of a two-level development process where, at

first, satisfactory logic-based neural networks are obtained using our algorithm; then,

gradient-based learning methods are used to refine the networks. Results are obtained for a

6-input Logic-Based Neural Network problem.

Page 3: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

ii

Acknowledgements

Support from the Natural Sciences and Engineering Research Council and

MICRONET is gratefully acknowledged.

Also, I would like to thank Professor Witold Pedrycz for the numerous hours spent

discussing genetic programming and neural networks and for the advice and assistance he

has given me over my years of collaboration with him.

Page 4: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

iii

Table of Contents

Abstract ------------------------------------------------------------------------------------------------ i

Acknowledgements -------------------------------------------------------------------------------- ii

Table of Contents --------------------------------------------------------------------------------- iii

List of Figures -------------------------------------------------------------------------------------- v

List of Tables --------------------------------------------------------------------------------------- vi

1.0 Introduction ------------------------------------------------------------------------------------ 1

2.0 Description of Genetic Algorithms --------------------------------------------------------- 3

2.1 General Description ----------------------------------------------------------------- 3

2.2 Description of Coding --------------------------------------------------------------- 3

2.3 Description of Operators ------------------------------------------------------------ 4

2.4 Example Problem -------------------------------------------------------------------- 7

2.5 Non-Randomness of Genetic Algorithms ---------------------------------------- 10

2.6 Modifications to the Simple Genetic Algorithm --------------------------------- 10

3.0 Description of Genetic Programming ----------------------------------------------------- 12

3.1 Genetic Programming: Data Structure ------------------------------------------- 12

3.2 Crossover ---------------------------------------------------------------------------- 13

3.3 Mutation ----------------------------------------------------------------------------- 14

3.4 Summary of Differences Between the GA and the GPP ----------------------- 15

4.0 Description of Logic-Based Neural Networks ------------------------------------------- 17

4.1 The AND Neuron ------------------------------------------------------------------- 17

4.2 The OR Neuron --------------------------------------------------------------------- 18

Page 5: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

iv

4.3 Connection Weights ---------------------------------------------------------------- 19

4.4 Example Logic-Based Neural Network ------------------------------------------ 20

5.0 Description of the GPP for LNNs --------------------------------------------------------- 21

5.1 Coding ------------------------------------------------------------------------------- 21

5.2 Genetic Mechanisms --------------------------------------------------------------- 22

5.3 Fitness Function -------------------------------------------------------------------- 24

6.0 Using the C Program for the GPP --------------------------------------------------------- 27

6.1 Changing Parameters --------------------------------------------------------------- 27

6.2 Descrption of Constants ----------------------------------------------------------- 28

6.3 The Input File ----------------------------------------------------------------------- 29

6.4 Changing the Size of the Problem ------------------------------------------------ 30

7.0 Experimental Studies ------------------------------------------------------------------------ 32

7.1 Description of Tests ---------------------------------------------------------------- 32

7.2 Results ------------------------------------------------------------------------------- 33

8.0 Conclusions ---------------------------------------------------------------------------------- 38

9.0 References ------------------------------------------------------------------------------------ 39

Appendix 1.0: Data Set Used to Test Algorithm ---------------------------------------------- 41

Appendix 2.0: C Program Code ---------------------------------------------------------------- 43

Page 6: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

v

List of Figures

Figure 2.1: Multivariable Chromosome --------------------------------------------------------- 4

Figure 2.2: Roulette Wheel with Population of Individuals ---------------------------------- 5

Figure 2.3: One Point Crossover on Bit Strings ----------------------------------------------- 6

Figure 2.4: Mutation on a Bit String ------------------------------------------------------------ 7

Figure 2.5: Illustration of the Permutation Operator ----------------------------------------- 11

Figure 3.1: Example of Parse Tree for Genetic Programming ------------------------------ 12

Figure 3.2: Two Trees Before Crossover ------------------------------------------------------ 14

Figure 3.3: Two Trees After Crossover -------------------------------------------------------- 14

Figure 3.4: Tree Before and After Mutation --------------------------------------------------- 15

Figure 3.5: Comparison Between GA and GPP ---------------------------------------------- 16

Figure 4.1: Graphical Representation of the AND Function -------------------------------- 17

Figure 4.2: Graphical Representation of the OR Function ---------------------------------- 18

Figure 4.3: Example Logic-Based Neural Network ------------------------------------------ 20

Figure 5.1: Adjacency Between Weights ------------------------------------------------------ 23

Figure 5.2: Other Potential Adjacency Configuration ---------------------------------------- 23

Figure 7.1: Best Obtained Logic-Based Neural Network ------------------------------------ 35

Figure 7.2: Average Fitness of Population by Generation ----------------------------------- 36

Figure 7.3: Average Fitness of Best Individual by Generation ------------------------------ 37

Page 7: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

vi

List of Tables

Table 2.1: Individuals in Initial Population ------------------------------------------------ 8

Table 2.2: Individuals in Second Population ---------------------------------------------- 9

Table 4.1: 2-Input AND Function -------------------------------------------------------- 18

Table 4.2: 2-Input OR Function ----------------------------------------------------------- 19

Table 5.1: Comparison Between Generic GPP and GPP for LNNs ------------------- 26

Table 6.1: Example 2-Input Problem Outputs -------------------------------------------- 30

Table 7.1: Description of Three Parameter Sets ------------------------------------------ 32

Table 7.2: Description of Weight Intervals ----------------------------------------------- 32

Table 7.3: Probabilities of Nodes Getting Selected at Given Levels ------------------- 33

Table 7.4: Best Individuals from each Run ---------------------------------------------- 34

Page 8: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

1

1.0 Introduction

Genetic algorithms [7][8] and genetic programming [10] are both optimization

methods; populations of potential solutions to a problem evolve from generation to

generation through genetics-based operators such as selection, crossover and mutation.

Genetic algorithms tackle problems which are numerical in nature [1][3][5][6], such

as determining weights for standard artificial neural networks, whereas the “Genetic

Programming Paradigm” (GPP) [10] tackles more symbolic-oriented problems such as

Boolean function learning and game playing strategy learning.

Logic-Based Neural Networks (LNN), defined in [12], arise from the gap between

well-structured symbolic programming methods and less structured, more distributed neural

networks. While the lack of rigid structure in neural networks permits incredible learning

capabilities at the numerical level [11], it also makes it harder for the user to understand

what is going on within the network at the logical level. In symbolic programming, on the

other hand, knowledge is very explicitly represented, but the learning capabilities are not

great. LNNs have the general structure of neural networks, with neurons and weighted

connections, but the neurons are based on fuzzy-set theory, thus incorporating a more

symbolic structure within the network.

In this thesis we introduce modifications to the GPP that make it possible for the

GPP to obtain LNNs [4]. The modifications include a new, more general data structure

which is very suitable to LNNs; also, new operators that are suitable to LNNs are

developed.

The LNNs obtained using the GPP method are starting architectures from which

more refined, gradient-based learning is performed using LNN-adapted learning methods.

Thus, for our version of the GPP, we are concentrating on two-level semi-parametric

development of LNNs, rather than on one-level parametric learning.

Note that the class of problems tackled here by our version of the GPP is more

Page 9: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

2

specific as opposed to the wider class of problems tackled by Koza’s original version. In

particular, we have developed an algorithm whose data structure can include weighted

connections; also, the problems tackled by our algorithm have a more layered structure.

The material is organized in sections. Sections 2.0 and 3.0 describe the simple

genetic algorithm and the Genetic Programming Paradigm, respectively. Section 4.0 deals

with Logic-Based Neural Networks. Then, in Section 5.0, we introduce modifications to the

GPP which make it possible to obtain LNNs. Section 6.0 describes the C implementation

of our algorithm. Finally, Section 7.0 presents results of numerous tests performed using a

6-input LNN.

We now proceed with a concise description of genetic algorithms followed by a

description of genetic programming.

Page 10: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

3

2.0 Description of Genetic Algorithms

2.1 General Description

The simple genetic algorithm (GA) is an optimization method, where potential

optimal solutions are usually coded in the form of bit strings [7].

In contrast with other optimization methods such as gradient-based simulated

annealing, genetic algorithms operate on a “population” of potential solutions to the given

problem, and allow greater modifications of the individuals from one iteration or generation

to the other. Thus, a wider coverage of the solution space is possible while maintaining a

reasonable number of potential solutions.

The GA uses the population of strings on which it operates to create one generation

of the population from the previous, in an attempt to improve on performance and tend

towards an optimum value. It does this through operators called “selection”, “crossover”

and “mutation”.

For each string in the population, there is an associated fitness value that describes

how “good” that particular string is or how well it solves the given problem.

The selection operator determines which members of the population will be retained,

intact or modified, for the next generation. This is based on fitness values; strings (or

individuals) with higher fitness values are assigned higher probabilities of getting retained

for the next generation, compared to those with lower fitness values. From two selected

strings, the crossover operator creates two new strings by cutting the input strings at the

same point and switching their tail ends. The cut-point is chosen randomly. Finally, the

mutation operator changes the value of a randomly chosen bit in a selected bit string.

2.2 Description of Coding

Usually, potential solutions are coded as bit strings.

Page 11: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

4

For numerical optimization problems, the bit string is a binary representation of the

solution. For example, suppose we are attempting to optimize a function, f(x), over and

interval x ∈ [a, b). Then, we would divide the interval [a, b) into 2n, evenly distributed,

potential solutions, where n is the length of the bit string. The bit strings have

corresponding integer values, V, in the interval [0, 2n – 1]. The following transformation is

used to determine the actual value of x from the value of V.

x = a + V (b − a)2n

In most cases, however, the function to be optimized is a multivariable function, f(x1,

x2, ..., xm). In such a case, the bit string must contain all values of xi. What is usually done

is to code the variables sequentially, one after the other, as depicted in Figure 2.1.

Figure 2.1: Multivariable Chromosome

x1 x2 x3 xm–2 xm–1 xm...

Some non-numerical problems, such as developing classifier systems, are optimized

using genetic algorithms [8]. In such cases, each bit can represent a particular attribute; if

the bit is a one (1), then the attribute is present, and vice versa.

2.3 Description of Operators

The selection operator selects individuals from the current population which will be

retained, intact or modified, for the next population. Selection of individuals is based on

fitness values, where highly fit individuals have a higher probability of survival than less

highly fit individuals (survival of the fittest).

A common analogy to demonstrate the selection operator is to represent the

Page 12: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

5

population of individuals as a biased roulette wheel, as depicted in Figure 2.2, where each

individual is assigned a slice proportional to its fitness, as compared to the total fitness of

the population. When the GA selects an individual, it “spins” the wheel and selects the

winning individual.

Figure 2.2: Roulette Wheel with Population of Individuals

The crossover operator chooses two selected individuals and from them creates two

new individuals for the new population. This is done by randomly selecting a crossover

point at which the two bit strings are cut; then, the two tails are switched from one

individual to the other, and the bit strings are reassembled. An example crossover is shown

in Figure 2.3.

Page 13: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

6

Figure 2.3: One Point Crossover on Bit Strings

Individuals Before Crossover

Individuals After Crossover

Individual A

Individual B

Individual A'

Individual B'

CrossoverPoint

Crossover is likely the most important of the GA’s “modification” operators

(crossover and mutation) since it creates new individuals that are quite often very different

from the original individuals. One would hope that the qualities which make one of the

original individuals highly fit will remain part of one of the new individuals, and that it will

mix with some of the qualities that made the other original individual fit; this would

potentially lead to an even better individual.

A less important genetic operator is the mutation operator which randomly selects a

bit on a bit string and flips it, as depicted in Figure 2.4.

Page 14: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

7

Figure 2.4: Mutation on a Bit String

0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0

Chosen bit before mutation

Chosen bit after mutation

The mutation operator is usually assigned a very low probability of flipping a bit

since it would be destructive to start flipping many bits. Rather, the mutation operator is

used sporadically with the hope of increasing or reintroducing genetic diversity.

To illustrate this, suppose all the individuals in the population had a zero (0) in their

least significant bit position. Then the mutation operator, if used to flip an individual’s least

significant bit, would reintroduce a one (1) in that bit position. None of the other operators

can do that. Note however that the decision to flip a bit is purely random and is not based in

any way on the necessity for genetic diversity in a given bit position.

2.4 Example Problem

In this subsection, an example run of the simple genetic algorithm will be given,

where we will attempt to maximize the function 676 – (x – 5)2 on the interval x ∈ [0, 31],

where x is an integer (676 is simply a bias to keep function values positive). Some

properties of the function make it a desirable toy problem. First, we know in advance that

the optimal solution is for x = 5. Second, the potential solutions can be coded as bit strings

of length 5, where the bit string corresponds to the binary representation of x. (In contrast

with another toy problems often used to demonstrate the GA, where the objective function is

x2 instead [7], the optimal solution in this case is not at one extremity of the solution space.)

The population size, probability of crossover and probability of mutation are user defined;

Page 15: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

8

in an effort to maintain simplicity, we will select a population size of 4 (population sizes of

100 to 500 are common), a probability of crossover of 100% (usually around 80%, with the

other 20% of the new population being made up of non-modified selected individuals), and

a very low probability of mutation (as usual).

The first step the GA must take is to generate the initial random population.

Suppose the GA “randomly” creates the four individuals in Table 2.1.

Table 2.1: Individuals in Initial Population

Individual Value Bit String FitnessA 25 11001 276B 11 01011 640C 30 11110 51D 2 00010 667

The fitness of each individual is obtained by evaluating the objective function for the

individual’s value. A higher fitness value is more desirable.

Now, the GA must iteratively create a new population from an old one until one of

two things happen: either the GA finds the optimal solution or a user specified number of

generations go by. At that point, the GA will declare the best individual obtained as the

winner.

The first step when creating a new population from an old one is to select the

individuals from the old population which will become part of the new population (selection

operator). Suppose that roulette wheel selection chooses individual A once, individual B

twice and individual D once to become part of the mating pool. Individuals in the mating

pool are now paired for crossover; for this purpose, we will pair individual A with one

Page 16: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

9

instance of individual B and the other instance of individual B with individual D.

For the crossover between A and B, suppose we cross after the third bit, thus

producing individuals 11011 (27) and 01001 (9). Similarly, for the crossover between B

and D, suppose we cross to the right of the fourth bit, producing individuals 01010 (10) and

00011 (3). The new individuals and their respective fitnesses are shown in Table 2.2.

Table 2.2: Individuals in Second Population

Individual Value Bit String FitnessA 27 11011 192B 9 01001 660C 10 01010 651D 3 00011 672

Notice that the best individual’s fitness in this new population (672) is higher than the

fitness for the best individual in the previous population (667). Also, the average fitness has

gone up from 408.5 to 543.75.

Note however that none of the individuals contain a one (1) in their third bit

position. Since the optimal solution, 00101, contains a one (1) in its third bit position, the

only way the GA could potentially obtain the optimal solution would be to have the mutation

operator flip one of the individual’s third bit in one of the subsequent generations. This

might not happen very quickly, and the user may be forced to settle for a near-optimal

solution, as is nearly always the case.

In this toy example, we obtained the near-optimal solution x = 3 after one

generation. In reality, where the GA is run on a computer rather than by an individual

attempting to solve a toy problem, optimal or near-optimal solutions are not usually obtained

that quickly. Often, the GA needs to run for somewhere in the order of 50 to 100

generations. For larger problems, the GA might require even larger numbers of say 1000

Page 17: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

10

generations with populations sizes of 1000 to 5000 individuals. Of course, if the GA is

running on a fast, parallel, supercomputer with large amounts of memory, it is feasible to

increase the population size and maximum number of generations.

2.5 Non-Randomness of Genetic Algorithms

The following example serves to convince potential GA users that a GA is not

merely a random search through the solution space but rather a directed search.

In [6] a problem requiring 144-bit accuracy (9 variables with 16 bits per variable) is

studied. Suppose there are approximately 250 acceptable solutions in the solution space,

consisting of 2144 members. A purely random search would normally examine

approximately 2144 / 250 ≈ 2.0 × 1028 potential solutions before finding an acceptable one.

In that example, approximate solutions were generally obtained using a population size of

201 individuals evolving over 100 generations; therefore, only 2 × 104 potential solutions

were examined. This is significantly more efficient than a mere random search.

2.6 Modifications to the Simple Genetic Algorithm

Some modifications have improved the performance of the GA.

One such modification, called “elitism” [7], preserves the best individual obtained

at any given point during a run of the GA. The best individual from the current population

is copied intact into the next population. Such a modification guarantees that the best

individual found during a run of the GA will not be destroyed before the end of the run,

provided that the fitness function does not change from one generation to the next.

Another modification, introduced in [6], is useful for multivariable functions where

some symmetry exists in the problem domain. The “permutation” operator randomly

chooses two variables represented on a chromosome and switches their corresponding

Page 18: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

11

portions of the bit string, as illustrated in Figure 2.5.

Figure 2.5: Illustration Of The Permutation Operator

Individual Before Permutation

Individual After Permutation

Many optimization problems are solvable using genetic algorithms. However, the

bit string coding is not always the most easily usable coding or the most efficient. Often,

the GA user has to create a data structure suitable to the problem [5][6]. Also, a new data

structure implies new genetic operators.

This will be the case in the next section, on genetic programming, where the data

structure is no longer a bit string, but rather a tree.

Page 19: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

12

3.0 Description of Genetic Programming

3.1 Genetic Programming: Data Structure

The Genetic Programming Paradigm [10] uses the general ideas found in the

standard GA, but operates on a different data structure: trees replace bit strings; each tree

represents the parse tree for a given program.

For example, the tree in Figure 3.1 represents the Boolean function

F = (A AND B) OR ((NOT A) AND (NOT B)).

Figure 3.1: Example of Parse Tree for Genetic Programming

OR

AND AND

NOT

A

NOT

B

A B

Note that many different trees can represent the same given function. Observe also that

there are no numerical values except 0 or 1 associated with the connections; in other words,

connections either exist or don’t exist and are not truly weighted.

Parse trees may be represented in many ways. Koza represents the parse trees as

LISP s-expressions [10]. The following is the LISP s-expression for the parse tree in

Figure 3.1:

Page 20: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

13

(OR(AND A B)(AND (NOT A) (NOT B)))

Other possible codings are discussed in [9].

3.2 Crossover

This tree data structure, being more complex yet sophisticated than bit strings,

requires new crossover and mutation operators.

In a standard bit-string-driven GA, crossover is performed by selecting two strings,

choosing a “crossover point” and cutting and splicing the two strings at that same point.

An identical concept of “crossover point” is not applicable to trees, since it would nearly

imply that the two input trees in question are, from the root to the cut-point, isomorphic, an

excessively restraining hypothesis. This is addressed by choosing a cut-point for each of

the input trees and switching subtrees, as illustrated in Figures 3.2 and 3.3.

Figure 3.2 shows the two input trees, with selected crossover points. The crossover

points are the two nodes, numbered “1” and “2”; the subtrees being switched are shown

in white and with dashed connections. Note that the connection from the crossover point to

its parent node is included with the subtree; this will become important later. For the

problems tackled by the generic GPP, the connections are not weighted, and thus it is not

necessary, in that case, to include the connection to the parent node. Also, to make things

consistent, the root node is connected “to nowhere” via an “invisible” connection that has

no effect on the functioning of tree; if the root node is chosen as crossover point, this

invisible connection becomes the connection to the tree’s new parent node, and the other

subtree’s “real” connection becomes invisible.

Page 21: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

14

Figure 3.2: Two Trees Before Crossover

1 2

After crossover, the two trees look as visualized in Figure 3.3. The two subtrees

have been switched, along with their respective connections to their parent nodes.

Figure 3.3: Two Trees After Crossover

12

3.3 Mutation

The mutation operator in a GA flips a randomly selected bit within the bit string. In

a certain way, one can think of this as randomly assigning a different value to the selected

bit; there being only one such thing, we are not faced with the question of what we are

mutating to.

One could create an operator that simply mimics the GA’s mutation operator and

changes the nature of a randomly selected node (e.g. change from an AND to an OR node).

However, this would be infeasible in many cases for the simple fact that two given nodes

Page 22: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

15

will very often have different arities, thus resulting in conflicts for the children of the node.

The GPP’s mutation operator solves this problem. It randomly picks a mutation

point (a node in the tree); it then destroys the subtree (along with the connection to its

parent) whose root is the mutation point, and replaces it with a new randomly-generated

subtree with its own connection to the parent node.

In Figure 3.4, the tree on the left has been selected for mutation; the subtree that will

get mutated is dashed and white. The tree after mutation is shown on the right; the new

subtree is dashed and gray.

Figure 3.4: Tree Before and After Mutation

Before Mutation After Mutation

3.4 Summary of Differences Between the GA and the GPP

As we have seen, genetic algorithms are used mainly for numerical or parametric

problems, where we are attempting to find numerical values for a given set of parameters,

such that the parameters will minimize some cost function. Thus the GA could be used to

determine appropriate weights given a neural network architecture and desired output

function.

On the other hand, genetic programming is used for problems which are more

Page 23: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

16

symbolic in nature. In other words, instead of trying to determine an actual numerical

solution to a problem, we attempt to determine some algorithm or program which could

learn how to solve or approximately solve a problem. Thus the GPP could be used to

obtain a general neural network architecture, where the weights would then be refined

through parametric learning techniques such as backpropagation.

A comparison between genetic algorithms and genetic programming is illustrated in

Figure 3.5.

Figure 3.5: Comparison Between GA and GPP

GeneticAlgorithms

GeneticProgramming

NeuralNetworks

learning:numerical/parametric

learning:symbolic/structural

applications:learn weights

in neuralnetworks

applications:develop

topology ofneural networks

Page 24: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

17

4.0 Description of Logic-Based Neural Networks

Logic-Based neural networks (LNN) are explained in detail in [12]. Rather than

having one single standardized type of neuron, as in standard artificial neural networks,

LNNs incorporate the idea of having neurons which have responses based on fuzzy set

theory. The responses are similar to those of the AND and OR functions in Boolean logic.

4.1 The AND Neuron

AND neurons are defined by AND: [0, 1] × [0, 1] → [0, 1], with response

A AND B ≡ min(A, B). For more than 2 inputs, AND is defined associatively. A graphical

representation of the AND function is shown in Figure 4.1.

Figure 4.1: Graphical Representation of the AND Function

Table 4.1 presents a table, for inputs restricted to {0, 1} × {0, 1}, showing that the

AND neuron is a generalization of the AND function encountered in Boolean logic.

Page 25: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

18

Table 4.1: 2-Input AND Function

A B min(A, B)0 0 00 1 01 0 01 1 1

4.2 The OR Neuron

OR neurons have the response A OR B ≡ max(A, B) and are defined associatively

for more than 2 inputs. A graphical representation of the OR function is shown in Figure

4.2.

Figure 4.2: Graphical Representation of the OR Function

Once more we can see in Table 4.2 that the response generalizes Boolean logic.

Page 26: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

19

Table 4.2: 2-Input OR Function

A B max(A, B)0 0 00 1 11 0 11 1 1

4.3 Connection Weights

The NOT function is defined by NOT A ≡ 1 – A. Once more, this is consistent

with Boolean logic, though we should state clearly that we are defining NOT as a function

and will not use it as a specific node; rather, as we will see below, NOT will be used

implicitly within connections.

As in neural networks, connections for Logic-Based Neural Networks are weighted:

1) If the parent of a connection is an AND neuron, the input to the neuron is

<parent neuron input> = <child node output> OR <connection weight>

2) If the parent is an OR neuron, the input to the neuron is

<parent neuron input> = <child node output> AND <connection weight>

Weights always have values in the interval [0, 1]; however, they may also be

“inhibitive”; with an inhibitive weight, we apply the required AND or OR weightage and

then apply the NOT operation before the result is sent to the parent neuron.

For example, if the parent neuron is an AND function, and the weight is inhibitive,

then the neuron input would be the following:

<parent neuron input> = NOT [<child node output> OR <connection weight>].

Page 27: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

20

4.4 Example Logic-Based Neural Network

The example LNN shown in Figure 4.3 will demonstrate the use of AND and OR

neurons, as well as the use of weighted connections and the NOT function.

Figure 4.3: Example Logic-Based Neural Network

A = 0.6

B = 0.3

C = 0.8

AND

OR

w = 0.2

w = 0.9

w = 0.5

w = 0.7

Output

In this network, information flows from left to right. The circle adjacent to the OR

neuron indicates that the corresponding weight is inhibitive.

Suppose inputs A, B and C have numerical values 0.6, 0.3 and 0.8, respectively.

First, we must compute the input values to the AND neuron. Since it is an AND

neuron, the inputs are weighted by ORing them with the corresponding weights. For input

A, the input value to the AND neuron is (0.6 OR 0.2) = 0.6. Similarly, the input value

corresponding to B is (0.3 OR 0.9) = 0.9. Now, we can compute the output from the AND

neuron, which is (0.6 AND 0.9) = 0.6.

Then, we can compute the input values to the OR neuron. Since it is an OR neuron,

the inputs are weighted by ANDing them with the corresponding weights. The input value

corresponding to C is (0.8 AND 0.7) = 0.7. The input to the OR neuron corresponding to

the AND neuron includes an inhibitive weight. Thus we must use the NOT function in

calculating the OR neuron’s input value. The input value is equal to NOT(0.6 AND 0.5) =

1 – (0.6 AND 0.5) = 0.5. Now the output from the OR neuron (and from the network) can

be calculated; the output is equal to (0.5 OR 0.7) = 0.7.

Page 28: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

21

5.0 Description of the GPP for LNNs

There are four basic differences between our new algorithm and the generic GPP:

our version introduces connection weights, a new mutation operator, a new crossover

operator, and a new method of generating the individuals in the initial random population.

5.1 Coding

Neurons in the LNN are coded as nodes in Koza’s parse trees. The function set

(set of all nodes with arity ≥ 1) is:

F = {AND, OR}

where we use arities of 2 and 3 for both the AND and OR functions. The terminal set (set

of all possible nodes with arity = 0) is:

T = {Inputs to the given problem}.

The generic version of the GPP does not use weighted connections. They are not

necessary in the problems it tackles. However, since we are dealing here with problems

whose solutions have a weighted structure, we find it natural and also necessary to introduce

weights to the connections between neurons. The weights are coded with the weight from a

node to its parent being coded alongside the node.

In order to control the eventual mutation of weights, we will assign weights from

symbolic categories named small (S), medium (M), or large (L); also, we determine whether

the weight is normal or inhibitive. Then, when determining fitness values, we will assign a

numerical value to each symbolic value; this numerical value will be the actual weight.

There are many ways to assign numerical weight values. All are probabilistic in

nature. As an economy measure, we have decided to use uniform distributions on user pre-

Page 29: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

22

defined intervals

S: [0, s]M: [m1, m2]L: [l, 1]

where there may or may not be overlap between intervals. Also, we assign equal

probabilities to normal and inhibitive weights.

5.2 Genetic Mechanisms

The generic GPP mutation operator destroys a randomly chosen subtree and

replaces it with a newly generated subtree; this supposedly reintroduces genetic diversity.

The generic GPP crossover operator picks a random subtree in each of two input

trees and switches them. This crossover, alone, is sufficient to attain genetic diversity

because the two crossover points are not necessarily at the same point in the two trees and

switching subtrees from different locations can and will introduce new nodes to locations

that previously had never seen such types of nodes. (Crossover in a standard binary-string-

driven GA is not sufficient to attain genetic diversity because the two crossover points must

be the same in order to maintain string length.) Thus, the generic crossover operator in

GPP is sufficient to attain genetic diversity, without requiring the generic mutation operator.

Also, the generic mutation operator is undesirable since it randomly destroys

information. A more useful version of this particular mutation operator would take into

account the “sub-fitness” of a subtree before actually destroying it.

However, a mutation operator is required to attain genetic diversity for connection

weights.

This is at the heart of the mutation operator we have defined and will be using

instead of the generic mutation operator: our new mutation operator mutates connection

weights. The operator picks a connection in the tree and replaces it with an “adjacent”

weight; adjacency is determined as shown in Figure 5.1, where an arrow denotes adjacency

Page 30: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

23

between two weights. Inhibitive weights are represented by the weight symbol with an

overstrike. For example, if the old connection weight is M, the new connection weight will

be either S or L.

Figure 5.1: Adjacency Between Weights

S M L

S M L

Other adjacency configurations such as the one depicted in Figure 5.2 could

potentially be studied.

Figure 5.2: Other Potential Adjacency Configuration

S M L

S M L

Next, the standard method of generating random individuals for the initial

population is far too general, especially for Boolean problems. Boolean expressions can be

rewritten as a sum-of-products (SOP) or a product-of-sums (POS); knowledge of this

makes it unnecessary and also wasteful to use the most general trees possible.

For a Boolean function represented as a SOP, we have an input layer followed by

NOT, AND and OR layers, the OR layer representing outputs; each layer is composed of

one type of function only; there is a similar result for POS.

LNNs, which incorporate certain properties of Boolean expressions, also tend

towards a similar layered structure. It is therefore natural to consider, in LNNs, only trees

Page 31: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

24

with a tendency towards such a layered structure, each layer consisting predominantly of

one type of neuron.

For this purpose, we incorporate a probability of nodes getting chosen at a given

level. For example, we may require the second level to consist of 80% OR nodes and 20%

terminals (inputs); to accomplish this, we would set the probabilities for the OR nodes to

total 0.8 and the probabilities for the terminals to total 0.2. When the algorithm creates an

individual, it looks at which level it is creating, and chooses a node at random, with more

“preferable” nodes having a higher probability of getting chosen. This tends to create

individuals whose structure is pertinent to the problem.

It then follows that we should create a crossover operator that takes advantage of this

layered structure. Other crossover operators have been proposed [2]; these operators limit

the crossover points to two “identical” points within the trees or two points that are within

the “same” subtree. However, our new crossover operator picks two crossover points that

are at the same level in the tree. Thus the individuals resulting from our new crossover will

have the same desired, layered structure as the individuals did before crossover. This new

crossover is more general than the crossover proposed in [2], but it accomplishes the same

effect of preserving the underlying structure of the individuals and it is very well suited to

the layered nature of the problems we are attempting to solve.

5.3 Fitness Function

When determining fitness, numerical values of S, M and L are assigned for each

particular trial, as described in Section 5.1. Raw fitness for an individual is naturally

defined as the sum of absolute values of deviations between observed and desired outputs,

calculated over all possible inputs. These are averaged over the collection of trials to

produce the individual’s actual raw fitness; then a transformation of the form

Page 32: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

25

fitness = 1ε + (average raw fitness)

is applied, for a small ε > 0; this fitness measure serves a computational purpose only, when

selecting individuals for a new population. The average raw fitness is a quantity really

pertinent to the user, in some sense. Raw fitness varies between 0 and 2n, though it is

actually never 2n in practice; a value of 0 signifies perfection.

A different measure of performance is the number of hits for a particular individual.

A hit occurs when a particular combination of inputs produces an output within a user-

specified threshold of the desired output. The numbers of hits over all combinations of

inputs are averaged over the number of trials, and then rounded to the nearest integer. The

hits are essentially a measure of how many outputs, on average, are acceptable to the user.

Hits vary between 0 and 2n; a value of 2n signifies perfection.

Both these measures inform the user of the degree of approximation supplied by the

individual’s output, but in different ways. One might say that a good number of hits

signifies an individual that currently produces the most correct outputs whereas a good raw

fitness signifies a more robust individual, capable of adapting to new weights without

significant deterioration.

Also note that in the fitness function, there is no measure of the size of the network.

Table 5.1 summarizes the basic differences between the various operators, for the

algorithm we tested and the generic GPP. The C program code for the algorithm we tested

is given in Appendix 2.0.

Page 33: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

26

Table 5.1: Comparison Between Generic GPP and GPP for LNNs

Algorithm Generic GPP GPP for LNNsData Structure General Tree Weighted TreeCrossover Generic Crossover Generic Crossover and

Same-Layer CrossoverMutation Generic Mutation Connection Weight

MutationGeneration ofIndividuals in OriginalPopulation

Random with GeneralStructure

Random with LayeredStructure

Page 34: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

27

6.0 Using the C Program For The GPP

The C program code for the genetic programming algorithm for logic-based neural

networks is given in Appendix 2.0. This section explains how to use the program and to

modify it to obtain different LNNs of varying sizes.

When using this version of the GPP for Logic-Based Neural Networks, three things

may be changed: parameters, desired outputs, and, if the number of inputs changes, the list

of functions and terminals.

The program can be compiled using the “acc” C compiler. Also, an input file

which specifies the desired outputs of the LNN must exist; the format of this file is

explained in Section 6.3.

6.1 Changing Parameters

There are three sets of parameters which may be modified in order to change the

problem.

The first is the list of constants at the beginning of the program; these relate to

crossover rates, mutation rates, population size, size of individuals, number of trials,

maximum number of generations, the hit threshold, and the random number generator seed.

The constants are explained in detail in the next subsection.

Next, the probabilities that nodes will get chosen at any given level can be modified.

These probabilities are located in the create_functions() function. The field containing the

probability of getting chosen at level L is called “probs[L]”. The root node is at level 0.

Finally, the weight intervals are determined in the create_weights() function. The

field “lowbound” contains the lower bound on a weight and the field “highbound”

contains the upper bound.

6.2 Description of Constants

Page 35: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

28

The following is a list of the user-specified constants located at the beginning of the

program, accompanied by a brief description of each.

DNA_length: The maximum length of a chromosome (i.e. the maximum number of nodes

in a tree).

POP_SIZE: The number of individuals in the population.

MAX_GENS: The maximum number of generations (not including the first,

random, generation). The number of generations will reach MAX_GENS if and

only if the number of “hits” in the population’s best individual never reaches

MAX_HITS.

NUM_FUNCTIONS: The number of functions (non-terminals). We use the 2 and

3-input AND and the 2 and 3-input OR functions, so the value here is 4.

NUM_TERMINALS: The number of terminals (i.e. the number of inputs to the problem

being solved).

NUM_WEIGHTS: The number of possible weights. Here, we use L , M , S , S, M, L,

so the value is 6.

NUM_TRIALS: The number of times the raw fitness of an individual is measured before

calculating fi tness. The actual r aw fi tness i s t he average o f t he r aw fitnesses measured

during the trials.

MAX_LEVELS: The maximum number of levels in a tree.

Page 36: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

29

NUM_CROSS_1: The number of generic (Koza) crossovers.

NUM_CROSS_2: The number of new (same level) crossovers.

W_MUT_PROB: The probability that a selected “non-crossover” individual will have

one of its weights mutated.

RAND_SEED: The random number generator seed, which must be an integer. For a

given seed, the pseudo random numbers are in fact deterministic, which allows the

user to reproduce a given run.

MAX_HITS: The maximum number of hits (2NUM_TERMINALS).

threshold: If the difference between the desired and actual outputs is smaller than this

threshold, we have a hit.

6.3 The Input File

The input file allows the user to describe the desired outputs from the network. The

input file is opened in the main() function and the desired outputs are read in the

get_test_vectors() function. In the C code in Appendix 2.0, the input file is called

“GPR6.in”. A new name can be specified by changing the name in the main() function.

The desired outputs are written one per line (one line for every combination of

inputs) in numerical order of inputs. Suppose the desired outputs for a 2-input problem

were as given in Table 6.1.

Page 37: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

30

Table 6.1: Example 2-Input Problem Outputs

A B Output0 0 0.10 1 0.71 0 0.21 1 1.0

Then, the input file would contain the values 0.1, 0.7 0.2 and 1.0 in that order, one per line,

as follows:

0.10.70.21.0

Note that one output must be stated for every combination of 0 or 1 input values.

Future versions should include the possibility of having incompletely specified outputs or

outputs specified for real inputs.

6.4 Changing the Size of the Problem

The following instructions will turn the problem from an m-input problem into an n-

input problem.

1) Change the list of terminals in the create_functions() function to include n instead of

m terminals.

2) Change the probabilities of nodes getting chosen, in the create_functions() function.

3) Change the input file, as described in Section 6.3.

Page 38: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

31

4) Change constant values NUM_TERMINALS and MAX_HITS at the beginning of

the program.

5) Change the evaluate_fitness() function by adding or deleting depths of “for” loops

for every input. Also change the number of parameters passed to the

eval_individual() function.

6) Change the eval_individual() function by adding or deleting inputs that need to be

evaluated.

Page 39: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

32

7.0 Experimental Studies

7.1 Description of Tests

The GPP for LNNs was tested using a data set (given in Appendix 1.0) derived

from a 6-input LNN. For testing purposes, the inputs are limited to 0 and 1.

Tests were performed using three different sets of parameters; for each set of

parameters, there were 12 runs. The parameters used are listed in Tables 7.1, 7.2 and 7.3.

Parameters in Tables 7.2 and 7.3 remain constant throughout the three parameter sets.

Table 7.1: Description of Three Parameter Sets

Parameter Set 1 2 3Maximum Number Of Nodes In Tree 60 60 60Population Size 300 300 300Number Of Generations 300 300 300Number Of Fitness Trials 4 4 4Number Of Generic Crossovers 200 100 0Number Of Same Level Crossovers 0 100 200Probability Of Weight Mutation 0.05 0.05 0.05Threshold For Hits 0.10 0.10 0.10

Table 7.2: Description of Weight Intervals

S Range [0.0, 0.2]M Range [0.3, 0.7]L Range [0.8, 1.0]

Page 40: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

33

Table 7.3: Probabilities of Nodes Getting Selected at Given Levels

level 0 level 1 level 2 level 3 level 4AND-2 0.1 0.2 0.4 0.2 0.0AND-3 0.1 0.2 0.4 0.2 0.0OR-2 0.4 0.3 0.05 0.0 0.0OR-3 0.4 0.3 0.05 0.0 0.0inputs 0.0 0.0 0.1 0.6 1.0

The first parameter set tests our algorithm using only standard crossover; the

second tests our algorithm using a mix of 100 standard crossovers and 100 new crossovers

per generation; the third parameter set tests our algorithm using only the new crossover

operator.

The general structure of the individuals is as follows: there is an input layer, then a

layer of AND nodes, and the output layer consists of OR nodes. Note that this is only a

general structure and that there may be deviations from this structure.

7.2 Results

Table 7.4 gives the average raw fitness of the best individual obtained during each of

the 36 runs, and the generation in which the individual was obtained (in brackets).

The “optimal” (64 hits) solution was not obtained within the allotted number of

generations for any of the 12 runs of any of the three parameter sets. This is as expected

since the data set was derived using a network containing weights of 0.1, 0.5 and 0.9 only.

It would be nearly impossible to obtain the exact same weight values and thus small

differences between the obtained and desired outputs will always occur. Rather, the

algorithm obtains a “family” of networks where the weights can and will be adjusted, using

parametric learning techniques, to replicate more precisely the desired function.

Page 41: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

34

Table 7.4: Best Individuals from each Run

Run 200Generic Crossovers

100/100Crossover Mix

200New Crossovers

0 4.99 (250) 6.91 (300) 6.35 (299)1 7.22 (146) 9.74 (285) 10.35 (237)2 10.14 (81) 7.16 (292) 6.29 (231)3 9.27 (295) 4.39 (256) 6.82 (250)4 9.74 (189) 4.96 (283) 6.25 (266)5 7.44 (299) 7.20 (261) 6.12 (217)6 8.19 (219) 7.62 (270) 3.82 (111)7 4.06 (258) 8.09 (210) 9.07 (162)8 6.36 (261) 10.04 (300) 9.16 (151)9 4.47 (267) 4.23 (121) 8.33 (67)10 8.73 (244) 8.30 (199) 6.11 (254)11 4.52 (266) 4.31 (229) 4.15 (246)

For the parameter set using 200 generic crossovers, the algorithm’s best solution (in

terms of average raw fitness) on average was 7.09. For the 100/100 mix, the algorithm’s

best solution on average was 6.91. For the parameter set using 200 new crossovers, the

algorithm’s best solution on average was 6.90. The new crossover operator, working alone

or with the generic crossover operator, produces slightly better solutions than those obtained

with generic crossovers only.

The overall best solution was obtained with the third parameter set (200 new

crossovers) in generation 111. It has a raw fitness of 3.82 and got 58 hits. Its tree structure

is shown in Figure 7.1. Large black circles represent OR nodes whereas large white circles

represent AND nodes. Weights are indicated by their symbolic value, shown next to their

respective connections; small white circles represent inhibitive weights. The inputs are at

the bottom; the information travels up to the root, or output node.

Page 42: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

35

Figure 7.1: Best Obtained Logic-Based Neural Network

Suppose that each of the 58 hits is a “true” hit where the output is identical to the

desired output; then, the estimated deviation from the desired output for the 6 non-hits

would be 0.64. However, a more realistic approximation of deviation from desired output

would assume the hits having a deviation of approximately half the threshold value; then,

the estimated deviation from the desired output for the non-hits would be reduced to 0.15.

This is most certainly acceptable since now, more refined, gradient-based learning

techniques will be used in order to complete the learning process.

The two following graphs illustrate the improvement in fitness over the generations.

Figure 7.2 shows the general improvement in average fitness in the populations for

each of the three parameter sets.

Figure 7.3 shows the general improvement of the fitness of the best individual in

each population with respect to generation number.

These figures must be interpreted with a grain of salt since the fitness function,

being based on randomly determined weights, changes from one generation to the next;

thus changes in its values are not easy to interpret. Rather, we are looking for long-term

Page 43: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

36

trends and “large” changes in fitness values.

This explains the apparent non-monotonicity of the fitness of the best individual per

generation, as shown in Figure 7.3. Since we are using elitism, decreases in fitness result

from the use of new weights and not from the loss of the best individual.

Figure 7.2: Average Fitness of Populations by Generation

0

0.02

0.04

0.06

0.08

0.1

0.12

0 20 40 60 80 100

120

140

160

180

200

220

240

260

280

300

I

II

III

Generation

Fitness

Page 44: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

37

Figure 7.3: Average Fitness of Best Individual by Generation

0

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0 20 40 60 80 100

120

140

160

180

200

220

240

260

280

300

I

II

III

Generation

Fitness

Page 45: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

38

8.0 Conclusions

We have presented a Genetic Programming-based algorithm which obtains

satisfactory solutions to medium-sized Logic-Based Neural Network problems. In

developing this algorithm, we have introduced new operators that take advantage of

knowledge we have about logic-based neural networks. We have also introduced a new

weighted tree structure. Some of the methods introduced, such as the method of generating

individuals in the initial random population and the new crossover operator, could possibly

be applied to other types of problems where structured, layered solutions are desired.

The algorithm we have presented is meant to be part of a two-level development

process. First, satisfactory LNNs are obtained using our algorithm; these networks are

then refined using gradient-based learning techniques.

Some problems still need to be studied.

First, a data structure incorporating many outputs for one given network should be

developed. Currently multiple runs can be used to produce results for each of the separate

outputs. This, however, does not produce a globally optimal solution since, very often,

portions of networks might be combined.

Second, testing has to be performed using more general “continuous-to-

continuous” intervals rather than the “discrete-to-continuous” intervals used so far.

Page 46: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

39

9.0 References

[1] Bornholdt, S., and D. Graudenz; “General Asymmetric Neural Networks And

Structure Design By Genetic Algorithms”; Neural Networks; Vol. 5 #2; pp. 327-334;

1992.

[2] D’haeseleer, P.; “Context Preserving Crossover In Genetic Programming”; 1994;

available via anonymous ftp from ftp.cc.utexas.edu under pub/genetic-programming/papers

with name WCCI94_CPC.ps.Z

[3] Dill, F. A., and B. C. Deer; “An Exploration Of Genetic Algorithms For The

Selection Of Connection Weights In Dynamical Neural Networks”; IEEE 1991 National

Aerospace And Electronics Conference NAECON 1991; Vol. 3 pp. 1111-1115; 1991.

[4] Gaudet, V.; “Genetic Programming Of Logic-Based Neural Networks”;

submitted to Information Sciences 95/01/15; 1995.

[5] Gaudet, V.; “NNAGA: A Genetic Algorithm Designed To Obtain Optimal Neural

Network Architectures”; 1993.

[6] Gaudet, V.; “Applying Genetic Algorithms To Determining The Weights For A

Neural Network”; 1992.

[7] Goldberg, D. E.; Genetic Algorithms In Search, Optimization, And Machine

Learning; Addison-Wesley; Reading, Mass.; 1989.

[8] Holland, J. H.; “Genetic Algorithms”; Scientific American; Vol. 267 #1; 1992.

Page 47: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

40

[9] Keith, M., and M. Martin; “Genetic Programming In C++: Implementation

Issues”; Advances In Genetic Programming; MIT Press; Cambridge, Mass.; pp. 285-

310; 1994.

[10] Koza, J. R.; Genetic Programming; MIT Press; Cambridge, Mass.; 819 pages;

1992.

[11] Lippmann, R. P.; “An Introduction To Computing With Neural Nets”; Artificial

Neural Networks: Theoretical Concepts; pp. 36-54; 1987.

[12] Pedrycz, W., and A. F. Rocha; “Fuzzy-Set Based Models Of Neurons And

Knowledge-Based Networks”; IEEE Transactions On Fuzzy Systems; Vol. 1 #4; pp.

254-266; 1993.

Page 48: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

41

Appendix 1.0: Data Set Used to Test AlgorithmA B C D E F Output0 0 0 0 0 0 0.10 0 0 0 0 1 0.10 0 0 0 1 0 0.10 0 0 0 1 1 0.50 0 0 1 0 0 0.10 0 0 1 0 1 0.10 0 0 1 1 0 0.10 0 0 1 1 1 0.90 0 1 0 0 0 0.10 0 1 0 0 1 0.10 0 1 0 1 0 0.10 0 1 0 1 1 0.50 0 1 1 0 0 0.10 0 1 1 0 1 0.10 0 1 1 1 0 0.10 0 1 1 1 1 0.90 1 0 0 0 0 0.50 1 0 0 0 1 0.50 1 0 0 1 0 0.50 1 0 0 1 1 0.50 1 0 1 0 0 0.50 1 0 1 0 1 0.50 1 0 1 1 0 0.50 1 0 1 1 1 0.90 1 1 0 0 0 0.90 1 1 0 0 1 0.90 1 1 0 1 0 0.50 1 1 0 1 1 0.50 1 1 1 0 0 0.90 1 1 1 0 1 0.90 1 1 1 1 0 0.50 1 1 1 1 1 0.91 0 0 0 0 0 0.11 0 0 0 0 1 0.11 0 0 0 1 0 0.11 0 0 0 1 1 0.51 0 0 1 0 0 0.51 0 0 1 0 1 0.51 0 0 1 1 0 0.51 0 0 1 1 1 0.91 0 1 0 0 0 0.11 0 1 0 0 1 0.11 0 1 0 1 0 0.11 0 1 0 1 1 0.51 0 1 1 0 0 0.11 0 1 1 0 1 0.11 0 1 1 1 0 0.11 0 1 1 1 1 0.91 1 0 0 0 0 0.51 1 0 0 0 1 0.5

Page 49: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

42

1 1 0 0 1 0 0.11 1 0 0 1 1 0.51 1 0 1 0 0 0.51 1 0 1 0 1 0.51 1 0 1 1 0 0.51 1 0 1 1 1 0.91 1 1 0 0 0 0.91 1 1 0 0 1 0.91 1 1 0 1 0 0.11 1 1 0 1 1 0.51 1 1 1 0 0 0.91 1 1 1 0 1 0.91 1 1 1 1 0 0.11 1 1 1 1 1 0.9

Page 50: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

43

Appendix 2.0: C Program Code

The following is a complete listing of the C code used to test the genetic

programming algorithm for logic-based neural networks.

/* Genetic Programming Algorithm, written in C *//* Program written by: *//* Vincent Gaudet *//* Department of Electrical and Computer Engineering *//* University of Manitoba *//* Winnipeg, Manitoba, Canada *//* 1994 *//* -------------------------------------------------------------------------------------- *//* This program implements J. R. Koza's Genetic Programming Paradigm. *//* The program is designed to obtain logic-based neural network *//* architectures. *//* For this purpose, some modifications to the GPP were necessary: *//* -connection weights *//* -new crossover operator which crosses at the same level *//* -new mutation operator which mutates weights *//* -new method of generating individuals in initial population *//* Individual chromosomes are coded in postfix notation in an array */

/* Include I/O and mathematical libraries */#include <stdio.h>#include <math.h>

/* Constant declarations */#define DNA_length 60 /* length of chromosome */#define POP_SIZE 300 /* population size */#define MAX_GENS 300 /* number of generations */#define NUM_FUNCTIONS 4 /* number of functions (non-terminals) */#define NUM_TERMINALS 6 /* number of terminals */#define NUM_WEIGHTS 6 /* number of weight intervals */#define NUM_TRIALS 4 /* number of fitness trials */#define MAX_LEVELS 5 /* maximum number of levels in trees */#define NUM_CROSS_10 /* number of standard crossovers */#define NUM_CROSS_2200 /* number of crossovers at same level */#define W_MUT_PROB 0.05 /* probability of weight mutation */#define RAND_SEED 11 /* random number generator seed */#define RAND_MAX 32767 /* random number generation constant */#define MAX_HITS 64 /* maximum number of hits */#define threshold 0.1 /* threshold for hits */

/* type declarations */typedef struct node NODE;typedef struct dna_string DNA_STRING;typedef struct chromosome CHROMOSOME;typedef struct function_node FUNCTION;typedef struct population POPULATION;typedef struct functions FUNCTIONS;typedef struct weight_node WEIGHT;typedef struct weights WEIGHTS;struct node {

Page 51: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

44

int value; /* terminal or function in node */int arity; /* arity of function (terminal has arity 0) */int weight; /* weight of connection to parent */int level; /* level of node in tree */};

struct dna_string {NODE DNA[DNA_length]; /* string of DNA */};

struct chromosome {DNA_STRING DNA; /* program */float fitness; /* fitness of chromosome */float raw_fitness; /* raw, pre-processed fitness */float cum_fitness; /* cumulative normalized fitness */int hits; /* hits, used when needed */int length; /* length of DNA string */};

struct function_node {char *name[5]; /* name of function or terminal */int value; /* value of function or terminal in chromosome */int arity; /* arity of function or terminal */float probs[MAX_LEVELS]; /* probability of getting chosen at level */};

struct population {CHROMOSOME individuals[POP_SIZE]; /* individuals in population */float total_fitness; /* total fitness of population */float aver_fitness; /* average fitness of population */};

struct functions {FUNCTION function_list[NUM_FUNCTIONS + NUM_TERMINALS];};

struct weight_node {char *name[3]; /* name of weight */int value; /* value of weight in chromosome */float lowbound;float highbound; /* low and high bounds on weight values */float magnitude; /* magnitude of weight */};

struct weights {WEIGHT weight_list[NUM_WEIGHTS];};

/* function & procedure declarations */float float_rand();int int_rand(int base, int size) ;void create_functions(FUNCTIONS *function_array);void create_weights(WEIGHTS *weight_array);void get_test_vectors();void create_population(POPULATION *curr_population);void create_subtree(DNA_STRING *DNA, int *position, int *length, intlevel, int MAX_LENGTH);FUNCTION choose_element(int level);WEIGHT choose_weight();void print_gen_report(POPULATION *curr_population, int generation, int *h_hits);void print_population(POPULATION *curr_population, int generation);void print_individual(CHROMOSOME *individual);void print_DNA(DNA_STRING DNA, int length);void evaluate_fitness(POPULATION *curr_population);float eval_individual(DNA_STRING DNA, int length, int A, int B, int C, int D, int E, int F);

Page 52: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

45

float min2(float A, float B);float min3(float A, float B, float C);float max2(float A, float B);float max3(float A, float B, float C);void calc_pop_fitness(POPULATION *population);void generate_weights(WEIGHTS *weight_array);int parent(DNA_STRING DNA, int location);void create_new_population(POPULATION *curr_population);int best_individual(POPULATION *pop);int pick_individual(POPULATION *pop);void crossover_1(CHROMOSOME *indiv1, CHROMOSOME *indiv2);void crossover_2(CHROMOSOME *indiv1, CHROMOSOME *indiv2);int count_same_level(CHROMOSOME *indiv, int level);void mutate_weight(CHROMOSOME *old_indiv);

/* variable declarations */static FUNCTIONS function_array; /* list of functions and terminals */static WEIGHTS weight_array; /* list of weights */

FILE *ptr_infile; /* input file containing desired outputs */static float des_output[MAX_HITS]; /* desired outputs */

main() {/* more variable declarations */int i; /* index */int generation = 0; /* generation number */int highest_hits; /* number of hits for best individual */CHROMOSOME best; /* best individual in population */static POPULATION curr_population; /* current population */

/* Initialize genetic programming algorithm */srand(RAND_SEED); /* set random number generator */ptr_infile = fopen("GPR6.in", "r"); /* open test vector file */get_test_vectors(); /* read test vectors */create_functions(&function_array); /* initialize functions and terminals */create_weights(&weight_array); /* initialize weights */

/* Create and evaluate initial random population of programs */create_population(&curr_population);evaluate_fitness(&curr_population);calc_pop_fitness(&curr_population);print_gen_report(&curr_population, generation, &highest_hits);

/* genetic algorithm */for (generation = 1; (generation<=MAX_GENS) && (highest_hits != MAX_HITS); generation++) {

create_new_population(&curr_population);evaluate_fitness(&curr_population);calc_pop_fitness(&curr_population);print_gen_report(&curr_population, generation, &highest_hits);}

}

/**************************************//* RANDOM NUMBER GENERATION FUNCTIONS *//**************************************/

float float_rand()/* Returns a random float in the interval [0, 1) */{

Page 53: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

46

return((1.0*rand())/RAND_MAX);}

int int_rand(int base, int size){/* Returns a random integer in the range [base, (base+size-1)] */return((int) base + (1.0*rand())/RAND_MAX * size);}

/************************************************//* FUNCTIONS TO INITIALIZE LISTS OF POSSIBLE *//* FUNCTIONS, TERMINALS AND WEIGHTS *//* WARNING: PROBLEM-SPECIFIC *//************************************************/

void create_functions(FUNCTIONS *function_array){/* Problem-specific; initializes the list of functions and terminals for *//* the genetic programming algorithm. *//* NOTE: This is a problem-specific function *//* In this particular case, the functions and terminals are those used *//* for the 6-input boolean-input, real-output [0, 1], boolean networks. *//* Possible functions are the 2 and 3-input AND functions (minimum) *//* and the 2 and 3-input OR functions (maximum). *//* Possible terminals are the 6 possible inputs, A, B, C, D, E and F. */

/* 2-input AND */*(function_array->function_list[0].name) = "And2";function_array->function_list[0].value = 0;function_array->function_list[0].arity = 2;/* Probability of function or terminal being chosen at given level */function_array->function_list[0].probs[0] = 0.1;function_array->function_list[0].probs[1] = 0.2;function_array->function_list[0].probs[2] = 0.4;function_array->function_list[0].probs[3] = 0.2;function_array->function_list[0].probs[4] = -0.1;

/* 3-input AND */*(function_array->function_list[1].name) = "And3";function_array->function_list[1].value = 1;function_array->function_list[1].arity = 3;function_array->function_list[1].probs[0] = 0.1;function_array->function_list[1].probs[1] = 0.2;function_array->function_list[1].probs[2] = 0.4;function_array->function_list[1].probs[3] = 0.2;function_array->function_list[1].probs[4] = 0.0;

/* 2-input OR */*(function_array->function_list[2].name) = "Or2";function_array->function_list[2].value = 2;function_array->function_list[2].arity = 2;function_array->function_list[2].probs[0] = 0.4;function_array->function_list[2].probs[1] = 0.3;function_array->function_list[2].probs[2] = 0.05;function_array->function_list[2].probs[3] = 0.0;function_array->function_list[2].probs[4] = 0.0;

/* 3-input OR */

Page 54: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

47

*(function_array->function_list[3].name) = "Or3";function_array->function_list[3].value = 3;function_array->function_list[3].arity = 3;function_array->function_list[3].probs[0] = 0.4;function_array->function_list[3].probs[1] = 0.3;function_array->function_list[3].probs[2] = 0.05;function_array->function_list[3].probs[3] = 0.0;function_array->function_list[3].probs[4] = 0.0;

/* Terminal A */*(function_array->function_list[4].name) = "A";function_array->function_list[4].value = 4;function_array->function_list[4].arity = 0;function_array->function_list[4].probs[0] = 0.0;function_array->function_list[4].probs[1] = 0.0;function_array->function_list[4].probs[2] = 0.01666666;function_array->function_list[4].probs[3] = 0.1;function_array->function_list[4].probs[4] = 0.26666666; /* 0.2666 to ensure that a terminal is picked at the last level *//* Terminal B */*(function_array->function_list[5].name) = "B";function_array->function_list[5].value = 5;function_array->function_list[5].arity = 0;function_array->function_list[5].probs[0] = 0.0;function_array->function_list[5].probs[1] = 0.0;function_array->function_list[5].probs[2] = 0.01666667;function_array->function_list[5].probs[3] = 0.1;function_array->function_list[5].probs[4] = 0.16666667;

/* Terminal C */*(function_array->function_list[6].name) = "C";function_array->function_list[6].value = 6;function_array->function_list[6].arity = 0;function_array->function_list[6].probs[0] = 0.0;function_array->function_list[6].probs[1] = 0.0;function_array->function_list[6].probs[2] = 0.01666667;function_array->function_list[6].probs[3] = 0.1;function_array->function_list[6].probs[4] = 0.16666667;

/* Terminal D */*(function_array->function_list[7].name) = "D";function_array->function_list[7].value = 7;function_array->function_list[7].arity = 0;function_array->function_list[7].probs[0] = 0.0;function_array->function_list[7].probs[1] = 0.0;function_array->function_list[7].probs[2] = 0.01666666;function_array->function_list[7].probs[3] = 0.1;function_array->function_list[7].probs[4] = 0.16666666;

/* Terminal E */*(function_array->function_list[8].name) = "E";function_array->function_list[8].value = 8;function_array->function_list[8].arity = 0;function_array->function_list[8].probs[0] = 0.0;function_array->function_list[8].probs[1] = 0.0;function_array->function_list[8].probs[2] = 0.01666667;function_array->function_list[8].probs[3] = 0.1;function_array->function_list[8].probs[4] = 0.16666667;

Page 55: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

48

/* Terminal F */*(function_array->function_list[9].name) = "F";function_array->function_list[9].value = 9;function_array->function_list[9].arity = 0;function_array->function_list[9].probs[0] = 1.0;function_array->function_list[9].probs[1] = 1.0;function_array->function_list[9].probs[2] = 1.01666667;function_array->function_list[9].probs[3] = 1.1;function_array->function_list[9].probs[4] = 1.16666667;

/* large values to ensure that SOMETHING gets picked */}

void create_weights(WEIGHTS *weight_array){/* Initializes the list of possible connection weights *//* Note: Problem-specific *//* In this case, the weights can be small (S), medium (M) or large (L). *//* The output can be positive or negated (NOT, negative weight. */

/* Large, negated */*(weight_array->weight_list[0].name) = "-L";weight_array->weight_list[0].value = 0;weight_array->weight_list[0].lowbound = 0.8;weight_array->weight_list[0].highbound = 1.0;

/* Medium, negated */*(weight_array->weight_list[1].name) = "-M";weight_array->weight_list[1].value = 1;weight_array->weight_list[1].lowbound = 0.3;weight_array->weight_list[1].highbound = 0.7;

/* Small, negated */*(weight_array->weight_list[2].name) = "-S";weight_array->weight_list[2].value = 2;weight_array->weight_list[2].lowbound = 0.0;weight_array->weight_list[2].highbound = 0.2;

/* Small, non-negated */*(weight_array->weight_list[3].name) = "S";weight_array->weight_list[3].value = 3;weight_array->weight_list[3].lowbound = 0.0;weight_array->weight_list[3].highbound = 0.2;

/* Medium, non-negated */*(weight_array->weight_list[4].name) = "M";weight_array->weight_list[4].value = 4;weight_array->weight_list[4].lowbound = 0.3;weight_array->weight_list[4].highbound = 0.7;

/* Large, non-negated */*(weight_array->weight_list[5].name) = "L";weight_array->weight_list[5].value = 5;weight_array->weight_list[5].lowbound = 0.8;weight_array->weight_list[5].highbound = 1.0;}

void get_test_vectors()

Page 56: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

49

{/* Reads the test vectors from the input file. *//* The input file contains a list (one per line) of desired outputs *//* arranged in numerical order of inputs, e.g. output for 2 inputs in the *//* order (0, 0), (0, 1), (1, 0), (1, 1). */int i; /* index */for (i=0; i<MAX_HITS; i++) {

fscanf(ptr_infile, "%f\n", &(des_output[i]));}

}

/*************************************************//* FUNCTIONS TO CREATE INITIAL RANDOM POPULATION *//*************************************************/

void create_population(POPULATION *curr_population){/* Randomly creates the initial population; i.e. individuals (of *//* varying length) are randomly created until the population is full. */int i; /* index */int length; /* length of chromosome */int level; /* level of node in tree; here it is always 0 */int position; /* position of node on the chromosome *//* Create individuals one by one */for (i=0; i<POP_SIZE; i++) { /* initialization */ level = 0; length = 1; position = DNA_length - 1; curr_population->individuals[i].hits = 0; curr_population->individuals[i].fitness = 0; curr_population->individuals[i].cum_fitness = 0; /* Actually create individual */ create_subtree(&(curr_population->individuals[i].DNA), &position, &length, level, DNA_length); curr_population->individuals[i].length = length; }}

void create_subtree(DNA_STRING *DNA, int *position, int *length, int level, int MAX_LENGTH){/* Randomly creates a subtree of maximum size MAX_LENGTH at position *//* *position in the tree (chromosome). */FUNCTION element; /* function to be added */WEIGHT weight; /* weight to be added */int i; /* index */int elem_loc; /* location of element *//* Choose root of subtree */element = choose_element(level);/* Check so that length doesn't exceed MAX_LENGTH */while ((element.arity + *length) > MAX_LENGTH) {

element = choose_element(level);if ((*length == MAX_LENGTH) && (element.arity > 0)) {

elem_loc = int_rand(NUM_FUNCTIONS, NUM_TERMINALS);element = function_array.function_list[elem_loc];}

}/* Choose connection weight from root to its parent */weight = choose_weight();

Page 57: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

50

/* recalculate appropriate fields */*length += element.arity;DNA->DNA[*position].value = element.value;DNA->DNA[*position].arity = element.arity;DNA->DNA[*position].weight = weight.value;DNA->DNA[*position].level = level;*position = *position - 1;for (i=0; i<element.arity; i++) {

/* create sub-subtrees */create_subtree(DNA, position, length, level + 1, MAX_LENGTH);}

}

FUNCTION choose_element(int level){/* Randomly chooses a function or terminal from the list. */float location =float_rand(); /* choose fcn or tml */int count = 0; /* Count to find integer location of fcn or tml */float total = 0.0; /* total float location *//* Error checking */if (level>=MAX_LEVELS) printf("ERROR! Over maximum level: %i\n", level);/* Get to location */total += function_array.function_list[0].probs[level];while (total < location) {

count++;total += function_array.function_list[count].probs[level];}

/* Return appropriate fcn or tml */return(function_array.function_list[count]);}

WEIGHT choose_weight(){/* Randomly picks a connection weight from the list. */return(weight_array.weight_list[int_rand(0, NUM_WEIGHTS)]);}

/**********************************//* FUNCTIONS TO PRINT OUT REPORTS *//**********************************/

void print_gen_report(POPULATION *curr_population, int generation, int *h_hits){CHROMOSOME best_indiv;/* prints out population statistics and stats for the best individual in *//* population. */printf("Generation report, generation: %i\n", generation);printf("Average fitness in population: %f\n",curr_population->aver_fitness);printf("Best individual this generation: \n");best_indiv = curr_population->individuals[best_individual(curr_population)];*h_hits = best_indiv.hits;print_individual(&best_indiv);}

void print_population(POPULATION *curr_population, int generation){/* prints out statistics for every individual in the population */

Page 58: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

51

int i; /* index */printf("List of individuals in population, generation: %i\n", generation);printf("Average fitness in population: %f\n\n", curr_population->aver_fitness);for (i=0; i<POP_SIZE; i++) {

print_individual(&(curr_population->individuals[i]));}

printf("\n\n");}

void print_individual(CHROMOSOME *individual){/* prints out the statistics for the given individual. */printf("fitness: %f\n", individual->fitness);printf("hits: %i\n", individual->hits);printf("raw fitness: %f\n", individual->raw_fitness);printf("length: %i\n", individual->length);printf("DNA: ");print_DNA(individual->DNA, individual->length);printf("\n");}

void print_DNA(DNA_STRING DNA, int length){int i;/* Prints the tree (individual/ chromosome) in postfix notation */for (i=DNA_length-length; i<DNA_length; i++) {

if ((DNA.DNA[i].value < 0) || (DNA.DNA[i].value > 9))printf("MAJOR ERROR!!!!!\n");

printf("%s ", *(function_array.function_list[DNA.DNA[i].value].name));printf("(%s) ", *(weight_array.weight_list[DNA.DNA[i].weight].name));}

printf("\n");}

/************************************//* FUNCTIONS FOR EVALUATING FITNESS *//* WARNING: PROBLEM-SPECIFIC *//************************************/

void evaluate_fitness(POPULATION *curr_population){/* Evaluates the raw fitness of every individual in the population. *//* Note: Problem-specific. In this case, raw fitness is the sum of *//* absolute values of differences between desired and actual outputs *//* for every possible combination of inputs, taken over the desired number *//* of trials (NUM_TRIALS). The type of problem used here is the 6-input *//* boolean-input, real-output boolean network, whose outputs are specified *//* by the input file. */int i; /* index */int A, B, C, D, E, F; /* inputs */float value, desired_value; /* Value returned by program, desired value */int trial, count; /* trial no., count number of combinations of outputs */for (i=0; i<POP_SIZE; i++) { /* initialization for every individual */

curr_population->individuals[i].raw_fitness = 0.0;curr_population->individuals[i].hits = 0;}

for (trial = 0; trial<NUM_TRIALS; trial++) { /* repeat trials */generate_weights(&weight_array);

Page 59: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

52

for (i=0; i<POP_SIZE; i++) {/* Evaluate raw_fitness of given individual */count = 0;for (A = 0; A <= 1; A++) {for (B = 0; B <= 1; B++) {for (C = 0; C <= 1; C++) {for (D = 0; D <= 1; D++) {for (E = 0; E <= 1; E++) {for (F = 0; F <= 1; F++) {

/* eva,luate output from individual for given inputs */value = eval_individual(curr_population->individuals[i].DNA, curr_population-

>individuals[i].length, A, B, C, D, E, F);desired_value = des_output[count];/* add error to raw fitness */curr_population->individuals[i].raw_fitness +=

fabs(value - desired_value);if (fabs(value - desired_value) < threshold) {

curr_population->individuals[i].hits++;}

count++;}}}}}}

}}}

float eval_individual(DNA_STRING DNA, int length, int A, int B, int C, int D, int E, int F){/* Returns the value returned by the program represented by the DNA *//* This function is problem-specific and needs to be rewritten for *//* different types of problems. Here, the implementation is for *//* the boolean network problem. */float stack[100]; /* stack to evaluate postfix expression */int tos = -1; /* top of stack */int i; /* index */int parent_val; /* what is the parent node *//* evaluate every node in the tree */for (i=DNA_length - length; i<DNA_length; i++) {

/* And2 */if (DNA.DNA[i].value == 0) {

stack[tos-1] = min2(stack[tos-1], stack[tos]);tos--;}

/* And3 */else if (DNA.DNA[i].value == 1) {

stack[tos-2] = min3(stack[tos-2], stack[tos-1], stack[tos]);tos-=2;}

/* Or2 */else if (DNA.DNA[i].value == 2) {

stack[tos-1] = max2(stack[tos-1], stack[tos]);tos--;}

/* Or3 */else if (DNA.DNA[i].value == 3) {

stack[tos-2] = max3(stack[tos-2], stack[tos-1], stack[tos]);tos-=2;}

/* Input A */else if (DNA.DNA[i].value == 4) {

Page 60: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

53

stack[++tos] = 1.0*A;}

/* Input B */else if (DNA.DNA[i].value == 5) {

stack[++tos] = 1.0*B;}

/* Input C */else if (DNA.DNA[i].value == 6) {

stack[++tos] = 1.0*C;}

/* Input D */else if (DNA.DNA[i].value == 7) {

stack[++tos] = 1.0*D;}

/* Input E */else if (DNA.DNA[i].value == 8) {

stack[++tos] = 1.0*E;}

/* Input F */else if (DNA.DNA[i].value == 9) {

stack[++tos] = 1.0*F;}

/* Error checking */else printf("ERROR - invalid function value %i\n", DNA.DNA[i].value);if (i < DNA_length - 1) {/* Check parent and weigh output */parent_val = parent(DNA, i);if ((parent_val == 0) || (parent_val == 1)) {

/* Do OR */stack[tos] = max2(stack[tos],

weight_array.weight_list[DNA.DNA[i].weight].magnitude);if (DNA.DNA[i].weight <= 2) stack[tos] = 1.0 - stack[tos];}

else {/* Do AND */stack[tos] = min2(stack[tos],

weight_array.weight_list[DNA.DNA[i].weight].magnitude);if (DNA.DNA[i].weight <= 2) stack[tos] = 1.0 - stack[tos];}}

}/* Return output from root node */return(stack[tos]);}

float min2(float A, float B){/* Returns the minimum of (A, B) */if (A<B) {

return(A);}

else {return(B);}

}

float min3(float A, float B, float C){/* Returns the minimum of (A, B, C) */

Page 61: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

54

return(min2(min2(A, B), C));}

float max2(float A, float B){/* Returns the maximum of (A, B) */if (A>B) {

return(A);}

else {return(B);}

}

float max3(float A, float B, float C){/* Returns the maximum of (A, B, C) */return(max2(max2(A, B), C));}

void calc_pop_fitness(POPULATION *population)/* Calculated the total fitness of the population, as well *//* as fitness and normalized cumulative fitness for every individual in *//* the population. */{int i; /* index */float cum_fitness = 0; /* normalized cumulative fitness */population->total_fitness = 0; /* total fitness of population */for (i=0; i<POP_SIZE; i++) { /* calculate individual fitnesses */

population->individuals[i].raw_fitness /= (1.0 * NUM_TRIALS);population->individuals[i].hits=(int) (population->individuals[i].hits / (1.0 * NUM_TRIALS) +

0.5);population->individuals[i].fitness = 1.0 /

(0.001 + population->individuals[i].raw_fitness);population->total_fitness += population->individuals[i].fitness;}

/* calculate normalized cumulative fitness */for (i=0; i<POP_SIZE; i++) {

population->individuals[i].cum_fitness = (cum_fitness += (population->individuals[i].fitness/population->total_fitness));

}population->aver_fitness = population->total_fitness / (1.0 * POP_SIZE);}

void generate_weights(WEIGHTS *weight_array){/* Generates the magnitude of the S, M & L weights */int i; /* index */for (i=0; i<3; i++) {

weight_array->weight_list[i].magnitude =weight_array->weight_list[i].lowbound +float_rand() * (weight_array->weight_list[i].highbound -weight_array->weight_list[i].lowbound);weight_array->weight_list[5-i].magnitude =weight_array->weight_list[i].magnitude;}

}

Page 62: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

55

int parent(DNA_STRING DNA, int location){/* Returns the value of the node DNA.DNA[location]'s parent */int count = 0;int parent_loc = location;do {

parent_loc++;count++;count -= DNA.DNA[parent_loc].arity;/* Error checking */if (parent_loc >= DNA_length) printf("Error, no parent %i\n", parent_loc);}

while (count>0);return(DNA.DNA[parent_loc].value);}

/********************************************************//* FUNCTIONS TO CREATE NEW POPULATION GIVEN OLD ONE *//* USES OPERATIONS OF SELECTION, CROSSOVER AND MUTATION *//********************************************************/

void create_new_population(POPULATION *curr_population){/* Via selection, crossover and mutation, creates a new population. *//* Includes elitism. */static POPULATION new_population; /* new population */int i; /* index */static CHROMOSOME indiv1, indiv2; /* 2 individuals chosen for crossover *//* elitism */new_population.individuals[0] = curr_population->individuals[best_individual(curr_population)];

/* crossover 1 */for (i=1; i<=NUM_CROSS_1; i++) {

indiv1 = curr_population->individuals[pick_individual(curr_population)];indiv2 = curr_population->individuals[pick_individual(curr_population)];crossover_1(&indiv1, &indiv2);new_population.individuals[i++] = indiv1;new_population.individuals[i] = indiv2;}

/* crossover 2 */for (i; i<=(NUM_CROSS_1 + NUM_CROSS_2); i++) {

indiv1 = curr_population->individuals[pick_individual(curr_population)];indiv2 = curr_population->individuals[pick_individual(curr_population)];crossover_2(&indiv1, &indiv2);new_population.individuals[i++] = indiv1;new_population.individuals[i] = indiv2;}

/* selection */for (i; i<POP_SIZE; i++) {

new_population.individuals[i] =curr_population->individuals[pick_individual(curr_population)];

if (float_rand() < W_MUT_PROB) {mutate_weight(&(new_population.individuals[i]));}

}/* replace old population with new one */*curr_population = new_population;}

Page 63: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

56

int best_individual(POPULATION *pop)/* returns the location within the population of the individual with *//* the highest fitness */{float highest_fit = pop->individuals[0].fitness; /* highest fitness */int location = 0; /* location of individual with highest fitness */int i; /* index */for (i=1; i<POP_SIZE; i++) {

if (pop->individuals[i].fitness > highest_fit) {location = i;highest_fit = pop->individuals[i].fitness;}

}return(location);}

int pick_individual(POPULATION *pop) {/* Picks an individual (randomly, with highly fit individuals having a higher *//* probability of getting chosen than unfit individuals). */float c_fit_location = float_rand(); /* normalized cumulative fitness of individual chosen */int location = 0; /* location of chosen individual */while (c_fit_location > pop->individuals[location].cum_fitness) location++;/* Error checking */if ((location>=POP_SIZE)||(location<0)) location = 0;return(location);}

void crossover_1(CHROMOSOME *indiv1, CHROMOSOME *indiv2){/* Performs standard GP crossover, as described by Koza */int count, i; /* indices */int r_point1, r_point2, l_point1, l_point2; /* crossover points */static CHROMOSOME new_indiv1, new_indiv2; /* new individuals created by crossover */do {

/* Pick crossover points */r_point1 = int_rand(DNA_length - indiv1->length, indiv1->length);r_point2 = int_rand(DNA_length - indiv2->length, indiv2->length);/* get l_points */count = indiv1->DNA.DNA[r_point1].arity;l_point1 = r_point1;while (count-- > 0) {

l_point1--;count += indiv1->DNA.DNA[l_point1].arity;}

count = indiv2->DNA.DNA[r_point2].arity;l_point2 = r_point2;while (count-- > 0) {

l_point2--;count += indiv2->DNA.DNA[l_point2].arity;}

} while (((indiv1->length+(r_point2-l_point2-r_point1+l_point1)) > DNA_length)|| ((indiv2->length+(r_point1-l_point1-r_point2+l_point2)) > DNA_length));

/* Crossover *//* create first individual */new_indiv1.length = indiv1->length+(r_point2-l_point2-r_point1+l_point1);new_indiv1.fitness = 0;new_indiv1.hits = 0;

Page 64: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

57

new_indiv1.cum_fitness = 0;for (i=DNA_length-1; i>r_point1; i--) {

new_indiv1.DNA.DNA[i] = indiv1->DNA.DNA[i];}

for (i=r_point1; i>=(r_point1 - r_point2 + l_point2); i--) {new_indiv1.DNA.DNA[i] = indiv2->DNA.DNA[r_point2 + i - r_point1];new_indiv1.DNA.DNA[i].level -= (indiv2->DNA.DNA[r_point2].level -

indiv1->DNA.DNA[r_point1].level);}

for (i=(r_point1-r_point2+l_point2-1); i>=(DNA_length-new_indiv1.length); i--) {new_indiv1.DNA.DNA[i] = indiv1->DNA.DNA[l_point1+i-r_point1+r_point2-l_point2];}

/* create second individual */new_indiv2.length = indiv2->length+(r_point1-l_point1-r_point2+l_point2);new_indiv2.fitness = 0;new_indiv2.hits = 0;new_indiv2.cum_fitness = 0;for (i=DNA_length-1; i>r_point2; i--) {

new_indiv2.DNA.DNA[i] = indiv2->DNA.DNA[i];}

for (i=r_point2; i>=(r_point2 - r_point1 + l_point1); i--) {new_indiv2.DNA.DNA[i] = indiv1->DNA.DNA[r_point1 + i - r_point2];new_indiv2.DNA.DNA[i].level -= (indiv1->DNA.DNA[r_point1].level -

indiv2->DNA.DNA[r_point2].level);}

for (i=(r_point2-r_point1+l_point1-1); i>=(DNA_length-new_indiv2.length); i--) {new_indiv2.DNA.DNA[i] = indiv2->DNA.DNA[l_point2+i-r_point2+r_point1-l_point1];}

/* replace old individuals with new ones */*indiv1 = new_indiv1;*indiv2 = new_indiv2;}

void crossover_2(CHROMOSOME *indiv1, CHROMOSOME *indiv2){/* Performs crossover at the same level *//* This is essentially the same crossover as crossover_1, only that *//* the 2 crossover points are at the same level. */int count, i; /* indices */int r_point1, r_point2, l_point1, l_point2; /* crossover points */int r_point_loc; /* used to determine r_point2 */static CHROMOSOME new_indiv1, new_indiv2; /* new individuals created by crossover */int num_same_level = 0; /* number of points in other tree with same level */do {

/* Pick crossover points at same level */do {do {r_point1 = int_rand(DNA_length - indiv1->length, indiv1->length);num_same_level = count_same_level(indiv2, indiv1->DNA.DNA[r_point1].level);} while (num_same_level <= 0);r_point_loc = int_rand(0, num_same_level);r_point2 = get_r_point(indiv2, r_point_loc, indiv1->DNA.DNA[r_point1].level);} while (r_point2 >= DNA_length);/* get l_points */count = indiv1->DNA.DNA[r_point1].arity;l_point1 = r_point1;while (count-- > 0) {

l_point1--;

Page 65: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

58

count += indiv1->DNA.DNA[l_point1].arity;}

count = indiv2->DNA.DNA[r_point2].arity;l_point2 = r_point2;while (count-- > 0) {

l_point2--;count += indiv2->DNA.DNA[l_point2].arity;}

} while (((indiv1->length+(r_point2-l_point2-r_point1+l_point1)) > DNA_length)|| ((indiv2->length+(r_point1-l_point1-r_point2+l_point2)) > DNA_length));

/* Crossover *//* create first individual */new_indiv1.length = indiv1->length+(r_point2-l_point2-r_point1+l_point1);new_indiv1.fitness = 0;new_indiv1.hits = 0;new_indiv1.cum_fitness = 0;for (i=DNA_length-1; i>r_point1; i--) {

new_indiv1.DNA.DNA[i] = indiv1->DNA.DNA[i];}

for (i=r_point1; i>=(r_point1 - r_point2 + l_point2); i--) {new_indiv1.DNA.DNA[i] = indiv2->DNA.DNA[r_point2 + i - r_point1];}

for (i=(r_point1-r_point2+l_point2-1); i>=(DNA_length-new_indiv1.length); i--) {new_indiv1.DNA.DNA[i] = indiv1->DNA.DNA[l_point1+i-r_point1+r_point2-l_point2];}

/* create second individual */new_indiv2.length = indiv2->length+(r_point1-l_point1-r_point2+l_point2);new_indiv2.fitness = 0;new_indiv2.hits = 0;new_indiv2.cum_fitness = 0;for (i=DNA_length-1; i>r_point2; i--) {

new_indiv2.DNA.DNA[i] = indiv2->DNA.DNA[i];}

for (i=r_point2; i>=(r_point2 - r_point1 + l_point1); i--) {new_indiv2.DNA.DNA[i] = indiv1->DNA.DNA[r_point1 + i - r_point2];}

for (i=(r_point2-r_point1+l_point1-1); i>=(DNA_length-new_indiv2.length); i--) {new_indiv2.DNA.DNA[i] =indiv2->DNA.DNA[l_point2+i-r_point2+r_point1-l_point1];}

/* replace old individuals with new ones */*indiv1 = new_indiv1;*indiv2 = new_indiv2;}

int count_same_level(CHROMOSOME *indiv, int level){/* returns the number of nodes in indiv->DNA at the given level */int count = 0; /* initialize count to 0 */int i; /* index */for (i = DNA_length - indiv->length; i < DNA_length; i++) {

if (indiv->DNA.DNA[i].level == level) count++;}

return(count);}

void mutate_weight(CHROMOSOME *old_indiv){/* Picks a weight in the chromosome and replaces it with an adjacent weight */

Page 66: Genetic Programming of Logic-Based Neural Networksvgaudet/PAPERS/GAUDET_UGTHESIS.pdf · i Abstract Genetic algorithms and genetic programming are optimization methods in which potential

59

/* Adjacency is determined by the cycle (-L, -M, -S, S, M, L, -L) but *//* could be modified by adding a link from -M to M. */int location; /* location of weight in chromosome */int new_weight; /* value of new weight */location = int_rand(DNA_length - old_indiv->length, old_indiv->length);if (int_rand(0, 2) == 0) {

new_weight = old_indiv->DNA.DNA[location].weight - 1;}

else {new_weight = old_indiv->DNA.DNA[location].weight + 1;}

if (new_weight == -1) new_weight = NUM_WEIGHTS - 1;if (new_weight == NUM_WEIGHTS) new_weight = 0;old_indiv->DNA.DNA[location].weight = new_weight;}