18
Artificial Neural Network Topology JMHM Jayamaha SEU/IS/10/PS/104 PS0372

Artificial Neural Network Topology

Embed Size (px)

Citation preview

Page 1: Artificial Neural Network Topology

Artificial Neural Network Topology

JMHM JayamahaSEU/IS/10/PS/104

PS0372

Page 2: Artificial Neural Network Topology

Contents Artificial Neural Network Feed-forward neural networks Neural Network Architecture

Single layer feedforwared network Multilayer feedforward network Recurrent network

Summary References

Page 3: Artificial Neural Network Topology

Artificial Neural Network (ANN)

An artificial neural network is defined as a data processing system consisting of a large number of simple highly interconnected processing elements (artificial neurons) in an architecture inspired by the structure of the cerebral cortex of the brain. ( Tsoukalas and Uhring, 1997)

Page 4: Artificial Neural Network Topology

Neural Network Architectures Generally , an ANN structure can be represented using a directed

graph. A graph G is an ordered 2-tuple (V, E) consisting of a set V of vertices and a set E of edge.

When each edge is assigned an orientation , the graph is directed and is called a directed graph or digraph.

There are several classes of NN. Classified according to their learning mechanisms. However we identify 3 fundamentally different classes of Networks. Single layer feedforward network Multilayer feedforward network Recurrent network

All the three classes employ the digraph structure for their representation.

Page 5: Artificial Neural Network Topology

Feed-forward neural networks

These are the commonest type of neural network in practical applications. The first layer is the input and the last layer is the output. If there is more than one hidden layer, we call them “deep” neural

networks.

They compute a series of transformations that change the similarities between cases.

Page 6: Artificial Neural Network Topology

Single Layer Feedforward Network This type of network comprises of two layers , namely the input layer

and the output layer. The input layer neurons receive the input signals and the output layer

neurons receive the output signals. The synaptic links carrying the weights connect every input neuron to

the output neuron but not vise – versa. Such a network is said to be feedforward in type or acyclic in nature.

Despite the two layers , the network is termed single layer since it is the output layer , alone which performs computation.

The input layer merely transmits the signals to the output layer. Hence the name single layer feedforward network.

Page 7: Artificial Neural Network Topology

Illustrates an example network

Page 8: Artificial Neural Network Topology

Multilayer Feedforward Network

This network, as its name indicates is made up of multiple layers. Architecture of this class besides possessing an input and output

layer also have one or more intermediary layers called hidden layers. The computational units of the hidden layer are known as hidden

neurons or hidden units. The hidden layer aids in performing useful intermediary computation

before directing the input to the output layer. The input layer neurons are linked to the hidden layer neurons and

the weights on these links are referred to as input hidden layer weights.

Again , the hidden layer neurons are linked to the output layer neurons and the corresponding weights are referred to as hidden-output layer weights.

Page 9: Artificial Neural Network Topology

Multilayer Feedforward Network(con’t) A multilayer feedforward network with l input neurons m1 neurons in

the first hidden layer. m2 neurons in the second hidden layer and n output neurons in the output layer in written as l – m1 – m2 – n

Page 10: Artificial Neural Network Topology

Multilayer Feedforward Network(con’t)• Applications:

Pattern classification Pattern matching Function approximation any nonlinear mapping is possible with nonlinear Processing Elements

Page 11: Artificial Neural Network Topology

Multi-layer Networks: Issues How many layers are sufficient?

How many Processing Elementss needed in (each) hidden layer?

How much data needed for training?

Page 12: Artificial Neural Network Topology

Examples of Multi-layer NNs Backpropagation

Neocognitron

Probabilistic NN (radial basis function NN)

Cauchy machine

Radial basis function networks

Page 13: Artificial Neural Network Topology

Recurrent Network These networks differ from feedforward architecture in the sense that

there is atleast one feedback loop. Not necessarily stable

Symmetric connections can ensure stability Why use recurrent networks?

Can learn temporal patterns (time series or oscillations) Biologically realistic

Majority of connections to neurons in cerebral cortex are feedback connections from local or distant neurons

Examples Hopfield network Boltzmann machine (Hopfield-like net with input & output units) Recurrent backpropagation networks: for small sequences, unfold network

in time dimension and use backpropagation learning

Page 14: Artificial Neural Network Topology

Recurrent Networks (con’t) Example

Elman networks Partially recurrent Context units keep internal

memory of part inputs Fixed context weights Backpropagation for

learning E.g. Can disambiguate

ABC and CBA

Elman network

Page 15: Artificial Neural Network Topology

Recurrent Network(con’t) Advantages

Unlike feedforward neural networks, RNNs can use their internal memory to process arbitrary sequences of inputs. This makes them applicable to tasks such as unsegmented

connected handwriting recognition or speech recognition. They have the ability to remember information in their hidden state for

a long time. But its very hard to train them to use this potential.

Disadvantages Most RNNs used to have scaling issues. RNNs could not be easily trained for large numbers of neuron units nor

for large numbers of inputs units

Page 16: Artificial Neural Network Topology

Summary Artificial Neural network Feed-forward neural networks Neural network topologies.

Single layer feedforward network Multilayer feedforward network Recurrent network

Page 17: Artificial Neural Network Topology

References

Neural Networks, Fuzzy Logic, and Genetic Algorithems synthesis and applications by S.Ranjasekaran , G.A Vijayalakshmi pai

Neural netowks , a comprehensive foundation by Simon Haykin (second edition)

https://en.wikipedia.org/wiki/Recurrent_neural_network ( 2016-05-14) https://class.coursera.org/neuralnets-2012-001/lecture

Page 18: Artificial Neural Network Topology

Thank you…….