Upload
bozica
View
22
Download
0
Embed Size (px)
DESCRIPTION
Self-Organizing Potential Field Network: A New Optimization Algorithm. Lu Xu and Tommy Wai Shing Chow TNN, Vol.21 2010, pp. 1482–1495 Presenter : Wei- Shen Tai 20 10 / 10/20. Outline . Introduction Background SOMA Self-Organizing Potential Field Network Simulations and results - PowerPoint PPT Presentation
Citation preview
Intelligent Database Systems Lab
國立雲林科技大學National Yunlin University of Science and Technology
Self-Organizing Potential Field Network:A New Optimization Algorithm
Lu Xu and Tommy Wai Shing Chow
TNN, Vol.21 2010, pp. 1482–1495
Presenter : Wei-Shen Tai
2010/10/20
N.Y.U.S.T.
I. M.
Intelligent Database Systems Lab
2
Outline Introduction Background
SOMA Self-Organizing Potential Field Network Simulations and results Analysis of SOPFN algorithm Conclusion Comments
N.Y.U.S.T.
I. M.
Intelligent Database Systems Lab
3
Motivation Most optimization algorithms
Individuals only learn from the best candidate solution even it is far from the global optimum.
They explores a larger search space, but at the expense of convergence rate.
N.Y.U.S.T.
I. M.
Intelligent Database Systems Lab
4
Objective A new optimization algorithm
Each candidate solution can effectively reach the optimum in low search space and computation complexity.
N.Y.U.S.T.
I. M.
Intelligent Database Systems Lab
5
Background Self-organizing migrating algorithm (SOMA)
Updates every individual by a “migration loop” to generate a series of candidate solutions. Particle swarm optimization (PSO)
At each time step, every particle moves toward the direction of the best position among all particles’ previous positions.
Self-organizing and self-evolving neurons (SOSEN) Each neuron evolves using SA and cooperates with other neurons by a self-organizing operator. Search space is enlarged by multiple neurons to enhance the convergence rate for finding the
optimum.
N.Y.U.S.T.
I. M.
Intelligent Database Systems Lab
6
Self-organizing potential field strategy
The cooperation behavior is a self-organizing procedure that the neurons subjected to the winning neuron’s neighborhood are trained.
The competition behavior models the network as a potential field similar to the vector potential field used in mobile robot.
N.Y.U.S.T.
I. M.
Intelligent Database Systems Lab
7
Self-organizing potential field network algorithm
1. Initialization Randomize the initial weights of M × N neurons.
2. Construction of the Potential Field Target neuron Obstacle neuron
3. 1-D Weight Updating For every neuron i, randomly choose an integer k [1,∈ D].
4. Self-adaption: reassign the target neuron c and obstacle neuron r.5. Stop when stopping criteria are satisfied, go step 3 otherwise.
N.Y.U.S.T.
I. M.
Intelligent Database Systems Lab
8
Cooperative and Competitive Behaviors of SOPFN
N.Y.U.S.T.
I. M.
Intelligent Database Systems Lab
9
Simulations and results
N.Y.U.S.T.
I. M.
Intelligent Database Systems Lab
10
Analysis of SOPFN algorithm
N.Y.U.S.T.
I. M.
Intelligent Database Systems Lab
11
Conclusion SOPFN
A new evolutionary algorithm that models the search space as a self-organizing potential field.
In the competitive behavior The target and obstacle neurons are found to speeds up
the convergence rate and increases the probability of escaping from the local optimum.
In the cooperative behavior The winner’s neighboring neurons are updated to
generate new weights at each generation.
N.Y.U.S.T.
I. M.
Intelligent Database Systems Lab
12
Comments Advantage
This proposed model is feasible for effectively finding the optimum in low computational complexity but high convergence speed.
The search space is constrained in a fixed neural network and the candidate solution can be more abundant by self-organizing potential field strategy.
Drawback The number of map size is a crucial factor for determining the search
space and computational complexity. Nevertheless, the performance comparison of different map size was not discussed in this paper.
Application Optimization problems