Upload
others
View
4
Download
0
Embed Size (px)
Citation preview
A Self-Organizing Fuzzy Neural Networks
H. S. LIN, X. Z. GAO, XIANLIN HUANG, AND Z. Y. SONG
AbstractThis paper proposes a novel clustering algorithm for the structure learning of fuzzy neural networks. Our clustering algorithm uses the reward and penalty mechanism for the adaptation of the fuzzy neural networks prototypes at every training sample. Compared with the classical clustering algorithms, the new algorithm can on-line partition the input data, pointwise update the clusters, and self-organize the fuzzy neural structure.
Abstract
All rules are self-created, and they grow automatically with more incoming data. There are no conflicting rules in the created fuzzy neural networks. Our approach shows supervised clustering algorithms can be suitable for the structure learning of the self-organizing fuzzy neural networks. The identification of several typical nonlinear dynamical systems and prediction of time series data are employed to demonstrate the effectiveness of the proposed fuzzy neural networks and its learning algorithm.
IntroductionIt is well known fuzzy logic provides human reasoning capabilities to capture uncertainties, which cannot be described by precise mathematical models. Neural networks offer remarkable advantages, such as adaptive learning, parallelism, fault tolerance, and generalization. They have been proved to be powerful techniques in the discipline of system control, especially when the controlled system is difficult to be modeled accurately, or the controlled system has large uncertainties and strong nonlinearities.
IntroductionTherefore, fuzzy logic and neural networks have been widely adopted in model-free adaptive control of nonlinear systems.There are numerous kinds of neural fuzzy systems proposed in the literature,and most of them are suitable for only off-line cases. Some on-line learning methods for the neural fuzzy systems are studied as well.
IntroductionIn this paper, we proposed a novel on-line clustering algorithm of structure learning for our fuzzy neural networks. This new clustering algorithm employs the mechanism of reward and penalty action used in the Learning Vector Quantization (LVQ). Our fuzzy neural networks with the on-line structure and parameter learning capability is a suitable candidate for real-time applications due to its fast convergence.
Structure of Fuzzy Neural Networks
1x 2x
dy1 2ydy21y
Structure of Fuzzy Neural Networks
Layer 1: Each node in this layer, only transmits input values to the next layer directly. Thus, the function of the th node is defined as
Layer 2: Each node in this layer corresponds to one linguistic label of one of the input variables in Layer 1. The operation performed in this layer is
1i if u x= = a f=
(2)21 ( )
2i ij
ij
u mf
σ−
= −fa e=
Structure of Fuzzy Neural Networks
Layer 3: Nodes in this layer are rule nodes, and constitute the antecedents of the fuzzy rule base. The input and output functions of the th rule node are
Layer 4: The nodes in this layer are called “output-term nodes”. The links in Layer 4 perform the fuzzy OR operation that have the same consequent
(3)
1
n
ii
f u=
=∏ a f=
4
1
J
jj
f u=
= ∑ min(1, )a f=
Structure of Fuzzy Neural Networks
Layer 5: These nodes and Layer 5 links attached act as the defuzzifier. The following functions can perform the Center Of Area (COA) defuzzification method:
Based on the above structure, an on-line learning algorithm will be proposed to determine the proper centers and widths of term nodes.
5 5 5( )ij i ij ij if w u m uσ= =∑ ∑ 5ij i
fauσ
=∑
Learning Algorithm for Structure Identification
In our fuzzy neural networks, for every on-line incoming training pattern, we first use the novel cluster algorithm to identify the structure, and next apply the backpropagation algorithm to optimize the parameters. In our learning method, only the training data is need. The input /output-term nodes and rule nodes are created dynamically as learning proceeds upon receiving on-line incoming training data. During the learning process, novel input-term and output-term nodes and rule nodes will be added.
Learning Algorithm for Structure Identification
The main idea of our clustering algorithm is for every input data, we first find the winner clusters in the input and output space respectively. Next, as in the fuzzy ARTMAP, we check that if the winner cluster in the input space is connected to the winner cluster in the output space. If so, we assume that the winner cluster in the output space is the correct prediction of the winner cluster in the input space, which is analogous to the fact the fuzzy ARTb category activated by the input is the correct prediction of the fuzzy ARTa categories activated by an input in the fuzzy ARTMAP.
Learning Algorithm for Structure Identification
If not, we assume that the mismatch occurs between the winner cluster in the input space and the winner cluster in the output space and we will begin to search for another cluster in the input space which will match the winner cluster in the output space . The reward and penalty mechanism is employed in our clustering algorithm. We can describe the novel clustering algorithm as follows.
Learning Algorithm for Structure Identification
Step1: Initialize the fuzzy system with zero cluster: , .Step 2: For the first input and output training vectors, they are selected as the centers of the first clusters in the input and output space, respectively. We connect the first cluster in the input space to the first cluster in the output space, and set the number of data belonging to the cluster as one.
0In = 0On =
Learning Algorithm for Structure Identification
Step 3: For an input data point in the input space, we compute the distances between the input vector and the existing input space clusters using the Euclidean metric function:
The nearest cluster (winner neuron) is chosen by selecting the minimum . Next, the following algorithms are used:
2
1_
ql l
p i pi
d x I m=
= −∑
jd
Learning Algorithm for Structure Identification
If is larger than a certain value , we assume that this input data does not belong to any existing cluster, we form a novel cluster. The newly added cluster is the winner cluster in the input space. If is smaller than , we assume that cluster j is the winner cluster in the input space. The procedure in the input space in Step 3 is also adopted in the output space. We can also find the winner cluster in the output space.
vigilanced
jd
vigilanced
jd
Learning Algorithm for Structure Identification
Step 4: We check the mapping process from input clusters to the output clusters.
(1). If the winner cluster in the input space is a novel cluster, we connect this novel cluster to the winner cluster in the output space, and update the parameters of the winner cluster in the output space.
_1
winner winnerwinner
winner
Om Oc xeph Om
Oc× +
=+
Learning Algorithm for Structure Identification
(2). If the winner cluster of the input space is connected to the winner cluster of the output space originally, we adopt the following algorithm to update the centers (), variances and the counters of the winner cluster in the input space.
2 2 22 2( ) _
1winner winner winner
winner winnerwinner
Oc O Om yO eph OmOcσσ × + +
= −+
_winner winnerOm eph Om=
1winner winnerOc Oc= +
Learning Algorithm for Structure Identification
(3). If the winner cluster of the input space is not connected to the winner cluster of the output space yet, we use the following algorithm to punish the winner cluster of the input space
_1
winner winnerwinner
winner
Im Ic xeph Im
Ic× +
=+
_winner winnerIm eph Im=
2 2 22 2( )
_1
winner winner winnerwinner winner
winner
Ic I Im xI eph Im
Icσ
σ× + +
= −+
1winner winnerIc Ic= +
Learning Algorithm for Structure Identification
After that, we return to Step 3 to search for another cluster in the input space which will match the winner cluster in the output space
_1
winner winnerwinner
winner
Im Ic xeph Im
Ic× −
=−
_winner winnerIm eph Im=
2 2 22 2( )
_1
winner winner winnerwinner winner
winner
Ic I Im xI eph Im
Icσ
σ× + −
= −−
winner winnerIc Ic=
Learning Algorithm for Structure Identification
Our structure learning algorithm is actually a supervised clustering method for identifying the structure of the fuzzy neural networks. As we know, supervised cluster algorithms are indeed effective, and they converge fast. Furthermore, our fuzzy neural networks has the remarkable self-learning ability. That is, it can self-generate fuzzy rules and self-adapt the structure and synaptic weights. Note, there are no conflicting rules in the generated fuzzy neural networks. In summary, the proposed structure learning strategy provides a new way to utilize a class of supervised clustering algorithms for the on-line structure learning of fuzzy neural networks.
Learning Algorithm for Structure Identification
Our structure learning algorithm is actually a supervised clustering method for identifying the structure of the fuzzy neural networks. As we know, supervised cluster algorithms are indeed effective, and they converge fast. Furthermore, our fuzzy neural networks has the remarkable self-learning ability. That is, it can self-generate fuzzy rules and self-adapt the structure and synaptic weights. Note, there are no conflicting rules in the generated fuzzy neural networks. In summary, the proposed structure learning strategy provides a new way to utilize a class of supervised clustering algorithms for the on-line structure learning of fuzzy neural networks.
Parameter Learning of Fuzzy Neural Networks
We use the backpropagation algorithm to tune the parameters of the fuzzy neural networks.
The centers and variances of the cluster in Layer 5 are updated by
4( ) ( )( 1) ( ) [ 1( ) ( )]
1( )k k
k kO t f t
Om t Om t y t y tf t
ση
×+ = + × − ×
( 1) ( ) [ 1( ) ( )]k kO t O t y t y tσ σ η+ = + × −4 4
2
( ) ( ) 1( ) 2( ) ( )1 ( )
k k kOm t f t f t f t f tf t
× × − ××
Parameter Learning of Fuzzy Neural Networks
The centers and variances of the cluster in Layer 2 are updated by
3 32
( )( 1) ( ) ( ) ( ) 2
( ( ))
ii ji i
j j j j ij
x Im tIm t Im t error t f t
I tη
σ−
+ = + × × × ×
23 3
3
( ( ))( 1) ( ) ( ) ( ) 2
( ( ))
ii ji i
j j j j ij
x Im tI t I t error t f t
I tσ σ η
σ−
+ = + × × × ×
Simulations
Example 1 — Identification of an SISO dynamic system: The plant to be identified is described by the following difference equation
32
( )( 1) ( ).1 ( )
y ty t u ty t
+ = ++
Simulations
Root-mean-square errors during learning.
0 20000 40000 60000 80000 1000000
0.1
0.2
0.25
RM
S e
rrors
Iterations
Simulations
Outputs of SISO system and identification model.
0 50 100 150 200 250 300 350 400-1.5
-1
-0.5
0
0.5
1
1.5
Out
put
Time in Samples
Simulations
Example 2 — Identification of an MISO dynamic system: the plant to be identified is a two-input and one-output dynamic system described by the following equations:
2 2 21 2 1( ) (1 )y x x x= − + −
Simulations
Outputs of MISO system and identification
0 10 20 30 40 50 60 70 80 90 1000
1
2
3
4
5
6
Time in Samples
Out
puts
Simulations
Example 3 — Identification of the MIMO dynamic system: the plant is described as:
12
1 2 1
2 1 2 222
( )( 1) 1 ( ) ( )( 1) ( ) ( ) ( )
1 ( )
y ty t y t u ty t y t y t u t
y t
⎡ ⎤⎢ ⎥+ +⎡ ⎤ ⎡ ⎤⎢ ⎥= +⎢ ⎥ ⎢ ⎥⎢ ⎥+⎣ ⎦ ⎣ ⎦⎢ ⎥+⎣ ⎦
Simulations
response of MIMO system and identification model. 0 10 20 30 40 50 60 70 80 90 100
-6
-5
-4
-3
-2
-1
0
1
2
3
4
Time in Samples
y1
1y
Simulations
response of MIMO system and identification model.
0 10 20 30 40 50 60 70 80 90 100-3
-2
-1
0
1
2
3
Time in Samples
y2
2y
Simulations
Example 4 — Prediction of time series.The performance of our fuzzy neural networks in dealing with real-world prediction problems is demonstrated here by predicting the time series of an automobile gearbox. This slightly nonlinear time series is collected using a sound level meter, and the analyzed acoustic data is provided by a Japanese automobile manufacturer.
SimulationsIt represents the external sound level of an automatic transmission system. Below is a brief summary of the experimental measurement conditions and used instrumentation: ①.1000 r/min rotational velocity of the automobile engine,②.5 Nm load torque,③.Integrated sound level meter (Ono Sokki LA-5110),④.5 ms sampling period.
Simulations
Actual and predicted time series of gearbox.
0 10 20 30 40 50 60 70 80 90 100-3
-2
-1
0
1
2
3
Time in Samples
y2
Conclusions
In our paper, a new novel clustering algorithm is proposed for the structure learning of the fuzzy neural networks. This clustering algorithm can on-line partition the input data and self-organize the fuzzy neural structure. Therefore, no priori knowledge of the distribution of the input data is needed for initialization of fuzzy rules. They are automatically generated with the incoming training data.
Conclusions
Our fuzzy neural networks can use this on-line training algorithm for the structure and parameter training. The effectiveness of the proposed learning algorithm is verified by the identification of dynamical nonlinear systems and prediction of time series.