3
AbstractAn efficient selection method of cluster heads in wireless sensor networks using an intelligent computing algorithm is evaluated with a large data set in this paper. The CHS-CNN (Cluster Head Selection using Centroid Neural Network) finds clusters heads optimally by using Centroid Neural Network (CNN). Th CHS-CNN can take the advantages of both CNN and LEACH for minimizing battery (energy) consumption in sensor nodes. The CHS-CNN compares with another intelligent computing method, Self-Organizing Map (SOM). Initial results show that the CHS-CNN can be effective in terms of total energy consumption of the network and cluster head selection speed. Keywordssensor networks, cluster head, intelligent computing, centroid neural network, LEACH, SOM. I. INTRODUCTION IRESSS sensor network(WSN) is a physical system with a few to hundreds or thousands of sensor nodes. Since each node in the network connects with one or several other nodes, WSN can have various topologies[1]-[4]. Each sensor node in WSN collects various information temperature, sound, pressure, and remaining battery life data. These collected information is then transmitted to a base station over a limited data bandwidth. With existing thousands of nodes, this kind of work consumes lots of energy and can overload bandwidth in communicating many nodes at the same time. Therefore, it is required to design an effective routing protocol for wireless sensor networks. Various routing protocols have been proposed for wireless networks [5]-[7]. One of the main features in these protocols is minimizing energy consumption at each sensor. Among proposed routing protocols for WSN, LEACH(Low Energy Adaptive Clustering Hierarchy)[1] can successfully minimize the energy consumption required for communication and LEACH then achieves improving the life time of a wireless sensor network by selecting the cluster heads stochastically. Although LEACH shows meaningful advantageous features, there is a room for improvement in system life time or system performance capacity because cluster selection process in LEACH is far from being optimal. This issue can be addressed by using computational intelligence methods like Self-Organizing map (SOM) or Centroid Neural Network(CNN) [8]-[11]. The rest of this paper is organized as follows: Section II Eun-Ae Park, Kheon-Hee Lee, and Dong-Chul Park are with the Department of Electronics, Myongji University, YongIn, Rep. of KOREA (phone: +82-31-330-6756, fax: +82-31-3306977, email id; [email protected]) Soo-Young Min is with Software Device Research Center at Korea Electronics Technology Institute, SongNam, Rep. of KOREA (e-mail: [email protected] ) summarizes LEACH and CNN. A cluster head selection method based on CNN is summarized in Section III. Section IV presents experiments and results on several data sets for the evaluation of the CNN-based method. Section V concludes this paper. II. RELATED WORKS A. LEACH (Low-energy adaptive clustering hierarchy) LEACH assumes that sensor nodes have enough power for transmitting data to the base station (BS). LEACH also assumes that each node always has data to send. LEACH divides nodes into groups and each group has one node, called cluster head (CH). CH collects data from its neighboring node members and then sends the collected data to the BS. For selecting CHs, LEACH uses a stochastic process and it ensures that all nodes are selected with equal probability. Once all CHs are chosen, they broadcast an advertisement message to other non-CH nodes and let them know which nodes are the CH. More details on LEACH can be found in [1]. B. . CNN (Centroid Neural Network) The CNN algorithm [9] is an unsupervised competitive learning algorithm. It finds the center of clusters optimally at each presentation of the data vector. The CNN first introduces definitions of the winner neuron and the loser neuron. When a data x is given to the network at the epoch (k), the winner neuron at the epoch (k) is the neuron with the minimum distance to x. The loser neuron at the epoch (k) to x is the neuron that was the winner of x at the epoch (k-1) but is not the winner of x at the epoch (k). The CNN updates its weights only when the status of the output neuron for the presenting data has changed when compared to the status from the previous epoch. The objective function for CNN is the summation of all the distances between each data and the center of the data group for the data. The CNN has several advantages over conventional algorithms such as SOM or k-means algorithm when used for clustering and unsupervised competitive learning. The CNN requires neither a predetermined schedule for learning gain nor the total number of iterations for clustering. It always converges to sub-optimal solutions while conventional algorithms such as SOM may give unstable results depending on the initial learning gains and the total number of iterations. More detailed description on the CNN can be found in [9]-[11]. Application of Intelligent Computing to Wireless Sensor Networks Eun-Ae Park , Kheon-Hee Lee, Dong-Chul Park, and Soo-Young Min W 3rd International Conference on Computational Techniques and Artificial Intelligence (ICCTAI'2014) Feb. 11-12, 2014 Singapore 26

Application of Intelligent Computing to Wireless …psrcentre.org/images/extraimages/7 214348.pdfT ABLE I A VERAGE NUMBERS OF DEATH ROUNDS FOR DIFFERENT CASES Protocol First death

  • Upload
    lamdieu

  • View
    215

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Application of Intelligent Computing to Wireless …psrcentre.org/images/extraimages/7 214348.pdfT ABLE I A VERAGE NUMBERS OF DEATH ROUNDS FOR DIFFERENT CASES Protocol First death

Abstract— An efficient selection method of cluster heads in

wireless sensor networks using an intelligent computing algorithm is

evaluated with a large data set in this paper. The CHS-CNN (Cluster

Head Selection using Centroid Neural Network) finds clusters heads

optimally by using Centroid Neural Network (CNN). Th CHS-CNN

can take the advantages of both CNN and LEACH for minimizing

battery (energy) consumption in sensor nodes. The CHS-CNN

compares with another intelligent computing method, Self-Organizing

Map (SOM). Initial results show that the CHS-CNN can be effective in

terms of total energy consumption of the network and cluster head

selection speed.

Keywords— sensor networks, cluster head, intelligent computing,

centroid neural network, LEACH, SOM.

I. INTRODUCTION

IRESSS sensor network(WSN) is a physical system with

a few to hundreds or thousands of sensor nodes. Since

each node in the network connects with one or several other

nodes, WSN can have various topologies[1]-[4]. Each sensor

node in WSN collects various information temperature, sound,

pressure, and remaining battery life data. These collected

information is then transmitted to a base station over a limited

data bandwidth. With existing thousands of nodes, this kind of

work consumes lots of energy and can overload bandwidth in

communicating many nodes at the same time. Therefore, it is

required to design an effective routing protocol for wireless

sensor networks. Various routing protocols have been

proposed for wireless networks [5]-[7]. One of the main

features in these protocols is minimizing energy consumption

at each sensor. Among proposed routing protocols for WSN,

LEACH(Low Energy Adaptive Clustering Hierarchy)[1] can

successfully minimize the energy consumption required for

communication and LEACH then achieves improving the life

time of a wireless sensor network by selecting the cluster

heads stochastically. Although LEACH shows meaningful

advantageous features, there is a room for improvement in

system life time or system performance capacity because

cluster selection process in LEACH is far from being optimal.

This issue can be addressed by using computational

intelligence methods like Self-Organizing map (SOM) or

Centroid Neural Network(CNN) [8]-[11].

The rest of this paper is organized as follows: Section II

Eun-Ae Park, Kheon-Hee Lee, and Dong-Chul Park are with the

Department of Electronics, Myongji University, YongIn, Rep. of KOREA (phone: +82-31-330-6756, fax: +82-31-3306977, email id; [email protected])

Soo-Young Min is with Software Device Research Center at Korea

Electronics Technology Institute, SongNam, Rep. of KOREA (e-mail: [email protected] )

summarizes LEACH and CNN. A cluster head selection

method based on CNN is summarized in Section III. Section IV

presents experiments and results on several data sets for the

evaluation of the CNN-based method. Section V concludes this

paper.

II. RELATED WORKS

A. LEACH (Low-energy adaptive clustering hierarchy)

LEACH assumes that sensor nodes have enough power for

transmitting data to the base station (BS). LEACH also assumes

that each node always has data to send. LEACH divides nodes

into groups and each group has one node, called cluster head

(CH). CH collects data from its neighboring node members and

then sends the collected data to the BS. For selecting CHs,

LEACH uses a stochastic process and it ensures that all nodes

are selected with equal probability. Once all CHs are chosen,

they broadcast an advertisement message to other non-CH

nodes and let them know which nodes are the CH. More details

on LEACH can be found in [1].

B. . CNN (Centroid Neural Network)

The CNN algorithm [9] is an unsupervised competitive

learning algorithm. It finds the center of clusters optimally at

each presentation of the data vector. The CNN first introduces

definitions of the winner neuron and the loser neuron. When a

data x is given to the network at the epoch (k), the winner

neuron at the epoch (k) is the neuron with the minimum

distance to x. The loser neuron at the epoch (k) to x is the

neuron that was the winner of x at the epoch (k-1) but is not the

winner of x at the epoch (k). The CNN updates its weights only

when the status of the output neuron for the presenting data has

changed when compared to the status from the previous epoch.

The objective function for CNN is the summation of all the

distances between each data and the center of the data group for

the data.

The CNN has several advantages over conventional

algorithms such as SOM or k-means algorithm when used for

clustering and unsupervised competitive learning. The CNN

requires neither a predetermined schedule for learning gain nor

the total number of iterations for clustering. It always

converges to sub-optimal solutions while conventional

algorithms such as SOM may give unstable results depending

on the initial learning gains and the total number of iterations.

More detailed description on the CNN can be found in [9]-[11].

Application of Intelligent Computing

to Wireless Sensor Networks

Eun-Ae Park , Kheon-Hee Lee, Dong-Chul Park, and Soo-Young Min

W

3rd International Conference on Computational Techniques and Artificial Intelligence (ICCTAI'2014) Feb. 11-12, 2014 Singapore

26

Page 2: Application of Intelligent Computing to Wireless …psrcentre.org/images/extraimages/7 214348.pdfT ABLE I A VERAGE NUMBERS OF DEATH ROUNDS FOR DIFFERENT CASES Protocol First death

III. CHS-CNN (CLUSTER HEAD SELECTION USING CENTROID

NEURAL NETWORK)

Although LEACH has some advantages, there is room for

improvement because of the grouping is not optimal and rather

random. In LEACH, it ignores physical locations of the node

when selecting group and cluster heads. Thus, CHs in LEACH

may consume more energy from cluster members than those at

the center positions. Fig. 1 shows examples of the problem. To

overcome this problem, CNN is applied in the process of

cluster selection. At the first round, nodes at cluster centers are

chosen as CHs and CNN consumes less system energy than

LEACH from the first round. The CHS-CNN ensures that the

energy is optimized locally in each group and thus decreases

the energy consuming in the whole networks rather than

randomly changing the clusters as in LEACH[8].

IV. EXPERIMENTS AND RESULTS

For experiments, the same environment setup as in LEACH

is used. We randomly generate 1,000 sensor nodes that locate at

(x,y) in a certain node range 0≤x≤200 and 0≤y≤200 and BS

locate at (100,350). Each data message is 500 bytes over 1 Mb/s

of bandwidth. The header for each packet is 25 bytes. The cost

of communication energy when transmit a message with size of

l bits through d distance is evaluated as follows:

(1)

and for a received message:

(2)

Where is a threshold distance. and are the

amplifier energy respect to the free space model or the

multipath model. is the electronics energy. In our

experiments, these parameters are set as: = 50 ,

= 0.0013 , =10 and =86.202.

These parameters are also used in [1].

Fig. 1 shows the location of CHs and grouping information

for LEACH, EBCS (Energy Based Clustering using Self

organizing map), and the CHS-CNN on one instance for the

data set. In Fig. 1, selected CHs are in red dots. Light blue dots

represent the node that have been selected as CHs in previous

rounds. Blue dots are the nodes that never been selected as CHs

in previous rounds and are the candidates for future CHs. The

total distances from the chosen CHs to the nodes in their groups

in these instances are 11,423.8, 11,598.4, and 8,862.7 for

LEACH , EBCS, and CHS-CNN, respectively. This clearly

shows the difference of communication energies required for

different protocols. Table I summarizes average numbers of

rounds for different case: first death, half death and last death.

These criteria are very important in evaluating different routing

protocols. Table I shows that the CHS-CNN can be favorably

compared with to other algorithms in terms of node deaths.

Similar results are reported in [8] for a smaller scale problems.

(a)

(b)

(c)

Fig. 1: Examples of cluster head locations: (a) LEACH (b) EBCS,

and (c) CHS-CNN (red dots: current CHs, light blue dots: past CHs,

blue dots: never been CHs)

In order to evaluate speed for selecting cluster heads in

EBCS and CHS-CNN, the cpu times are calculated under the

following computing environment:

- CPU: Intel(R) Core(TM) i5-2400, 3.1㎓ CPU,

- RAM size: 2 GB, OS: Window 7 Enterprise, 64bit.

Table II shows the clustering time required for different

protocols. Note that LEACH does not require clustering time

because it selects CHs randomly. On average, EBCS requires

5,399.1 mS while CHS-CNN finds clusters in 1,620.0 mS. This

clearly shows that CHS-CNN reduces the clustering time about

70% when compared to EBCS. When number of nodes is

practical size, this effect is more severe and this savings in

selection time makes the CNN-based protocol very valuable.

3rd International Conference on Computational Techniques and Artificial Intelligence (ICCTAI'2014) Feb. 11-12, 2014 Singapore

27

Page 3: Application of Intelligent Computing to Wireless …psrcentre.org/images/extraimages/7 214348.pdfT ABLE I A VERAGE NUMBERS OF DEATH ROUNDS FOR DIFFERENT CASES Protocol First death

TABLE I

AVERAGE NUMBERS OF DEATH ROUNDS FOR DIFFERENT CASES

Protocol First death Half death Last death

LEACH

EBCS

CHS-CNN

101.1

111.7

139.0

282.2

575.5

601.8

139.0

601.8

1,018.5

TABLE II

AVERAGE CPU TIME REQUIRED FOR CLUSTERING

Data Random Grid Gaussian

EBCS

CHS-CNN

4,993.6

1,488.6

5,412.8

1,583.2

5,790.9

1,788.3

V. CONCLUSIONS

A wireless sensor network protocol based on centroid neural

network is evaluated in this paper. The CHS-CNN protocol

finds clusters of sensor nodes optimally by adopting Centroid

Neural Network and follows the selection method for cluster

heads from LEACH algorithm. Experiments are performed on

example platform used in LEACH algorithm in order to

evaluate performances of the CNN-based algorithm. Several

different sensor network maps with 2,000 nodes are designed

for experiments and evaluated the number of nodes alive after a

period. The results show that the CHS-CNN shows very

compatible performance with EBCS and very improved

performance over LEACH. The computing speeds for the

selection of cluster heads for each round are evaluated for

CHS-CNN and EBCS. The results show that the CHS-CNN can

save the clustering time required for the selection of cluster

heads about 70%. From experiments and results, we can

conclude that the CHS-CNN can be more effective than

conventional LEACH in terms of total energy consumption of

the network and system life time while the CHS-CNN can find

its cluster heads 3 times faster than EBCS.

ACKNOWLEDGMENT

This work was supported by the IT R&D program of The

MKE/KEIT (10040191, The development of Automotive

Synchronous Ethernet combined IVN/OVN and Safety control

system for 1Gbps class).

REFERENCES

[1] W. Heinzelman, A. Chandrakasan, and H. Balakrishnan, “An

application-specific protocol architecture for wireless microsensor

networks,” IEEE Trans. on Wireless Communications, pp.660--670, 2002.

[2] M.A.A. Kashani, H. Ziafat, “A method for reduction of energy

consumption in Wireless Sensor Network with using Neural Networks,” Proc. of IEEE ICCIT Conference, 6: 476--481, 2011.

[3] M. Cordina, C.J. Debono, “ Increasing Wireless Sensor Network

Lifetime through the Application of SOM neural networks,” ISCCSP, IEEE, Malta, pp. 467--481, May, 2000.

[4] N. Enami, R.A. Moghadam, K.D. Ahmadi, “ A new neural network based

energy efficient clustering protocol for Wireless Sensor Networks,” Proc. of IEEE ICCIT Conference, 5: 40--45, 2010.

[5] M. Sharma, K. Sharma, “An Energy Efficient Extended LEACH”, Proc.

IEEE CSNT Conference, pp. 377-382, May, 2012. [6] Asaduzzaman and Hyung-Yun Kong, “Energy Efficient cooperative

LEACH Protocol for Wireless Sensor Networks,” IEEE Trans. on

Communications and Networks, Vol. 12, No. 4, pp. 358-365, Aug, 2010. [7] C.P. Subha, S. Malarkan, K. Vaithinathan, “A Survey On Energy

Efficient Neural Network Based Clustering Models In wireless Sensor

Network”, Proc. IEEE ICEVENT Conference, pp. 1-6, Jan, 2013.

[8] E. Park , K. Lee, D. Park , and S. Min, “Centroid Neural Networks for

Wireless Sensor Networks,” Proc. of WCSN, Dec. 2013. [9] Dong-Chul Park, “Centroid Neural Network for Unsupervised

Competitive Learning,” IEEE Trans on Neural Networks, 11(2):

520--528, May, 2000. [10] Dong-Chul Park and Y. Woo, “Weighted centroid neural network for

edge reserving image compression,” IEEE Trans. on Neural Networks,

12:1134--1146, Mar. 2001. [11] Dong.-Chul Park, O.-H. Kwon, and Jio Chung, “Centroid neural network

with a divergence measure for gpdf data clustering,” IEEE Trans. on

Neural Networks, 19(6):948--957, June 2008.

3rd International Conference on Computational Techniques and Artificial Intelligence (ICCTAI'2014) Feb. 11-12, 2014 Singapore

28