15
Sensors and Actuators A 179 (2012) 297–311 Contents lists available at SciVerse ScienceDirect Sensors and Actuators A: Physical jo u rn al hom epage: www.elsevier.com/locate/sna Wearable obstacle detection system fully integrated to textile structures for visually impaired people Senem Kursun Bahadir a,b,, Vladan Koncar a , Fatma Kalaoglu b a University of Lille North of France, ENSAIT, GEMTEX F-59100, Roubaix, France b ITU, Textile Engineering Department 34437 Istanbul, Turkey a r t i c l e i n f o Article history: Received 6 July 2011 Received in revised form 20 February 2012 Accepted 20 February 2012 Available online 10 March 2012 Keywords: Smart clothing Obstacle detection Visually impaired Ultrasonic sensors Vibration motors Intelligent textiles a b s t r a c t In this study, an innovative wearable obstacle detection system fully integrated to textile structures, which enables detection of obstacles for visually impaired people, has been developed. In order to guide visually impaired people safely and quickly among obstacles, an innovative approach based on integration of electronics onto textiles has been studied. Adaptation of sensor and actuator methodology to textile structures has been realized. Finally, smart clothing prototype including ultrasonic sensors, vibration motors, power supplies and a microcontroller has been developed. The working principle of the system is based on two main functions: sensing the surrounding environment as well as detection of obstacles via sonar sensors and guiding the user by actuators by using a novel control algorithm based on a neuro- fuzzy controller implemented to a processing unit. This system is able to identify obstacle’s position within the detection range. It is capable of detecting obstacle’s position accurately. It is easily worn as a garment that is flexible, lightweight and comfortable for human body as well as washable. The proposed smart clothing system could become united part of visually impaired people’s lifestyle, and it could help them overcome navigation concerns seamlessly, without imposing upon them any physical or cognitive load. © 2012 Elsevier B.V. All rights reserved. 1. Introduction During last decades, several researches have been focused on visually impaired individuals’ navigation concerns in their living environment. These researches have been focused on developing new devices by adapting advanced technologies. Development of these devices and application of technologies for orientation and mobility have evolved since 1960s. The devices developed for visually impaired can be categorized as ETA (Electronic Travel Aids) or RTA (Robotic Travel Aids). These ETAs generally provide feedback to the user through a range of tones and fixed intensity vibrations, and are usually implemented as a portable system. They consist of cameras to capture images and detect obstacles or find pathways using image processing techniques. Moreover, they include sonar sensors to measure the distances to obstacles, and/or GPS (Global Positioning Systems) RFID (Radio Frequency Identification) to identify their local posi- tion. Besides, in order to guide user, vibrators, earphone, audio etc. are generally used as actuators in aformentioned systems. Corresponding author at: ITU, Textile Engineering Department 34437 Istanbul, Turkey. Tel.: +90 212 2931300x2499. E-mail address: [email protected] (S.K. Bahadir). In the literature, a set of systems that represents four principal approaches to solve the identification and navigation problems of the visually impaired people: Camera, RFID, GPS, and sonar based systems, has been studied. In camera-based systems, cameras are used to capture visual information from the surrounding environment. The captured image is processed via image processing methods, and then mapped to stereo sound patterns or vibrations [1–4]. The earliest camera based system was NAVI (Navigation Assistant for Visually Impaired), which was designed to convert images captured by a vision sensor into verbal messages through stereo earphone [5,6]. Similarly, Balakrishnan et al. developed SVETA (Stereo Vision based Electronic Travel Aid) system composed of a wearable computer, stereo earphones and a helmet moulded with stereo cameras cap- turing the images [7,8]. In some studies, mounting visual camera on a head was found to be an inconvenient solution for blind user. Thus, another system in which the camera was mounted on the chest of the user was proposed [9,10]. Differently, in Tyflos system developed by Bourbakis et al. two tiny cameras were placed at the dark glasses. In that system, visual information was converted into either vibrations on the 2D vest for navigation purposes or spoken natural language sentences for reading purpose [11–13]. Another approach is to use radio-frequency identification (RFID) for identification and tracking of an object by using radio waves. The 0924-4247/$ see front matter © 2012 Elsevier B.V. All rights reserved. doi:10.1016/j.sna.2012.02.027

Wearable obstacle detection system fully integrated to textile structures for visually impaired people

Embed Size (px)

Citation preview

Page 1: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

Wv

Sa

b

a

ARRAA

KSOVUVI

1

ventm

aEtaatdRta

T

0d

Sensors and Actuators A 179 (2012) 297– 311

Contents lists available at SciVerse ScienceDirect

Sensors and Actuators A: Physical

jo u rn al hom epage: www.elsev ier .com/ locate /sna

earable obstacle detection system fully integrated to textile structures forisually impaired people

enem Kursun Bahadira,b,∗, Vladan Koncara, Fatma Kalaoglub

University of Lille North of France, ENSAIT, GEMTEX F-59100, Roubaix, FranceITU, Textile Engineering Department 34437 Istanbul, Turkey

r t i c l e i n f o

rticle history:eceived 6 July 2011eceived in revised form 20 February 2012ccepted 20 February 2012vailable online 10 March 2012

eywords:mart clothingbstacle detection

a b s t r a c t

In this study, an innovative wearable obstacle detection system fully integrated to textile structures,which enables detection of obstacles for visually impaired people, has been developed. In order to guidevisually impaired people safely and quickly among obstacles, an innovative approach based on integrationof electronics onto textiles has been studied. Adaptation of sensor and actuator methodology to textilestructures has been realized. Finally, smart clothing prototype including ultrasonic sensors, vibrationmotors, power supplies and a microcontroller has been developed. The working principle of the systemis based on two main functions: sensing the surrounding environment as well as detection of obstaclesvia sonar sensors and guiding the user by actuators by using a novel control algorithm based on a neuro-

isually impairedltrasonic sensorsibration motors

ntelligent textiles

fuzzy controller implemented to a processing unit. This system is able to identify obstacle’s positionwithin the detection range. It is capable of detecting obstacle’s position accurately. It is easily worn as agarment that is flexible, lightweight and comfortable for human body as well as washable. The proposedsmart clothing system could become united part of visually impaired people’s lifestyle, and it could helpthem overcome navigation concerns seamlessly, without imposing upon them any physical or cognitiveload.

© 2012 Elsevier B.V. All rights reserved.

. Introduction

During last decades, several researches have been focused onisually impaired individuals’ navigation concerns in their livingnvironment. These researches have been focused on developingew devices by adapting advanced technologies. Development ofhese devices and application of technologies for orientation and

obility have evolved since 1960s.The devices developed for visually impaired can be categorized

s ETA (Electronic Travel Aids) or RTA (Robotic Travel Aids). TheseTAs generally provide feedback to the user through a range ofones and fixed intensity vibrations, and are usually implementeds a portable system. They consist of cameras to capture imagesnd detect obstacles or find pathways using image processingechniques. Moreover, they include sonar sensors to measure theistances to obstacles, and/or GPS (Global Positioning Systems) –

FID (Radio Frequency Identification) to identify their local posi-ion. Besides, in order to guide user, vibrators, earphone, audio etc.re generally used as actuators in aformentioned systems.

∗ Corresponding author at: ITU, Textile Engineering Department 34437 Istanbul,urkey. Tel.: +90 212 2931300x2499.

E-mail address: [email protected] (S.K. Bahadir).

924-4247/$ – see front matter © 2012 Elsevier B.V. All rights reserved.oi:10.1016/j.sna.2012.02.027

In the literature, a set of systems that represents four principalapproaches to solve the identification and navigation problems ofthe visually impaired people: Camera, RFID, GPS, and sonar basedsystems, has been studied.

In camera-based systems, cameras are used to capture visualinformation from the surrounding environment. The capturedimage is processed via image processing methods, and thenmapped to stereo sound patterns or vibrations [1–4]. The earliestcamera based system was NAVI (Navigation Assistant for VisuallyImpaired), which was designed to convert images captured by avision sensor into verbal messages through stereo earphone [5,6].Similarly, Balakrishnan et al. developed SVETA (Stereo Vision basedElectronic Travel Aid) system composed of a wearable computer,stereo earphones and a helmet moulded with stereo cameras cap-turing the images [7,8]. In some studies, mounting visual cameraon a head was found to be an inconvenient solution for blind user.Thus, another system in which the camera was mounted on thechest of the user was proposed [9,10]. Differently, in Tyflos systemdeveloped by Bourbakis et al. two tiny cameras were placed at thedark glasses. In that system, visual information was converted into

either vibrations on the 2D vest for navigation purposes or spokennatural language sentences for reading purpose [11–13].

Another approach is to use radio-frequency identification (RFID)for identification and tracking of an object by using radio waves. The

Page 2: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

2 d Actu

bcts

oCPBt(mSttiseeswa

bnibtctiPsOtiafu

mrhtt

spiembssgttpsmgigw

t

98 S.K. Bahadir et al. / Sensors an

asic concepts of RFID tags used for visually impaired navigationoncerns are such that at first tags are mounted on the objects inhe environment that are significant for the user, then the signalent by the RFID tags is read by the receiver that user carries [14].

In last two decades, a number of research projects have focusedn development of suitable guidance system using RFID [15–19].hang et al. developed iCane System equipped with an RFID reader,ersonal Digital Assistant (PDA) and earphones communicating vialuetooth. The RFID reader embedded in iCane sends the informa-ion gathered from the RFID tags to the PDA to show local pointse.g. crossing points, stairs etc.). By this way, instructions are trans-

itted to the user via speech through the Bluetooth earphones [20].imilar to iCane system, Shiizu et al. developed White cane sys-em in which RFID tags are set on coloured navigation lines. Alonghe navigation line, if the user is oriented to wrong line he/she isnformed by vibration of the white cane. Moreover, information isent to him/her by pre-recorded voice [21,22]. Differently, an inter-sting approach in the RFID tag systems was suggested by Szetot al. that was embedding RFID reader antennas on a jacket. Theyuggested that microstrip antennas could be etched or sewn on toearable conducting clothing while the pocket PC and a recharge-

ble battery are placed in the pocket of the jacket [23].Since the mid-1980s, researchers have focused on the GPS-

ased systems in order to overcome the visually impairedavigation concerns, maintain orientation, and store and retrieve

nformation about specific location [24,25]. Basic concept of GPS-ased system is taking location of user as reported by GPS devicehrough Geographic Information System (GIS). This informationan then provide the user his/her localization such as by iden-ifying which building the user is close to or how far the users from the desired destination [26,27]. Loomis et al. developedersonal Guidance System including GPS module with stereo headet [28]. Another portable system including GPS module was theDILLA System with white cane. In that system, information about

he localization of a user was provided by audible text-to-speechnformation [29]. Ran et al. developed DRISTHI System for outdoornd indoor environments. In their system, GPS module was usedor outdoor navigation whereas ultrasound-positioning tags weresed for indoor environment [30].

Sonar is a kind of instrument used for detecting, locating, deter-ining objects or measuring the distance to an object through

eflected sound waves. During past decades, several researchersave introduced devices that use sonar system to provide mobilityo the visually impaired. Cane or stick is the one approach in whichhe sonar system is integrated.

In cane systems, as soon as the information is generated by sen-ors, it is processed and by this way, encoder generates controlulses for the servos as a routing system [31–33]. Another approach

s to implement sonar system on a wearable concept. Shovalt al. developed Navbelt system that involves ultrasonic sensorsounted on a belt with a computer. In this system, signals acquired

y the sensors were processed in the computer, and then resultedignals were sent to user by stereophonic headphones using atereo imaging technique [34,35]. Recently, researchers have sug-ested novel wearable devices in which sonars are attached toextile structures e.g. Andha Astra, Arm9-based embedded sys-em [36,37]. Additionally, in a study of Cardin et al. sonars werelaced on the shoulder of cloth, and vibrators were mounted on theame cloth. Prototype was composed of sonars, vibration motors, aicrocontroller and a PDA [38]. However, in those studies, the inte-

ration of electronic components to textile structures was not givenn detail; this was the missing part. It seems that they only sug-

ested a system, or attached these components by using conductiveires to the garment [36–39].

Based on literature review, it is important for a usable electronicravel aid to let the visually impaired be hand free and comfortable

ators A 179 (2012) 297– 311

during the navigation. The most suitable approach to let the user behand free is embedding whole system into clothes. Although thereare numerous researches and developments of mobility aids forvisually impaired people, there is no development of any interactivegarment fully integrated with textile structure which can help visu-ally impaired people to overcome navigation concerns. Therefore, astudy to develop an innovative wearable obstacle detection systemfully integrated to textile structures for visually impaired peoplehas been conducted and presented in this paper. The prototype ofthe whole system has also been realized and successfuly tested. Themain difference of this system compared to others is the position ofvibration motors and sensors on the garment. It is different than instudies suggesting using sonars and vibrators. In our study, beforedesigning smart clothing system the position of vibration motorswere investigated in order to optimize user vibrotactile sensations[40,41]. Correspondingly, in order to find the optimum number andplacement of sensors on the garment, several experiments werealso conducted [42,43]. Finally, this system is lightweight, flexibleand comfortable for user to wear it rather than to carry it. Thus, thecomprehensive investigation with respect to sensing performanceand guidance as well in indoor environment can be considered asthe advantage of the newly developed system.

2. E-textile architecture for obstacle detection

2.1. Basis of system architecture

A smart clothing system is essentially a hierarchical process. Ateach level of hierarchy there are different factors which have to betaken into account in order to transform garment into an interac-tive, intelligent infrastructure to facilitate information processing.During the design of interactive garment two main critical issues,which are requirements for electronic software and hardwarecomponents, and wearability performance have to be elaboratedtogether and compromised.

Therefore, a smart clothing system is a combination of differentresearch fields especially electronics, information technology, con-trol engineering, and textiles design. Towards our objective, beforedesigning smart clothing system, the issues stressed that are relatedto electronic system architecture are:

• What types of sensors are required?• What types of actuators can be used?• How the data will be processed, which types of signal processing

units are required?• What will be the decision parameters in signal processing unit?• How many sensors of each type are required?• How many actuators of each type are required?• What is the optimum placement of sensors and actuators on

human body?• What is the most useful placement for the microcontroller?• What algorithms are needed to provide accuracy in analysing data

gathered by sensors?• What is the required power consumption for the system?• Which types of power supplies are adequate?• Which types of conductive fibers are suitable for this system

architecture?

In wearability concept, some performance requirements suchas lightweight, breathable, comfortable, easy to wear etc. have tobe taken into consideration. In our system, the whole performancerequirements were expected as shown in Table 1.

Page 3: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

S.K. Bahadir et al. / Sensors and Actuators A 179 (2012) 297– 311 299

Table 1Smart clothing system expected requirements.

Functionality Maintability Manufacturability Wearability Durability

Detection of obstacles Launderable Ease of fabrication Comfortable StrengthEasy drying Suitable size ranges -No skin irritation -Tear-tensile burstColour fastness -No pressure points

Guidance alert Repairable Breathable (air permeable) Abrasion resistanceMoisture absorptionLightweightDimensional stabilityEasy to wear and take offMaintain operational mobility

2s

2

Mimr

atiu

tswwdbbTpil

.2. Adaptation of sensor and actuator methodology to textiletructures

.2.1. MaterialsIn order to detect obstacles within e-textile circuit, LV-

axSonar®-EZ3TM (MaxBotix) ultrasonic sensor was used due tots small dimensions, low power requirements (2.5–5.5 V) and opti-

al detection angle. The detection capacity of this ultrasonic sensoranges from 6 to 254 inches and sensor operates at 42 KHz [44].

Arduino LilyPad Vibe Board® vibration motor was used as anctuator due to its small dimensions (outer diameter: 20 mm,hin, 0.002 kg), low power requirement (maximum 5.5 V) and easymplementation, in order to give vibrotactile sensations to guideser [45].

To integrate the vibration motor and ultrasonic sensor to tex-ile structures as well as to form electric circuit in the structures,ilver plated nylon yarn with a linear resistance of <50 ohm/m andith a yarn count of 312/34f × 4 dtex was used. Silver plated nylonas chosen as conductive yarn because our previous experimentsenoted that silver plated nylon yarns show the best compromiseetween the signal quality and textile properties e.g. handle, sta-le and elastic, easy to weave, easy to integrate sensor etc. [46].he linear resistance of conductive yarns was measured in ohm

er meter (�/m) using TTi 1906 computing multimeter. Electrical

nsulation in the textile circuit was provided by PA 66 yarns with ainear density of 78/68 × 4 dtex.

Fig. 1. Vibration motors integrated to woven fabric

Maximize range of motion

2.2.2. Formation of e-textile structure and integration of sensorand actuators to structure

To prevent formation of short circuits, conductive yarns werehidden into structure. A fabric structure was considered as adouble-woven fabric, and conductive yarns were placed in the mid-dle layer of the structure [47]. The set of warp yarns of upper layerwas linked to the set of weft yarns from bottom layer and thus, twolayers were held together.

To obtain transmission lines in a desired location on the struc-ture, some of weft yarns were replaced with conductive yarnsduring the weaving process. Throughout the production of sam-ples, to design electrical circuit and to connect vibration motors andultrasonic sensors to a fabric, loops were formed among conductiveyarns and then snap fasteners were sewn onto these loops.

Eventually, signals passing through conductive yarns via snapfasteners were transmitted to vibration motors and ultrasonic sen-sors. For instance, Fig. 1 shows the integration of vibration motorsto textile structure.

To integrate multi-connected sensors to textile structure, it wasnecessary to use Voltage, Ground, AN, TX, RX and BW pins of thesensor. Therefore, with reference to dimensions of ultrasonic sen-sor’s output pins six electrical connection points (Voltage, Ground,AN, TX, RX and BW pins) at specified distances were taken into

account. Thus, in each sample, conductive yarn was inserted sixtimes in weft direction at desired distances to satisfy six electricalconnection points.

to attach over hip bone area of the garment.

Page 4: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

300 S.K. Bahadir et al. / Sensors and Actuators A 179 (2012) 297– 311

n of u

pyc

3

crih

Fig. 2. Integratio

Final e-textile structure corresponding sensor’s connectionoints is shown in Fig. 2. As seen in the figure, the conductivearns are in grey colour in the middle part of the fabric and non-onductive PA 66 yarns are in white colour.

. Design of smart clothing prototype

During the development of prototype, firstly, the electronic cir-

uit of the system according to electronic software and hardwareequirements has been designed and then by considering wearabil-ty requirements and comfort of the user, the layout of the systemas been devised.

Fig. 3. Schematic diagram of sma

ltrasonic sensor.

3.1. Circuit design

The circuit was designed mainly considering multi-connectionprinciples of ultrasonic sensors. The schematic diagram of smartclothing system circuit is shown in Fig. 3.

The function of this circuitry is to digitalize as well as trans-form analog signals acquired by sensors into vibration signal. Itmodulates analog signals into different levels of vibrations byidentifying correlation between position of obstacle and requiredturning action (direction and angle) for user.

There are four key connections and elements for this circuitry; (i)one microcontroller, (ii) four ultrasonic sensors, (iii) eight vibrationmotors, and (iv) two power supplies. Four sensors were used todetect obstacles, eight vibration motors (each of four on the left and

rt clothing system circuit.

Page 5: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

S.K. Bahadir et al. / Sensors and Actuators A 179 (2012) 297– 311 301

t design on the garment.

rtp

3

ddr

3

cgg

odcutduaAtzs

Fig. 4. Circuit layou

ight) were used in order to guide user by recommending him/herurning direction and angle. Microcontroller was used in order torocess as well as transform data into commands.

.2. Design layout

Considering Fig. 3, the circuit layout of the prototype wasesigned as shown in Fig. 4. In Fig. 4(a) and (b), the circuit wasesigned over the garment front and back, and over the sleeves,espectively.

.2.1. Sensor placementSince the distance between four sensors were adjusted to 20 cm

onsidering human body physiology, the position of sensors on thearment were decided as placing them under the breast zone of thearment taking account of both women’s and men’s body.

Indeed, the position of sensor plays a great role on the detectionf obstacles. They should be placed in a region where the garmentoes not move significantly during the walking. Within this con-ept, there can be two alternative zones: shoulders zone or zonender the breast. If the sensors are placed over the shoulders zone,hen obstacles with higher height such as wardrobe, wall will beetected. On the other hand, if the sensors are placed over the zonender breast, then the obstacles not only with higher height butlso lower than that of height such as tables can also be detected.

ccording to our objective and considering environmental condi-

ions, since there are more obstacles with lower height, then theone under the breast on the garment was chosen for position ofensors in order to avoid more obstacle collision.

Fig. 5. Base structure of developed interactive garment.

Page 6: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

302 S.K. Bahadir et al. / Sensors and Actuators A 179 (2012) 297– 311

ensor

3

hwuwmt

iptttpltvtt

Fig. 6. Removable fabrics of the prototype for s

.2.2. Actuator-vibration motors placementIn accordance to our previous study [40], it was found that the

ighest level of vibrotactile sensation was perceived over the outerrist and hip bone area of the evaluators’ body. Therefore, to guideser, vibration motors were decided to be placed over the outerrist and hip bone area of the garment. Three of the vibrationotors were decided to be placed on the wrist of left arm, whereas

he other three on the wrist of right arm.The left and right hip bone area was chosen for summer cloth-

ng usage. The garment was designed for both summer and wintereriods. Therefore, when the sleeves of the proposed garment areaken out, the system is designed to be able to generate control byhe vibration motors placed on the left and right hip bone area ofhe garment. To sum up, eight vibration motors were decided to belaced as follows: each three of the six vibration motors are on the

eft and right arm over the outer wrist of garment and the other

wo are on the left and right hip bone area of the garment. Threeibration motors are used on one arm in order to give informationo user about the location of obstacles as well as about requiredurning angle. For instance, in the case of right turn with a small

, actuator, microcontroller and power supplies.

angle, only the 1st vibration motor on the right arm will act. Simi-larly, if the required turning action is turn right with a large angle,then three vibration motors on the right arm will act.

3.2.3. Microcontroller and power supply placementAfter the decision of actuator’s and sensor’s placement, the posi-

tions of microcontroller and power supplies were planned out.Considering circuit and resistance constraints, microcontroller andpower supplies should be placed as close as possible to each other.Moreover, critical point in microcontroller placement is that it isthe network of inputs and outputs. Therefore, it should be placedin a region that is able to gather all analog outputs from sensors andsend inputs to actuators without any overlapping. The best possibleposition for microcontroller is the centre of garment regarding towhole circuit.

Due to microcontroller position and circuit constraints, posi-tions of power supplies were determined close to microcontroller.Hence, they were placed around the vertical centreline of the gar-ment.

Page 7: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

S.K. Bahadir et al. / Sensors and Actuators A 179 (2012) 297– 311 303

ent w

3

3

ttds

mmwooi

ss

Fig. 7. Interactive garm

.3. Smart clothing prototype

.3.1. Base structure of the intelligent garmentIn order to obtain the circuit layout design shown in Fig. 4 on

he garment, production of seamless products was considered. Byhis way, single-jersey circular knitting machine, with a cylinderiameter of 13 inches and E28 gauge was used to produce basetructure of interactive garment (see Fig. 5).

Considering wearability and durability performance require-ents mentioned in Table 1 such as comfortable, breathable,oisture absorption, lightweight, strength etc., Polyamide 66 yarnsith a linear density of 78/68 × 2 dtex were used. Additionally, in

rder to get tightly fit in the garment, elastomeric yarns composedf PA (22 Denier) including Lycra® (16 Denier) were also used dur-

ng the production of base structure of interactive garment.

The sleeves and body parts of the garment were producedeamless. Then, they were sewn together in order to build a basetructure of the interactive garment (see Fig. 5). As shown in that

ith its removable parts.

figure, conductive yarns (silver plated nylon 66) are in grey colour,whereas polyamide yarns (nonconductive) are in white. This struc-ture is also washable. The other electronic parts are removable andtheir washing is not recommended.

3.3.2. Removable structures of intelligent garmentThe compactness of woven structure provides better impact

resistance than knitting structure and by this way, it will preventthe swinging of sensors and actuators during walking. Therefore,instead of knitting structure, woven structure was considered toproduce removable parts as mentioned in Section 2.2.2.

The total removable parts for sensor and actuator connectionsare shown in Fig. 6a. There are totally four sensors and eight vibra-tion motors integrated to woven fabrics. The connection of these

removable parts to main circuit in base structure of the interactivegarment was provided by snap fasteners.

The removable fabric for microcontroller connection was pro-duced by sewing. Similarly, snap fasteners provided the connection

Page 8: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

304 S.K. Bahadir et al. / Sensors and Actuators A 179 (2012) 297– 311

Fig. 8. Flow diagram of microcontroller’s main program.

Fig. 9. Timing diagram for microcontroller.

Page 9: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

S.K. Bahadir et al. / Sensors and Actuators A 179 (2012) 297– 311 305

Fig. 10. Measurements for detection capability of the developed system and layout of environment including obstacles.

sus tim

adpia

3

suwt

dtotttbb

er

Fig. 11. Measured distance data taken by sensors ver

mong main circuit and microcontroller. The fabric used for pro-ucing base structure of interactive garment was also used toroduce both microcontroller connection and pockets for batter-

es (see Fig. 6b). The conductive yarns were again inserted as wells hidden in the middle part of knitted fabric.

.3.3. Interactive garment prototypeFinally, an interactive garment with its removable parts is

hown in Fig. 7a. Sensors were positioned in front of the garmentnder the breast zone. Vibration motors were positioned over therist and hip bone area of the garment. Microcontroller and bat-

eries were positioned along the vertical centreline of the garment.Parts for ultrasonic sensors, vibration motors and batteries were

esigned to be attached from the inner side of the garment and thus,hey are not visible when they are attached (see Fig. 7b). However,nly microcontroller is visible on the garment when attached. Byhis way, user can open and close the system easily via button onhe microcontroller. Fig. 7b shows the final smart clothing proto-ype worn by mannequin. The weight of the final prototype withoutatteries is about 250 g, whereas the total weight including two

atteries is 458 g.

Smart garment enabling detection and avoidance obstacles isasy to handle, light enough to wear and carry, and washable whenemovable parts are detached from the main structure.

e when obstacle at left (−40, 80) and right (40, 80).

3.3.4. Microcontroller programmingIn our study, Lilypad Arduino® microcontroller [45] board was

used. The board was based on ATmega328 (20 MHz, 6-channel 10-bit ADC, 14-channel programmable I/O Lines) which the instructionset and technical specifications of the chip was given in ATMEL®

Technical Data Manual [48]. In order to program microcontroller,Arduino’s own software® [45] was preferred against assembly lan-guage due to easy programming.

The design of program is aimed at analysing signals acquiredby the ultrasonic sensors and transforming them into differentvibration intervals in the case of obstacles for guiding person withrecommended turning action to avoid obstacle. Fig. 8 shows theflow diagram of the microcontroller’s main program used in ourstudy.

The main program works as follows: Firstly, program goesthrough an initialization phase where all variables are set, all I/O(Input/Output) ports are initialized and the external devices areenabled. Next the processor waits for calibration during 5 s. In thecalibration phase, all sensor outputs assign to the same range. Thus,they are capable of measuring the same interval.

Then, data acquisition and sampling loop start. Signals acquired

by ultrasonic sensors are processed within a sampling period. Inthat period, data processing is done in order to understand if thereis an obstacle on the way of user or not. According to data assess-ment, decision output is given such that if the obstacle is detected
Page 10: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

306 S.K. Bahadir et al. / Sensors and Actuators A 179 (2012) 297– 311

Table 2The relation between turning angle and detected object position.

Object at x-axis (cm)

Object at y-axis (cm) Turning angle (ϕ) −∞ −40 −30 −20 −10 0 10 20 30 40 ∞RMRSZ

amoaicata

pssidp(veda

0–125 Z RS RS

125–250 Z RS RS

∞ Z Z Z

t the right, then actuation signals are transformed to left vibrationotors in order to guide person by turning left or vice versa. When

bstacles are at the front, system gives an output like turn rightnd turn left at the same time, hence user can understand theres an obstacle just in front of the user and randomly he/she canhoose his/her way by turning right or left at that moment. If datassessment results in there is no obstacle on the way of the user,hen there is no decision output as turn left or right that means noctuation signals are transformed to vibration motors (go straight).

In microcontroller programming, the critical point is the sam-ling period for data acquisition and the control signal. For ourtudy, in order to determine sampling period as well as controlignal frequency to guide user at a right time interval before crash-ng obstacle, first walking speed of visually impaired people wasetermined. Some studies reported that walking speed of normaledestrian is between 1.22 m/s (younger pedestrians) and 0.91 m/solder pedestrians) [49,50]. Considering this fact and our obser-ations, walking speed of visually impaired person was assumed

qual approximately to 0.6 m/s. Furthermore, during walking theistance to be checked for efficient obstacles avoidance was defineds 2.5 m.

Fig. 12. Detection capability of the developed smart clothing.

RL RVL-LVL LL LM LS LS Z RM RL-LL LM LS LS LS Z

Z Z Z Z Z Z Z

Hence, maximum timing diagram for microcontroller program-ming including sampling loop and decision output period in asafety margins is shown in Fig. 9. In the 1st second, data acqui-sition as well as sampling is performed. Minimum sampling timeof data was calculated and determined as approximately 10 ms (1sampling loop ≈ 10 ms). According to one data assessed after thesampling; one element of the decision matrix is updated. Hence, forthe new condition, the decision output can be given after the tensampling process at least i.e. approximately in 100 ms. As soon asdecision output is given, actuation signals are transferred to vibra-tion motors and thus, user can sense vibration motions within 1 stime.

Sensation of vibration motions can start before the 2nd timeinterval due to decision output (the intervals given in the Fig. 9shows the maximum timing including safety margins in order to beable to guide user before crashing an obstacle). After sensations, onemore second is given to user to compensate forward motion duringturning action. In this manner, user’s avoidance from an obstaclewithin a 2–2.5 m range is guaranteed.

Regarding turning action, in order to control user’s motion inan environment as well as to establish the relation between sensorvalues and turning angle, an algorithm had been developed beforethe microcontroller programming. In that algorithm, rules were

defined using human experience based on observed data taken byreal time measurements. The recommended turning angle for userto avoid obstacle concerning its position was decided as seen in

Fig. 13. Experiments during moving towards an obstacle.

Page 11: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

S.K. Bahadir et al. / Sensors and Actuators A 179 (2012) 297– 311 307

Fig. 14. Interactive garment worn by the mannequin moving towards an object located on the left (1st case) and in the front (2nd case).

left a

TraM

s

tvitLl

Fig. 15. Measured signals over the 1st, 2nd and 3rd vibration motors on the

able 2. In the table “R” and “L” indicate “turn right” and “turn left”,espectively. Additionally, {Z, S, M, L, VL} values denote the turningngle in terms of linguistic variables; Zero (Go Straight), S (Small),

(Medium), L (Large), VL (Very Large).Terms indicating the turning angle {Z, S, M, L, VL} are corre-

ponding to the number of working of vibration motors as follows:Z = No vibrations neither at left nor at right. RS/LS = 1st vibra-

ion motor is only acting either at left or right. RM/LS = 1st and 2ndibration motors on the same side (either at left or right) are act-

ng simultaneously. RL/LL = 1st, 2nd and 3rd vibration motors onhe same side (either at left or right) are acting simultaneously. RL-L = 1st and 2nd vibration motors on both different sides (both ateft and right) are acting simultaneously. RVL-LVL = 1st, 2nd and 3rd

Fig. 16. Measured signals over the 1st, 2nd and 3rd vibration motors on th

nd right arms during the movement towards an obstacle located at the left.

vibration motors on both different sides (both at left and right) areacting simultaneously.

4. Performance of smart clothing system

Developed smart clothing was tested for the following purposes:(i) detection capability and robustness; (ii) power consumption;(iii) heating behaviour, and (iv) washability performances.

4.1. Detection capability and robustness of the developed system

Detection capability of the developed system was defined asthe degree of detection range in terms of distance along y and

e left and right arms during the movement towards front obstacle.

Page 12: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

308 S.K. Bahadir et al. / Sensors and Actuators A 179 (2012) 297– 311

pera

xwp

mBcwiecotwmDwcao

orst4aif

adcdols

miwtcs

aos

Fig. 17. (a) Thermal image of the structure at 5 V and (b) its tem

-axis that is able to detect obstacles during operations. Robustnessas defined as the ability of developed system to detect obstacles’osition accurately in order to navigate avoiding collisions.

For experimental purposes, intelligent garment placed on theannequin was tested for its detection range as shown in Fig. 10.

efore conducting experiments since lengths and widths of obsta-les are critical issues for obstacle detection, some assumptionsere made as follows: (i) the widths of obstacles used in exper-

ments were larger than 30 cm, (ii) the heights of obstacles used inxperiments were higher than 90 cm. During measurements, obsta-les were placed in front of the mannequin in different positions inrder to find maximum detection range. For instance, Fig. 11 showshat the measured distance data taken by sensors when the obstacleas positioned at left (−40, 80) and right (40, 80). These measure-ents were recorded in MATLAB by using National Instruments®

AQ (Data Acquisition) Card. It is noticeable from the figure thathen the obstacle at left only the left down sensor (sensor 1) is

apable of detecting obstacle. On the contrary, when the obstaclet the right, only right down sensor (sensor 2) is capable of detectingbstacle’s position as expected logically.

In Fig. 10, white drawings on the ground were obtained duringperation in different time intervals. According to measurementesults, during the first two hour the detection capability of theystem was up to 2.5 m in y-axis as shown in Fig. 12. However, ashe operation time increased, the detection range decreased. After

h and 6 h working time, the detection range decreased to 2.2 mnd 1.8 m, respectively. This result can be attributed to a decreasen battery voltage. As the time passes, batteries run out. Thus, theeeding voltage going to sensors decreases.

Since our ultrasonic sensor analogue voltage output works with scaling factor of Vcc (Feeding voltage)/512 per inch, the measuredistance values acquired by sensors, which are processed in micro-ontroller, decrease due to decrease in Vcc and results in smalleretection range as time passes. Moreover, as seen in Fig. 12, it wasbserved that the detection capability of left and right sensors is aittle bit different. The area detected by left and right sensor is notymmetric. This can be linked to sensitivity of the sensors.

In order to test the system’s reaction in case of obstacles duringovement, the experimental set up seen in the Fig. 13 was used. For

nstance, intelligent garment worn by the mannequin was movedith a speed of 0.6 m/s towards an obstacle (see Fig. 14). When

he mannequien was between 3 m and 2.5 m far away from obsta-le, there was no actuation on the vibration motors that means gotraight position (Z).

However, in the 1st case (Fig. 14a), when the mannequin reached distance of 2.5 m away from obstacle located on the left side,nly the 1st vibration motor on the right arm acted in order toatisfy warning action. The other vibration motors did not exhibit

ture distribution along conductive yarn with a distance of |AP|.

any vibrations until reaching a distance of 1.25 m far away fromobstacle. On the other hand, during the movement along 1.25 mand 15 cm; in addition to 1st vibration motor, the 2nd vibrationmotor on the right arm also showed vibration in order to presentthe proximity of the obstacle (see Table 2). Signals over the conduc-tive yarns measured by the oscillloscope at the point of connectionwith 1st, 2nd and 3rd vibration motors on the left and right armsduring the movement towards an obstacle located at the left sideare shown in the Fig. 15.

In the 2nd case (Fig. 14b), when the obstacle located in the frontof the mannequin, at the beginning again there was no vibrations(between 3 m and 2.5 m) neither at left nor right. But when the man-nequin reached 2.5 m and during the movement till 1.25 m, it wasobserved that 1st and 2nd vibration motors on both different sides(both at left and right) were acting simultaneously. Later on, afterpassing the distance of 1.25 m, three vibration motors on both sidesstarted to act concurrently. Fig. 16 clearly shows the differentiationof signals taken over the vibration motors during the movement ofmannequin towards front obstacle.

Based on experiment results, it can be said that the developedsystem is able to detect obstacle’s position accurately and presentsencouraging results in order to warn the user to avoid obstacle. Itis able to identify obstacle’s position without any failure within thedetection range. That means system is capable of detecting obsta-cles accurately such that when obstacles are at the right, systemgives an output turn left or vice versa. When obstacles are at thefront, system gives an output like turn right and turn left at the sametime. Therefore, a user can choose his/her way randomly by turn-ing right or left in order to avoid obstacle in the front. Thus, it canbe concluded that the success and the robustness of the developedsmart clothing system is really promising.

4.2. Power consumption

Our experiments showed that system can work for 1 day work-ing period (8 h) without any additional battery, however dependingon environment this result can change. Therefore, it is recom-mended to users to have spare battery for much longer usage andto overcome upsets.

4.3. Heating behaviour

Thermal analysis was carried out on the base structure of thedeveloped smart clothing system in order to find out whether the

garment heats up to the point of where the comfort of user can beaffected or it may injure user.

A thermal camera (Testo 880®, Testo Inc.) was used to takeinfrared images of the structure. Testo 880® Thermal Camera has a

Page 13: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

d Actu

tp2pliawsrsoa

ai5pcaafrt

4

lcist

5

ossnedwtrcoaoacdms

stcf

atic

S.K. Bahadir et al. / Sensors an

hermal resolution of <0.1 ◦C at 30 ◦C and was set to record tem-eratures every 5 s. A multichannel DC power source (Keithley400 SourceMeter®, Keithley Instruments Inc.) was used to sup-ly voltages to the structure. Experiments were done in standard

aboratory conditions (20 ◦C, 65%RH). The base structure of thenteractive garment was placed on a plastic stand about 50 cmway from thermal camera. Then, conductive parts of the garmentere clamped with the prob of the DC power supply. Measurement

tarted with supplying 5 V to the garment considering system’seal working condition. During the measurements, Testo IRSoftoftware® was used to acquire images of temperature distributionn the fabric sample. Fig. 17 shows the thermal image of a structuret 5 V with its temperature distribution along conductive yarn.

According to Fig. 17, it was observed that the average temper-ture along the conductive yarn is about 22.8 ◦C. Concerning ourntelligent garment, since the system’s working range is around

V, it can be concluded that during the operation mode the tem-erature on the intelligent garment will be around 22 ◦C along theonductive yarns (transmission lines). Indeed, this result was founds satisfying regarding thermal comfort of a human body. Becauseccording to ASHRAE STANDARD 55-2010 [51], to maintain com-ort of human body, the amount of insulation required to keep aesting person warm in a windless room at 70 ◦F (21.1 ◦C) is equalo one clo.

.4. Washability

In order to see conductivity changes of the system after homeaundering, sample of base structure of garment was washed 10ycles under AATCC Test Method 135-2004 [52]. After each wash-ng cycle, the conductivity of the transmission line was tested. Noignificant difference could be noticed after 10 washing cycles alongransmission line.

. Conclusion

In this article, a smart clothing prototype enabling detectionf obstacles called“Wearable Obstacle Detection System” has beenuccessfully developed. The design and development concept ofmart clothing prototype has involved four key areas of research,amely (i) electronics, (ii) information technology, (iii) controlngineering, and (iv) textiles. The prototype has been tested foretection capability, power consumption, heating behaviour, andash-ability. Results showed that developed system is able to iden-

ify obstacle’s position without any failure within its detectionange. System is capable of detecting left, right, and front obsta-les’ position accurately and giving right output while detectingbstacles such that when obstacles are at the right, system givesn output turn left; when obstacles are at the left, system gives anutput turn right. When obstacles are at the front, system givesn output like turn right and turn left at the same time, hence useran understand there is an obstacle just in front of the user and ran-omly he/she can choose his/her way by turning right or left at thatoment. Therefore, it can be obviously concluded that developed

ystem is successful, reliable and robust.Regarding power consumption of the system, it was found that

ystem can work for 1 day working period (8 h) without any addi-ional battery, however depending on environment this result canhange. Therefore, it is recommended to users to have spare batteryor much longer usage and to overcome upsets.

Concerning heating behaviour of the system, during the oper-

tion mode (Voltage ≈ 5 V), it was found that temperature alongransmission lines on the garment is around 22 ◦C, which approx-mately corresponds with ASHRAE STANDARD 55-2010’s thermalomfort degree (21.1 ◦C).

ators A 179 (2012) 297– 311 309

Comprehensive investigation on the developed smart cloth-ing system with respect to sensing performance of the indoorenvironment and guiding the visually impaired accurately will pro-vide a new scientific understanding of interactive garment designand development. It represents a great challenge and significantcontribution to the sensor and actuator integration knowledgeto textile structure. In addition, intelligent textiles are a recentlydeveloping area and there is still many to be invented, thereforesuccessful implementation and integration of electronics used inour smart clothing system are significantly valuable for smart tex-tiles researches.

6. Recommendations for future work

For future studies especially for outdoor environment, initialprototype can be fully integrated with GPS, RFID, camera and vocalguidance, not only can it track the user, but also find a routeto specific destination, and then guide the user to this destina-tion using synthesized speech by ensuring localization informationto user such as the street address of the current location etc. Inaddition in order to detect obstacles with lesser heights, systemcould be combined with whole garment of the user such as pantsand to detect obstacles on the ground such as big holes, stairswhen descending, system could be integrated with shoes as well.Regarding power consumption of the developed system, a con-trol to warn the user about battery level or the level of voltagesupply can be implemented to the system e.g. voltage monitor-ing circuit. Due to information on voltage level decrease, somecoefficients related to detection range capability have to be addedto microcontroller programming in order to prevent decrease indetection range as well. Due to the technology miniaturization andreduction of costs in electronics and textile industry, new sensingelements, new flexible technologies, new actuators, new functionalyarns can be implemented to our developed system. For instance,flexible textile based solar cells, which is expected to be thin-ner, lighter in weight and more powerful in the future, would beembedded to newly developed smart clothing system as powersupply for further improvement. As an actuator, artificial muscleswould be interesting. Finally, a fully textile flexible sonar systemsmay be developed in order to replace existing miniaturized rigidsensors.

Acknowledgments

This work was accomplished through the collaboration of Istan-bul Technical University, Istanbul (Turkey) and Ecole NationaleSupérieure des arts et Industries Textiles, Roubaix (France). Appre-ciation is extending to ENSAIT, GEMTEX Laboratory and ITU,Textile-Clothing Control and Research Laboratory, for their sup-port to supply materials and perform experimental work. Wealso wish to thank to French Embassy in Turkey for the finan-cial support. Furthermore we would like to present our specialthanks to Sim-Art Tekstil San. Tic. Ltd. S ti. to produce finalprototype.

References

[1] W. Fink, M. Tarbell, CYCLOPS. A mobile robotic platform for testing and vali-dating image processing and autonomous navigation algorithms in support ofartificial vision prostheses, Computer Methods and Programs in Biomedicine 6(3) (2009) 226–233.

[2] D. Tsai, J.W. Morley, G.J. Suaning, N.H. Lovell, A wearable real-time image pro-

cessor for a vision prosthesis, Computer Methods and Programs in Biomedicine(2009), ISSN 0169-2607.

[3] C. Lee Wee, M.K.H. Leung, G. Sainarayanan, SINVI: smart indoor navigation forthe visually impaired, in: Control, Automation, Robotics and Vision Conference,vol. 2, 6–9 December, 2004, 2004, pp. 1072–1077.

Page 14: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

3 d Actu

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

[

10 S.K. Bahadir et al. / Sensors an

[4] K. Alhajri, N. Al-Salihi, V. Garaj, W. Balachandran, The performance of WiFinetwork for application in a navigation system for visually impaired people,24–26 April 2008, pp. 243–249.

[5] R. Nagarajan, S. Yaccob, G. Sainarayanan, Fuzzy clustering in vision recognitionapplied in NAVI, Fuzzy Information Processing Society, Proceedings. NAFIPS, in:Annual Meeting of the North American, 27–29 June, 2002, 2001, pp. 261–266.

[6] R. Sainarayanan, R. Nagarajan, Y. Sazali, Fuzzy image processing scheme forautonomous navigation of human blind, Applied Soft Computing 7 (1) (2007)257–264.

[7] G. Balakrishnan, G. Sainarayanan, R. Nagarajan, S. Yaccob, On stereo processingprocedure applied towards blind navigation aid, in: SVETA Proceedings – 8thInternational Symposium on Signal Processing and its Applications 2, 2005, pp.567–570.

[8] G. Balakrishnan, G. Sainarayanan, R. Nagarajan, S. Yaccob, Fuzzy matchingscheme for stereo vision based electronic travel aid, in: IEEE Region 10 AnnualInternational Conference, Proceedings, 2007.

[9] R. Jirawimut, S. Prakoonwit, F. Cecelja, W. Balachandran, Visual odometerfor pedestrian navigation, Instrumentation and Measurement 52 (4) (2009)1166–1173.

10] M. Nicholas, S. Stephen, B. Michael, L. David, P. Penny, Robotic sensing for thepartially sighted, Robotics and Autonomous Systems 26 (2) (1999) 185–201.

11] N. Bourbakis, D. Kavraki, An intelligent assistant for navigation of visuallyimpaired people Bioinformatics and Bioengineering Conference, in: Proceed-ings of the IEEE 2nd International Symposium on 4–6 November, 2001, 2001,pp. 230–235.

12] N. Bourbakis, R. Keefer, D. Dakopoulos, A. Esposito, Multimodal interactionscheme between a blind user and the tyflos assistive prototype, Tools withArtificial Intelligence 2 (2008) 487–494.

13] D. Dakopoulos, N. Bourbakis, Preserving visual information in low resolutionimages during navigation of visually impaired, in: 1st International Conferenceon Pervasive Technologies Related to Assistive Environments, 2008.

14] C.P. Gharpure, Orientation-free radio frequency identification-based naviga-tion in a robotic guide for the visually impaired, M.S. dissertation. United States– Utah: Utah State University, Publication Number: AAT 1423820, 2004.

15] W.S. Mooi, E.N.H. Tan Chong, Efficient RFID tag placement framework for inbuilding navigation system for the blind, in: Information and Telecommunica-tion Technologies (APSITT), 2010, pp. 1–6.

16] K. Yelamarthi, D. Hass, D. Nielsen, S. Mothersell, RFID and GPS integrated nav-igation system for the visually impaired, in: IEEE Engineering and Technology,2010, pp. 1149–1152.

17] B. Ji Zou, Duanyuan research of region navigation based on radio frequencyidentification, in: International Conference on Computer, Mechatronics, Con-trol and Electronic Engineering (CMCE), 2010, pp. 300–303.

18] S. Chumkamon, P. Tuvaphanthaphiphat, K. Keeratiwintakorn, A blindnavigation system using RFID for indoor environments, ElectricalEngineering/Electronics, Computer, Telecommunications and Informa-tion Technology, in: 5th International Conference 2, 14–17 May, 2008, 2008,pp. 765–768.

19] J. Fariaj, S. Lopes, H. Fernandes, P.B. Martins, Electronic white cane for blindpeople navigation assistance, in: World Automation Congress, 2010, pp. 1–7.

20] T.H. Chang, C.J. Ho, D.C. Hsu, Y.H. Lee, M.S. Tsai, M.C. Wang, J. Hsu, iCane –a partner for the visually impaired, lecture notes in computer science, LNCS3823 (2005) 393–402.

21] T.K.M. Seto, A navigation system for the visually impaired using colored naviga-tion lines and RFID tags, in: IEEE Engineering in Medicine and Biology Society.Conference, 2009, pp. 831–834.

22] Y. Shiizu, Y. Hirahara, K. Yanashima, K. Magatani, The development of a whitecane which navigates the visually impaired, in: Engineering in Medicine andBiology Society. 29th Annual International Conference of the IEEE 22–26August, 2007, pp. 5005–5008.

23] Y.J. Szeto, K. Andrew, S. Sharma, RFID based indoor navigational aid for personswith severe visual impairments, in: 29th Annual International Conference ofIEEE-EMBS, Engineering in Medicine and Biology Society, 2007, pp. 6360–6363.

24] P.E. Ponchillia, E.C. Rak, A.L. Freeland, S.J. LaGrow, Accessible GPS: reorientationand target location among users with visual impairments, Journal of VisualImpairment & Blindness 101 (7) (2007) 389–401.

25] J.R. Marston, J.M. Loomis, R.L. Klatzky, R. Golledge, Nonvisual route follow-ing with guidance from a simple haptic or auditory display, Journal of VisualImpairment & Blindness 101 (4) (2007) 203–211.

26] Y. Ebrahim, M. Abdelsalam, M. Ahmed, C. Siu-Cheung, Proposing a hybrid tag-camera-based identification and navigation aid for the visually impaired, in:Consumer Communications and Networking Conference, 3–6 January, 2005,pp. 172–177.

27] K. Yelamarthi, D. Haas, D. Nielsen, S. Mothersell, RFID and GPS integrated nav-igation system for the visually impaired, IEEE Engineering and Technology(2010) 1149–1152.

28] R. Loomis, G. Marston, K. Golledge, Personal guidance system for people withvisual impairment: a comparison of spatial displays for route guidance, Journalof Visual Impairment & Blindness 99 (2005) 219–232.

29] B. Mayerhofer, B. Pressl, M. Wieser, ODILIA – a mobility concept for the visuallyimpaired, Lecture notes in computer science (including subseries lecture notes

in artificial intelligence and lecture notes in bioinformatics), LNCS 5105 (2008)1109–1116.

30] L. Ran, S. Helal, S. Moore, Drishti: an integrated indoor/outdoor blind navigationsystem and service, in: Pervasive Computing and Communications. Proceed-ings of the Second IEEE Annual Conference, 2004, pp. 23–30.

ators A 179 (2012) 297– 311

31] D. Dakopoulos, N.G. Bourbakis, Wearable obstacle avoidance electronic travelaids for blind: a survey, IEEE Transactions on Systems, Man, and Cybernetics –Part C: Applications and Reviews 40 (1) (2010) 25–35.

32] I. Ulrich, J. Borenstein, The GuideCane-applying mobile robot technologies toassist the visually impaired, IEEE Transactions on Systems, Man, and Cybernet-ics Part A: Systems and Humans 31 (2) (2001) 131–136.

33] M. Bousbia-Salah, M. Fezari, An ultrasonic mobility system for blind and visu-ally impaired people, Automatic Control and Computer Sciences 40 (3) (2006)55–59.

34] S. Shraga, U. Iwan, B. Johann, Computerized obstacle avoidance systems for theblind and visually impaired, Intelligent Systems and Technologies in Rehabili-tation Engineering, CRC Press LLC, 2001.

35] S. Shoval, J. Borenstein, Y. Koren, Auditory guidance with the Navbelt-a com-puterized travel aid for the blind, Systems, Man, and Cybernetics, Part C:Applications and Reviews 28 (3) (1998) 459–467.

36] S. Sethu Selvi, U.R. Kamath, M.A. Sudhin, Andha Asthra, A navigation systemfor the visually impaired, multisensor fusion and integration for intelligentsystems, in: IEEE International Conference, 20–22, 2008, pp. 137–142.

37] M. Bousbia-Salah, A. Redjati, M. Fezari, M. Bettayeb, An ultrasonicnavigation system for blind people, signal processing and communica-tions, in: IEEE International Conference, 24–27 November 2007, 2007,pp. 1003–1006.

38] S. Cardin, F. Thalmann, F. Vexo, A wearable system for mobility improvementof visually impaired people, The Visual Computer 23 (2) (2007).

39] S. Byeong-Seo, L. Cheol-Su, Y. Oakley, S. Brewster, Obstacle detection and avoid-ance system for visually impaired people, LNCS 4813 (2007) 78–85.

40] S. Kursun, V. Koncar, F. Kalaoglu, S. Thomassey, Fuzzy based evaluation of vibro-tactile perception by using vibration motor embedded to woven fabric, in:International Conference on Intelligent Textiles 2010, 16–18 June, Seoul-Korea,2010, pp. 49–51.

41] S. Kursun, F. Kalaoglu, V. Koncar, S. Thomassey, Comparison on perceivedvibrotactile stimuli of e-textile structures by using fuzzy logic, in: 4thInternational Conference of Applied Research in Textile, Monastir, TUNUS,02.12.2010–05.12.2010, 2010.

42] S. Kursun Bahadir, F. Kalaoglu, V. Koncar, Comparison of multi-connectedminiaturized sonar sensors mounted on textile structure in a different posi-tion angle for obstacle detection, in: International Conference on IntelligentTextiles & Mass Customisation, Casablanca, Morrocco, 27.10. 2011–29.10.2011,2011.

43] S. Kursun Bahadir, F. Kalaoglu, V. Koncar, Multi-connection of miniaturizedsonar sensors onto textile structure for obstacle detection, in: AUTEX 2011,Mulhouse, France, 08.06.2011–10.06.2011, 2011.

44] Maxbotix Inc., LV-Maxsonar®-EZ3TM Data Sheet and LV Chaining ApplicationNotes, avalaible at www.maxbotix.com (accessed February 2011).

45] Ardunino LilyPad Vibe Board LilyPad Arduino microcontroller available onhttp://www.arduino.cc/ (accessed December 2010).

46] S. Kursun Bahadir, V. Koncar, F. Kalaoglu, I. Cristian, S. Thomassey, Assess-ing the signal quality of ultrasonic sensor on different conductive yarns usedas transmission lines, Fibres & Textiles in Eastern Europe 195 (88) (2011)75–81.

47] S. Kursun Bahadir, F. Kalaoglu, S. Thomassey, I. Cristian, V. Koncar, A Study onthe beam pattern of ultrasonic sensor integrated to textile structure, Journal ofClothing Science and Technology 23 (4) (2011) 232–241.

48] http://www.atmel.com/dyn/resources/prod documents/doc8271.pdf(accessed February 2011).

49] R.L. Knoblauch, M.T. Pietrucha, M. Nitzburg, Field studies of pedestrian walk-ing speed and start-up time, Transportation Research Record 1538 (2006)27–38.

50] T.J. Gates, D.A. Noyce, A.R. Bill, V. Nathanael, Recommended Walking Speedsfor Pedestrian Clearance Timing Based on Pedestrian Characteristics, TRB 2006Annual Meeting, Paper No. 06, 2006, 1826.

51] ASHRAE STANDARD 55-2010. Thermal Environmental Conditions for HumanOccupancy.

52] AATCC Test Method 135-2004. Dimensional Changes of Fabrics after HomeLaundering.

Biographies

Senem Kursun Bahadir received her B.Sc. degree in Textile Engineering and M.Sc.degree in Industrial Engineering from Istanbul Technical University (ITU), Turkeyin 2005 and 2007, respectively. Since then she completed her Ph.D. in Auto-matic/Computer Engineering at the University of Lille, France, in 2011 and in TextileEngineering at the Istanbul Technical University, Turkey, in 2011. Furthermore, sheis working as a research assistant in ITU, Turkey. Her research interests include elec-tronic textiles, smart clothing system designs and simulation techniques in clothingarea.

Professor Vladan Koncar is Director of the GEMTEX research laboratory and Headof Research at ENSAIT in Roubaix, France. Prior to this, he was the President of

the Association of Universities for Textiles (AUTEX) for 3 years (2007–2010). He iscurrently AUTEX Vice President. Professor Koncar completed his MSc in electronicsat the University of Belgrade, Serbia and his Ph.D. in informatics and automation atthe University of Lille, France, in 1991. Since then he has taught in fields such ascommunicative and intelligent textiles, automation, virtual reality, and computer
Page 15: Wearable obstacle detection system fully integrated to textile structures for visually impaired people

d Actu

nTsP(aCIm

S.K. Bahadir et al. / Sensors an

etworks. He is Editor in Chief of the International Journal of Textile Science andechnology and Special Editor of the Research Journal of Textile and Apparel, andits on the editorial board of a number of other publications including the Textilerogress Journal and the Nordic Textile Journal. He served as a chairman of ITMC

Intelligent Textiles and Mass Customisation) scientific conferences in 2007, 2009nd 2011 and of Futurotextiles International Scientific conferences in 2006 and 2008.urrently Professor Koncar is Scientific coordinator of MAPICC 3D, FP 7 Large Scale

ntegrated project, on Textile Composite Structures. His research interests includeultifunctional and intelligent textiles, flexible sensors and actuators, and hybrid

ators A 179 (2012) 297– 311 311

systems, modelling, identification and control. He is author and co-author of morethan 150 scientific articles, patents, book chapters and scientific communications.

Fatma Kalaoglu received the B.Sc degree in Textile/Mechanical Engineering in

Uludag University. She received M.Sc. degree in Mechanical Engineering and PhDin Textile Engineering in Istanbul Technical University. She is currently Professor inTextile Engineering Department of Textiles Technologies and Design Faculty of Istan-bul Technical University. Her current topics garment production, clothing comfort,fabric hand and drapability, smart garments.