19
HANDTALK PRESENTATION BY: GAURAV RAJESH (1017415) JOSEPH EMMANUEL (1017417) SHRUTHI S (1017430)

Project handtalk

Embed Size (px)

Citation preview

Page 1: Project handtalk

HANDTALKPRESENTATION BY: GAURAV RAJESH (1017415) JOSEPH EMMANUEL (1017417) SHRUTHI S (1017430)

Page 2: Project handtalk

Abstract: It is well-known that deaf and mute people find it very challenging to interact with those who are different from them. HandTalk unleashes as a novel technology that can aid those differently-able (deaf and mute) who have difficulty in communicating with others. The existing gesture recognition gloves might solve the problem but they are very expensive. HandTalk comes into play at this point as a problem-solver; it is accurate and inexpensive.

Page 3: Project handtalk

Introduction: Of all the available Sign Languages, The American Sign Language (ASL) serves as the major form of communicative method for the deaf and mute in most communities.

By incorporating the ASL in a portable device, the gestures made by the user can be recognized accurately by developing reliable software meant to serve its purpose.

By ‘serve its purpose’, we mean that the software is capable of detecting the gestures made by the user and mapping those recognized gestures into appropriate alphabets/words and then ultimately converting those text into speech.

Page 4: Project handtalk

Requirements Hardware

◦ P5 virtual glove◦ IR receiver◦ Computer with minimum of 512 MB RAM and 100 MB Free HDD Space.

Software◦ Text to speech synthesizer ◦ To convert gestures to text

Page 5: Project handtalk

UI P5 Virtual Reality Glove

3D and virtual environments

proprietary bend sensor

Page 6: Project handtalk

Motivation 250-300 million peoples.

Indian situations.

Worldwide situations.

Page 7: Project handtalk

The Design: We characterised four main modules that comprises the system.

1. The UI module deals with the P5 glove, its features and composition, and its interaction with the system.

2. The Communication module accounts for the mode of communication technology employed; which is Infrared that is bundled with the P5 glove.

3. The Processing module is the vital one where the major actions such as the finger bend data being received constantly and interpreted with the software, which looks into the gesture dictionary and once matched would pop out the appropriate text data. The textual data that is given out is further taken by a speech synthesizer which transforms into audio and made available to the users.

4. The Output module does only have a very minor role when compared with the other three; its main concern is the delivery of the output to the user.

Page 8: Project handtalk
Page 9: Project handtalk

CONVERSION DELAY

TEXT PREVIEW AND LOWER GESTURE PREVIEW TIME

SWITCH FREE SUPPORT FOR GESTURE DICTIONARY

EDITABLE GESTURE DATABASE

INSTANT SPEECH

SPEAK ANYTIME

Page 10: Project handtalk

The User Interface:HandTalk uses component off-the-shelf (COTS) called P5 Virtual Reality Glove which is mainly used for 3D gaming.

User-friendly interface that the deaf can use to interact with the software to get what they want.

It is inexpensive, light-weight and portable which fits into every aspect of our scheme.

The glove is built upon flex sensors, as its name indicates, the sensors are flexible and it works on the unique property of its resistance being varied depending upon the degree of the finger bends.

The glove is embedded with eight infrared transmission points which constantly transfer data to the IR module which in turn is connected to the computer through a USB port.

Page 11: Project handtalk
Page 12: Project handtalk

Problems with existing system

Meta Motion’s Data Glove Series and UC Berkeley’s Accelerating Glove Sensing prototype

Expensive

Only gesture recognition.

Time delay problem.

Buffer problem

Page 13: Project handtalk
Page 14: Project handtalk

AND MORE…!Future enhancements:

Wired glove into Wireless glove. Bluetooth instead of Infrared. Mobile replaces laptop. Support for Natural Language Processing. Embedded Chip.

Page 15: Project handtalk
Page 16: Project handtalk
Page 17: Project handtalk

Literature survey

Isaac Garcia Incertis, Jaime Gomez Garcia-Bermejo, Eduardo Zalama Casanova, "Hand Gesture Recognition for Deaf People Interfacing," icpr, vol. 2, pp.100-103, 18th International Conference on Pattern Recognition (ICPR'06) Volume 2, 2006.

John Kangchun Perng, Brian Fisher, Seth Hollar, Kristofer S.J. Pister, "Acceleration Sensing Glove," iswc, pp.178, Third International Symposium on Wearable Computers (ISWC'99), 1999

Ali Farhadi, David Forsyth, Ryan White, "Transfer Learning in Sign language," cvpr, pp.1-8, 2007 IEEE Conference on Computer Vision and Pattern Recognition, 2007

Page 18: Project handtalk

Jiyong Ma, Wen Gao, Jiangqin Wu, Chunli Wang, "A Continuous Chinese Sign Language Recognition System," fg, pp.428, Fourth IEEE International Conference on Automatic Face and Gesture Recognition (FG'00), 2000

Sylvie C.W. Ong, Surendra Ranganath, "Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 6, pp. 873-891, June 2005, doi:10.1109/TPAMI.2005.112

”Experiments in Virtual Reality” by David Harrison and Mark Jaques.

Year of Publication: 1996, Publisher: Butterworth-Heinemann (Newton, USA)

”The Virtual Reality Homebrewer’s Handbook” by Robin Hollands, Sean Clark and Chris Hand

Year of Publication: 1996, Publisher: John Wiley & Sons, Inc. (New York, USA)

http://www.cwonline.com

”The American Sign Language Handshape Dictionary” by Richard A. Tennant and Marianne Gluszak Brown.

Year of Publication: 1998, Publisher: Gallaudet University Press

Page 19: Project handtalk

THANK YOU