2
ASLLENGE He Zhou, Hui Zheng, William Mai, Xiang Guo Faculty Advisor: Prof. Patrick A. Kelly Department of Electrical and Computer Engineering ECE 415/ECE 416 – SENIOR DESIGN PROJECT 2012 College of Engineering - University of Massachusetts Amherst SDP 12 Abstract Human gestures recognition can be extremely challenging, since human gestures does not provide auditory information to the learner, The human brain is used to distinguish the difference of a word by hearing the sound with the different intensity, pitch, and tone. Instead human gestures rely heavily on streams of information that are much more physically demanding. Human gestures helps men to express themselves by using distinctive facial, eye, head, and body movements. The overall goal of ASLLENGE is to design and implement a modern tool that can be easily used to record and compare human gestures to enhance the communication. The main components of this system are a Microsoft Kinect based arm motion tracking device and a finger tracking module. Name of Components Quantiti es Unit Price Total Price Arduino Mega 2560 1 $56 $451 Bluetooth Module 1 $40 Flex Sensor 10 $12 A Pair of Gloves 1 $30 Kinect Sensor 1 $150 Packing Materials 1 $20 Conductive Pen 1 $20 Cost for The Project The table above shows the detail cost for our project. If it is going to be recreated, the total cost is $431. Circuitry between the flex sensor glove and the Arduino board Major Components •Hardware •1 Arduino Mega 2560 •10 Flex sensors •1 BlueSMiRF Silver •1 Kinect •Software •Visual Studio(C#) •Arduino Software (C) •Terminal Emulator Flex Sensor BlueSMiRF Silver Arduino Mega 2560 User Interface

ASLLENGE

Embed Size (px)

DESCRIPTION

ASLLENGE. He Zhou, Hui Zheng, William Mai, Xiang Guo Faculty Advisor: Prof. Patrick A. Kelly. Abstract. - PowerPoint PPT Presentation

Citation preview

Page 1: ASLLENGE

ASLLENGEHe Zhou, Hui Zheng, William Mai, Xiang Guo

Faculty Advisor: Prof. Patrick A. Kelly

Department of Electrical and Computer Engineering

ECE 415/ECE 416 – SENIOR DESIGN PROJECT 2012College of Engineering - University of Massachusetts Amherst

SDP 12

AbstractHuman gestures recognition can be extremely challenging, since human gestures does not provide auditory information to the learner, The human brain is used to distinguish the difference of a word by hearing the sound with the different intensity, pitch, and tone. Instead human gestures rely heavily on streams of information that are much more physically demanding. Human gestures helps men to express themselves by using distinctive facial, eye, head, and body movements. The overall goal of ASLLENGE is to design and implement a modern tool that can be easily used to record and compare human gestures to enhance the communication. The main components of this system are a Microsoft Kinect based arm motion tracking device and a finger tracking module.

Name of Components Quantities 

Unit Price

Total Price

Arduino Mega 2560 1 $56

$451

Bluetooth Module 1 $40

Flex Sensor 10 $12

A Pair of Gloves 1 $30

Kinect Sensor 1 $150

Packing Materials 1 $20

Conductive Pen 1 $20

Cost for The Project

The table above shows the detail cost for our project. If it is going to be recreated, the total cost is $431.

Circuitry between the flex sensor glove and the Arduino board

Major Components•Hardware

•1 Arduino Mega 2560•10 Flex sensors•1 BlueSMiRF Silver •1 Kinect

•Software•Visual Studio(C#)•Arduino Software (C)•Terminal Emulator

Flex Sensor BlueSMiRF Silver

Arduino Mega 2560

User Interface

Page 2: ASLLENGE

Detailed Flow Chart

Within Threshold Exceed Threshold

Check Threshold

Motion Recognition Steps

1: The starting/ending of a gesture is when both hands

are the hip

2: When using one hand, the other hand must be

resting on the hip

3: The start and stop are controlled by the

controller

Two Simple Motion

1: Raise both hands frontal motion above

your head at the same time

2: Lower both hands frontal motion on the stop position at the

same time

1: Raise left hand frontal motion above

your head

2: Lower left hand frontal motion on the

stop position

3: Raise right hand frontal motion above

your head

4: Lower right hand frontal motion on the

stop position

Motion 3 Motion 1

User does a gesture

Data from the Kinect Camera is collected through the 3D depth

sensors and RGB CameraThe flex sensor is inserted into each glove. It collects analog signal while fingers are bending

Arduino Mega 2560 works as an AD converter

BlueSMiRF Silver transmits data to a PC wirelessly

Data from finger trackersplit into the left and right

Kinect Skeleton generated bythe Kinect SDK

Flex Sensor

Being Inserted