32
Help-in- Hand Major Project 2013-2014 Made by: Surbhi Jain Aishwarya Jain

Human Activity Recognition in Android

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: Human Activity Recognition in Android

Help-in-HandMajor

Project 2013-2014

Made by:Surbhi Jain

Aishwarya Jain

Page 2: Human Activity Recognition in Android
Page 3: Human Activity Recognition in Android

The Android Framework

The Android platform is an open platform for mobile devices consisting of an operating system, applications and middleware

Android gives users the opportunity to build and publish their own applications by providing an open development environment. Android treats all applications (native and third-party) as equals.

Therefore, having such an open development environment requires security measures to be taken in order to protect the integrity of the Android platform and the privacy of its users.

Page 4: Human Activity Recognition in Android

• Android is an open source mobile operating system with

Linux kernel. • The Android SDK is installed in to Eclipse. • Android treats both native and third party applications as

the same. So we can build and develop our own applications easily here.

• The android software development kit includes a set of development tool such as a debugger, libraries, handset emulator, documentation, sample code and tutorials.

• Android SDK has a java frame work and a powerful API for the hardware embedded on smartphones.

Why Android ?

Page 5: Human Activity Recognition in Android

Android

Architecture

Page 6: Human Activity Recognition in Android

Various Android Sensors

The Android has several sensors available:

• Accelerometer• Orientation• Ambient Light• Proximity• Magnetic Force

Use of these sensors does not require direct user permission!

Page 7: Human Activity Recognition in Android

Accelerometer Usage(contd.)

Activity Recognition

Distance travelled

Page 8: Human Activity Recognition in Android

Activity Recognition

Desired Outputs:- Physical Activities (e.g., Running, Walking)- Approximate time spans- Quick detection of change

Accelerometer Usage

Page 9: Human Activity Recognition in Android

Our Objective

To explore the Accelerometer as a measure of context- aware based applications for physical activity

recognition on the Android framework.

Page 10: Human Activity Recognition in Android

Literature Survey

List Of Paper Studied:• Paper 1 : Applications of Mobile Activity RecognitionAuthors : Jeffrey W. Lockhart, Tony Pulickal AND Gary M. Weis

• Paper 2 : User, Device and Orientation Independent Human Activity Recognition on

Mobile Phones: Challenges and a ProposalAuthors : Yunus Emre Ustey, Ozlem Durmaz Incel AND Cem Ersoy

• Paper 3 : Simple and Complex Activity Recognition Through SmartPhones

Authors : Das, B., Krishnan, Narayanan C., Thomas, B.L. AND Cook, D.J

• Paper 4 : Fall Detection by Built-In Tri-Accelerometer of Smartphone

Authors : Yi He, Ye Li AND Shu-Oi Bao

• Paper 5 : Feature Selection Based On Mutual Information For Human Activity Recognition

Authors : Khan, A., Chehade, N.H., Chieh Chien AND Pottie. G

contd..

Page 11: Human Activity Recognition in Android

• Paper 6 : Smartphone-based Monitoring System for Activities of Daily Living for Elderly People and Their Relatives Etc.

Authors : Kazushige Ouchi AND Miwako Doi

• Paper 7 : Environment Feature Extraction and Classification for Context Aware Physical Activity Monitoring

Authors : Troped, P.J., Evans,J.J. AND Pour,G.M

• Paper 8 : Fall Detection based on movement in Smartphone Technology

Authors : Gueesang Lee AND Deokjai Cho

• Paper 9 : Activity logging using lightweight classification techniques in mobile devices

Authors : Henar Martı´n ,Ana M. Bernardos ,Josue´ Iglesias • Jose´ R.Casar

• Paper 10 : Privacy control in smart phones using semantically rich reasoning and context modeling

Authors : Dibyajyoti Ghosh, Joshi, A., Finin, T. AND Jagtap, P

contd..

Page 12: Human Activity Recognition in Android

• Paper 11 : Towards Successful Design of Context-Aware Application Frameworks to Develop Mobile Patient Monitoring Systems Using Wireless Sensors

Authors : Al-Bashayreh, M.G. Hashim, N.L. AND Khorma, O.T

• Paper 12 : ActivityMonitor: Assisted Life Using Mobile Phones

Authors : Matti Lyra AND Hamed Ketabdar

Page 13: Human Activity Recognition in Android

Comparison among the papersPaper Parameters Used Algorithm Used/Proposed

Paper 1 Nil Neural networks and J48 decision trees  Autocorrelation, K-nearest neighbors (KNN),Paper 2 Mean, Variance, Std. Dev, fast Fourier transform (FFT) coefficients

  Zero Crossing Rate, Period    Mean, min, max, Std. Dev, Multi-layer Perceptron,Paper 3 Zero Crossing Rate, correlation Naïve Bayes, Bayesian network, Decision

    Table, Best-First    Tree, and K-star  Acceleration due to Signal Magnitude Vector, Signal MagnitudePaper 4 body movement; 2) Area (SMA), Tilt Angle (T A)

  gravitational acceleration,    median filter    Standard deviation, Mean, Tree-based, feature selection algorithm basedPaper 5 Absolute mean, Energy ratio, on mutual information, binary

  Ratio of DC to sidelobe, First decision-tree with a naıve Bayes classifier  sidelobe location, Max value,    Short time energy, Correlation  Paper 6 Average, minimum, maximum Stochastic model, Neural Networks, SVM every

  and variance, MFCC (Mel- 1 sec.  Frequency    Cepstral Coefficient), RMS    (Root Mean Square) and ZCR    (Zero-Crossing Rate)  

Cont…

Page 14: Human Activity Recognition in Android

Paper 7 Mean and sigma K-nearest neighbor

  of the Gaussian function  

Paper 8   Lower

  NIL Threshold (LT) and Upper Threshold (UT).

Paper 9 Mean, Variance, Zero crossing Naıve Bayes,

  rate ,75 percentile Decision Table and Decision Tree

     

Paper 10 NIL NIL

Paper 11 NIL NIL

Paper 12 Average magnitude value, Multi-Layer Perceptron (MLP)

  average rate of change,  

  weighted sum  

Page 15: Human Activity Recognition in Android

Current Problems

• First and Foremost is the use of Body Worn sensors. Today, In most of the apps we have found that external sensors are used to detect the physical movements of a person. Practically it is not possible to carry an external device with you Sometimes people forget to wear the device.

 • In most of the apps Positioning of the device is the concerned

for the success of application i.e. Most of the apps build are position specific of device. If the device is kept in hands then values will be different from the values generated when the device is kept in pocket.

 • Use of multiple Sensors to achieve the same goal which makes

the application bulky leading to slower processing of the data and also affects its cost.

 

Page 16: Human Activity Recognition in Android

Restating the Problem

We primarily focused on the Activity Recognition project

Inputs:- X acceleration - Y acceleration - Z acceleration

Desired Outputs:- Physical Activities (e.g., Running,

Walking)- Approximate time spans- Quick detection of change

Page 17: Human Activity Recognition in Android

The Activity Recognition Process

Page 18: Human Activity Recognition in Android

Data Collection Process

The first step to the project was to collect raw accelerometer data and transform it into features that WEKA, the machine-learning tool that we implemented, used to train a classifier. To accomplish this, we first took in sensor samples made up of acceleration readings in the x, y, and z directions and computed their magnitudes. Labeled all the data manually in terms of running, walking, standing, sitting. To make the data more accurate data of more than 20 minutes have been taken. Data gathering was done by performing experiments on four subjects. Each of the four subjects were asked to collect the data activity one by one by placing smartphone at the positions mentioned above. Each subject performed the set of 6 activities one by one for the duration of two minutes and the respective data was recorded in a .csv file in the external storage of the smartphone.

Contd...

Page 19: Human Activity Recognition in Android

Data Collection Process

Page 20: Human Activity Recognition in Android

Feature Extraction

Feature Extraction is the process of extracting key “features” from a signal. Features will be extracted from every sample window of 512 samples. The following features we Will be using in our project:1. The Fundamental Frequencies: The average of the three

dominant frequencies of the signal over the sample window. This was found via a Discrete Fourier Transformation.

2. Average Acceleration: The arithmetic mean of the acceleration magnitudes over the sample window.

3. Max Amplitude: The maximum acceleration value of the signal in the sample window.

4. Min Amplitude: The minimum acceleration value of the signal in the sample window.

Page 21: Human Activity Recognition in Android

Classification is the process of labeling unknown patterns based on the knowledge of known patterns of data. Four different classifiers were used:

K-Nearest Neighbor: Based on the shortest euclidean distance between the unknown and known data’s feature vector

Naïve Bayes: Assumes the absence of one feature does not disqualify a candidate (e.g., an object which is red and round is an apple, even if is not known to be a fruit)

J48(Decision Tree): J48 builds decision trees from a set of labeled training data using the concept of information entropy. It uses the fact that each attribute of the data can be used to make a decision by splitting the data into smaller subsets.

Random Forest: An ensemble classifier using many decision tree models. It can be used for classification or regression.

Classification

Page 22: Human Activity Recognition in Android

Use Case Diagram

Page 23: Human Activity Recognition in Android

The application will be divided in several modules which will be implemented time to time. Module-1 Activity Recognition Physical Activites like cycling, running,walking, standing etc performed by the user will be recognized in this module. User will click on the app icon or start button in the app which make him able to run sensors and thus his motion wil be detected. Module-2 Location Based Activity Recognition In addition with Activity Recognition GPS sensor will be used to find the location of user also what activity he is performing at that location. This will help in Fall Detection Module discussed later.

Overall Description

Contd…

Page 24: Human Activity Recognition in Android

Module-3 Fall Recognition In addition with physical Movement Recognition, Fall Recognition will be there.. Whenever Fall detection will have positive result then alarm will be raised instatntly and then app will monitor physical activities and if motion is not detected it will send an emergency message to the guardian informing about this accident.  Module-4 Physical Activity Chart The User will be able to see his/her daily physical activity chart i.e. how much he has done workouts today and what type of physical activity he/she performed during the day. Module-5 Calorie Burnt The user will have the ability to see the calories burnt by him within a day and how much calories he/she should be burnt to be physically fit. 

 

Contd…

Page 25: Human Activity Recognition in Android

Module-6 Medical Reminder The user can set the medical reminder if he wants. For this he/she have to feed the prescription in the phone with the timings and the made reminder active. Module-7 Distance TravelledThe user have the ability to see the distance he travelled by running or by cycling or by walking

Page 26: Human Activity Recognition in Android

Data Flow Diagram

Page 27: Human Activity Recognition in Android

Algorithm Flow Diagram

Page 28: Human Activity Recognition in Android

IMPLEMENTATION

Page 29: Human Activity Recognition in Android
Page 30: Human Activity Recognition in Android

Risk and MitigationRisk Description Risk Area Prob Impa RE (P* Risk Mitigati

Id of Risk ( Identify abilit c I) Select on Plan

    Risk y t (I)   ed if 8 is ‘Y’

    Areas for (P)     for  

    your       Mitiga  

    project)       ti  

            on  

            (Y/N)  

  Position of mobile Sensor          

1. i.e. Whether the readings High High High   Taking data by

  phone is taken in variation       Y considering all the

  hand or kept in           possible locations.

  chest pocket or in            

  pant’s pocket            

2. Battery drainage Hardware High Medi Medium N NIL

        um      

               

3. Sending each sms Security Medi Low Medium N NIL

  to a hidden 3rd   um        

  Party address            

  Device            

               

4. Computational limitations Hardware Medi Low Medium Y On cloud storage service.

      um        

               

Page 31: Human Activity Recognition in Android

Sources

International Journal of Distributed Sensor Networks provided by Hindawi.com

IEEE Sensors Journal  provided by  http://www.ieee-sensors.org/journals.

IJCA Proceedings on International Conference on Recent Trends in Information Technology and Computer Science 2012

Acoustics, Speech and Signal Processing (ICASSP), 2012 IEEE International Conference

Sensors Applications Symposium (SAS), 2013 IEEE

2012 IEEE RIVF International Conference

IEEE Symposium on Security and Privacy Workshops@2012

ACM Transactions on Knowledge Discovery from Data (TKDD)

ACM Journal of Data and Information Quality 

Page 32: Human Activity Recognition in Android

The End