13
Emotion Recognition: Benefits and Challenges Sujith Kumar Anand Affective Computing and Human Robot Interaction University College London [email protected] ABSTRACT The project aims to capture the body postures with sensing devices and video recorder for automatic recognition of affective states and motivate them; through feedback and personal exercise plan. INTRODUCTION An injury caused in the act of participating in a sport is a traumatic experience for the person who has dedicated their time and energy into their fitness and achievements[1]. The most common type of knee injury among sport professionals are ACL, or anterior cruciate ligament[2]. A targeted set of strength and training fitness activities can be found useful for the personal who have ACL injuries[3]. Researchers have tried to generalize the emotional aspect of a sport person injury[4]. But Crossman and Smith[5][6], thinks, the post-injury responses of an ACL injured person will be different and more complex. Some of the commonly shown emotions to injury are disengagement, frustration, depression, anger, tension, disbelief, fear, rage, depression, and fatigue [6]. Johnston and Carroll[7] studied the dissimilarities between uninjured and injured sports athletes and pointed out the injured athlete showed greater affect of negativism, inferior self- esteem and very great amount of disengagement, depression and anxiety. ACL participants require regular fitness activities during their rehab period to overcome their short or long term outcomes w.r.t to their pain. However, emotions such as fear from the pain and/or re-injury, counteracts many individuals suffering from ACL, to hold on to an exercise regime[4][8]. Being an athlete requires commitment, determination, and, most importantly, a passion. During these periods, athlete may question themselves about their identity of being an athlete they were if they are unable to practice and compete[6]. Squat is one of the exercises used by the physiotherapists for ACL athletes during their rehab period. This exercise puts every muscle below the waist into work when performed correctly[9]. The use of technology in the field of physical activity in increased over time but their uses in chronic pain people performing physical activity is still in its infancy[10]. Luckily, the last research performed by [11][12][13] showed use of pedometers to automatically track exercise an motivate a person to perform physical activity. However, pedometers are used in strength training exercise are not comparable fitness activities such as running[14]. Large companies (Nike, FitBit, Microsoft) have come forward with their own devices to support fitness activities with the help of wearable fitness sensors such as the FuelBand, Flex and so on. These could be using while performing an exercise to track their calorie outflow and overall activity information. But they fail to information such as (reps, sets, time taken)[14] The use of camera along with sensor technology doesn’t robustly handle different body motions and postures while performing strength-training exercises such as squats. In this work, the goal is to understand the advantages and disadvantages of using sensing device i.e., Microsoft Kinect sensor and external devices such as empatica and video camera to build a design system in identifying the affective states i.e., engagement and disengagement (in our case) with the context of fitness. BACKGROUND & RELATED WORK What is ACL? The full form of ACL is Anterior Cruciate Ligament. This injury occurs in the knee and this is the most complex and largest joint in the human body. The knee depends on ligaments, muscles, tendons and secondary ligaments to function consistently [15]. How is the ACL Injured? The most common way for an ACL person to be injured is through a direct contact to the knee, which usually occurs in football. This case occurs when one or more ligaments are

Affective Computing and Human robot i coursework

Embed Size (px)

Citation preview

Page 1: Affective Computing and Human robot i coursework

Emotion Recognition: Benefits and Challenges Sujith Kumar Anand

Affective Computing and Human Robot Interaction

University College London [email protected]

ABSTRACT The project aims to capture the body postures with sensing devices and video recorder for automatic recognition of affective states and motivate them; through feedback and personal exercise plan.

INTRODUCTION An injury caused in the act of participating in a sport is a traumatic experience for the person who has dedicated their time and energy into their fitness and achievements[1]. The most common type of knee injury among sport professionals are ACL, or anterior cruciate ligament[2]. A targeted set of strength and training fitness activities can be found useful for the personal who have ACL injuries[3]. Researchers have tried to generalize the emotional aspect of a sport person injury[4]. But Crossman and Smith[5][6], thinks, the post-injury responses of an ACL injured person will be different and more complex. Some of the commonly shown emotions to injury are disengagement, frustration, depression, anger, tension, disbelief, fear, rage, depression, and fatigue [6]. Johnston and Carroll[7] studied the dissimilarities between uninjured and injured sports athletes and pointed out the injured athlete showed greater affect of negativism, inferior self-esteem and very great amount of disengagement, depression and anxiety. ACL participants require regular fitness activities during their rehab period to overcome their short or long term outcomes w.r.t to their pain. However, emotions such as fear from the pain and/or re-injury, counteracts many individuals suffering from ACL, to hold on to an exercise regime[4][8]. Being an athlete requires commitment, determination, and, most importantly, a passion. During these periods, athlete may question themselves about their identity of being an athlete they were if they are unable to practice and compete[6]. Squat is one of the exercises used by the physiotherapists

for ACL athletes during their rehab period. This exercise puts every muscle below the waist into work when performed correctly[9]. The use of technology in the field of physical activity in increased over time but their uses in chronic pain people performing physical activity is still in its infancy[10]. Luckily, the last research performed by [11][12][13] showed use of pedometers to automatically track exercise an motivate a person to perform physical activity. However, pedometers are used in strength training exercise are not comparable fitness activities such as running[14]. Large companies (Nike, FitBit, Microsoft) have come forward with their own devices to support fitness activities with the help of wearable fitness sensors such as the FuelBand, Flex and so on. These could be using while performing an exercise to track their calorie outflow and overall activity information. But they fail to information such as (reps, sets, time taken)[14] The use of camera along with sensor technology doesn’t robustly handle different body motions and postures while performing strength-training exercises such as squats. In this work, the goal is to understand the advantages and disadvantages of using sensing device i.e., Microsoft Kinect sensor and external devices such as empatica and video camera to build a design system in identifying the affective states i.e., engagement and disengagement (in our case) with the context of fitness.

BACKGROUND & RELATED WORK

What is ACL? The full form of ACL is Anterior Cruciate Ligament. This injury occurs in the knee and this is the most complex and largest joint in the human body. The knee depends on ligaments, muscles, tendons and secondary ligaments to function consistently [15].

How is the ACL Injured? The most common way for an ACL person to be injured is through a direct contact to the knee, which usually occurs in football. This case occurs when one or more ligaments are

Page 2: Affective Computing and Human robot i coursework

torn between the knee and another object from an abnormal position. There are chances where ACL injury can occur without the external contact of another object such as a running athlete changing direction or hyperextends their knee while landing from a jump [15].

What does ACL injury recovery entail? ACL reconstruction is conducted through rehabilitation, which requires time and hard work. A minimum of 6 weeks to 6 months are required varying on the individual activity and severity level. Many research studies have proved 90 percent of the ACL patients are able to normal sporting duty without the symptoms of knee instability [15].

Body Expression de Gelder[16] notes, 95% of the studies conducted in the field of emotion in humans are suing facial expression whereas the information from sounds such as voice, music and environment, makes up the remaining 5% on whole-body expressions. Kleinsmith et al[17] cite body expressions are identified as more important for non-verbal communication. The change in body posture tends to change in the person’s affective state[18][19]. Additionally, some affective state expressions can be very well communicated via body than face[20]. Research studies have expressed people tend to control facial expressions more than body expressions[21] and similarly people trust body expressions more when compared with facial expressions[22]. Therefore, we focus extensively on body expressions to identify the participant’s engagement in performing a squat. Studies in the field of physiology are using body expression to evaluate the features of the attributed body to recognize the affective states that need to be identified. Aronoff et al. [23] use acted movements with angular and diagonal postures to signify threatening behavior. James[24] expresses the importance of body openness, leaning direction and head position for distinguishing several affective states. Similarly, Wallbott[19] uses the arm, shoulder, and head positions to distinguish between 14 emotions and Coulson[25] used computer-generated avatars to examine postural features for assigning affect to body posture. Similarly, in our work we used body expression to identify affective states

Recognition of Body Expressions

Acted sets Most of the automatic recognition systems to identify affective states in the current era are based on the Laban movement analysis [26]. Camurri et al. [27] examined body

expressions to identify emotional expressions in dance. The results for automatic recognition of four emotions were ranging between 31-46%, whereas 56% was recognized for observers rating [28]. Berthouze et al. [29] use acted postures, which are labeled by observers to recognize emotions based on low-level features from the body joints. The recognition model yielded 90% accuracy. Similarly, Berhardt and Robinso [30], Pollick et al, [31] build recognition model using acted sets. In our work, we use acted sets as automatic label without the observers to identify affective states.

Unacted or non-acted sets In Picard's studies [32] [33], non-acted sets to build multimodal recognition model from body expressions to identify the various levels of interest [32] and self-narrated frustration [33]. They obtained 55.1% recognition accuracy for facial expressions, body postures, and game-state expressions. Similarly, Kleinsmith et al, [17] proposed a computational model to identify affective states from low-level non-acted body postures. In this study, external observers are used for extensive analysis agreement to identify base rate In our work, we use non-acted body postures to recognize engagement and disengagement affective states. This labeling of affective states for non-acted is performed by observers.

Technology support for physical activity in ACL

Self-supervision and physical motivation activity are increasingly in terms technology support[10]. However, they fall short for addressing chronic pain such as fear of movement. Existing technology are not detailed in terms of to help chronic pain users with detailed valued help, which is valued by users[34]. Smartphones apps are better to acute pain rather chronic pain. WebMD PainCouch (www.webmd.com/webmdpaincoach.app) was developed along with the healthcare professional to enable users to monitor their pain and set and track activity goals, generating related messages. Similarly, Google PACO, a self-monitoring app allows users to create personalized monitoring for specific exercises and identifying relevant factors. These apps are such supportive to monitoring physical and psychological states but fail in support and engaging behavior present in chronic pain people i.e., guarding. People with chronic pain suggest friendly experiences help with recovering from pain[10].

Page 3: Affective Computing and Human robot i coursework

This suggests a need for supportive experience in pain management with better understanding of engagement.

Other technologies for physical activity Consoles such as Nintendo Wii are used in the areas of rehabilitation such as stroke therapy. The Microsoft Kinect has used for helping physical activity in older adults by prototype games[10]. Use of sensing devices to provide information of the correctness of the movement and to increase the fitness level are increasing such as riablo (www.corehab.com). Riablo uses accelerometer data used by patients and remotely send the information to clinicians to monitor. However, a technology that adapts along with people emotions is still immature with no use in the physical context of rehabilitation[35]. In our study, Microsoft Kinect sensor is used to record the physical activity i.e., squat to create the recognition model along with two other devices (explained in below section).

Full squat vs. Half squat Drinkwater et al, [36] explains fitness industry have poorly understood squat and, in particular, squat depth. There is still confusion in understanding what a full squat implies to parallel squat or half squat for some people. Here are few definitions[37]: Half or Parallel squat: This is where a person performing a squat will use his hip joint going down as far as being parallel to their knees. This formation leads to the formation of 90% of knee flexion. Deep or Full squat: This is where the hip joint goes down are well below the knee joint. Partial Squat: This is where the hip joint goes down till they are just above the knees. The squat can be performed using weights just above the shoulder either front or back, which is ideal for the gym but ideal for rehabilitation setup. In our study, understanding the context of the ACL athletes. The full, half and partial squats are considered and used for labeled of affective states (explained in below section).

Physiotherapy exercises Athletes having ACL go through a rehab process by exercising and keeping them healthy. Physiotherapy is the process, they go through to recover and manage the pain to improve their strength and flexibility. With physiotherapy, they try to understand the problem and advise and help the athletes to perform a variety of simple and exercises. Some of the exercises are used by physiotherapist are Leg stretch,

Leg cross, leg raises, sit/stands, step ups, knee squats or half squat and so on[38] [39]. Ideally, ACL person will be performing squat only in their advanced treatment due to its complexness and considering the person capability to withstand the pain[39].

DATA COLLECTION

Use of sensors

Which ones? Two different sensing devices were used for collecting body postures i.e., Microsoft Kinect Xbox One and Empatica. Alongside, these two devices an external video recorder from the apple laptop was also used.

Why were selected? The sensor from the Kinect Xbox was used to collect the physiological data or the various body joint positions while the user in action i.e., different body postures. Empatica was considered to see the possibility identifying information accelerometer and galvanic sensor. These two sensing devices were considered to collect the numerical data to identify the affective states of the users. However, after initial assessment empatica data was disregarded due to the complexity in identifying squat numerical data. However, numerical data from Kinect provided clear differentiation between various squats performed by the users. A video recorder was also used as an external device to record users performing squats. Observers were provided with non-acted squat sets for labeling the affective state of the users while performing the squat i.e., engaged or disengaged.

Setup The data was set up in a closed university room, similar to a laboratory setup. The room was spacious and comfortable to accommodate the involved participants. The participants were allowed to wear clothes comfortable to them. A Kinect sensor was placed above the video camera and in front of the users at certain position to identify all the body joint positions. The main idea was to capture the whole body of the user performing squat in front of the device. Empatica device was asked to wear either on their left or right hand depending on their choice. The majority of the times users wore the device on their right wrist. During the feature labeling and extraction, it was understood a fixed position to keep both feet’s were needed. Since all the participants had different foot positions while performing each squats section. Due to this, normalization was very difficult to be created and was agreed to be neglected. Additionally, their body measurements were also different from each other making it difficult to identify the ground truth.

Page 4: Affective Computing and Human robot i coursework

Participants Due to the ethical limitations ACL users were not involved in this study whereas students who are performing this study were only considered as participants i.e., 3 people i.e., P1, P2, P3. The participant’s age group range between the years of 25 to 28 with previous knowledge of performing a squat. The students involved were healthy while performing the squat activity.

Process The participants were asked to perform two sets i.e., acted and non-acted. Four affective states were considered initially, i.e., engaged, disengaged, energetic and tired. During the acted sets, the participants were advised to perform 5 squats of acted affective state squats. A clear verbal instruction was exchanged between the participants on how to perform four affective state squats. After completion of acted sets, the participants were asked to perform 40 squats of non-acted squats with an interval of 2 minutes after the 20th squat. The whole process is repeated twice with the three participants. There was a total of 120 acted and 240 non-acted sets at the end of the process. This data Kinect and video recorder were individually segregated to ease the further process.

FEATURE LABELLING The participants labeled the acted sets automatically. Since participants were explained and agreed together about the instruction of performing squat in their respective affective state. However, this might not be an idealistic approach since there are high chances where the participants have failed to portray the actual affective state correctly and might have inter performed the affective states. The non-acted sets were 240 in total after the process. Each participant has performed 40 sets. Due to the time involved in labeling the non-acted sets by external observers. The non-acted sets were further divided into segments. 12 segments were created with each segment containing 5 non-acted squats (i.e., a total of 60 non-acted squats) from each participant. Also, it was made user the each segment would contain at least one non-acted squat from each participant. Twelve observers were recruited from the Ifor Evans Hall, Camden road, London (a UCL University Residence). The observers were consented to participate in this labeling. The whole process was explaining and pointing out the research would yield not benefits. The age group of the observers was between 21 to 28 years. These 12 observers were asked to label 4 affective states from 2 segments of 10 non-acted squats i.e., 5 squats as engaged or disengaged and the rest of the 5 squats with energetic and tired. The non-acted squats recorded in the video recorder were considered for

observer labeling. The squats were cropped individually with a time ranging from 30-50 seconds. The whole process of observer labeling lasted between 2 to 4 minutes. The observers were screened and selected to participate if they had a previous knowledge of performing squats. There faces in the video recording of users performing squat were not covered, so there are chances of facial expression biasing and performing labeling without their full interest and could have used sensor blob video. One important comment from the observer was why didn’t we perform squat sideways to the Kinect and video recorder instead of facing forward to the setup. The four affective states i.e., engaged, disengaged, energetic and tired. Since from the literature it showed many ACL athletes are engaged, disengaged, energetic and tired, affective states were disregarded in the furthered process since the there two states were not ideal or best suited with the ACL chronic pain users.

FEATURE EXTRACTION We used skeleton data to capture captured skeleton data from Kinect sensor, which is comprised of three-dimensional coordinates (see Figure 2) for the markers or attributes depicted in Figure 1. The Kinect Xbox sensor has identified 75 markers or attributes from each participant (see table 1). Furthermore, the attributes and be classified into 5 segments i.e., cervical and thoracic spine, left arm, right arm, left leg and right leg.

Figure 1. Skeleton data

Page 5: Affective Computing and Human robot i coursework

Figure 2. Skeleton data (3D)

Segments Attributes Cervical and thoracic spine Head, neck, spine, spine

base, spine shoulder Left arm Left shoulder, left elbow,

left hand, left wrist, left fingertip, left thumb

Left leg Right shoulder, Right elbow, Right hand, Right

wrist, Right fingertip, Right thumb

Right arm Right shoulder, Right elbow, Right wrist, Right

hand, Right fingertip, Right thumb

Right leg Left shoulder, Left elbow, Left wrist, Left hand, Left

fingertip, Left thumb

Table 1. Markers identified from Kinect Xbox sensor

As explained in literature, half squat is performed only in vertical motion only in the direction of y-axis (See Figure 2) i.e., where a person performing a squat will use his hip joint going down as far as being parallel to their knees. This formation leads to formation of 90% of knee flexion (See Figure 3).

Figure 3. Half Squat

So wr.t to the above reason we disregarded x and z-axis attributes and thus 25 attributes remains.

Furthermore, we disregard 4 attributes from y-axis i.e., right ankle, left ankle, right foot, left foot. Since they have very minimal movement in y-axis more or less they remain motionless compared to other attributes that have very high variation (See Figure 4), thus 25 attributes reduces to 21 attributes. The actual sensor data records 30 frames per second. This actually leads to unwanted frames since there were situations where the sensor was not stopped when participant was taking rest or pre and post squat phase, where the participant will be motionless. So we later we refined each attribute into 3 attributes according to each squat i.e., upper 1, lower and upper 2. This position is considered when a person is starting to perform the squat i.e., upper 1. Later, when the participant’s hip joint going down as far as being parallel to their knees or to the lowest point of the hip from the standing position before he pulls back himself, this position in the attribute is considered as lower and finally when the participant comes back to the initial position, this is called as upper 2. This refinement is performed for each squat i.e., for both acted (63 squats) and non-acted sets (61 squats). Finally, a total of 63 attributes are created after the consideration of 3 points for each squat, i.e., each squat will be having attributes for e.g., spine_base Y_Upper_1, spine_base Y_Lower and spine_base Y_Upper_2

Figure 4. A chart of 25 attributes over y-axis, where four

attributes are highlighted in red color to point out the inconsistency from the remaining attributes.

FEATURE MODELLING

We tested our model i.e., 63 attributes with seven classification algorithms, ZeroR, OneR, NaiveBayes, SMO, J48 and Random Forest to identify the most discriminative attributes identifying the affective states. To facilitate the implementation, Weka version 3.6.12[40], The acted and non-acted data sets were evaluated with all the algorithms mentioned above with 10 fold cross-validation method except ZeroR as training set, which was used to identify the baseline accuracy.

Page 6: Affective Computing and Human robot i coursework

Method Accuracy

ZeroR 51%

OneR 57%

Naïve Bayes 68%

Nearest neighbor (IBK) 75%

Random Forest 73%

J48 81%

SMO 78%

Table 2. Various algorithms used to acted sets to identify the algorithm with best accuracy.

a b <-- classified as

27 4 a = Yes

8 24 b = No

Table 3. Confusion matrix for J48 algorithm with 10-cross validation training.

In acted sets, The performance accuracy from J48 was highest i.e., 81%; compared to other algorithm used. Also all the algorithm’s accuracy was higher than the baseline accuracy (J48).

Method Accuracy

ZeroR 65%

OneR 47%

Naïve Bayes 43%

Nearest neighbor (IBK) 52%

Random Forest 55%

J48 60%

SMO 60%

Table 4. Various algorithms used on non-acted sets to identify the algorithm with best accuracy.

In this case of non-acted sets, its clearly identifiable the accuracy of OneR, NaiveBayes, SMO, J48 and Random Forest are below the accuracy (65%) of the ZeroR. According to Ian[41], if the baseline accuracy i.e., the accuracy of ZeroR more than the algorithms used then the used dataset is not good. External observers who might have resulted to this situation performed the labeling of the un-acted data sets.

Method Accuracy

Zero R 50%

OneR 38%

Naïve Bayes 43%

Nearest neighbor (IBK) 52%

Random Forest 52%

J48 52%

SMO 67%

Table 5. Leave one person out, where participant P3 was used as test set to validate the training set of P1+P2.

Method Accuracy

Zero R 50%

OneR 65%

Naïve Bayes 45%

Nearest neighbor (IBK) 55%

Random Forest 50%

J48 45%

SMO 55%

Table 6. Leave one person out, where participant P1 was used as test set to validate the training set of P2+P3.

Page 7: Affective Computing and Human robot i coursework

Method Accuracy

Zero R 50%

OneR 55%

Naïve Bayes 55%

Nearest neighbour (IBK) 50%

Random Forest 73%

J48 82%

SMO 59%

Table 7. Leave one person out, where participant P2 was used as test set to validate the training set of P1+P3.

Cases Accuracy

1 67%

2 65%

3 82%

Table 8. Leave one person out, maximum accuracies identified from the above three possibilities..

To conclude, we performed leave on person out wherein the acted sets from two participants P1and P2 were considered to test with the P3. This was repeated with all the possibilities (See Figure 5,6,7) and finally the highest accuracy was considered from the three possibilities (See Figure8) to identify the mean performance of the model i.e., 71%. The leave one person out was considered for non-acted because of the baseline accuracy problem explained above. However, during the acted set validation, J48 algorithm provides 81% accuracy with 5 leaves and 9 trees and discriminative attributes identified are right_fingertip_lower (Parent node), right_wrist_lower (child node), right_elbow_lower (grandchild node). Few more iterations were performed by remove dominant attributes but the accuracy dropped drastically. However, the dominant attributes were either from left or right finger, hands, shoulders i.e., left or right hands

So we agreed this presented model is better than the rest and our model requires hands to automatically predict the affective states i.e., engaged and disengaged.

PERSONALIZED EXERCISE PLAN As explained, by Johnston and Carroll[7] injured sports athletes showed greater amount of disengagement, depression and anxiety. As a result, engaged and disengaged were considered, leaving out energetic and tired. Also, asking ACL patients to be energetic in the initial phase of their treatment is impossible. As pointed before ACL can be cured and treated with rehabilitation process. For ACL patients, strengthening of the muscles in the close range of knee, particularly hamstrings are required. As explained in the lecture 5 of the class, persuasion can be considered in an attempt to an attempt to change attitudes or behaviors for creating a personalized exercise plan for an ACL patient e.g., motivating patient to perform a perfect half squat. It is also important to perform the squat in correct positioning. So ACL patients can be provided with a consent sheet informing, how to perform a squat or through assistance. Here are some of the positions to remember while performing a squat [42]. Neck: It is important to keep the neck in neutral position. The ideal way is to look forward and stick to it until the squat or a set is finished. Lower Back: Similar to neck, keeping lower back in neutral position is important. However, to achieve this position, chest can be squeezed up as much as possible to have a neutral lower back. Back at the hips first: The effective way to perform the squat is by sticking the butt back instead of pushing knees forward. This provides better stability to hamstrings and glutes by putting less stress on knee. Knee: The knee must be kept inline with toes. Stay on the heels: If “back at the hip first” can’t be followed then this can be followed i.e., to stay on heal while performing the squat. After carful considerations, as majority of the ACL patients visit physiotherapist or knee doctors for checkups only once a week and learn home exercise program to be performed at their home environment[39].

Page 8: Affective Computing and Human robot i coursework

As explained in sportshealth.com, ACL patients will be able to perform half, partial squat and wall squat during their 2 -4th weeks of their rehab sessions. Keeping this in mind, for the week 2, asking the patients to perform half squat is too pushy. So as explained in the lecture 5, its better to use macrosuasion for patients to perform at their own pace and ability. Also, it’s recommended, to perform each squat for minimum of 10 seconds with a repetition of 10 times. During the week two, patients can be asked to perform wall squats or partial squat once every day for three sets or more with the support of wall. With wall squats, user stand on their back against a wall, keeping feet at shoulder width apart with 40 centimeters away from the wall. Then patients are allowed to slide down until they start to sense the pain in their knee and hold for 5 seconds to see improvement (See Figure 5).

Figure 5. An ACL patient performing wall squat during

the week two of their exercise plan.

In week 3, they can perform partial squat without the support of wall. In this squat, they follow the same procedure of performing 10 times of 3 sets while holding for 5 seconds when starting to feel pain in their knee while pushing their hip downwards[43] (See Figure 6).

Figure 6. An ACL patient performing partial squat

during week three of their exercise plan without support of wall.

Finally, in week 4 they are asked to perform partial squat with the same procedure of performing 10 times of 3 sets with holding time of 10 seconds.

Figure 7. An ACL patient performing half squat during

week four of their exercise plan.

FEEDBACK

However, setting the exercise plan for ACL patients in devices ambitious since they are already in a pain to walk or not at all in a possible to move legs with or without support of hands, so handheld devices to display an exercise plan is difficult. We could use the functional triad model or persuasive tools to identify the best-suited way of display personalized display panel. However, a mounted display or anthropomorphic interfaces can be used to provide feedback. The current work on chronic pain[44] and use of technology for chronic pain[45] shows sound feedback eases self-analysis and anxiety reduction. The body function increases to teach mindfulness skills using aural feedback[45]. Sound feedback, have an advantage over visual display due to its flexibility of providing feedback with movement and doesn’t require fixation on a display[10]. Sound feedback do have a positive effect over motor rehabilitation e.g. smartphone devices [46] and sonification [47] such as introducing movement, facilitating coordination and performance improvement. So the sound feedback suits best for our study where people are with chronic pain in their knees and they introduce movements to perform activities. Aural feedback can be used similarly to the work. Similarly, [10][48] have used aural feedback in their work where they provide feedback to patients when the chronic patients bend forward and backward accordingly. As pointed, the affective states such as engaged and disengaged can be considered to identify the person. The ACL patients perform engaged and disengaged squats in the vertical position. So an aural feedback can be provided

Page 9: Affective Computing and Human robot i coursework

based on the patient's movement up and down (See Figure 7). However, a problem arises during wall squat. This is the phase where ACL patients are in their starting phase of recovery and use the wall to perform squats, but patients use hands supports themselves by resting on the wall to perform squat. But, in our feature modeling left or right hand had the dominant impact. However, after second iteration spine was the dominant attribute. So this actually helps us to put the sensor or the feedback devices on the central hip position (See Figure8) where it could automatically identify affective states from squat.

Figure 8. A look-a-alike wearing sensing device over hip.

Figure 9. Sound feedback and exercise poses.

As shown in the above Figure 8 and 9[48], the feedback device can be placed behind or front of the hip. The patient could set the minimum and maximum posture performing up and down while performing during the weeks 2 to 4 via external button. The pitch rises sound while the patients reaching down and the reduces while coming back to normal position. If the patients perform the squat faster, then a faster sound can be generated to inform the patient to slow down. This way patient’s self-analysis will increase effectively and reduces over-active movements.

DISCUSSION Identifying the ground truth with our affective states was really difficult as pointed in the lecture too[49]. User of external observer could have been handled better way. With the current observer labeling the data was not good to automatic recognition of affective states. Similarly, we could have faced opposite to Kinect while performing squat

rather then facing forward. Also, we could have prevented biased observer labeling by providing blob videos from Kinect rather than video recording.

CONCLUSION [1] “A Pain in the Brain: The Psychology of Sport and

Exercise Injury,” www.ideafit.com. [Online]. Available: http://www.ideafit.com/fitness-library/a-pain-in-the-brain-the-psychology-ofsport-and-exercise-injury. [Accessed: 28-May-2015].

[2] “Statistics on ACL Injuries in Athletes,” LIVESTRONG.COM. [Online]. Available: http://www.livestrong.com/article/548782-statistics-on-acl-injuries-in-athletes/. [Accessed: 28-May-2015].

[3] A. W. Kiefer, A. M. Kushner, J. Groene, C. Williams, M. A. Riley, and G. D. Myer, “A Commentary on Real-Time Biofeedback to Augment Neuromuscular Training for ACL Injury Prevention in Adolescent Athletes,” J. Sports Sci. Med., vol. 14, no. 1, pp. 1–8, Jan. 2015.

[4] C. Klenk, “Psychological response to injury, recovery, and social support: A Survey of athletes at an NCAA Division I University.,” 2006.

[5] J. Crossman, “Psychological rehabilitation from sports injuries,” Sports Med. Auckl. NZ, vol. 23, no. 5, pp. 333–339, May 1997.

[6] A. M. Smith, S. G. Scott, and D. M. Wiese, “The psychological effects of sports injuries. Coping,” Sports Med. Auckl. NZ, vol. 9, no. 6, pp. 352–369, Jun. 1990.

[7] L. Johnston and D. Carroll, “The psychological impact of injury: effects of prior sport and exercise involvement,” Br. J. Sports Med., vol. 34, no. 6, pp. 436–439, Dec. 2000.

[8] M. L. Shuer and M. S. Dietrich, “Psychological effects of chronic injury in elite athletes.,” West. J. Med., vol. 166, no. 2, pp. 104–109, Feb. 1997.

[9] “Research Review: Front or back squats,” Precision Nutrition. .

[10] A. Singh, A. Klapper, J. Jia, A. Fidalgo, A. Tajadura-Jiménez, N. Kanakam, N. Bianchi-Berthouze, and A. Williams, “Motivating People with Chronic Pain to Do Physical Activity: Opportunities for Technology Design,” in Proceedings of the 32Nd Annual ACM Conference on Human Factors in Computing Systems, New York, NY, USA, 2014, pp. 2803–2812.

[11] D. M. Bravata, C. Smith-Spangler, V. Sundaram, A. L. Gienger, N. Lin, R. Lewis, C. D. Stave, I. Olkin, and J. R. Sirard, “Using pedometers to increase physical activity and improve health: a systematic review,” JAMA, vol. 298, no. 19, pp. 2296–2304, Nov. 2007.

[12] C. B. Chan, D. A. J. Ryan, and C. Tudor-Locke, “Health benefits of a pedometer-based physical

Page 10: Affective Computing and Human robot i coursework

activity intervention in sedentary workers,” Prev. Med., vol. 39, no. 6, pp. 1215–1222, Dec. 2004.

[13] D. Merom, C. Rissel, P. Phongsavan, B. J. Smith, C. Van Kemenade, W. J. Brown, and A. E. Bauman, “Promoting walking with pedometers in the community: the step-by-step trial,” Am. J. Prev. Med., vol. 32, no. 4, pp. 290–297, Apr. 2007.

[14] D. Morris, T. S. Saponas, A. Guillory, and I. Kelner, “RecoFit: Using a Wearable Sensor to Find, Recognize, and Count Repetitive Exercises,” in Proceedings of the 32Nd Annual ACM Conference on Human Factors in Computing Systems, New York, NY, USA, 2014, pp. 3225–3234.

[15] P. on Wed and O. 27, “The Anterior Cruciate Ligament (ACL).” [Online]. Available: http://www.foundrysportsmedicine.com/our-blog/bid/47663/The-Anterior-Cruciate-Ligament-ACL. [Accessed: 30-May-2015].

[16] B. de Gelder, “Why bodies? Twelve reasons for including bodily expressions in affective neuroscience,” Philos. Trans. R. Soc. Lond. B. Biol. Sci., vol. 364, no. 1535, pp. 3475–3484, Dec. 2009.

[17] A. Kleinsmith, N. Bianchi-Berthouze, and A. Steed, “Automatic Recognition of Non-Acted Affective Postures,” Trans Sys Man Cyber Part B, vol. 41, no. 4, pp. 1027–1038, Aug. 2011.

[18] A. Mehrabian and J. T. Friar, “Encoding of attitude by a seated communicator via posture and position cues,” J. Consult. Clin. Psychol., vol. 33, no. 3, pp. 330–336, 1969.

[19] H. G. Wallbott and K. R. Scherer, “Cues and channels in emotion recognition,” J. Pers. Soc. Psychol., vol. 51, no. 4, pp. 690–699, 1986.

[20] P. Ekman and W. V. Friesen, “Nonverbal leakage and clues to deception,” Psychiatry, vol. 32, no. 1, pp. 88–106, Feb. 1969.

[21] P. Ekman and W. V. Friesen, “Detecting deception from the body or face,” J. Pers. Soc. Psychol., vol. 29, no. 3, pp. 288–298, 1974.

[22] C. C. R. J. van Heijnsbergen, H. K. M. Meeren, J. Grèzes, and B. de Gelder, “Rapid detection of fear in body expressions, an ERP study,” Brain Res., vol. 1186, pp. 233–241, Dec. 2007.

[23] J. Aronoff, B. A. Woike, and L. M. Hyman, “Which are the stimuli in facial displays of anger and happiness? Configurational bases of emotion recognition,” J. Pers. Soc. Psychol., vol. 62, no. 6, pp. 1050–1066, 1992.

[24] W. T, “A study of the expression of bodily posture,” J. Gen. Psychol., vol. 7, pp. 405–437, 1932.

[25] M. Coulson, “Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint Dependence,” J. Nonverbal Behav., vol. 28, no. 2, pp. 117–139, Jun. 2004.

[26] R. von Laban, The mastery of movement. Macdonald and Evans, 1980.

[27] A. Camurri, B. Mazzarino, M. Ricchetti, R. Timmers, and G. Volpe, “Multimodal Analysis of Expressive Gesture in Music and Dance Performances,” in Gesture-Based Communication in Human-Computer Interaction, A. Camurri and G. Volpe, Eds. Springer Berlin Heidelberg, 2004, pp. 20–39.

[28] A. Camurri, I. Lagerlöf, and G. Volpe, “Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques,” Int. J. Hum.-Comput. Stud., vol. 59, no. 1–2, pp. 213–225, Jul. 2003.

[29] P. R. De Silva and N. Bianchi-Berthouze, “Modeling Human Affective Postures: An Information Theoretic Characterization of Posture Features: Research Articles,” Comput Animat Virtual Worlds, vol. 15, no. 3–4, pp. 269–276, Jul. 2004.

[30] D. Bernhardt and P. Robinson, “Detecting Affect from Non-stylised Body Motions,” in Affective Computing and Intelligent Interaction, A. C. R. Paiva, R. Prada, and R. W. Picard, Eds. Springer Berlin Heidelberg, 2007, pp. 59–70.

[31] Y. Ma, H. M. Paterson, and F. E. Pollick, “A motion capture library for the study of identity, gender, and emotion perception from biological motion,” Behav. Res. Methods, vol. 38, no. 1, pp. 134–141, Feb. 2006.

[32] A. Kapoor, R. W. Picard, and Y. Ivanov, “Probabilistic combination of multiple modalities to detect interest,” in Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004, 2004, vol. 3, pp. 969–972 Vol.3.

[33] A. Kapoor, W. Burleson, and R. W. Picard, “Automatic Prediction of Frustration,” Int J Hum-Comput Stud, vol. 65, no. 8, pp. 724–736, Aug. 2007.

[34] B. A. Rosser, P. McCullagh, R. Davies, G. A. Mountain, L. McCracken, and C. Eccleston, “Technology-mediated therapy for chronic pain management: the challenges of adapting behavior change interventions for delivery with pervasive communication technology,” Telemed. J. E-Health Off. J. Am. Telemed. Assoc., vol. 17, no. 3, pp. 211–216, Apr. 2011.

[35] M. S. H. Aung, B. Romera-Paredes, A. Singh, S. Lim, N. Kanakam, A. C. de C Williams, and N. Bianchi-Berthouze, “Getting RID of pain-related behaviour to improve social and self perception: A technology-based perspective,” in 2013 14th International Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS), 2013, pp. 1–4.

[36] E. J. Drinkwater, N. R. Moore, and S. P. Bird, “Effects of changing from full range of motion to partial range of motion on squat kinetics,” J. Strength Cond. Res. Natl. Strength Cond. Assoc., vol. 26, no. 4, pp. 890–896, Apr. 2012.

[37] “How are partial squats and full squats different?,” Strength & Conditioning Research. .

Page 11: Affective Computing and Human robot i coursework

[38] G. St Thomas, “Physiotherapy following anterior cruciate ligament (ACL) reconstruction.” © 2013 Guy’s and St Thomas’ NHS Foundation Trust, Feb-2013.

[39] “Your Quick Guide to ACL Rehab Exercises,” About.com Health, 15-Dec-2014. [Online]. Available: http://sportsmedicine.about.com/od/surgeryrehab/a/ACLSurgeryRehab.htm. [Accessed: 30-May-2015].

[40] “Weka 3 - Data Mining with Open Source Machine Learning Software in Java.” [Online]. Available: http://www.cs.waikato.ac.nz/ml/weka/. [Accessed: 31-May-2015].

[41] “Data Mining with Weka (2.4: Baseline accuracy) - YouTube.” [Online]. Available: https://www.youtube.com/watch?v=MrQhW4FyNW4. [Accessed: 31-May-2015].

[42] “How To Do Squats Properly - The Ultimate Guide.” [Online]. Available: http://johnalvino.com/how-to-do-squats/. [Accessed: 31-May-2015].

[43] “Iowa Governor’s Council on Physical Fitness & Nutrition Exercise #6 HALF SQUAT.” .

[44] M. S. Cepeda, D. B. Carr, J. Lau, and H. Alvarez, “Music for pain relief,” in Cochrane Database of Systematic Reviews, John Wiley & Sons, Ltd, 1996.

[45] J. Vidyarthi, B. E. Riecke, and D. Gromala, “Sonic Cradle: Designing for an Immersive Experience of Meditation by Connecting Respiration to Music,” in Proceedings of the Designing Interactive Systems Conference, New York, NY, USA, 2012, pp. 408–417.

[46] B. A. Rosser and C. Eccleston, “Smartphone applications for pain management,” J. Telemed. Telecare, vol. 17, no. 6, pp. 308–312, Sep. 2011.

[47] K. Vogt, D. Pirrò, I. Kobenz, R. Höldrich, and G. Eckel, “PhysioSonic - Evaluated Movement Sonification as Auditory Feedback in Physiotherapy,” in Auditory Display, S. Ystad, M. Aramaki, R. Kronland-Martinet, and K. Jensen, Eds. Springer Berlin Heidelberg, 2010, pp. 103–120.

[48] “Chronic Pain Rehabilitation Assistant | Ye’s Stories.” [Online]. Available: http://yelin.in/works/chronic.html. [Accessed: 31-May-2015].

[49] R. W. Picard, E. Vyzas, and J. Healey, “Toward machine emotional intelligence: analysis of affective physiological state,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 23, no. 10, pp. 1175–1191, Oct. 2001.

The important factor for an ACL athlete is to be motivated. They expressed lack of motivation in performing strength-training exercises. This paper also points out the lack of technology support towards chronic pain patients especially towards ACL athletes. A need to build a dedicated system for ACL patients can be seen expressed in this paper

highlight the various problem we faced though sensing devices and video recorder to automatic recognition of affective states. However, the acted sets provided a accuracy of 81% which is really good. The discriminative features identified are arms in particular hands. Finally a personalized exercise and feedback plan was proposed to increase the motivation of the ACL patients.

REFERENCES [1] “A Pain in the Brain: The Psychology of Sport and

Exercise Injury,” www.ideafit.com. [Online]. Available: http://www.ideafit.com/fitness-library/a-pain-in-the-brain-the-psychology-ofsport-and-exercise-injury. [Accessed: 28-May-2015].

[2] “Statistics on ACL Injuries in Athletes,” LIVESTRONG.COM. [Online]. Available: http://www.livestrong.com/article/548782-statistics-on-acl-injuries-in-athletes/. [Accessed: 28-May-2015].

[3] A. W. Kiefer, A. M. Kushner, J. Groene, C. Williams, M. A. Riley, and G. D. Myer, “A Commentary on Real-Time Biofeedback to Augment Neuromuscular Training for ACL Injury Prevention in Adolescent Athletes,” J. Sports Sci. Med., vol. 14, no. 1, pp. 1–8, Jan. 2015.

[4] C. Klenk, “Psychological response to injury, recovery, and social support: A Survey of athletes at an NCAA Division I University.,” 2006.

[5] J. Crossman, “Psychological rehabilitation from sports injuries,” Sports Med. Auckl. NZ, vol. 23, no. 5, pp. 333–339, May 1997.

[6] A. M. Smith, S. G. Scott, and D. M. Wiese, “The psychological effects of sports injuries. Coping,” Sports Med. Auckl. NZ, vol. 9, no. 6, pp. 352–369, Jun. 1990.

[7] L. Johnston and D. Carroll, “The psychological impact of injury: effects of prior sport and exercise involvement,” Br. J. Sports Med., vol. 34, no. 6, pp. 436–439, Dec. 2000.

[8] M. L. Shuer and M. S. Dietrich, “Psychological effects of chronic injury in elite athletes.,” West. J. Med., vol. 166, no. 2, pp. 104–109, Feb. 1997.

[9] “Research Review: Front or back squats,” Precision Nutrition. .

[10] A. Singh, A. Klapper, J. Jia, A. Fidalgo, A. Tajadura-Jiménez, N. Kanakam, N. Bianchi-Berthouze, and A. Williams, “Motivating People with Chronic Pain to Do Physical Activity: Opportunities for Technology Design,” in Proceedings of the 32Nd Annual ACM Conference on Human Factors in Computing Systems, New York, NY, USA, 2014, pp. 2803–2812.

[11] D. M. Bravata, C. Smith-Spangler, V. Sundaram, A. L. Gienger, N. Lin, R. Lewis, C. D. Stave, I. Olkin, and J. R. Sirard, “Using pedometers to increase physical activity and improve health: a systematic

Page 12: Affective Computing and Human robot i coursework

review,” JAMA, vol. 298, no. 19, pp. 2296–2304, Nov. 2007.

[12] C. B. Chan, D. A. J. Ryan, and C. Tudor-Locke, “Health benefits of a pedometer-based physical activity intervention in sedentary workers,” Prev. Med., vol. 39, no. 6, pp. 1215–1222, Dec. 2004.

[13] D. Merom, C. Rissel, P. Phongsavan, B. J. Smith, C. Van Kemenade, W. J. Brown, and A. E. Bauman, “Promoting walking with pedometers in the community: the step-by-step trial,” Am. J. Prev. Med., vol. 32, no. 4, pp. 290–297, Apr. 2007.

[14] D. Morris, T. S. Saponas, A. Guillory, and I. Kelner, “RecoFit: Using a Wearable Sensor to Find, Recognize, and Count Repetitive Exercises,” in Proceedings of the 32Nd Annual ACM Conference on Human Factors in Computing Systems, New York, NY, USA, 2014, pp. 3225–3234.

[15] P. on Wed and O. 27, “The Anterior Cruciate Ligament (ACL).” [Online]. Available: http://www.foundrysportsmedicine.com/our-blog/bid/47663/The-Anterior-Cruciate-Ligament-ACL. [Accessed: 30-May-2015].

[16] B. de Gelder, “Why bodies? Twelve reasons for including bodily expressions in affective neuroscience,” Philos. Trans. R. Soc. Lond. B. Biol. Sci., vol. 364, no. 1535, pp. 3475–3484, Dec. 2009.

[17] A. Kleinsmith, N. Bianchi-Berthouze, and A. Steed, “Automatic Recognition of Non-Acted Affective Postures,” Trans Sys Man Cyber Part B, vol. 41, no. 4, pp. 1027–1038, Aug. 2011.

[18] A. Mehrabian and J. T. Friar, “Encoding of attitude by a seated communicator via posture and position cues,” J. Consult. Clin. Psychol., vol. 33, no. 3, pp. 330–336, 1969.

[19] H. G. Wallbott and K. R. Scherer, “Cues and channels in emotion recognition,” J. Pers. Soc. Psychol., vol. 51, no. 4, pp. 690–699, 1986.

[20] P. Ekman and W. V. Friesen, “Nonverbal leakage and clues to deception,” Psychiatry, vol. 32, no. 1, pp. 88–106, Feb. 1969.

[21] P. Ekman and W. V. Friesen, “Detecting deception from the body or face,” J. Pers. Soc. Psychol., vol. 29, no. 3, pp. 288–298, 1974.

[22] C. C. R. J. van Heijnsbergen, H. K. M. Meeren, J. Grèzes, and B. de Gelder, “Rapid detection of fear in body expressions, an ERP study,” Brain Res., vol. 1186, pp. 233–241, Dec. 2007.

[23] J. Aronoff, B. A. Woike, and L. M. Hyman, “Which are the stimuli in facial displays of anger and happiness? Configurational bases of emotion recognition,” J. Pers. Soc. Psychol., vol. 62, no. 6, pp. 1050–1066, 1992.

[24] W. T, “A study of the expression of bodily posture,” J. Gen. Psychol., vol. 7, pp. 405–437, 1932.

[25] M. Coulson, “Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and

Viewpoint Dependence,” J. Nonverbal Behav., vol. 28, no. 2, pp. 117–139, Jun. 2004.

[26] R. von Laban, The mastery of movement. Macdonald and Evans, 1980.

[27] A. Camurri, B. Mazzarino, M. Ricchetti, R. Timmers, and G. Volpe, “Multimodal Analysis of Expressive Gesture in Music and Dance Performances,” in Gesture-Based Communication in Human-Computer Interaction, A. Camurri and G. Volpe, Eds. Springer Berlin Heidelberg, 2004, pp. 20–39.

[28] A. Camurri, I. Lagerlöf, and G. Volpe, “Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques,” Int. J. Hum.-Comput. Stud., vol. 59, no. 1–2, pp. 213–225, Jul. 2003.

[29] P. R. De Silva and N. Bianchi-Berthouze, “Modeling Human Affective Postures: An Information Theoretic Characterization of Posture Features: Research Articles,” Comput Animat Virtual Worlds, vol. 15, no. 3–4, pp. 269–276, Jul. 2004.

[30] D. Bernhardt and P. Robinson, “Detecting Affect from Non-stylised Body Motions,” in Affective Computing and Intelligent Interaction, A. C. R. Paiva, R. Prada, and R. W. Picard, Eds. Springer Berlin Heidelberg, 2007, pp. 59–70.

[31] Y. Ma, H. M. Paterson, and F. E. Pollick, “A motion capture library for the study of identity, gender, and emotion perception from biological motion,” Behav. Res. Methods, vol. 38, no. 1, pp. 134–141, Feb. 2006.

[32] A. Kapoor, R. W. Picard, and Y. Ivanov, “Probabilistic combination of multiple modalities to detect interest,” in Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004, 2004, vol. 3, pp. 969–972 Vol.3.

[33] A. Kapoor, W. Burleson, and R. W. Picard, “Automatic Prediction of Frustration,” Int J Hum-Comput Stud, vol. 65, no. 8, pp. 724–736, Aug. 2007.

[34] B. A. Rosser, P. McCullagh, R. Davies, G. A. Mountain, L. McCracken, and C. Eccleston, “Technology-mediated therapy for chronic pain management: the challenges of adapting behavior change interventions for delivery with pervasive communication technology,” Telemed. J. E-Health Off. J. Am. Telemed. Assoc., vol. 17, no. 3, pp. 211–216, Apr. 2011.

[35] M. S. H. Aung, B. Romera-Paredes, A. Singh, S. Lim, N. Kanakam, A. C. de C Williams, and N. Bianchi-Berthouze, “Getting RID of pain-related behaviour to improve social and self perception: A technology-based perspective,” in 2013 14th International Workshop on Image Analysis for Multimedia Interactive Services (WIAMIS), 2013, pp. 1–4.

[36] E. J. Drinkwater, N. R. Moore, and S. P. Bird, “Effects of changing from full range of motion to partial range of motion on squat kinetics,” J. Strength

Page 13: Affective Computing and Human robot i coursework

Cond. Res. Natl. Strength Cond. Assoc., vol. 26, no. 4, pp. 890–896, Apr. 2012.

[37] “How are partial squats and full squats different?,” Strength & Conditioning Research. .

[38] G. St Thomas, “Physiotherapy following anterior cruciate ligament (ACL) reconstruction.” © 2013 Guy’s and St Thomas’ NHS Foundation Trust, Feb-2013.

[39] “Your Quick Guide to ACL Rehab Exercises,” About.com Health, 15-Dec-2014. [Online]. Available: http://sportsmedicine.about.com/od/surgeryrehab/a/ACLSurgeryRehab.htm. [Accessed: 30-May-2015].

[40] “Weka 3 - Data Mining with Open Source Machine Learning Software in Java.” [Online]. Available: http://www.cs.waikato.ac.nz/ml/weka/. [Accessed: 31-May-2015].

[41] “Data Mining with Weka (2.4: Baseline accuracy) - YouTube.” [Online]. Available: https://www.youtube.com/watch?v=MrQhW4FyNW4. [Accessed: 31-May-2015].

[42] “How To Do Squats Properly - The Ultimate Guide.” [Online]. Available: http://johnalvino.com/how-to-do-squats/. [Accessed: 31-May-2015].

[43] “Iowa Governor’s Council on Physical Fitness & Nutrition Exercise #6 HALF SQUAT.” .

[44] M. S. Cepeda, D. B. Carr, J. Lau, and H. Alvarez, “Music for pain relief,” in Cochrane Database of Systematic Reviews, John Wiley & Sons, Ltd, 1996.

[45] J. Vidyarthi, B. E. Riecke, and D. Gromala, “Sonic Cradle: Designing for an Immersive Experience of Meditation by Connecting Respiration to Music,” in Proceedings of the Designing Interactive Systems Conference, New York, NY, USA, 2012, pp. 408–417.

[46] B. A. Rosser and C. Eccleston, “Smartphone applications for pain management,” J. Telemed. Telecare, vol. 17, no. 6, pp. 308–312, Sep. 2011.

[47] K. Vogt, D. Pirrò, I. Kobenz, R. Höldrich, and G. Eckel, “PhysioSonic - Evaluated Movement Sonification as Auditory Feedback in Physiotherapy,” in Auditory Display, S. Ystad, M. Aramaki, R. Kronland-Martinet, and K. Jensen, Eds. Springer Berlin Heidelberg, 2010, pp. 103–120.

[48] “Chronic Pain Rehabilitation Assistant | Ye’s Stories.” [Online]. Available: http://yelin.in/works/chronic.html. [Accessed: 31-May-2015].

[49] R. W. Picard, E. Vyzas, and J. Healey, “Toward machine emotional intelligence: analysis of affective physiological state,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 23, no. 10, pp. 1175–1191, Oct. 2001.