15
Implementation of EyesWeb low-level features Antonio Camurri, Paolo Coletta, Barbara Mazzarino, Stefano Piana DELIVERABLE D4.2 Grant Agreement no. 289021 Project acronym ASC-Inclusion Project title Integrated Internet-Based Environment for Social Inclusion of Children with Autism Spectrum Conditions Contractual date of delivery 31 October 2012 Actual date of delivery 31 October 2012 Deliverable number D4.2 Deliverable title Implementation of EyesWeb low-level features Type Report, Public Number of pages 15 WP contributing to the deliverable WP 4 (Nonverbal Expressive Gesture Analysis) Responsible for task Antonio Camurri (UNIGE) [email protected] Author(s) Antonio Camurri (UNIGE) [email protected] Paolo Coletta (UNIGE) [email protected] Barbara Mazzarino (UNIGE) [email protected] Stefano Piana (UNIGE) [email protected]

&RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

  • Upload
    duongtu

  • View
    216

  • Download
    0

Embed Size (px)

Citation preview

Page 1: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

Implementation of EyesWeb low-level

features Antonio Camurri, Paolo Coletta, Barbara Mazzarino, Stefano Piana

DELIVERABLE D4.2 Grant Agreement no. 289021 Project acronym ASC-Inclusion Project title Integrated Internet-Based Environment for Social

Inclusion of Children with Autism Spectrum Conditions Contractual date of delivery 31 October 2012 Actual date of delivery 31 October 2012 Deliverable number D4.2 Deliverable title Implementation of EyesWeb low-level features Type Report, Public Number of pages 15 WP contributing to the deliverable WP 4 (Nonverbal Expressive Gesture Analysis) Responsible for task Antonio Camurri (UNIGE) [email protected]

Author(s) Antonio Camurri (UNIGE) [email protected] Paolo Coletta (UNIGE) [email protected] Barbara Mazzarino (UNIGE) [email protected] Stefano Piana (UNIGE) [email protected]

Page 2: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 2 of 15 ICT FP7 Contract No. 289021

Table of Contents1. Introduction ........................................................................................................................... 3 2. Features ................................................................................................................................ 3

2.1 kinematic features....................................................................................................... 3 2.1.1 Positions and Trajectories .............................................................................. 3 2.1.2 Velocities, accelerations and jerk ................................................................... 3

2.2 Expressive features .................................................................................................... 3 2.2.1 Kinetic Energy / Quantity of Motion ................................................................ 3 2.2.2 Spatial extent: 2D/3D contraction Index (2D/3DCI) and Density .................... 4 2.2.3 Smoothness ................................................................................................... 5 2.2.4 Symmetry ....................................................................................................... 6 2.2.5 Forward-backward leaning of the upper body and relative positions .............. 7 2.2.6 Directness ...................................................................................................... 7 2.2.7 Periodicity ....................................................................................................... 8 2.2.8 Rigidity ............................................................................................................ 8 2.2.9 Impulsivity ....................................................................................................... 9

3. Implementation of EyesWeb XMI blocks ............................................................................... 9 3.1 kinematic features..................................................................................................... 10

3.1.1 Positions and Trajectories ............................................................................ 10 3.1.2 Velocities, accelerations and jerk computation ............................................. 10

3.2 Expressive features .................................................................................................. 11 3.2.1 Kinetic energy ................................................................................................ 11 3.2.2 2D/3D contraction Index ................................................................................ 11 3.2.3 Smoothness 2D/3D ...................................................................................... 12 3.2.4 Symmetry ..................................................................................................... 12 3.2.5 Density index ................................................................................................ 13 3.2.6 Directness .................................................................................................... 13 3.2.7 Periodicity ..................................................................................................... 13 3.2.8 Rigidity .......................................................................................................... 14 3.2.9 Impulsivity ..................................................................................................... 14

4. Conclusions ........................................................................................................................ 14 5. References ......................................................................................................................... 14

Page 3: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 3 of 15 ICT FP7 Contract No. 289021

1. Introduction This Deliverable gives a brief description of the results of the work of Implementation of EyesWeb low-level features: that is, (i) to define algorithms to compute the features identified in [1], (ii) to implement the algorithm in the EyesWeb Environment in the form of an EyesWeb XMI block or patch. The implemented modules and patches will be used to develop a system for automated real-time analysis of non-verbal full-body expressive gesture, which will be integrated and validated in the platform developed by WP5. 2. Features In this section we describe the set of expressive features identified during months 1-6 of the project (See D4.1) and algorithms used to compute them. 2.1 kinematic features Kinematic features are directly computed from mocap data or images, each one of the proposed feature can be computed starting from two dimensional or three dimensional information. 2.1.1 Positions and Trajectories Joints positions are extracted from mocap data. In the case of three dimensional positions mocap Systems such as Microsoft Kinect can be used to track each user (See [1]). Trajectories are a collections of consecutive positions that represent the movement performed during a gesture by a single tracked point. 2.1.2 Velocities, accelerations and jerk Velocities, accelerations and jerk related to each single point identified during the tracking, are computed directly starting from the coordinates. They are computed using a Savitzky–Golay filter [11]. 2.2 Expressive features 2.2.1 Kinetic Energy / Quantity of Motion The first expressive feature concerns the overall energy spent by the user during movement, calculated as the total amount of displacement in all of the tracked points, the judged amount of movement activity is deemed important in differentiating emotions. The highest values of energy are related to anger, joy and terror emotions, the lowest values correspond to sadness and boredom. Camurri [3] showed that movement activity is a relevant feature in recognizing emotion from the full-body movement. For these reasons, we include in the set of expressive features an approximated measure of the overall motion energy at time frame f. Given three-dimensional or two dimensional tracking information a Kinetic Energy index, that represent the total amount of energy spent by the user, can be computed:

Page 4: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 4 of 15 ICT FP7 Contract No. 289021

let ,fz+fy+fx=fvi iii222 denote the module of velocity of the i-th tracked point

at time frame f for three-dimensional coordinates and ,fy+fx=fvi ii22

the module of velocity calculated on two-dimensional coordinates. we then define Etot(f) as an approximation of the body kinematic energy, as the weighted sum of the limbs’ kinetic energy

Etot( f )= 1

2 ∑i=1

nmi⋅vi ( f )2 ,

where mi is the approximation of the mass of the i-th joint, m values are computed starting from biometrics anthropometric tables. 2.2.2 Spatial extent: 2D/3D contraction Index (2D/3DCI) and Density Contraction Index (CI), is a measure, ranging from 0 to 1, of how the user’s body uses the space surrounding it. It is related to Laban’s “personal space” [12], [13]. 3D Contraction index is calculated comparing the volume of the Bounding Volume (BV) that is the minimum parallelepiped surrounding the user’s body and an approximation of the volume of the density (DI) of the 3D coordinates calculated as follows:

,DIDIDIπ=DI zyx 43

where zyx DI,DI,DI are the approximated density indexes calculated respectively on x,y and z axes as follows:

DI x= 1n∑i=1

n dxi

DI y= 1n∑i=1

ndyi

DI z= 1n∑i=1

n dzi and iii dz,dy,dx are the distance between the center of mass and the i-th point:

n,,=izz=dz

n,,,=iyy=dy

n,,,=ixx=dx

cii

cii

cii

0,1

0,1

0,1

2

2

2

Page 5: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 5 of 15 ICT FP7 Contract No. 289021

The 3D Contraction Index is then calculated as the normalized ratio between DI and BV. 2D Contraction Index can be calculated using a technique related to the bounding box, i.e., the minimum rectangle surrounding the dancer’s body: the algorithm compares the area covered by this rectangle with the area currently covered by the user's silhouette. If the limbs of the user are fully stretched and not lying along the body, both 3D and 2D CI values will be low, while, if the limbs are kept tightly nearby the body, they will be high (near to 1). A different index of spatial extent is density, the density can be computed from a set of coordinates (both 2D and 3D coordinates), the density can be computed in two different ways: nd,,dmean=Density 0 or nd,,dmedian=Density 0 or nd,,dmax=Density 0where id is the Euclidean distance of the i-th point of the set from the centroid C which in turn can be computed as mean or mode of the set. 2.2.3 Smoothness In general, “smoothness” is synonymous to “having small values of high-order derivatives.” Wallbott [2], in his analysis of qualitative aspects of psychiatric patients’ hand movements, noticed that movements judged as smooth “are characterized distally by large circumference, long wavelength, high mean velocity, but not abrupt changes in velocity or acceleration (standard deviations of velocity and acceleration). Thus, smooth movements seem to be large in terms of space and exhibit a high but even velocity”. We therefore have adapted Wallbott’s statements on the qualitative dimensions of under-constrained arm movements and we have computed hands trajectories curvature to identify trajectories’ smoothness. Curvature (k) measures the rate at which a tangent vector turns as a trajectory bends it is defined as the reciprocal of the radius of the curve described by the trajectory:

R=k 1

A hand trajectory following the contour of a small circle will bend sharply, and hence will have higher curvature; by contrast, a point trajectory following a straight line will have zero curvature. The curvature is computed for each point trajectory as follows: n,,=ir

rr=ki

iii 0,13

where r is the velocity of the trajectory of the i-th point and r is its acceleration. The smoothness index can be computed on multi-dimensional trajectories, we choose to calculate mono (i), bi (ii), and three (iii) dimensional curvatures:

Page 6: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 6 of 15 ICT FP7 Contract No. 289021

iiin,,=i

z+y+xyzzy+zxxz+xyyx=k

iin,,=iy+x

xyyx=k

in,,=ix+x=k

iii

iiiiiiiiiiiii

ii

iiiii

i

ii

0,123

0,123

0,1231

222

222

22

2

2

where the first and second-order derivatives of the coordinates of one point are combined to calculate the curvature. 2.2.4 Symmetry Lateral asymmetry of emotion expression has long been studied in face expressions resulting in valuable insights about a general hemisphere dominance in the control of emotional expression. An established example is the expressive advantage of the left hemiface that has been demonstrated with chimeric face stimuli, static pictures of emotional expressions with one side of the face replaced by the mirror image of the other. A study by Roether et al. on human gait demonstrated pronounced lateral asymmetries also in human emotional full-body movement [5]. Twenty-four actors (with an equal number of right and left-handed subjects) were recorded by using a motion capture system during neutral walking and emotionally expressive walking (anger, happiness, sadness). For all three emotions, the left body side moves with significantly higher amplitude and energy. Perceptual validation of the results were conducted through the creation of chimeric walkers using the joint-angle trajectories of one body half to animate completely symmetric puppets. Considering that literature pointed out the relevance of symmetry as behavioral and affective features, we address the symmetry of gestures and its relation with emotional expression. It is measured evaluating limbs spatial symmetry with respect to the body computing symmetry on each of the available dimensions (x and y in the case of 2D space x, y and z in the three dimensional case). Each symmetry (SIx, SIy, SIz) is computed from the position of the barycenter and the left and right joints (e.g., hands shoulders, foots, knees) as described below:

Page 7: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 7 of 15 ICT FP7 Contract No. 289021

n,,=izz

zzzz=SI

n,,,=iyyyyyy=SI

n,,,=ixxxxxx=SI

LiRiRiBLiB

Zi

LiRiRiBLiB

Yi

LiRiRiBLiB

Xi

0,1

0,1

0,1

where BBB Z,Y,X are the coordinates of the center of mass LiLiLi Z,Y,X are the coordinates of a left joint (e.g., left hand, left shoulder, left foot, etc.) and RiRiRi Z,Y,X are the coordinates of a right joint (e.g., right hand, right shoulder, right foot, etc). The three partial indexes are then combined in a normalized index that expresses the overall estimated symmetry. 2.2.5 Forward-backward leaning of the upper body and relative positions Head and body movement and positions are relied on as an important feature for distinguishing between various emotional expressions [14]. The amount of forward and backward leaning of a joint is measured by the velocity of the joint's displacement along its z component (depth) respective to the body position and orientation. jointleaning z=fJoint 2.2.6 Directness Directness is one of the features that are deemed important in the process of recognizing emotions [15]. A direct movement is characterized by almost rectilinear trajectories. Movement Directness Index is computed from a trajectory drawn in the space by a joint as the ratio between the euclidean distance, calculated between the starting and the ending point of the trajectory, and the trajectory's actual length. The directness index tends to assume values near to 1 if a movement is direct and low values (near 0) otherwise. In the case of a three dimensional trajectory the index is computed as follows:

1

12

12

12

1

222N=k

=kk+kk+kk+k

SESESE

zz+yy+xxzz+yy+xx=DI

where SSS z,y,x are the coordinates of the starting point, EEE z,y,x are the coordinates of the ending point and N is the length of the trajectory. This computation is extensible to two dimensional trajectories.

Page 8: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 8 of 15 ICT FP7 Contract No. 289021

2.2.7 Periodicity Periodicity can be calculated using periodicity transform [18], The periodicity transform decomposes sequences into a sum of periodic sequences by projecting onto a set of “periodic subspaces”. The Periodicity Transform searches for the best periodic characterization of the length N sequence x . The underlying technique is to project x onto some periodic subspace pP . This periodicity is then removed from x leaving the residual r stripped of its p-periodicities. A sequence of real numbers is called p-periodic if there is an integer p with x=p+kx for all k integers. Let pPx,π represent the projection of x onto .pP Then

1

0

p

=sspSp deltaα=Px,π

where the spδ are the (orthogonal) -periodic basis elements of pP and Sα can be calculated

as:

1/

0~/

1 pN

=n NS np+sxpN=α where Nx ~ is the sequence constructed from the first pN=N /~ elements of x . Then the residual r can be calculated as pPx,πx=r The process can be iterated and a new projection on another periodic plane can be computed starting from r . The choose of the projection planes affects the detection of periodicity so we chose to implement two different algorithms that implement periodicity transform differently. The first algorithm is defined “small to large” and assumes a threshold T and calculates the projections onto pP beginning with 1=p and progressing through 2/N=p . Whenever the projection contains at least T percent of the energy in x , then the corresponding projection is chosen as a basis element. The second algorithm called "M-best" maintains a list of the M best periodicities and the corresponding basis elements. When a new (sub)periodicity is detected that removes more power from the signal than one currently on the list, the new one replaces the old, and the algorithm iterates. 2.2.8 Rigidity Rigidity (or stiffness) is a measure of the relative movement of different parts of the body, and is related to the expression of emotions [17] (sometimes rigidity of human body is related to difficulties in expressing emotions). Rigidity can be evaluated [16] observing a set of P points belonging to the body during a window of F frames. The absolute point coordinates P=pF,=f|v',u' fpfp 11 are transformed to central coordinates, i.e., they are made relative to their geometric center which is defined as:

Page 9: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 9 of 15 ICT FP7 Contract No. 289021

Pv'

,Pu'

=v,u P=ifi

P=ifi

ff 11 . Thus, the set of points P=pF,=f|v,u fpfp 11 where

ffpffpfpfp vv',uu'=v,u . The obtained set of points can be used to construct the observation matrix P:

FPF1

FPF1

uu

uuuu

uu=VU=P

1P11

1P11

P has size P2F . Evaluating the rank of this matrix (i.e., estimated the number of linearly independent rows or columns) provides information about the rigidity of the set of points: if the points belong to a rigid body, then the rank of P is 3 or lower, whereas in the case of non-rigid bodies the rank is higher than 3. 2.2.9 Impulsivity In human motion analysis, Impulsivity can be defined as a temporal perturbation of a regime motion [22]. Impulsivity refers to the physical concept of impulse as a variation of the momentum. This contributes to define and reach a reference measure for impulsivity. From psychological studies [23], and [24], an impulsive gesture lacks of premeditation, that is, is performed without a significant preparation phase. We developed a first algorithm for impulsivity detection, derived by Mancini and Mazzarino [19], where a gesture is considered an impulse if it is characterized by a T phase, i.e., short duration and high magnitude. The algorithm is the following:

let∆t=0 . 45sec ;letgesturethreshold=0:02 ;if (energy≥ threshold )evaluatetheGestureTimeDurationdt;Ifdt≥ 0ifdt≤ ∆tthenImpulsivityIndex=∆CI /dt

The maximum duration for the impulse 0.45sec=t has been chosen empirically based on our test data. The reason for the condition of the energy being greater than 0.02 is that we are interested in gestures with high magnitude, i.e., we considered only gestures with high energy. Results show that the above-described algorithm is able to identify an impulsive gesture. 3. Implementation of EyesWeb XMI blocks Each one of the features presented in Section 2 has been implemented as software

Page 10: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 10 of 15 ICT FP7 Contract No. 289021

module and integrated as part of the EyesWeb XMI blocks library, those blocks cha be used to create patches (programs) to perform user segmentation and human motion analysis. In this Section we present the list of the developed EyesWeb XMI blocks and some example patches that show how the EyesWeb system works. 3.1 kinematic features 3.1.1 Positions and Trajectories User tracking can be performed using cheap depth camera based motion capture systems like Kinect [20] and ASUS Xtion [21], that can extract both 2D and 3D MoCap data simultaneously or using traditional web-cams and, in this way only 2D coordinates will be available. EyesWeb XMI offers a large catalog of software modules for image analysis, motion capture (From both depth devices and web-cams) and for visualization (EyesWeb XMI provides both 2D and 3D rendering visualization tools). The patches shown in Figure 1 and Figure 2 are examples of how data can be processed and visualized in the EyesWeb XMI environment.

Figure 1: Example patch for 3D Coordinates capture and visualization

Figure 2: Example patch for 2D Coordinates capture and visualization

3.1.2 Velocities, accelerations and jerk computation 2D and 3D velocities and higher order derivative coordinates computations are performed by two EyesWeb XMI blocks developed specifically to carry on these operations. The blocks implement the algorithms described in section 2.1.2, they are denominated 3DTrajectoryFeatures and 2DtrajectoryFeatures, they can accept a coordinate as input and can calculate a single quantity that can be chosen among speed, acceleration, jerk

Page 11: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 11 of 15 ICT FP7 Contract No. 289021

and curvature. Each of the calculated quantity can be calculated as tangential (i.e., magnitude of the velocity) or as its components (i.e., components of velocity vx, vy, vz). The blocks are configurable using the following set of parameters:

Input Mode: the blocks can accept two different types of input, the first type is a single coordinate of a point, the second one is a temporal sequence of coordinates stored in a multi-channel time-series, the type of input can be chosen using this parameter.

Output Dim: used to choose the dimensionality of the desired output that can be tangential (mono-dimensional) or components (multidimensional) Output Feature: is the desired output quantity, the user can choose among

velocity, acceleration, jerk and curvature Nominal frequency: the nominal frequency of the input, if a value of 0 is given the

frequency will be automatically computed. Filter Order: is the order of the Savitzky–Golay filter used in the block, the higher is

the order the more smoothing is applied to the computed values. 3.2 Expressive features 3.2.1 Kinetic energy Kinetic energy is computed by a single EyesWeb block denomined “KineticEnergyIndex”. Given a set of coordinates captured by a MoCap system (3D) or image analysis (2D) computes its kinetic energy. The block can be configured by a number of parameters, each point of the input set is associated to a weigh that is used during the index computation. The block provides two sets of predefined wieghts, associated to the Microsoft Kinect for Windows SDK and the The OpenNi SDK set of points, in addition the predefined lists can be edited and a custom set of weights can be created by the user. The created sets of weights can be saved and loaded from file. 3.2.2 2D/3D contraction Index The contraction index is computed by two different blocks. The two blocks can be seen in Figure 3. The first block (top right corner of Figure 3) is designed to compute 2D contraction index starting from the user's blob (a black and white silhouette). The block computes internally the blob's bounding rectangle (the minimum rectangle that contains the entire silhouette) and the blob's area. The second block (bottom right corner of Figure 3) computes the 3D contraction index starting from a set of three dimensional coordinates, it computes internally an evaluation of the points density and the volume of the bounding parallelepiped. In both the two blocks the computation is performed as shown in Section 2.2.2.

Page 12: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 12 of 15 ICT FP7 Contract No. 289021

Figure 3: Example patch for 3D and 2D Contraction Index computation

3.2.3 Smoothness 2D/3D The smoothness is derived from the computation of curvature, to extract both 2D and 3D curvature from different trajectories we use the 2DTrajectoryFeatures block in the case of two dimensional data or 3DTrajectoryFeatures block if we have 3D MoCap data (See Section 3.1.2 for more details). 3.2.4 Symmetry Symmetry is computed by two different blocks, the first is developed to calculate symmetry over a silhouette the second calculates symmetry given a set of 3D coordinates. The 2D symmetry block (top right corner of Figure 4) requires a silhouette as input, it computes a pixel wise symmetry index of the given silhouette. The 3D symmetry index block (bottom right corner of Figure 4) computes the algorithm presented in section 2.2.4. both of the two blocks outputs are normalized indexes that indicate the quantity of symmetry (0 not symmetric, 1 symmetric).

Figure 4: Example patch for 3D and 2D Symmetry Index computation

Page 13: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 13 of 15 ICT FP7 Contract No. 289021

3.2.5 Density index Density can be computed for 2D and 3D sets of coordinates as described at the end of Section 2.2.2, a single block that can compute both 2D and 3D density indexes has been developed, the block is configurable using some parameters: Input Mode: the block can accept two different types of input, 2D or 3D data, the type of input is configurable using this parameter. Centroid: type of computation: used to choose the type of computation used to

find the centroid of the set of points, the user can select between mean centroid and median centroid.

Distance: type of computation: used to choose the type of computation used to evaluate the distances between the centroid and each point of the set, the user can select between mean, median and maximum distance.

Output Dimensionality: the user can select how the computed distance is given in output (i.e., as a single index or as multiple components).

3.2.6 Directness 2D and 3D Directness of a movement is computed by a single EyesWeb XMI block, that given a trajectory of a joint calculates its directness as described in Section 2.2.6. The block can compute the directness index starting from both two-dimensional and three-dimensional trajectories inputs and gives an index that express the directness of the trajectory (direct if the index is close to 1 not direct if the index is close to 0, See Figure. 5).

Figure 5: Example of non-direct (on the left) Vs direct trajectory (on the right) 3.2.7 Periodicity Periodicity is computed by a block that implements the “small to large” and “M best” algorithms described in Section 2.2.7. The block, given a sequence of numbers as input, computes a periodicity transform based on the chosen algorithm. The type of algorithm to be used is selectable using a block parameter. The block analyzes the input sequence and outputs a sequence of periods and their contribute to the input sequence.

Page 14: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 14 of 15 ICT FP7 Contract No. 289021

3.2.8 Rigidity Rigidity is computed by a single block that implements the algorithm described in Section 2.2.8. The block can accept both two-dimensional and three-dimensional coordinates as input, given a set of coordinates in input it computes a rigidity index, in the case of three-dimensional coordinates the computation is performed projecting the coordinates on two planes, calculating the rigidity on the planes and the combining the results in a single index that represents the overall rigidity. 3.2.9 Impulsivity Impulsivity Index is computed by a single EyesWeb XMI block that implements the algorithm presented in Section 2.2.9. The block requires the kinetic energy as input, the Kinetic energy can be computed from both two-dimensional and three-dimensional data from the block presented in section 3.2.1. 4. Conclusions The developed modules and patches enables an accurate extraction and analysis of low level movement expressive features, the extracted features can be computed from three-dimensional and bi-dimensional information, enabling the use of both traditional web-cams and cameras and more advanced (but cheap) depth sensors (Microsoft Kinect, Asus Xtion). Future work will concentrate on the analysis of the low level features and the development of the emotion recognition system based on the extracted features. 5. References [1] A. Camurri, P. Coletta, B. Mazzarino, and S. Piana, “Identification of expressive gestures relevant in improving capabilities of asc children,” Public Deliverable D4.1, The ASC-Inclusion Project (EU-FP7/2007-2013 grant No. 289021), Apr 2012. [2] H.G. Wallbott, “Bodily Expression of Emotion,” European J. Social Psychology, vol. 28, pp. 879-896, 1998. [3] A. Camurri, I. Lagerlo¨ f, and G. Volpe, “Recognizing Emotion from Dance Movement: Comparison of Spectator Recognition and Automated Techniques,” Int’l J. Human-Computer Studies, Elsevier Science, vol. 59, pp. 213-225, July 2003. [4] S.M. Khan and M. Shah. Detecting group activities using rigidity of formation. Proceedings of the 13th annual ACM international conference on Multimedia, pages 403–406, 2005. [5] C.L. Roether, L. Omlor, and M.A. Giese, “Lateral Asymmetry of Bodily Emotion Expression,” Current Biology, vol. 18, no. 8, pp. R329-R330, 2008. [6] R. Q. Quiroga, T. Kreuz, and P. Grassberger, “Event synchronisation: A simple and fast method to measure synchronicity and time delay patterns,,” Phys. Rev. E, vol. 66, no. 041904, pp. 1–9, 2002. [7] N. E. Dunbar and J. Burgoon, “Perceptions of power and interactional dominance in interpersonal relationships,” J. Social Person. Relationships, vol. 22, no. 2, pp. 207–233, 2005. [8] R. Norton, Communicator Style: Theory, Applications, and Measures. Beverly Hills,

Page 15: &RQFOXVLRQV 5HIHUHQFHV - geniiz.comgeniiz.com/wp-content/uploads/2016/08/D4.2_-_Implementation_of_Eye… · $6& ,qfoxvlrq ' 3djh ri ,&7 )3 &rqwudfw 1r ,qwurgxfwlrq 7klv 'holyhudeoh

ASC-Inclusion D4.2

Page 15 of 15 ICT FP7 Contract No. 289021

CA:Sage, 1983. [9] M. S. Mast, “Dominance as expressed and inferred through speaking time: A meta-analysis,,” Human Commun. Res., vol. 28, no. 3, pp. 420–450, 2002. [10] J. E. Driskell, G. Goodwin, E. Salas, and P. O’Shea, “What makes a good team player? personality and team effectiveness,” Group Dynam.: Theory, Res., Pract., vol. 10, pp. 249–271, 2006. [11] A. Savitzky, M.J.E. Golay, "Smoothing and Differentiation of Data by Simplified Least Squares Pro cedures," Analytical Chemistry ,vol. 36, no. 8, pp. 1627–1639, 1964. [12] Laban, R., Lawrence F.C., “Effort”, Macdonald & Evans Ltd. London, 1947. [13] Laban, R., “Modern Educational Dance” Macdonald & Evans Ltd. London, 1963. [14] Schowstra S. J. and Hoogstraten, J., “Head position and spinal position as determinants of perceived emotional state,” Perceptual and Motor Skills, Vol. 81, pp. 673-674, 1995. [15] M. Meijer, “The contribution of general features of body movement to the attribution of emotions,” Journal of Nonverbal Behavior, vol. 13, no. 4, pp. 247-268, 1989. [16] C. Tomasi and T. Kanade. "Shape and motion from image streams under orthography: a factorization method." Int. J. of Computer Vision, 9(2):137-154,1992. [17] M. M. Gross, E. A. Crane, B L. Fredrickson. “Methodology for Assessing Bodily Expression of Emotion” Int. J. of NonVerbal Behaviour, pp. 223-248, 2010. [18] W. A. Sethares, T. W. StaleyI “Periodicity Transforms” EEE transaction on signal processing, Vol. 47, no. 11, pp. 2953-2964, 1999. [19] B. Mazzarino, M. Mancini “The Need for Impulsivity & Smoothness - Improving HCI by Qualitatively Measuring New High-Level Human Motion Features” Signal Processing and Multimedia Applications, pp. 62-67, 2010. [20] Kinect For Windows http://www.microsoft.com/en-us/kinectforwindows/ [21] ASUS Xtion Pro http://www.asus.com/Multimedia/Motion_Sensor/Xtion_PRO/ [22] P. Heiser, J. Frey, J. Smidt, C. Sommerlad, P. M. Wehmeier, J. Hebebrand, H. Remschmidt, “Objective measurement of hyperactivity, impulsivity, and inattention in children with hyperkinetic disorders before and after treatment with methylphenidate” European Child & Adolescent Psychiatry, Vol. 13, No. 2, pp. 100-104, 2004. [23] J. L. Evenden, “Varieties of impulsivity” Psychopharmacology, Vol. 146, No. 4, pp. 348-361, 1999. [24] C. T. Nagoshi, J. R. Wilson, L. A. Rodriguez, “Impulsivity, Sensation Seeking, and Behavioral and Emotional Responses to Alcohol.” Alcoholism: Clinical and Experimental Research, Vol. 15, pp. 661–667, 1991.