12
ARTÍCULO DE INVESTIGACIÓN Vol. 35, No. 3, Diciembre 2014, pp. 241-252 Instantaneous Position and Orientation of the Body Segments as an Arbitrary Object in 3D Space by Merging Gyroscope and Accelerometer Information J.A. Barraza-Madrigal * R. Muñoz-Guerrero * L. Leija-Salas * R. Ranta ** * Centro de Investigación y de Estudios Avanzados (CINVESTAV-IPN). ** Centre de Recherche en Automatique de Nancy (CRAN-CNRS) ABSTRACT This work presents an algorithm to determine instantaneous orientation of an object in 3D space. The orientation was determined by using a Direction Cosine Matrix (DCM), performed by the combination of three consecutive rotations, around each to the main axes of the evaluated system, using quaternions. An inertial measurement unit (IMU), consisting of 3 axes gyroscope and 3 axes accelerometer was used in order to establish 2 coordinate systems; The first one describes object movement, by using gyroscope as a main source of information, relating the angular rate of change along time. The second defines a coordinate reference system, relating the acceleration of gravity to an inertial direction vector. A proportional integral (PI) feedback controller was used, which includes sensors information, eliminating offset, cancelling drift and improving the accuracy of the orientation. The proposed algorithm can be used for assessing both the position and orientation of the body segments which is very important in orthopedic, traumatology and rheumatology important for diagnosis, prognostic, therapeutic, research and as well as the design and fabrication of measuring devices used in surgical instrumentation, prostheses and ortheses. It is important to note that the developed system opens the opportunities to be implemented on ambulatory joint evaluation through a wearable system, due to the dimensions and requirements of the sensors. Keywords: instantaneous orientation, direction cosine matrix, quaternions, inertial measurement unit, offset and drifting deviation.

Instantaneous Position and Orientation of the Body ... · Instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer

  • Upload
    others

  • View
    21

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Instantaneous Position and Orientation of the Body ... · Instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer

ARTÍCULO DE INVESTIGACIÓN

Vol. 35, No. 3, Diciembre 2014, pp. 241-252

Instantaneous Position and Orientation of the BodySegments as an Arbitrary Object in 3D Space byMerging Gyroscope and Accelerometer Information

J.A. Barraza-Madrigal∗R. Muñoz-Guerrero∗

L. Leija-Salas∗R. Ranta∗∗

∗ Centro de Investigación yde Estudios Avanzados(CINVESTAV-IPN).

∗∗ Centre de Recherche enAutomatique de Nancy

(CRAN-CNRS)

ABSTRACTThis work presents an algorithm to determine instantaneous orientationof an object in 3D space. The orientation was determined by usinga Direction Cosine Matrix (DCM), performed by the combinationof three consecutive rotations, around each to the main axes of theevaluated system, using quaternions. An inertial measurement unit(IMU), consisting of 3 axes gyroscope and 3 axes accelerometer wasused in order to establish 2 coordinate systems; The first one describesobject movement, by using gyroscope as a main source of information,relating the angular rate of change along time. The second definesa coordinate reference system, relating the acceleration of gravity toan inertial direction vector. A proportional integral (PI) feedbackcontroller was used, which includes sensors information, eliminatingoffset, cancelling drift and improving the accuracy of the orientation.The proposed algorithm can be used for assessing both the position andorientation of the body segments which is very important in orthopedic,traumatology and rheumatology important for diagnosis, prognostic,therapeutic, research and as well as the design and fabrication ofmeasuring devices used in surgical instrumentation, prostheses andortheses. It is important to note that the developed system opensthe opportunities to be implemented on ambulatory joint evaluationthrough a wearable system, due to the dimensions and requirements ofthe sensors.

Keywords: instantaneous orientation, direction cosine matrix,quaternions, inertial measurement unit, offset and driftingdeviation.

Page 2: Instantaneous Position and Orientation of the Body ... · Instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer

242 Revista Mexicana de Ingeniería Biomédica · volumen 35 · número 3 · Diciembre, 2014

Correspondencia:Barraza-Madrigal J.A

Av. IPN 2508 Zacatenco 07360,México DF

Correo electrónico:[email protected]

Fecha de recepción:5 de mayo de 2014

Fecha de aceptación:10 de octubre de 2014

RESUMENEl presente trabajo presenta un algoritmo para determinar laorientación instantánea de un objeto en el espacio 3D. La orientaciónfue determinada utilizando una matriz de cosenos directores (DCM)conformada por la combinación de 3 rotaciones consecutivas alrededorde cada uno de los ejes del sistema evaluado, utilizando cuaterniones.Una unidad inercial de medida (IMU) compuesta por un giroscopiode 3 ejes y un acelerómetro de 3 ejes fue utilizada con el objetivo deestablecer 2 sistemas coordenados; Un sistema coordenado describiendoel movimiento del objeto, utilizando al giroscopio como fuente principalde información, estableciendo la relación de cambio con respectoal tiempo. Un sistema coordenado de referencia, relacionando laaceleración gravitacional a un vector inercial. Un control porretroalimentación proporcional integral (PI) fue utilizado con el objetivode combinar la información de los sensores, eliminando las desviacionespor offset y deriva, mejorando la precisión en la orientación. Dadassus características, el algoritmo propuesto permite su utilización en laevaluación de la posición y la orientación de los segmentos corporales,siendo de suma importancia en ortopedia, traumatología y reumatologíapara la determinación de diagnósticos, pronósticos terapéuticos einvestigación así como el diseño y fabricación de dispositivos demedición, instrumentación quirúrgica, prótesis y ortesis. Cabe destacarque el sistema desarrollado abre oportunidades de ser implementado enel diseño de sistemas ambulatorios de evaluación de las articulaciones,mediante el uso de elementos transportables dadas las reducidasdimensiones y limitaciones de los sensores empleados.

Palabras clave: orientación instantánea, matriz de cosenosdirectores, cuaterniones, unidad inercial de medida, desviación deoffset y deriva.

INTRODUCTION

Joints evaluation is very important inorthopedic, traumatology and rheumatologyfor assessing diagnosis, prognostic, therapeuticand research as well as the design andfabrication of measuring devices used in surgicalinstrumentation, prostheses and orthesis [1].

The kinematics of joint evaluation requiresthe determination of both the position andorientation of the body segments involved; Itis normally modelled as rigid bodies which isrelated to an inertial reference system [2]. Thisis commonly achieved by tracking the movementof a group of reflective point markers attachedto the required body segment of interest, this

using optical motion systems [3]. Nonetheless,this system is expensive and cannot also be usedoutside the laboratory, since it requires a specificand controlled environment.

During the last decade, body mountedsensors such as gyroscopes and/or accelerometersalso known as inertial sensors have been used toobtain kinematic values, offering the advantageof identifying human motion in a wide varietyof environments due their low cost, small sizeand low power consumption [4]. However,inertial sensors do not measure the position ororientation of the body segment directly butthe physical quantities which are related to themotion of the objects where the sensors are fixed[5]. On one hand, gyroscope measures angular

Page 3: Instantaneous Position and Orientation of the Body ... · Instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer

Barraza-Madrigal et al. Instantaneous position and orientation of the body segments as an arbitrary object in 3D spaceby merging gyroscope and accelerometer information. 243

velocity which if initial conditions were known,information may be integrated over time tocompute the sensors orientation [6] [7]. In theory,both the position and orientation of an arbitraryobject or body segment can be estimated byintegrating the angular rate data obtained froma gyroscope sensor attached herewith. Althoughthis data can be translated into meaningfulthree dimensional information, it was reportedthat the integration of any amount of signalnoise in the angular velocity data will cause anaccumulative error in the estimated orientationalso known as a drifting deviation [8]. Onthe other hand, the accelerometer measures theearth’s gravitation and provides an absolutereference orientation allowing it to measure theangle of the sensor unit in respect to gravity [9],[10]. This method is appropriate if magnitudeof acceleration is neglected with respect togravity, and less accurate when accelerations dueto motion are affected by measured directionof gravity. Furthermore, accelerometer signalsdo not contain information about the rotationaround the vertical axis and therefore do not givea complete description of the orientation [11].

Hence to compute a single estimate oforientation through the optimal fusion ofaccelerometer and gyroscope sensors is needed[12]. Recently, many studies of humanmotion based on inertial sensors have proposeddifferent methods to combine gyroscope andaccelerometer sensors information for kinematicjoint evaluation. Dejnabadi et.al [4] proposed amethod of measuring uniaxial joint angles usinga combination of accelerometers and gyroscopes.The model was based on the acceleration ofthe joint center rotation mathematically byshifting the location of the physical sensors.Joint angles were computed by estimatingabsolute angles without any integration ofsensors information. Luinge et.al [13] describeda method that used constraints in the elbowto measure the orientation of the lower arm inrespect to the upper arm. Accelerometer andgyroscope orientations were achieved by usingrotational matrixes. A calibration method todetermine the orientation of each sensor wasdeterminant in the accuracy of the orientationmeasured. Zhang et.al [14] used a particle

filter to fuse gyroscope and accelerometerinformation, compensating drifting deviationand also improving the estimation accuracy.Additionally, quaternions are used to representorientations of upper limb segments therebyavoiding singularities. Tadano et.al [8] proposeda method of three dimensional gait analysis usingwearable sensors and quaternion calculations.Accelerometer information was used to estimateinitial orientation of the sensors. Gyroscopeinformation was used to estimate angulardisplacements during gait, by integrating angularvelocity. Orientations of the sensors wereconverted to the orientations of the bodysegment by a rotational matrix obtained froma calibration trial. Favre et.al [15] proposed2 different methods to fuse gyroscope andaccelerometer information; Quaternion-basedintegration of angular velocity and orientationcorrection using gravity. Both methods are basedon recognizing characteristic samples whichprovide vertical orientation of the sensor.

This work presents an algorithm to determineboth the position and orientation of an arbitraryobject in 3D space, oriented to be used inkinematic joint evaluation. The proposedalgorithm combines both gyroscope andaccelerometer information, thereby establishingtwo suitable coordinate systems describing theobject movement in respect to a coordinatereference system. Both the position andorientation were achieved by estimating thespatial relationship between the coordinatesystems through a Direction Cosine Matrix(DCM) computed from the combination of threeconsecutive rotations, around each to the mainaxis of the evaluated system using quaternions.A PI controller feedback (PI) is used to applythe amount of rotation that is needed to alignthe estimated orientation by both gyroscopeand accelerometer information, forcing themto converge with each other proportionally,eliminating drifting deviation and improving alsothe accuracy by decreasing noise response.

METHODOLOGY

A combination of 3 axes gyroscope and 3 axesaccelerometer was used in order to estimate

Page 4: Instantaneous Position and Orientation of the Body ... · Instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer

244 Revista Mexicana de Ingeniería Biomédica · volumen 35 · número 3 · Diciembre, 2014

both the position and orientation of an arbitraryobject in 3D space. Gyroscope information(Gx, Gy, Gz) was used to define a coordinatesystem describing object movement (Omov)through gyroscope angular rate (ωx, ωy, ωz).Accelerometer information (Ax, Ay, Az) was usedto establish an absolute coordinate reference(Oref ) based on the acceleration of gravity (λ) inorder to compensate drifting deviation given bythe integration of gyroscope measurement errorsover time.

Sensors and data acquisition

Gyroscope and Accelerometer information wereobtained from an IMU (Invensense MPU6050)configured at its lowest feature rates [16], afull scale range of ±250o/s and ±2g with abandwidth of 256Hz and 260Hz respectively.Full scale range parameters and bandwidth weredefined considering the common values reportedin kinematic joint evaluation; gyroscope from±200o/s to ±600o/s, accelerometer from ±2gto ±5g and bandwidth from 2Hz to 200Hz[4], [8], [11], [15], [17], [18]. A 16-bit DigitalSignal Controller Microchip (Dspic−30F6014A)was used to perform the optimal fusionof gyroscope and accelerometer informationand to implement an algorithm for assessingthe orientation of an object in 3D space.Communication between IMU and Dspic wasestablished through the Inter-Integrated Circuitserial protocol (I2C) at 400-kHz which is thestandard operating frequency. The response ofthe implemented algorithm was assessed througha virtual model designed on V-realm builder2.0. Communication between Dspic and virtualenvironment was achieved by using MATLABR2013a version.

Orientation estimate

Orientation estimate problem involves twocoordinate systems; one attached to thearbitrary object describing its movement Omovand the other located at some point to the

earth but not attached to it establishing areference frame Oref . Since the coordinates ofa vector depend on the frame it is representedin, an arbitrary vector can be represented in therotation frame. Consequently these two framescan rotate relatively to each other independently[19]. This rotation is estimated by a 4D complexnumber also called quaternion (Q) which isconformed by an scalar (q0) and a 3D vector (q)with q = [q1, q2, q3], Eq.(1).

Q = {q0,q} (1)

The rotational vector Q is associated with thesine (S) and the cosine (C) of the rotationangle σ = [α, φ, θ], over each of the main axis(x, y, z) to the evaluated coordinate system intime, Eq.(2).

Q = 12[C(σ), (q)S(σ)] (2)

Quaternions can be composed to express severalrotations through a quaternion product Qcomp,Eq.(3). Qcomp can also be used to express therotation on a 3D space, Eq.(4).

Qcomp = Qa ⊗Qb = (qa0 ,qa)⊗ (qb0 ,qb) (3)= {qa0qb0 − qaqb,qa × qb + qa0qb + qb0qa}

Therefore Qcomp = Qz(θ)⊗Qy(φ)⊗Qx(α)

Qcomp = 12

C (θ)C (φ)C (α) + S (θ)S (φ)S (α)C (θ)C (φ)S (α)− S (θ)S (φ)C (α)C (θ)S (φ)C (α) + S (θ)C (φ)S (α)S (θ)C (φ)C (α)− C (θ)S (φ)S (α)

(4)

To transform vectors from one coordinatesystem to another using Eq.(4) a DirectionCosine Matrix (DCM) is performed as shownin Eq.(5), which is used to establish the spatialrelationship between coordinate systems Omovand Oref which allows estimating the orientation[20], [21].

DCM = (q20 − q′q)I + 2qq′ − 2q0q∗ (5)

When expanded this yields

Page 5: Instantaneous Position and Orientation of the Body ... · Instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer

Barraza-Madrigal et al. Instantaneous position and orientation of the body segments as an arbitrary object in 3D spaceby merging gyroscope and accelerometer information. 245

q20 + q2

1 − q22 − q2

3 2 (q1q2 + q0q3) 2 (q1q3 − q0q2)2 (q1q2 − q0q3) q2

0 − q21 + q2

2 − q23 2 (q2q3 + q0q1)

2 (q1q3 + q0q2) 2 (q2q3 − q0q1) q20 − q2

1 − q22 + q2

3

Direction cosine matrix improvement

To estimate the orientation of an object in3D space, DCM is updated by incrementalchanges when either Omov or Oref rotates.These incremental changes can be established byintegrating DCM as shown in Eq.(6).

DCM = DCMt−1 ∗DCMt (6)

Because DCM is conformed from Qcomp, Eq.(4)it is noted that by expressing elements of Qcompas approximations rather than identities, Qcompcan be reduced to Qred, Eq.(7).

cos (σ) ≈ 1sin (σ) ∗ sin (σ) ≈ 0

yields−−−→

sin (α)sin (φ)sin (θ)

αφθ

Qred = 1

2 [1, α, φ, θ] (7)

For assessing DCM improvement in order toestimate orientation, incremental changes wereestimated from the integration of Qred over time,improving Qred as shown in Eq.(8).

Qimp = Qredt−1 ⊗Qredt (8)

When expanded this yields:

Qimp =

q0q1q2q3

+ 12

−αq1 − φq2 − θq3αq0 + θq2 − φq3φq0 − θq1 − αq3θq0 + φq1 − αq2

It should be noted that numerical errors inthe approximations will reduce orthogonalitybetween rotational axes of the coordinatedsystems [22]. Consequently, Qimp is normalizedbefore being substituted in Eq.(5) in order toperform an improved DCM, Eq.(9).

Qimp(norm) = Qimp‖Qimp‖

DCM = DCMt (9)

Coordinate system that describes theobject movement (Omov)

Gyroscope output (ωx, ωy, ωz) is used as themain source of information in order to definethe coordinate system that describes objectmovement. Then Omov has its origin in thetri-axial gyroscope and each point along thegyroscope axes (Gx, Gy, Gz). Therefore, byrelating the angular rate of change along time(∆t) to its rotation during the movement of theIMU, a single estimation of gyroscope orientation(σgyro) can be determined, Eq.(10).

σgyro = (ω − ωoff ) ∗∆t (10)

Where ωoff is the intrinsic offset of the gyroscopesensor, estimated only at the beginning of thetest by computing the mean value of the totalnumber of the recorded data in one second instatic position.

It is noted that σgyro could be used toestimate both the position and orientation of anarbitrary object or body segment by conformingDCM. However, constructing DCM only fromσgyro is not only inefficient but also inconvenient,considering that any amount of noise in theangular velocity data will result in integrationerrors and consequently, an accumulative gyrodrift in DCM. Hence, an additional sourceof information is needed to assess driftingcompensation.

Drifting estimation

Accelerometer information (λx, λy, λz), definesthe coordinate reference system which is usedto assess drifting deviation. Therefore, Orefhas its origin at the tri-axial accelerometerand each point along accelerometer axes(Ax, Ay, Az). Assuming that the accelerometeris only measuring the acceleration of gravity,then an appropriate convention would be toassume that the direction of gravity g =[0, 0, 1] defines the vertical axis z [12], [23].

Page 6: Instantaneous Position and Orientation of the Body ... · Instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer

246 Revista Mexicana de Ingeniería Biomédica · volumen 35 · número 3 · Diciembre, 2014

Thus the inertial directional vector ~v can becomputed from the normalized accelerometermeasurements Eq.(11).

~v = λ

‖λ‖(11)

Given a DCM estimation from gyroscopemeasurement, a gravity vector is determined asEq.(12).

v = g ∗DCM (12)

Finally considering that the magnitude of crossproduct between two vectors is proportional tothe sine of the angle between them and itsdirection can be perpendicular to both of them[24], drifting deviation detected by taking crossproduct between vector ~v and v, presenting bothan axis and an amount of rotation that areneeded to rotate v to become parallel to theestimated vector σgyro, Eq.(13).

σaccel = ~v × v (13)

Data integration

Slow movements are related to low noise levels[25]. Hence, drifting deviation can be correctedby adding the total amount of rotation σaccelto the estimated orientation σgyro, in order toalign the coordinate systems that they represent,Eq.(14).

σ = σgyro + σaccel (14)

Nonetheless σaccel is susceptible to high noiselevel due to variations in velocity duringmovement. Therefore, a half portion of σaccel,computed by multiplying σaccel by a weighed ωσ,is used as a feedback to a Proportional Integral(PI) controller Eq.(15) before being added toσgyro to obtain σ Eq.(14). Hence, it is possibleto apply a proportional the amount of rotationneeded to align Omov and Oref , instead ofapplying the total correction. Consequently,both coordinate systems are forced to convergeproportionally thereby cancelling drifting.

σaccel = Kpf(ωσ ∗ σaccel) +Ki

t∫0

f(ωσ ∗ σaccel)∆t

(15)

Proportional and Integral parameters (Kp&Ki)were established based on The Good Gainmethod for PI (D) controller tuning reported instudies [26], [27].

RESULTS AND DISSCUSION

Proposed algorithms

Gyroscope and accelerometer information wereused in order to establish two suitable coordinatesystems describing both object movement Omovand reference coordinate system Oref . Thedescription of an object in 3D space wasassessed by estimating the spatial relationshipbetween coordinate systems. Commonly, thisrelationship is given by both the position andorientation of the object in respect to a referencesystem. However, assuming that both systemsoverlapped at the origin, it is said that thereis no change of position between them, [24].Therefore, an arbitrary object in 3D space canbe described by its orientation. This orientationis determined through a DCM computed fromthe combination of three consecutive rotationsaround each of the main axis of the evaluatedsystem using quaternions. Fig. 1, shows a blockdiagram representing the proposed algorithms.

In Fig. 1, it can be observed that twodifferent solutions proposed to estimate theamount of rotation σ, from which quaternionQ is computed defines the amount of rotationin 3D space. 1.- Estimation of σ by combiningσgyro and σaccel without any reduction of noise(σgyro&σaccel). 2.- Estimation of σ by combiningσgyro and σaccel information using a PI feedbackcontroller, (σgyro&σaccel)&PI. It is noted thatQ is used to perform DCM from which theorientation of an object in 3D space is defined.It is noteworthy that two different solutions wereproposed to estimate DCM; 1.- Estimation ofDCM from Qcomp which defines the orientationfrom DCM product. 2.- Estimation of DCM fromQimp which defines the orientation through thequaternion product.

Page 7: Instantaneous Position and Orientation of the Body ... · Instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer

Barraza-Madrigal et al. Instantaneous position and orientation of the body segments as an arbitrary object in 3D spaceby merging gyroscope and accelerometer information. 247Figures

Fig. 1. Estimation of 𝑄 from 𝜎 as (𝜎!"#$&𝜎!""#$) without any reduction of noise., estimation of 𝑄 from 𝜎 as 𝜎!"#$&𝜎!""#$ &𝑃𝐼 by using a PI feedback controller., estimation of DCM from 𝑄!"#$., estimation of DCM from 𝑄!"#.

Fig. 2. Behavior analysis of the estimated orientation in static position in order to establish drifting deviation comparison either in the estimated orientation from 𝜎!"#$ and 𝜎!""#$ independently, as well as through to the proposed algorithms; 𝜎!"#$&𝜎!""#$ and (𝜎!"#$&𝜎!""#$)&𝑃𝐼.

Table.1 Drifting deviation and RMSE on static position X axis Y axis Z axis

Drift 5min

RMSE Drift 5min

RMSE Drift 5min

RMSE

𝜎!"#$ 22.85º 1.3e-4 45.15º 1.4e-4 19.87º 1.8e-4 𝜎!""#$ 0.34º 0.18 0.17º 0.21 0.002º 0.007 𝜎!"#$&𝜎!""#$ 0.33º 0.18 0.12º 0.21 9.26º 0.007 𝜎!"#$&𝜎!""#$ &𝑃𝐼 0.001º 0.04 0.10º 0.05 9.21º 0.007

Fig. 3 Comparison in the estimated orientation in Z axis.

Fig. 4. Behavior analysis of the estimated orientation during movement into a range of motion from -90 to 90 degrees, comparing drifting deviation, between 𝜎!"#$ and the proposed algorithm (𝜎!"#$&𝜎!""#$)&𝑃𝐼, for each axes.

Table.2 Drifting deviation and RMSE during movement X axis Y axis Z axis

Drift (30s)

RMSE Drift (30s)

RMSE Drift (30s)

RMSE

𝜎!"#$ 48.82º 24.12º 4.39º 11.36º 15.15º 7.14º 𝜎!"#$&𝜎!""#$ &𝑃𝐼 2.59º 2.75º 1.32º 2.98º 15.15º 7.14º

Fig. 5. Virtual representation to the estimated orientation of an object during movement, in a range of motion from 0 to 90 degrees through X axis, based on the sensors orientation.

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-50-40-30-20-10

0102030

Estimated orientation on static position (σgyro)

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σaccel)

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σgyro&σaccel)&PI

Time (Min)

Ang

les

(Deg

rees

)

XYZ

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σgyro&σaccel)

0 1 2 3 4 5 6 7 8 9 10-100-90-80-70-60-50-40-30-20-100102030405060708090100Gyroscope & Accelerometer comparison (Z axis)

Time (Sec)

Angle

s (De

grees

)

σgyroσaccel

0 5 10 15 20 25 30-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (Z axis)

Time (Sec)

Angle

s (De

grees)

0 5 10 15 20 25 30-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (Y axis)

Angle

s (De

grees)

σgyro(σgyro & σaccel)&PI

0 5 10 15 20 25 30-120-110-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (X axis)

Angle

s (Degr

ees)

-20

2

-2

0

2-2

-1

0

1

2

Eje X

0 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

15 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

30 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

45 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

60 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

75 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

90 degrees

Eje Y

Eje

Z

0 2 4 6 8 10-100

102030405060708090

100

Orientation on 3D space (σgyro&σaccel)&PI

Time (Sec)

Angle

s (De

grees

)

XYZ

Fig. 1. Estimation of Q from σ as (σgyro&σaccel) without any reduction of noise., estimation of Q fromσ as (σgyro&σaccel)&PI by using a PI feedback controller., estimation of DCM from Qcomp., estimationof DCM from Qimp.

Estimating differences between proposedalgorithms

In order to estimate differences betweenproposed algorithms, the analysis of behaviorof the sensor is carried out by estimating theorientation in both static position and duringmovement. Hence representing the orientationas Euler angles computed from DCM Eq.(16).

α = atan2(DCM23, DCM33)φ = asin(DCM13) (16)θ = atan2(DCM12, DCM11)

Behavior analysis of the estimatedorientation in static position

For analysis of behavior in static positionfive minutes of the obtained information wererecorded, which allows the comparison of driftingdeviation over time. It should be noted that toassess the behavior of the proposed algorithms,the estimation of the orientation using eitherσgyro or σaccel is included to the analysis. Acomparison is made by using sensors information

separately and also by fusion sensors informationthrough the proposed algorithms.

In figure 2, it can be observed that estimatingorientation from σgyro only seems not to beso noisy but drifts over time. On theother hand, estimating orientation from σacceldoes not drift over time but seems to bequite noisy especially when compared to σgyro.When analyzing the estimated orientation eitherthrough (σgyro&σaccel) or (σgyro&σaccel)&PI,drifting deviation appears to be cancelled andthe noise reduced. Despite the fact there seemsto be a considerably reduction of noise betweenthe estimated orientation (σgyro&σaccel)&PIin comparison to (σgyro&σaccel), it is notedthat drifting deviation is compensated but noteliminated thereby maintaining the behavior ofσgyro in z axis. The main reason is thatboth algorithms depend on the sensors fusion inorder to compensate drifting deviation and theaccelerometer information can only be used toestimate the orientation of an object in X andY but not in Z axis because of its dependenceon the acceleration of gravity as was reported in[11].

Page 8: Instantaneous Position and Orientation of the Body ... · Instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer

248 Revista Mexicana de Ingeniería Biomédica · volumen 35 · número 3 · Diciembre, 2014

Figures

Fig. 1. Estimation of 𝑄 from 𝜎 as (𝜎!"#$&𝜎!""#$) without any reduction of noise., estimation of 𝑄 from 𝜎 as 𝜎!"#$&𝜎!""#$ &𝑃𝐼 by using a PI feedback controller., estimation of DCM from 𝑄!"#$., estimation of DCM from 𝑄!"#.

Fig. 2. Behavior analysis of the estimated orientation in static position in order to establish drifting deviation comparison either in the estimated orientation from 𝜎!"#$ and 𝜎!""#$ independently, as well as through to the proposed algorithms; 𝜎!"#$&𝜎!""#$ and (𝜎!"#$&𝜎!""#$)&𝑃𝐼.

Table.1 Drifting deviation and RMSE on static position X axis Y axis Z axis

Drift 5min

RMSE Drift 5min

RMSE Drift 5min

RMSE

𝜎!"#$ 22.85º 1.3e-4 45.15º 1.4e-4 19.87º 1.8e-4 𝜎!""#$ 0.34º 0.18 0.17º 0.21 0.002º 0.007 𝜎!"#$&𝜎!""#$ 0.33º 0.18 0.12º 0.21 9.26º 0.007 𝜎!"#$&𝜎!""#$ &𝑃𝐼 0.001º 0.04 0.10º 0.05 9.21º 0.007

Fig. 3 Comparison in the estimated orientation in Z axis.

Fig. 4. Behavior analysis of the estimated orientation during movement into a range of motion from -90 to 90 degrees, comparing drifting deviation, between 𝜎!"#$ and the proposed algorithm (𝜎!"#$&𝜎!""#$)&𝑃𝐼, for each axes.

Table.2 Drifting deviation and RMSE during movement X axis Y axis Z axis

Drift (30s)

RMSE Drift (30s)

RMSE Drift (30s)

RMSE

𝜎!"#$ 48.82º 24.12º 4.39º 11.36º 15.15º 7.14º 𝜎!"#$&𝜎!""#$ &𝑃𝐼 2.59º 2.75º 1.32º 2.98º 15.15º 7.14º

Fig. 5. Virtual representation to the estimated orientation of an object during movement, in a range of motion from 0 to 90 degrees through X axis, based on the sensors orientation.

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-50-40-30-20-10

0102030

Estimated orientation on static position (σgyro)

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σaccel)

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σgyro&σaccel)&PI

Time (Min)

Ang

les

(Deg

rees

)

XYZ

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σgyro&σaccel)

0 1 2 3 4 5 6 7 8 9 10-100-90-80-70-60-50-40-30-20-100102030405060708090100Gyroscope & Accelerometer comparison (Z axis)

Time (Sec)

Angle

s (De

grees

)

σgyroσaccel

0 5 10 15 20 25 30-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (Z axis)

Time (Sec)

Angle

s (De

grees)

0 5 10 15 20 25 30-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (Y axis)

Angle

s (De

grees)

σgyro(σgyro & σaccel)&PI

0 5 10 15 20 25 30-120-110-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (X axis)

Angle

s (Degr

ees)

-20

2

-2

0

2-2

-1

0

1

2

Eje X

0 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

15 degrees

Eje YEj

e Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

30 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

45 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

60 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

75 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

90 degrees

Eje Y

Eje

Z

0 2 4 6 8 10-100

102030405060708090

100

Orientation on 3D space (σgyro&σaccel)&PI

Time (Sec)

Angle

s (De

grees

)

XYZ

Fig. 2. Behavior analysis of the estimated orientation in static position in order to establish driftingdeviation comparison either in the estimated orientation from σgyro and σaccel independently, as wellas through to the proposed algorithms; (σgyro&σaccel) and (σgyro&σaccel)&PI.

Table.1 Drifting deviation and RMSE on static positionX axis Y axis Z axis

Drift 5min RMSE Drift 5min RMSE Drift 5min RMSE

σgyro 22.85o 1.3e-4 45.15o 1.4e-4 19.87o 1.8e-4σaccel 0.34o 0.18 0.17o 0.21 0.002o 0.007σgyro&σaccel 0.33o 0.18 0.12o 0.21 9.26o 0.007(σgyro&σaccel)&PI 0.001o 0.04 0.10o 0.05 9.21o 0.007

Consequently, drifting deviation in this axiscannot be completely eliminated. Finally, inorder to verify such assumptions; Table. 1 showsa comparison of the amount of drifting deviationin degrees as well as the amount of noise instatic position represented by the Root MeanSquare Error (RMSE). Similarly in Fig. 3, itcan be observed a comparison in the estimationof the orientation in Z axis, in a range of -90to 90 degrees using only σgyro or σaccel, hencedemonstrating the mentioned dependence of theacceleration of gravity.

Figures

Fig. 1. Estimation of 𝑄 from 𝜎 as (𝜎!"#$&𝜎!""#$) without any reduction of noise., estimation of 𝑄 from 𝜎 as 𝜎!"#$&𝜎!""#$ &𝑃𝐼 by using a PI feedback controller., estimation of DCM from 𝑄!"#$., estimation of DCM from 𝑄!"#.

Fig. 2. Behavior analysis of the estimated orientation in static position in order to establish drifting deviation comparison either in the estimated orientation from 𝜎!"#$ and 𝜎!""#$ independently, as well as through to the proposed algorithms; 𝜎!"#$&𝜎!""#$ and (𝜎!"#$&𝜎!""#$)&𝑃𝐼.

Table.1 Drifting deviation and RMSE on static position X axis Y axis Z axis

Drift 5min

RMSE Drift 5min

RMSE Drift 5min

RMSE

𝜎!"#$ 22.85º 1.3e-4 45.15º 1.4e-4 19.87º 1.8e-4 𝜎!""#$ 0.34º 0.18 0.17º 0.21 0.002º 0.007 𝜎!"#$&𝜎!""#$ 0.33º 0.18 0.12º 0.21 9.26º 0.007 𝜎!"#$&𝜎!""#$ &𝑃𝐼 0.001º 0.04 0.10º 0.05 9.21º 0.007

Fig. 3 Comparison in the estimated orientation in Z axis.

Fig. 4. Behavior analysis of the estimated orientation during movement into a range of motion from -90 to 90 degrees, comparing drifting deviation, between 𝜎!"#$ and the proposed algorithm (𝜎!"#$&𝜎!""#$)&𝑃𝐼, for each axes.

Table.2 Drifting deviation and RMSE during movement X axis Y axis Z axis

Drift (30s)

RMSE Drift (30s)

RMSE Drift (30s)

RMSE

𝜎!"#$ 48.82º 24.12º 4.39º 11.36º 15.15º 7.14º 𝜎!"#$&𝜎!""#$ &𝑃𝐼 2.59º 2.75º 1.32º 2.98º 15.15º 7.14º

Fig. 5. Virtual representation to the estimated orientation of an object during movement, in a range of motion from 0 to 90 degrees through X axis, based on the sensors orientation.

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-50-40-30-20-10

0102030

Estimated orientation on static position (σgyro)

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σaccel)

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σgyro&σaccel)&PI

Time (Min)

Ang

les

(Deg

rees

)

XYZ

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σgyro&σaccel)

0 1 2 3 4 5 6 7 8 9 10-100-90-80-70-60-50-40-30-20-100102030405060708090100Gyroscope & Accelerometer comparison (Z axis)

Time (Sec)

Angle

s (De

grees

)

σgyroσaccel

0 5 10 15 20 25 30-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (Z axis)

Time (Sec)

Angle

s (De

grees)

0 5 10 15 20 25 30-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (Y axis)

Angle

s (De

grees)

σgyro(σgyro & σaccel)&PI

0 5 10 15 20 25 30-120-110-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (X axis)

Angle

s (Degr

ees)

-20

2

-2

0

2-2

-1

0

1

2

Eje X

0 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

15 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

30 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

45 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

60 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

75 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

90 degrees

Eje Y

Eje

Z

0 2 4 6 8 10-100

102030405060708090

100

Orientation on 3D space (σgyro&σaccel)&PI

Time (Sec)

Angle

s (De

grees

)

XYZ

Fig. 3 Comparison in the estimated orientationin Z axis.

Page 9: Instantaneous Position and Orientation of the Body ... · Instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer

Barraza-Madrigal et al. Instantaneous position and orientation of the body segments as an arbitrary object in 3D spaceby merging gyroscope and accelerometer information. 249

Figures

Fig. 1. Estimation of 𝑄 from 𝜎 as (𝜎!"#$&𝜎!""#$) without any reduction of noise., estimation of 𝑄 from 𝜎 as 𝜎!"#$&𝜎!""#$ &𝑃𝐼 by using a PI feedback controller., estimation of DCM from 𝑄!"#$., estimation of DCM from 𝑄!"#.

Fig. 2. Behavior analysis of the estimated orientation in static position in order to establish drifting deviation comparison either in the estimated orientation from 𝜎!"#$ and 𝜎!""#$ independently, as well as through to the proposed algorithms; 𝜎!"#$&𝜎!""#$ and (𝜎!"#$&𝜎!""#$)&𝑃𝐼.

Table.1 Drifting deviation and RMSE on static position X axis Y axis Z axis

Drift 5min

RMSE Drift 5min

RMSE Drift 5min

RMSE

𝜎!"#$ 22.85º 1.3e-4 45.15º 1.4e-4 19.87º 1.8e-4 𝜎!""#$ 0.34º 0.18 0.17º 0.21 0.002º 0.007 𝜎!"#$&𝜎!""#$ 0.33º 0.18 0.12º 0.21 9.26º 0.007 𝜎!"#$&𝜎!""#$ &𝑃𝐼 0.001º 0.04 0.10º 0.05 9.21º 0.007

Fig. 3 Comparison in the estimated orientation in Z axis.

Fig. 4. Behavior analysis of the estimated orientation during movement into a range of motion from -90 to 90 degrees, comparing drifting deviation, between 𝜎!"#$ and the proposed algorithm (𝜎!"#$&𝜎!""#$)&𝑃𝐼, for each axes.

Table.2 Drifting deviation and RMSE during movement X axis Y axis Z axis

Drift (30s)

RMSE Drift (30s)

RMSE Drift (30s)

RMSE

𝜎!"#$ 48.82º 24.12º 4.39º 11.36º 15.15º 7.14º 𝜎!"#$&𝜎!""#$ &𝑃𝐼 2.59º 2.75º 1.32º 2.98º 15.15º 7.14º

Fig. 5. Virtual representation to the estimated orientation of an object during movement, in a range of motion from 0 to 90 degrees through X axis, based on the sensors orientation.

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-50-40-30-20-10

0102030

Estimated orientation on static position (σgyro)

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σaccel)

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σgyro&σaccel)&PI

Time (Min)

Ang

les

(Deg

rees

)

XYZ

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σgyro&σaccel)

0 1 2 3 4 5 6 7 8 9 10-100-90-80-70-60-50-40-30-20-100102030405060708090100Gyroscope & Accelerometer comparison (Z axis)

Time (Sec)

Angle

s (De

grees

)

σgyroσaccel

0 5 10 15 20 25 30-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (Z axis)

Time (Sec)

Angle

s (De

grees)

0 5 10 15 20 25 30-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (Y axis)

Angle

s (De

grees)

σgyro(σgyro & σaccel)&PI

0 5 10 15 20 25 30-120-110-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (X axis)

Angle

s (Degr

ees)

-20

2

-2

0

2-2

-1

0

1

2

Eje X

0 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

15 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

30 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

45 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

60 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

75 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

90 degrees

Eje Y

Eje

Z

0 2 4 6 8 10-100

102030405060708090

100

Orientation on 3D space (σgyro&σaccel)&PI

Time (Sec)

Angle

s (De

grees

)

XYZ

Fig. 4. Behavior analysis of the estimatedorientation during movement into a range ofmotion from -90 to 90 degrees, comparingdrifting deviation, between σgyro and theproposed algorithm (σgyro&σaccel)&PI, for eachaxes.

Behavior analysis of the estimatedorientation during movement

Although σaccel is susceptible to high noise levels,it can be used to estimate both the positionand orientation of an arbitrary object or bodysegment in X and Y axis which is enough insome applications [2], [17]. Although σaccel doesnot produce drifting deviation during the staticanalysis, it was demonstrated that it cannot beused to estimate neither the position nor the

orientation in 3D space because of its dependenceon the acceleration of gravity. Despite thefact that σgyro is subject to drifting deviation,some authors suggest that it could be usefulin the evaluation of body segments due to itscharacteristic of not being susceptible to noise[2], [17]. Accordingly, due to the analysis ofbehavior during movement, six repetitions from-90 to 90 degrees for each axis was performedcomparing σgyro with (σgyro&σaccel)&PI, whichis the proposed algorithm in order to determineboth the amount of drifting deviation andnoise during movement. However, as it wasassumed during static analysis in Fig. 4 X andY axis, it can be noted that because of driftingdeviation in σgyro, the estimated orientation islost through time during movement even in ashort period of time. On the other hand, it canbe noted that the estimated orientation using(σgyro&σaccel)&PI, seems to be as noisy as σgyrobut as susceptible in drifting as σaccel, thereforedemonstrating an optimal fusion sensor. Inaddition, although Z axis maintains the behaviorof σgyro, it was demonstrated that the fusionsensor not only allows to compensate and toeliminate both drifting deviation and noise as canbe seen in table 2 but it also allowed estimatingthe orientation of an arbitrary object or bodysegment in 3D space successfully.

By representing the orientation of an objectin 3D space, DCM was conformed from σinstead of using only Euler angles because ofits singularity-free orientation representation.Accordingly, by considering that an arbitraryobject in 3D space can be represented as a vectorthrough coordinate triplet Oxyz. The orientationof an object in 3D space will be given by Eq.(17).

Oxyz = Oxyzt−1 ∗DCM (17)

Table.2 Drifting deviation and RMSE during movementX axis Y axis Z axis

Drift (30s) RMSE Drift (30s) RMSE Drift (30s) RMSE

σgyro 48.82o 24.12o 4.39o 11.36o 15.15o 7.14o

(σgyro&σaccel)&PI 2.59o 2.75o 1.32o 2.98o 15.15o 7.14o

Page 10: Instantaneous Position and Orientation of the Body ... · Instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer

250 Revista Mexicana de Ingeniería Biomédica · volumen 35 · número 3 · Diciembre, 2014

Figures

Fig. 1. Estimation of 𝑄 from 𝜎 as (𝜎!"#$&𝜎!""#$) without any reduction of noise., estimation of 𝑄 from 𝜎 as 𝜎!"#$&𝜎!""#$ &𝑃𝐼 by using a PI feedback controller., estimation of DCM from 𝑄!"#$., estimation of DCM from 𝑄!"#.

Fig. 2. Behavior analysis of the estimated orientation in static position in order to establish drifting deviation comparison either in the estimated orientation from 𝜎!"#$ and 𝜎!""#$ independently, as well as through to the proposed algorithms; 𝜎!"#$&𝜎!""#$ and (𝜎!"#$&𝜎!""#$)&𝑃𝐼.

Table.1 Drifting deviation and RMSE on static position X axis Y axis Z axis

Drift 5min

RMSE Drift 5min

RMSE Drift 5min

RMSE

𝜎!"#$ 22.85º 1.3e-4 45.15º 1.4e-4 19.87º 1.8e-4 𝜎!""#$ 0.34º 0.18 0.17º 0.21 0.002º 0.007 𝜎!"#$&𝜎!""#$ 0.33º 0.18 0.12º 0.21 9.26º 0.007 𝜎!"#$&𝜎!""#$ &𝑃𝐼 0.001º 0.04 0.10º 0.05 9.21º 0.007

Fig. 3 Comparison in the estimated orientation in Z axis.

Fig. 4. Behavior analysis of the estimated orientation during movement into a range of motion from -90 to 90 degrees, comparing drifting deviation, between 𝜎!"#$ and the proposed algorithm (𝜎!"#$&𝜎!""#$)&𝑃𝐼, for each axes.

Table.2 Drifting deviation and RMSE during movement X axis Y axis Z axis

Drift (30s)

RMSE Drift (30s)

RMSE Drift (30s)

RMSE

𝜎!"#$ 48.82º 24.12º 4.39º 11.36º 15.15º 7.14º 𝜎!"#$&𝜎!""#$ &𝑃𝐼 2.59º 2.75º 1.32º 2.98º 15.15º 7.14º

Fig. 5. Virtual representation to the estimated orientation of an object during movement, in a range of motion from 0 to 90 degrees through X axis, based on the sensors orientation.

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-50-40-30-20-10

0102030

Estimated orientation on static position (σgyro)

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σaccel)

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σgyro&σaccel)&PI

Time (Min)

Ang

les

(Deg

rees

)

XYZ

0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5-10-8-6-4-202

Estimated orientation on static position (σgyro&σaccel)

0 1 2 3 4 5 6 7 8 9 10-100-90-80-70-60-50-40-30-20-100102030405060708090100Gyroscope & Accelerometer comparison (Z axis)

Time (Sec)

Angle

s (De

grees

)

σgyroσaccel

0 5 10 15 20 25 30-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (Z axis)

Time (Sec)

Angle

s (De

grees)

0 5 10 15 20 25 30-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (Y axis)

Angle

s (De

grees)

σgyro(σgyro & σaccel)&PI

0 5 10 15 20 25 30-120-110-100-90-80-70-60-50-40-30-20-10

0102030405060708090

100Estimated orientation from -90 to 90 degrees (X axis)

Angle

s (Degr

ees)

-20

2

-2

0

2-2

-1

0

1

2

Eje X

0 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

15 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

30 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

45 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

60 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

75 degrees

Eje Y

Eje

Z

-20

2

-2

0

2-2

-1

0

1

2

Eje X

90 degrees

Eje Y

Eje

Z

0 2 4 6 8 10-100

102030405060708090

100

Orientation on 3D space (σgyro&σaccel)&PI

Time (Sec)

Angle

s (De

grees

)

XYZ

Fig. 5. Virtual representation to the estimated orientation of an object during movement, in a rangeof motion from 0 to 90 degrees through X axis, based on the sensors orientation.

Finally, Oxyz will define the orientation ofan object in 3D space based on the sensorsorientation representing it through a virtualenvironment as a cube while performing atrajectory between 0 to 90 degrees through Xaxis as shown in Fig. 5.

CONCLUSION

To estimate the orientation of an arbitraryobject or body segment in 3D space cannotbe properly achieved using gyroscope oraccelerometer information separately. Although,gyroscope angular data rate can be translatedinto meaningful 3D information, it wasdemonstrated that the integration of any amountof signal will cause an accumulative driftingdeviation over time. On the other hand,

accelerometer information provides an absolutereference orientation which allows estimating theorientation in respect to gravity. Nonetheless,it was demonstrated that the accelerometer isnot able to give a complete description of theorientation because the accelerometer signalsdo not contain information about the rotationaround the vertical z axis.

Results from the present study shows thatthe proposed algorithm including PI controllerfeedback not only eliminates drifting deviationbut also reduces noise in both static positionand during movement, thereby estimating theorientation of an arbitrary object in 3D space.It should be noted that the proposed algorithmsimplifies equations by reducing the operationsto be computed and consequently computationcost as it was in contrast to methods based on theusual computation of quaternions and rotation

Page 11: Instantaneous Position and Orientation of the Body ... · Instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer

Barraza-Madrigal et al. Instantaneous position and orientation of the body segments as an arbitrary object in 3D spaceby merging gyroscope and accelerometer information. 251

matrixes.Finally, from the obtained results,

considering that parameters of the proposedalgorithm were configured according to thekinematic joint evaluation, it was demonstratedthat it is possible to estimate the instantaneousposition and orientation of the body segmentsas an arbitrary object in 3D space by merginggyroscope and accelerometer information.

The results obtained as well as low cost,small size and low power consumption ofthe employed sensors were encouraging fordesign a wearable system for application in theambulatory kinematic joint body evaluation as afuture work.

ACKNOWLEDGMENT

The authors express thanks to Consejo Nacionalde Ciencia y Tecnología (CONACyT, México),for granted scholarship to Barraza-Madrigal J.Aand their support to this research. The authorsalso thank the valuable assistance of Ing. EladioCardiel Perez for achieving this study.

REFERENCES

1. T. Shimada, “Normal Range of Motion ofJoints in Young Japanese People,” Bulletinof allied medical sciences Kobe BAMS(Kobe), 1988; 4:103-109.

2. D. Giansanti, V. Macellari, G. Maccioniand A. Cappozzo, "Is it Feasible toReconstruct Body Segment 3-D Positionand Orientation Using AccelerometricData?" IEEE Transactions on BiomedicalEngineering, 2003; 50(4):476-483.

3. Sen Suddhajit, R. Abboud, D. Ming, B.Wan, Y. Liao and Q. Gong, “A motionsimulation and biomechanical analysisof the shoulder joint using a wholehuman model,” in 4th InternationalConference on Biomechanical Engineeringand Informatics (BMEI), 2011; 4:2322-2326.

4. H. Dejnabadi, B. M. Jolles and K.Aminian, “A New Approach to Accurate

Measurement of Uniaxial Joint AnglesBased on a Combination of Accelerometersand Gyroscopes,” IEEE Transactions onBiomedical Engineering, 2005; 52(8):1478-1484.

5. M. A. Sabatini, “Estimating Three-Dimensional Orientation of Human BodyParts by Inertial/Magnetic Sensing,”Sensors, 2011; 11(2):1489-1525.

6. J. Bortz, “A New MathematicalFormulation for Strapdown InertialNavigation,” IEEE Transactions onAerospace and Electronic Systems, 1971;AES-7(1):61-66.

7. M. Ignangni, “Optimal StrapdownAttitude Integration Algorithms,” Journalof Guidance Control and Dynamics, 1990;13(2): 363-369.

8. S. Tadano, R. Takeda and H. Miyagawa,“Three Dimensional Gait Analysis UsingWearable Acceleration and Gyro SensorsBased on Quaternion Calculations,”Sensors, 2013; 13(7):9321-9343.

9. E. Bernmark and C. Wiktorin, “Atriaxial accelerometer for measuring armmovements,” Applied Ergonomics, 2002;33(6):541-547.

10. G. Hansson, P. Asterlan, N. Holmer andS. Skerfving, “Validity and reliability oftriaxial accelerometers for inclinometry inposture analysis,” Medical and BiologicalEngineering Computing, 2001; 39(4):405-413.

11. H. Luinge and P. Veltink, “Measuringorientation of human body segments usingminiature gyroscopes and accelerometers,”Medical and Biological Engineering andComputing, 2005; 43(2):273-282.

12. S. O. Madgwick, A. J. Harrison andR. Vaidyanathan, “Estimation of IMUand Marg orientation using a gradientdescent algorithm,” in IEEE InternationalConference on Rehabilitation Robotics,Zurich-Swizerland, 2011.

Page 12: Instantaneous Position and Orientation of the Body ... · Instantaneous position and orientation of the body segments as an arbitrary object in 3D space by merging gyroscope and accelerometer

252 Revista Mexicana de Ingeniería Biomédica · volumen 35 · número 3 · Diciembre, 2014

13. J. Luige Henk and H. Peter, “InclinationMeasurement of Human Movement Using a3-D Accelerometer With Autocalibration,”IEEE Transactions on Neural Systemsand Rehabilitation Engineering, 2004;12(1):112-121.

14. Z. Zhang, Z. Huang and J. Wu,“Hierarchical Information Fusion forHuman Upper Limb Motion Capture,”in 12th International Conference onInformation Fusion, Seattle-WA-USA,2009.

15. J. Favre, B. Jolles, O. Siegrist and K.Aminian, “Quaternion-based fusion ofgyroscopes and accelerometers to improve3D angle measurement,” ElectronicsLetters, 2006; 42(11):612-614.

16. InvenSense, MPU-6000 and MPU-6050Register Map and Descriptions, Revision4.0, 03/09/2012: 13-15.

17. E. L. Morin, S. A. Reid and J. M.Stevenson, “Characterization of upperBody Accelerations for task Performancein Humans,” in Proceedings of the 25thAnnual Conference of the IEEE EMBSCancun-México, 2003.

18. A. Olivares, J. Górriz, J. Ramirezand G. Olivares, “Accurate humanlimb angle measurement: sensor fusionthrough Kalman, least mean squares andrecursive least-squares adaptive filtering,”Measurement Science and Technology,2010; 22(2):1-15.

19. E. Bekir, “Introduction to ModernNavigation Systems,” World Scientific Co.Pte-Ltd, 2007.

20. H. Schneider and G. Barker, “Matricesand Linear Algebra,” Holt Rinehart andWinston Inc. New York, 1968.

21. C. Cullen, Matrices and LinearTransformations, Addison-Wesley Co.Reading M.A, 1972.

22. W. Premerlani and P. Bizard, DirectionCosine Matrix: Theory, 17 May 2009

23. S. O.H Madgwick, “An efficient orientationfilter for inertial and inertial/magneticsensors arrays,” Technical report,Department of Mechanical Engineering,University of Bristol, 2010.

24. B. Antonio, Fundamentos de Robótica,McGraw Hill, 1997.

25. R. Mahony, T. Hamel and J.-M. Pflimlin,“Nonlinear Complementary Filters onthe Special Orthogonal Group,” IEEETransactions on Automatic Control, 2008;53(5):1203-1218.

26. F. Haugen, “The Good Gain method forPI(D) controller tuning,” TechTeach, 2010:1-7.

27. F. Haugen, “The good Gain methodfor simple experimental tuning of PIcontrollers,” Modeling, Identification andControl, 2012; 33(4):141-152.