of 6/6
Human-like Arm Motion Generation for Humanoid Robots Using Motion Capture Database Seungsu Kim * , ChangHwan Kim and Jong Hyeon Park *‡ School of Mechanical Engineering Hanyang University, Seoul, 133-791, Korea. *† Intelligent Robotics Research Center Korea Institute of Science and Technology, Seoul, 130-650, Korea. Email: * [email protected]; [email protected]; [email protected] Abstract— During the communication and interaction with a human using motions or gestures, a humanoid robot needs to not only look like a human but also behavior like a human to avoid confusions in the communication and interaction. Among human- like behaviors, arm motions of the humanoid robot are essential for the communication with people through motions. In this work, a mathematical representation for characterizing human arm motions is first proposed. The human arm motions are charac- terized by the elbow elevation angle that is determined using the position and orientation of human hands. That representation is mathematically obtained using an approximation tool, Response Surface Method (RSM). Then a method to generate human- like arm motions in real time using the proposed representation is presented. The proposed method was evaluated to generate human-like arm motions when the humanoid robot was asked to move its arms from a point to another point including the rotation of hand. An example motion was performed using the KIST humanoid robot, MAHRU. I. I NTRODUCTION A few humanoid robots have been developed and shown to the public in the last decades aiming for providing people with useful services. Most interactions between a humanoid robot and a human happen through voices and behaviors. Such behaviors need to look like humans’ otherwise they may cause people to understand the meaning of behaviors incorrectly. It is natural that a behavior of humanoid robot be comfortable to and expectable by a human. Human-like arm motions are then the bottom requests for humanoid robots to do like humans. Some studies have been done to generate a human-like motion by imitating a human motion as closely as possible. The human motion is measured by a motion capture sys- tem and then adapted to a humanoid robot or an animation character. In the case of using an optical motion capture system the human motion are captured in the form of time trajectories of markers attached on the human body. This approach have been developed by several researchers. Kim et al.[?] proposed a method to convert the captured marker data of human arm to motions available to a humanoid robot using and optimization scheme. The position and orientation of hand, and the orientation of upper arm of a human were imitated by a humanoid robot under bounded capacities of joint motors. However this method were not able to generate a new human-like motion. Pollard et al.[?] also developed a method to adapt captured human motions to a humanoid robot that consists of only a upper body. The captured upper body motions of an actor was optimized, minimizing the posture differences between the humanoid robot and the actor. The limits of joint position and velocity were considered. Nakaoka et al.[?] explored a procedure to let a humanoid robot (HRP- 1S) imitate a Japanese folk dance captured by a motion capture system. Symbolic representations for primitive motions were presented. The time trajectories of joint positions were first generated to imitate the primitive motions. These trajectories were then modified to satisfy mechanical constraints of the humanoid robot. Especially, for the dynamic stabilities the trajectory of waist was modified to be consistent with the desired ZMP trajectory. The imitation of the Japanese folk dance was performed in the dynamics simulator, OpenHRP, and was realized by the real humanoid robot, HRP-1S, as well. These methods are all used to imitate the given human motions. The methods may have difficulties to generate a variety of new human-like motions from the Human’s motion capture database, since they adapted only the given captured motions. Another approach to generate human-like arm motions using a mathematical representation for a human’s arm motion was performed by Asfour et al.[?]. The mathematical repre- sentation in [?] and [?] was used. In those papers the four parameters were defined and represented in terms of wrist positions of a human in the spherical coordinate system at the shoulder. However, the proposed representation approximated arm movements so that the method developed in [?] provided erroneous results at the position and orientation of a humanoid hand. In addition, the four parameters used in that work may not have physical meanings. For a humanoid robot not only to imitate human motions but also to perform a human-like motions anytime needed using motion database, a new method is needed. In this paper, a method for extracting movement characteristics of human arms from the motion capture database will be presented.

Human-like Arm Motion Generation for Humanoid Robots Using …lasa.epfl.ch/publications/uploadedFiles/SKim_2006_IROS.pdf · Human-like Arm Motion Generation for Humanoid Robots Using

  • View
    1

  • Download
    0

Embed Size (px)

Text of Human-like Arm Motion Generation for Humanoid Robots Using...

  • Human-like Arm Motion Generationfor Humanoid Robots

    Using Motion Capture DatabaseSeungsu Kim∗, ChangHwan Kim† and Jong Hyeon Park‡

    ∗‡School of Mechanical EngineeringHanyang University, Seoul, 133-791, Korea.∗†Intelligent Robotics Research Center

    Korea Institute of Science and Technology, Seoul, 130-650, Korea.Email: ∗[email protected]; †[email protected]; ‡[email protected]

    Abstract— During the communication and interaction with ahuman using motions or gestures, a humanoid robot needs to notonly look like a human but also behavior like a human to avoidconfusions in the communication and interaction. Among human-like behaviors, arm motions of the humanoid robot are essentialfor the communication with people through motions. In this work,a mathematical representation for characterizing human armmotions is first proposed. The human arm motions are charac-terized by the elbow elevation angle that is determined using theposition and orientation of human hands. That representation ismathematically obtained using an approximation tool, ResponseSurface Method (RSM). Then a method to generate human-like arm motions in real time using the proposed representationis presented. The proposed method was evaluated to generatehuman-like arm motions when the humanoid robot was askedto move its arms from a point to another point including therotation of hand. An example motion was performed using theKIST humanoid robot, MAHRU.

    I. INTRODUCTION

    A few humanoid robots have been developed and shownto the public in the last decades aiming for providing peoplewith useful services. Most interactions between a humanoidrobot and a human happen through voices and behaviors. Suchbehaviors need to look like humans’ otherwise they may causepeople to understand the meaning of behaviors incorrectly. Itis natural that a behavior of humanoid robot be comfortable toand expectable by a human. Human-like arm motions are thenthe bottom requests for humanoid robots to do like humans.

    Some studies have been done to generate a human-likemotion by imitating a human motion as closely as possible.The human motion is measured by a motion capture sys-tem and then adapted to a humanoid robot or an animationcharacter. In the case of using an optical motion capturesystem the human motion are captured in the form of timetrajectories of markers attached on the human body. Thisapproach have been developed by several researchers. Kimet al.[?] proposed a method to convert the captured markerdata of human arm to motions available to a humanoid robotusing and optimization scheme. The position and orientationof hand, and the orientation of upper arm of a human wereimitated by a humanoid robot under bounded capacities of

    joint motors. However this method were not able to generatea new human-like motion. Pollard et al.[?] also developed amethod to adapt captured human motions to a humanoid robotthat consists of only a upper body. The captured upper bodymotions of an actor was optimized, minimizing the posturedifferences between the humanoid robot and the actor. Thelimits of joint position and velocity were considered. Nakaokaet al.[?] explored a procedure to let a humanoid robot (HRP-1S) imitate a Japanese folk dance captured by a motion capturesystem. Symbolic representations for primitive motions werepresented. The time trajectories of joint positions were firstgenerated to imitate the primitive motions. These trajectorieswere then modified to satisfy mechanical constraints of thehumanoid robot. Especially, for the dynamic stabilities thetrajectory of waist was modified to be consistent with thedesired ZMP trajectory. The imitation of the Japanese folkdance was performed in the dynamics simulator, OpenHRP,and was realized by the real humanoid robot, HRP-1S, aswell. These methods are all used to imitate the given humanmotions. The methods may have difficulties to generate avariety of new human-like motions from the Human’s motioncapture database, since they adapted only the given capturedmotions.

    Another approach to generate human-like arm motionsusing a mathematical representation for a human’s arm motionwas performed by Asfour et al.[?]. The mathematical repre-sentation in [?] and [?] was used. In those papers the fourparameters were defined and represented in terms of wristpositions of a human in the spherical coordinate system at theshoulder. However, the proposed representation approximatedarm movements so that the method developed in [?] providederroneous results at the position and orientation of a humanoidhand. In addition, the four parameters used in that work maynot have physical meanings.

    For a humanoid robot not only to imitate human motionsbut also to perform a human-like motions anytime neededusing motion database, a new method is needed. In this paper,a method for extracting movement characteristics of humanarms from the motion capture database will be presented.

  • The characteristics will be described in terms of the elbowelevation angle. This angle will be determined by the positionof wrist and the angle between the palm and the ground. Usingthis representation of a human’ natural elbow elevation anglea human-like motion will be generated.

    II. ELBOW ELEVATION ANGLE: CHARACTERIZING AHUMAN ARM MOTION

    Elbow elevation angle

    Fig. 1. The definition of elbow elevation angle for a human arm

    In this section the characterizing process for the movementof a human arm in the motion capture database is described.The motion database is constructed using an commerciallyavailable optical motion capture system as seen in Fig. 2. Thehuman model in Fig. 3 were modeled by the S/W of providedthe motion capture system.

    In daily life, the hand motions of moving from a point toanother point with varying its orientation occur anytime suchas when pointing out by a hand, moving a hand to grasp anobject on a table or in the air, talking to persons with handgestures and so on. The human arm’s posture may be describedin terms of the position of wrist, the orientation of hand, theelbow posture from the body and more. From the capture armmotion database it was observed that the elbow posture mightbe determined mainly by the position of wrist and the directionof vector normal to the palm, as called palm direction. Inother words, a posture of an arm at a certain instance can bedescribed in terms of the position of wrist, palm direction andelbow posture. Moreover elbow posture can be expressed bythe position of wrist and palm direction.

    The wrist position is obtained using the markers on thehuman arm first about the global cartesian coordinate on theground and then converted about the reference frame attachedat the shoulder. The elbow posture is defined by the anglebetween a plane vertical to the ground (see the red dashedtriangle in Fig. 1) and the plane defined by the three markersof shoulder, elbow and wrist (see the blue dashed triangle inFig. 1). This angle between the two planes are called elbowelevation angle in the entire paper. Using this angle, humanarm motions are characterized, since the angle is representedin terms of the wrist position and the palm direction that arethe key factors for natural postures of human arms. The elbowelevation angle is defined as zero when the blue dashed planein Fig. 1 is parallel to the vertical plane (red dashed plane inthe figure).

    To obtain the natural elbow postures of a human thekinematic analysis were performed as seen in Fig. 3. Duringthe experiments, the actor relaxed his arm, and moved withoutplanned arm postures. A number of wrist positions and palmdirections were examined in the rule given to the actor. Forthe experiments the reachable volume by human arms weredivided by 6 planes vertical to the ground as equally as possi-ble. Then the actor drew 5 circles having different diametersfrom each other during 5 seconds. The same experiment wererepeated by 3 times by varying palm directions.

    Fig. 2. Motion capture system and actor

    Fig. 3. Kinematic analysis for a variety of human arm postures

    The human arm motions were captured using the HawkDigital System commercially available from Motion AnalysisInc as seen in Fig. 2. 29 markers were attached to the upperbody of actor and 8 cameras were used. The capturing ratewas 120 frames per second. The time trajectories of markersrepresenting human motions were stored. Using such markers’time trajectories, the wrist positions were obtained with respectto the reference frame at the shoulder and the palm directionswas calculated at each frame as well.

    III. EQUATION OF ELBOW ELEVATION ANGLE

    From the kinematic analysis in the forgoing section it wasobserved that the arm posture could be characterized by theelbow elevation angle which is represented in terms of thewrist position and the palm direction. In this section, therepresentation of the elbow elevation angle is obtained usingResponse Surface Methodology (RSM) given in [?].

    A. Response Surface Methodology

    The Response Surface Methodology (RSM) in [?] is atechnique for representing relationship between controllableinput variables and a response. In the methodology a responsefunction is defined to approximate experiment results. Thebrief descriptions are made as follows:

  • The response of an experiment is approximated using aresponse function as

    y(x) = ŷ(x) + e (1)

    where y denotes the given response of experiment, ŷ is theunknown response function of y and e is the error betweenthe response and the response function. x is a controllableinput variable vector. The response function approximates theresponse using shape functions as

    ŷ(x) =Nb∑

    i=1

    biξi(x) (2)

    where Nb is the number of terms of the response function.ξi for i = 1 ∼ Nb are called shape functions (or basisfunctions by some researchers). Unknown coefficients of shapefunctions, bi for i = 1 ∼ Nb, need to be determined by curve-fitting experiment at results.

    When the multiple responses are given, the multiple errorsare also obtained using Eq. (1) and Eq. (2) as

    ej = yj − ŷ(xj) = yj −Nb∑

    i=1

    biξi(xj) for j = 1 ∼ N (3)

    where N is the number of responses (or experiments); yj andej are the value of the jth response and the corresponding er-ror, respectively; xj is the input variable vector correspondingto the jth response. Equation (3) can be rewritten in a vectorform as

    e = y −Xb (4)where the dimension of matrix, X, is N×Nb having the valuesof ξi(xj) as its elements. The unknown constant vector, b, isthen determined by minimizing the root mean square (RMS)for e as

    eRMS =

    √√√√ 1N

    N∑

    i=1

    e2i =

    √1ny

    eT e. (5)

    Note that minimizing eRMS is equivalent to minimizingeT e. Using the optimality conditions the vector, b, can beobtained as

    b =(XT X

    )−1XT y (6)

    so that the response function is achieved. It should be noticedthat the process in this section is called the least squaresmethod.

    B. Normalization of input variables

    In the solution process in the previous section, it is worth-while normalizing input variables separately, since big dif-ferences in the the magnitudes of the variables may exist.This normalization may help reduce the approximation error.Moreover, since the size of the humanoid is different from that

    of human, the normalization makes it easy to apply the humandatabase to the humanoid.

    As mentioned in Sec. II the characteristics of human armmotions can be represented using the wrist position and thepalm angle. The wrist positions are obtained about the spher-ical coordinate system on the shoulder using the trajectoriesof the marker at the wrist. The palm direction denotes thedirection of vector normal to the palm as defined in Sec. II.The angle between this direction and the ground is used asone of input variables. These representation parameters arenormalized to the dimensionless magnitude of 2 as

    0 ≤ r̄ ≤ 2 ; −1 ≤ ᾱ ≤ 1−1 ≤ β̄ ≤ 1 ; −1 ≤ θ̄ ≤ 1 (7)

    where r̄ is the distance from the shoulder to the wrist; ᾱ andβ̄ are the angles for the spherical coordinate system at theshoulder as seen in Fig. 7; θ̄ is the angle between the palmdirection and the ground.

    C. Characteristic equation for elbow elevation angle

    For the shape function a second ordered polynomial iswidely used in the response surface methodology. Using theparameters defined in the previous section are used to representthe response function for the elbow elevation angle as

    γ̂ = b0 + b1x1 + b2x2+b3x3 + b4x4 + · · ·· · ·+ b5x1x2 + b13x23 + b14x24

    (8)

    [ x1 x2 x3 x4 ] =[

    r̄ ᾱ β̄ θ̄]

    (9)

    where γ̂ is the normalized response function for the elbowelevation angle. The input variable vector, x is given asEq. (9). The unknown coefficient vector, b, for the shapefunction above is then obtained using Eq. (6) and the results ofkinematic analysis of human arm in Sec. 1. Once the responsefunction for the elevation angle for a human are completed,the most natural elbow elevation angle of a humanoid robot isthen determined by the wrist positions and the palm directionsof the humanoid robot. In addition the motions generatedusing this response function should look like those of ahuman. Figure 4 shows the effects of parameters on the elbowelevation angle with varying input parameters.

    IV. INVERSE KINEMATICS

    Up to the previous section the elbow elevation angle of ahuman was obtained using RSM and motion capture database.In this section, The inverse kinematics problem to generate ahuman-like arm motion using the elbow elevation angle anda typical procedure of inverse kinematics solution process inrobotics, is solved for joint positions.

    As a testbed, the humanoid robot of MAHRU in Fig. 5developed by Korean Institute of Science and Technology(KIST) with 6 degrees of freedom for each arm, was used.To solve inverse kinematics problem, 6 holonomic constraintsare needed. Input for desired posture is a wrist position in the

  • −200 −150 −100 −50 0 50 100 150 200−50

    0

    50

    100

    150

    200

    Palm direction angle ( θ ), when r = 1.7 and β = 0

    Elb

    ow e

    leva

    tion

    angl

    e ( γ

    ), i

    n de

    gree

    α = −45

    α = 0 α = 45

    1 1.2 1.4 1.6 1.8 20

    20

    40

    60

    80

    100

    120

    140

    160

    Wrist distance ( r ), when β = 0.0 and θ = 0.0

    Elb

    ow e

    leva

    tion

    angl

    e ( γ

    ), i

    n de

    gree

    α = −45

    α = 0

    α = 45

    −50 0 50 100−20

    −10

    0

    10

    20

    30

    40

    50

    60

    70

    80

    Wrist pitch angle ( β ), when r = 1.7 and α = 0.0

    Elb

    ow e

    leva

    tion

    angl

    e ( γ

    ), i

    n de

    gree

    θ = −90

    θ = 0

    θ = 90

    −50 0 500

    20

    40

    60

    80

    100

    120

    140

    160

    Wrist yaw angle ( α ), when r = 1.7 and θ = 0.0

    Elb

    ow e

    leva

    tion

    angl

    e ( γ

    ), i

    n de

    gree

    θ = −45

    θ = 0

    θ = 45

    Fig. 4. Elbow elevation angles of a human with respect to the four parameters, r, α, β, and θ

    shoulder-centered spherical coordinate and a palm directionangle. Wrist stoop angle also can be input. But in this paper,the angle was set zero. To generate human-like posture,human arm characteristic equation will be used. Therefore, 6constraint was set. Our approach to solve inverse kinematicsis derived from geometric analysis of the problem.

    Fig. 5. The KIST humanoid robot, MAHRU

    Figure 6 shows the home position of left arm. The postureof the arm at this position stretch down the ground and thepalm faces to the hip.

    When the elbow elevation angle is obtained from theprevious section, the remaining joint angles from θ0 to θ4 areobtained through the procedure in this section.

    First, the joint angle θ3 depends only on the distance r asseen in Fig. 7.

    θ3 = π − cos−1(

    L2u + L2l − r2

    2LuLl

    )(10)

    The joint angles θ0 and θ1 is dependent on the vector ~E.The ~E0 is the elbow position when α, β and γ were set zero atgiven wrist positions and palm directions. The plane buildedwith ~E0 and the wrist position vector from shoulder to wristlies on the x-z plane of shoulder centered coordinate.

    ~E0 =[

    r2+L2u−L2l2r 0 −Lu sin

    (cos−1

    (r2+L2u−L2l

    2rLu

    )) ]T

    (11)~E can be calculated by the elbow elevation angle, γ̂, in Eq.

    (8) and the wrist position.

    ~E = Rx (γ) ·Ry (β) ·Rz (α) · ~E0 (12)

    θ1 = sin−1(

    ~EyLu

    )(13)

    θ0 = a tan 2

    (~Ex

    Lu cos (θ1),

    ~EzLu cos (θ1)

    )(14)

  • Shoulder

    Elbow

    Wrist

    Lu

    Ll

    Hand

    Fig. 6. coordinates of left arm

    x

    y

    z

    Elbow

    Wrist

    Shoulder

    r

    uL

    lL

    Fig. 7. Parameters for human arm posture

    where ~E∗ is ∗ component of ~E.

    The wrist position is expressed as below

    01A · 12A · 23A · 34A · 4 ~W = ~W (15)

    where ijA is a homogeneous transformation matrix from theith reference frame to the jth reference frame and 4 ~W is thewrist position vector in the 4th frame. That vector is 4 ~W =[

    0 −Ll 0 0]T

    . The Wrist position is given and θ0 ,θ1and θ3 are known using the equations above. Therefore, θ2can be obtained as

    s2 =~Wz + (Lu + Llc3) s1

    Llc1s3(16)

    c2 =~Wy + c0 (c1 (Lu + Llc3) + Lls1s3)

    Lls0s3(17)

    θ2 = atan2 (s2, c2) (18)

    x

    y

    z

    Elbow

    Wrist

    Shoulder

    E

    EW

    W

    Fig. 8. coordinates of left arm

    where ci is cos (θi), si is sin (θi).In this paper, the wrist stoop angle θ5 was set zero. To find

    θ4 , angle between blow two vector was used.

    ~Nc =−→E ×−−→EW

    ~Nv =−−→EW ×

    001

    θdiff = cos−1

    ~Nc · ~Nv∥∥∥ ~Nc

    ∥∥∥ ·∥∥∥ ~Nv

    ∥∥∥

    where, ~Nv is the normal vector of the plane consisting of thevector from the elbow to the wrist and the normal directionvector from the ground. ~Nc is the normal vector of the planeconsisting of the origin at the shoulder, the wrist position andthe elbow position under given input variables.

    θ4 = θ − θdiff (19)V. AN EXAMPLE

    From above section, The equation of elbow elevation anglewas implemented. Using this equation, the best natural human-like posture can be obtained. Moreover, inverse kinematicssolution of KIST humanoid MAHRU can be obtained in anyreachable wrist position and palm direction. To evaluate theequation and the inverse kinematics solution, the humanoidrobot was required to follow the desired trajectories of wristposition and palm direction. The wrist trajectory is given bya sin wave in the y-z plane of the cartesian coordinate systemat the shoulder with the distance of 0.44 m in the x direction.The desired trajectory of palm direction was generated bytangential vectors of sin wave function for the wrist position ateach time frame. Using those desired trajectories the desiredtrajectories of joint angles were calculated giving human-likearm motions. Such desired trajectories of joint angles wereexamined by the KIST humanoid robot, MAHRU.

    The experiment was performed using a PC operated by thereal-time Linux (RTAI) and DSP control boards at each jointmotors. The Linux PC could send the desired joint angle and

  • Fig. 9. Comparison of the human arm motion and the human-like arm motion by MAHRU using the developed method

    desired joint velocity to each DSP board with CAN protocol ateach 5 ms. The real-time Linux (RTAI) system guaranteed the5 ms sampling time. Each DSP board controlled each motorto chace the desired values for joints with PD controller.

    Figure 9 shows the snap shots of the experiment result.The left and right wrist positions are symmetric and both ofthe palm directions in the first, third and last scenes of thefigure are same about the cartesian coordinate systems at eachshoulder. It should be noticed that the resultant arm posturesof humanoid robot in such scenes are not symmetric so thatone of one elbow was lifted more than the other one as ahuman does.

    VI. CONCLUSIONA mathematical representation for characterizing human

    arm motions have been proposed. The motion capture databasewere used for the representation. The representation wasimplemented and evaluated for the KIST humanoid robot,MAHRU successfully. The developed method for character-istics of human arms was very simple for implementation andgenerated a human-like posture for an arbitrary arm configura-tion. The method can be used to generate arm motions in realtime. In addition, the generated motion followed the desiredwrist positions exactly, since the elbow elevation angle didnot effect the wrist positions. Furthermore the method maybe used for the case where the humanoid robot is required tomove the wrist or the hand from a point to another point sucha case as approaching arm action to an object in the field ofvisual servoing.

    The method may not satisfy a desired orientation of handfully, since the elbow elevation angle used only one angle,which is relative to the palm direction, out of three anglesof desired orientation. If the desired orientation of the hand issatisfied, more degrees of freedom are needed to the humanoidrobot. In addition the arm motion generation consideringdynamics and the self-collision problem are still remainingfor our future work.

    REFERENCES[1] C. Kim, D. Kim, and Y. Oh, “Solving an inverse kinematics problem for

    a humanoid robots imitation of human motions using optimization,” in

    Proc. of Int. Conf. on Infomatics in Control, Automation and Robotics,2005, pp. 85–92.

    [2] N. S. Pollard, J. K. Hodgins, Marcia J. Riley, and Christopher G. Atkeson,“Adapting human motion for the control of a humanoid robot,” in Proc.of IEEE Int. Conf. on Robotics and Automation, 2002, vol. 2, pp. 1390–1397.

    [3] S. Nakaoka, A. Nakazawa, K. Yokoi, H. Hirukawa, and K. Ikeuchi, “Gen-erating whole body motions for a biped humanoid robot from capturedhuman dances,” in Proc. of Int. Conf. on Robotics and Automation, 2003,pp. 3905–3910.

    [4] T. Asfour and R. Dillmann, “Human-like motion of a humanoid robotarm based on a closed-form solution of the inverse kinematics problem,”in Proc. of IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, 2003,vol. 2, pp. 1407–1412.

    [5] J. F. Soechting and M. Flanders, “Errors in pointing are due toapproximations in targets in sensorimotor transformations,” in Journalof Neurophysiology, 1989, vol. 62, pp. 595–608.

    [6] J. F. Soechting and M. Flanders, “Sensorimotor representations for point-ing to targets in three-dimensional space,” in Journal of Neurophysiology,1989, vol. 62, pp. 582–594.

    [7] R. T. Haftka, Experimental Optimum Engineering Design Course NOTEs,Department of Aerospace Engineering, Mechanics and Engineering Sci-ence, University of Florida, Gainesville, Florida, U.S.A., 2000.