6
An Improved Framework for Robotic Door-opening Task Using Kinect and Tactile Information * Zhou Li 1,2,3 , Zeyang Xia 1,2, Ying Hu 1,2 , Jianwei Zhang 4 1.Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China 2.The Chinese University of Hong Kong, Hong Kong, China 3.University of Chinese Academy of Sciences, Beijing, China 4.TAMS, Department of Informatics, University of Hamburg, Hamburg, Germany To whom correspondence should be addressed: Email: [email protected]; Phone: +(86)-755-86392181 Abstract— Based on our previous work, this paper proposes an improved robotic door-opening framework. The improved framework has three components, grasping pattern, door-opening pattern, and semantic monitoring and exception handling. (1) Grasping pattern contains various grasping points and ways for different types of door knob. The point cloud of the target area is captured using Kinect sensor, and the door edge and knob is then extracted. (2) Determination of door-opening pattern includes sub operations and the corresponding trajectories. (3) By using semantic monitoring and exception handling, the tactile data from the gripper during the task execution is monitored based on which exceptional situation is assessed and handled. The proposed framework was validated by two normal door- opening experiments and two exception handling experiments using KUKA prototype. Index Terms— Robotic manipulation, Door-opening task, Kinect, Tactile, Exception monitoring and handling I. I NTRODUCTION With the coming of aging society, the industry of service robot is rising gradually. For individuals, service robot can help disabled people to do some basic tasks in daily life. Door- opening task is a fundamental task for service robot. It is also an important component of other composite tasks. For different doors and door knobs the shapes and sizes are various, and many exceptions may occur in the door-opening process. So door-opening task is not easy for robot to complete. Until now many researchers have studied lots of methods [1]– [6] to accomplish door-opening task. Steven [3] et al. proposes a planning framework for robot to open non-spring and spring- loaded doors. They use a succinct graph-based representation for door-opening task to lower the planning time. Simulation and physical experiments on PR2 [7] robot are conducted to validate their framework. Saleh [2] et al. utilize a modular and reconfigurable robot mounted on a wheeled mobile robot platform to open door. A new concept that uses the multiple working modes of the robot is presented in their work. As a result, the control design is obviously simplified by switching selected joints of the robot to work in passive mode during the * This work was supported by National Science Foundation of China (No.51305436, 61105132), Guangdong Innovative Research Team Program (No. 201001D0104648280) and Shenzhen Science and Technology Project (No.ZDSY20120617113312191). door-opening process. Our previous work [4] propose a light- weighted and generalized framework for robotic door-opening manipulation. This framework divides door-opening task into two parts: the determination of grasping pattern and the calculation of door-opening pattern. The previous framework uses only Kinect [8] visual sensor to collect environment data and accomplish door-opening task. There are some disadvantages in our previous work. (1) Door extraction and door knob recognition and extraction are two independent processes. There is no help between each other. (2) When the gripper grasps the door knob, there is no feedback to guarantee the grasping stability. (3) There is no monitoring and exception handling in door-opening process. To deal with the above shortages, this paper equips the grip- per with tactile sensor to record the tactile data during the door- opening process. This tactile data is used to adjust the grasping posture and monitor door-opening process, which is helpful for grasping stability and exception handling. In addition, the door edge and door knob recognition and extraction process has some modifications. The result of door extraction is used for door knob recognition and extraction. The rest of this work is organized as follows. Section II demonstrates the improved framework for robotic door- opening. The modified process of door edge and door knob recognition and extraction is revealed in Section III. Section IV shows the usage of tactile feedback in knob grasping and monitoring process. The experiments of normal door-opening and exception handling is described in Section V. Conclusion and future works are discussed in Section VI. II. FRAMEWORK The improved framework of Kinect and tactile-based robotic manipulation for door-opening is shown in Fig. 1. In detail, the framework divides door-opening task into three parts, which includes the determination of grasping pattern, the determination of door-opening pattern and semantic monitor- ing and exception handling. In grasping pattern stage, using Kinect sensor to photograph the environment depth image, then the point cloud processing methods are utilized to extract the door edge and door knob with corresponding parameters. In the door-opening stage, the motion trajectory of the door with a certain type is generated, the mechanical arm and 978-1-4799-5825-2/14/$31.00 ©2014 IEEE Proceeding of the 11th World Congress on Intelligent Control and Automation Shenyang, China, June 29 - July 4 2014 2219

An Improved Framework for Robotic Door-opening Task Using

Embed Size (px)

Citation preview

An Improved Framework for Robotic Door-opening TaskUsing Kinect and Tactile Information∗

Zhou Li1,2,3, Zeyang Xia1,2†, Ying Hu1,2, Jianwei Zhang4

1.Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, China2.The Chinese University of Hong Kong, Hong Kong, China3.University of Chinese Academy of Sciences, Beijing, China

4.TAMS, Department of Informatics, University of Hamburg, Hamburg, Germany† To whom correspondence should be addressed: Email: [email protected]; Phone: +(86)-755-86392181

Abstract— Based on our previous work, this paper proposesan improved robotic door-opening framework. The improvedframework has three components, grasping pattern, door-openingpattern, and semantic monitoring and exception handling. (1)Grasping pattern contains various grasping points and ways fordifferent types of door knob. The point cloud of the target areais captured using Kinect sensor, and the door edge and knobis then extracted. (2) Determination of door-opening patternincludes sub operations and the corresponding trajectories. (3)By using semantic monitoring and exception handling, the tactiledata from the gripper during the task execution is monitoredbased on which exceptional situation is assessed and handled.The proposed framework was validated by two normal door-opening experiments and two exception handling experimentsusing KUKA prototype.

Index Terms— Robotic manipulation, Door-opening task,Kinect, Tactile, Exception monitoring and handling

I. INTRODUCTION

With the coming of aging society, the industry of servicerobot is rising gradually. For individuals, service robot canhelp disabled people to do some basic tasks in daily life. Door-opening task is a fundamental task for service robot. It is alsoan important component of other composite tasks.

For different doors and door knobs the shapes and sizes arevarious, and many exceptions may occur in the door-openingprocess. So door-opening task is not easy for robot to complete.Until now many researchers have studied lots of methods [1]–[6] to accomplish door-opening task. Steven [3] et al. proposesa planning framework for robot to open non-spring and spring-loaded doors. They use a succinct graph-based representationfor door-opening task to lower the planning time. Simulationand physical experiments on PR2 [7] robot are conducted tovalidate their framework. Saleh [2] et al. utilize a modularand reconfigurable robot mounted on a wheeled mobile robotplatform to open door. A new concept that uses the multipleworking modes of the robot is presented in their work. As aresult, the control design is obviously simplified by switchingselected joints of the robot to work in passive mode during the

∗This work was supported by National Science Foundation of China(No.51305436, 61105132), Guangdong Innovative Research Team Program(No. 201001D0104648280) and Shenzhen Science and Technology Project(No.ZDSY20120617113312191).

door-opening process. Our previous work [4] propose a light-weighted and generalized framework for robotic door-openingmanipulation. This framework divides door-opening task intotwo parts: the determination of grasping pattern and thecalculation of door-opening pattern. The previous frameworkuses only Kinect [8] visual sensor to collect environment dataand accomplish door-opening task.

There are some disadvantages in our previous work. (1)Door extraction and door knob recognition and extraction aretwo independent processes. There is no help between eachother. (2) When the gripper grasps the door knob, there is nofeedback to guarantee the grasping stability. (3) There is nomonitoring and exception handling in door-opening process.

To deal with the above shortages, this paper equips the grip-per with tactile sensor to record the tactile data during the door-opening process. This tactile data is used to adjust the graspingposture and monitor door-opening process, which is helpfulfor grasping stability and exception handling. In addition, thedoor edge and door knob recognition and extraction processhas some modifications. The result of door extraction is usedfor door knob recognition and extraction.

The rest of this work is organized as follows. SectionII demonstrates the improved framework for robotic door-opening. The modified process of door edge and door knobrecognition and extraction is revealed in Section III. SectionIV shows the usage of tactile feedback in knob grasping andmonitoring process. The experiments of normal door-openingand exception handling is described in Section V. Conclusionand future works are discussed in Section VI.

II. FRAMEWORK

The improved framework of Kinect and tactile-based roboticmanipulation for door-opening is shown in Fig. 1.

In detail, the framework divides door-opening task into threeparts, which includes the determination of grasping pattern, thedetermination of door-opening pattern and semantic monitor-ing and exception handling. In grasping pattern stage, usingKinect sensor to photograph the environment depth image,then the point cloud processing methods are utilized to extractthe door edge and door knob with corresponding parameters.In the door-opening stage, the motion trajectory of the doorwith a certain type is generated, the mechanical arm and

978-1-4799-5825-2/14/$31.00 ©2014 IEEE

Proceeding of the 11th World Congress on Intelligent Control and AutomationShenyang, China, June 29 - July 4 2014

2219

gripper would follow this trajectory to complete door-openingtask. In semantic monitoring and exception handling stage,using tactile feedback to monitor the process of robotic door-opening, and formulate the corresponding treatment measures.Fig. 2 shows the processing flow of door-opening task.

classification

recognition

extraction

Determination of grasping

pattern

Task planning and execution

motion planning

Task execution

monitoring

Exception handling

Fig. 1. Main framework of robotic door-opening task

Door knob classification

Grasping pattern

Door-opening pattern

Door edge and door knob recognition using Kinect

Corresponding grasping pattern/door-opening pattern

Motion trajectory planning

Task execution and semantic monitoring

Grasping piontGrasping way

Sub operation sequence

Exception

Success

no

Exception handling

yes

Fig. 2. Processing flow of robotic door-opening task

A. Grasping Pattern

1) Feature recognition and extraction of door and knob:The point cloud image collected by Kinect sensor includesdepth information and RGB color information. For RGB in-formation, researchers usually use color, texture, shape, spatialrelationships and statistical features to describe its features.For depth image, the whole image is composed by thousands

of spatial points. K nearest neighbor (kNN) [9] is applied toobtain the neighbor points of the given point, the geometricfeatures of this points set form the three dimensional featuresof the point cloud. The local features of point cloud includesurface normal, surface curvature, etc. The global featuresconsist of Viewpoint Feature Histogram (VFH) [10], ClusteredVFH (CVFH) [11], etc. Section III discusses the detailedmethods of door recognition and extraction.

2) Grasping point: Once the knob is recognized and ex-tracted, the grasping point should be determined. The two mainfactors influencing the grasping point selection are graspingstability and door-opening simplicity. The experiment usestwo-finger gripper to grasp door knob. For knob in Fig. 3(a),the grasping point should be selected as the upside central pointand downside central point of the knob. In this circumstancethe two-finger gripper could open to a great extent and theslipping risk during the rotation of knob is reduced. For knobin Fig. 3(b), the grasping point should be selected as thepoints with a certain distance from the right edge of the knob.Assuming the distance is set to α times as large as the lengthof the knob, α should be selected between 0.2 to 0.5. In thiscase the two-finger gripper wouldn’t slip from the knob edge,with no need for putting a large force to rotate the knob.Using corresponding methods to determine the grasping pointof other types of door knob to ensure the success of grasping.

(a) Door knob: type 1 (b) Door knob: type 2

Fig. 3. Grasping point for different types of door knob

3) Grasping way: When the grasping point is selected, thegrasping way of two-finger gripper is determined. In Fig. 3(a)and Fig. 3(b), the two-finger gripper is parallel to the ground,but in other cases, it may be vertical to the ground. During theprocessing of door-opening, the grasping way of two-fingergripper may change to a certain extent.

B. Door-opening Pattern

1) Task planning: The whole door-opening task could bedecomposed into a series of sub operations, planning andrestructuring these sub operations would result in the door-opening pattern. Take the knob in Fig. 3(b) as an example, thetwo-finger gripper firstly arrives at the pre-grasping locationwith a proper posture, then it grasps the door knob with thehelp of tactile feedback. The detailed algorithm would bediscussed in Section IV. Next, the two-finger gripper rotates

2220

the knob to a certain degree to unlock the door. Finally itpulls or pushes the door around the door axis until the doorhas been opened enough to pass though. Fig. 4 shows the suboperations sequence of door-opening task. Hierarchical TaskPlanning (HTN) [12] is an appropriate method to plan thedoor-opening task.

Reaching pre-grasping

location

Grasping door knob

Rotating door knob

Pushing / pulling door

Fig. 4. Sub operations sequence of door-opening task

2) Reasoning: Reasoning exists in the choice of door-opening pattern and the execution of door-opening task. Theformer occurs after the recognition of door knob. If the typeof door knob already exists in the door knob library, we wouldapply the corresponding door-opening pattern to open door. Ifnot we would reason and choose the most suitable type fromthe library as the type of recognized knob. The latter reasoningoccurs when the two-finger gripper choose a direction to twistthe knob. Take previous successful door-opening examples intoconsideration, which is called prior knowledge, to help to makethe best choice at present.

3) Motion trajectory: Different type of door knob wouldlead to diverse door-opening motion trajectory. The robot armand two-finger gripper would along this trajectory until thedoor-opening task is finished. In the following figure, thegripper grasps the door knob stably in the initial position,then follows an arc to twist the knob until reaching the finalposition. The center of this arc is the same as the center of knobaxis. After the door knob rotation the robot arm and gripperwould apply the same method to rotate around the door axisto open door.

Fig. 5. Motion trajectory of two-finger gripper

C. Semantic Monitoring and Exception Handling

1) Semantic representation of door opening effect: Thedoor opening task might be executed successfully or not. Toshow the effect of door opening execution, this paper importssemantic monitoring. Semantic monitoring not only builds thebase of the followed exception handling, but also enhances the

human-machine interaction ability of the robot. The result ofdoor opening execution could be divided into three statuses.(a) Success. The door is open completely. (b) Fail. The state ofthe door stays unchanged. (c) Middle state. The door is openpartially because of any reason.

Some features should be chosen to represent the abovesemantic results. These features might include the follows:

–The stress of the tactile sensor in two-finger gripper. Thetwo-finger gripper is closed when grasping the door knobduring the door-opening process. Once the stress of the tactilesensor is zero, it indicates either the door has open successfully,or exception occurs during the process.

–The location of the two-finger gripper. If the door openssuccessfully, the gripper could pass through the door to theother side. If the gripper couldn’t reach the other side of thedoor, it indicates the exception occurs.

2) Monitoring: In order to promptly get the states of therobot during the door opening process, monitoring is need-ed. To put it simply, monitoring is obtaining the values ofthe above features regularly. Analyzing these values and thesemantic state of the robot would be gained.

3) Exception handling: Various exceptions might occurduring the door opening process. Some exceptions might becaused by the outside environment, such as the door is locked,or the door is blocked by something. Others might be caused bythe mistakes during the process, such as the two-finger gripperslips, or rotates in a wrong direction. For the exceptions causedby outside circumstance, robot couldn’t solve it effectively sothat it passes a semantic fail to the system. For the exceptionscaused by mistakes, robot would back to the initial positionand redo the task until it has been completed successfully.

III. DOOR AND KNOB RECOGNITION AND EXTRACTION

Door and knob recognition and extraction is the preconditionof door-opening. The environment image collected by Kinectsensor is organized as point cloud form by Point Cloud Library(PCL) [13]. Based the proposed approaches in our previouswork, we have some modifications to improve the effect ofrecognition and extraction. At the beginning, we use pass-through method to filter the original point cloud to removethe points obviously outside the door and knob.

The recognition and extraction process is divided into twophases: door extraction, door knob recognition and extraction.Door extraction is the basis of door knob recognition andextraction. This paper applies the same methods in [4] toextract the door parameters. It could be divided into three parts:point cloud extraction of door part, extraction of door edge andcomputation of door axis.

The door knob recognition and extraction phase could bealso parted into three steps: point cloud extraction of door knobpart, door knob recognition, computation of rotation center andgrasping point. In the first step, we treat the points within acertain distance in front of the door and within the range of thedoor as the points of door knob, the door has been extractedin the previous phase.

2221

In the second step, we construct a point cloud templatelibrary contained 20 objects, which includes both the actualdoor knobs and other objects with the similar shape. Werecognize the two types door knobs of Fig. 3 at differentlocations and different point of view. A total of 21 times testshave been carried out. The following figure shows the resultof one test.

Fig. 6. Recognition results of VFH and CVFH methods for door knob oftype 2: left - VFH result, right - CVFH result

Fig. 6 exhibits the most similar 12 objects with the rec-ognized knob. The number below the graph represents thedistance from this object’s feature to the recognized knob’sfeature. The smaller the distance, that the higher the degree ofsimilarity of the two. From Fig. 6 we can find the type of knobhas been recognized successfully using both two features. Butthe similarity ranks of other objects still have some difference.

The following table shows the recognition rate and calcula-tion time.

TABLE IRECOGNITION RESULTS OF VFH AND CVFH

Methods Recognitionrate

Feature calculationtime (s)

Feature recognitiontime (s)

VFH 90.48% 0.268 0.084CVFH 85.71% 0.292 0.155

From the above table we can see the recognition rate andtime of VFH feature are both better than CVFH feature usingthe template library constructed in this work. This is becausethe unknown type knob in this paper doesn’t have the situationof over-segmentation or partial occlusions, so the advantages ofCVFH feature don’t reflect. So this paper choose VFH featureto recognize door knob.

IV. USAGE OF TACTILE FEEDBACK

The two-finger gripper is equipped in the end of mechanicalarm and two FlexiForce tactile sensors are stuck on the innerside of the two-finger gripper. Once the mechanical arm arrivesat the pre-grasping location, the gripper starts to grasp thedoor knob. It utilizes the tactile feedback to adjust the graspingposture during the process. The grasping procedure is shownin Procedure 1.

There are lots of exceptions such as the door has beenlocked, the gripper slips from the knob might occurs during thedoor-opening process. This paper proposes a strategy to handlethe gripper slipping exception. Monitoring the tactile feedbackof two-finger gripper during the door rotation process. Once the

Procedure 1 Grasping door knobInput:

tactileForce: tactile feedback from tactile sensorFORCE THRESHOLD: the knob is grasped stably if greater than itGRIPPER INTERVAL: the distance interval of movement of two fingergripperARM INTERVEL: the distance interval of mechanical arm movement

Output:None

1: while true do2: if tactileForce.get(UP) < FORCE THRESHOLD and3: tactileForce.get(DOWN) < FORCE THRESHOLD then4: gripper.close(GRIPPER INTERVAL)5: end if6: if tactileForce.get(UP) >= FORCE THRESHOLD and7: tactileForce.get(DOWN) >= FORCE THRESHOLD then8: break9: end if

10: if tactileForce.get(UP) >= FORCE THRESHOLD and11: tactileForce.get(DOWN) < FORCE THRESHOLD then12: arm.up(ARM INTERVEL)13: end if14: if tactileForce.get(UP) < FORCE THRESHOLD and15: tactileForce.get(DOWN) >= FORCE THRESHOLD then16: arm.down(ARM INTERVEL)17: end if18: end while

tactile force is below a given threshold, it indicates the gripperhas slipped from the knob. Because the knob is out of controlin this situation, the robot couldn’t continue to execute thedoor-opening task. We let the robot back to the initial positionin order to re-execute the door-opening task. The followingfigure shows this procedure.

Procedure 2 Exception handlingInput:

SLIP FORCE THRESHOLD: the gripper is slipped if smaller than itOutput:

None1: procedure CHECKEXCEPTION()2: if tactileForce.get(UP) < SLIP FORCE THRESHOLD and3: tactileForce.get(DOWN) < SLIP FORCE THRESHOLD then4: throw exception(”knob slip out”)5: end if6: end procedure

1: procedure HANDLEEXCEPTION()2: gripper.open()3: arm.initialPosition()4: end procedure

1: procedure MAIN()2: try3: gripper.graspHandle()4: new Procedure CHECKEXCEPTION()5: arm.open()6: catch exception7: HANDLEEXCEPTION()8: end procedure

V. EXPERIMENTS

This paper utilizes KUKA robotic platform for experiments.This platform contains a KUKA [14] KR5 Sixx mechanicalarm with 6 DOF, a Microsoft Kinect sensor and a parallel

2222

two-finger gripper stuck with FlexiForce [15] tactile sensor.Fig. 7 shows the control architecture of this platform.

Depth image

KinectDoor and knob parameters file

Tactile feedback

FlexiForce tactile sensor

DAQ Card

KUKA IPC

Copley controller

KUKA mechanical arm

Parallel two-finger gripper

Upper PC

Laptop

Control data

Door and knob

Fig. 7. The control architecture of KUKA platform

This work applies PCL to process the point cloud dataphotographed Kinect. We use Qt [16] and C++ language toimplement the program.

This paper designs two normal door-opening experimentsand two exception handling experiments to verify the effec-tiveness of this framework. Two doors with knobs used in thiswork are analogous to the doors in Fig. 3.

A. Normal Experiment 1: Door-opening with A Cuboid-likedDoor Knob

In this experiment, the distance between the actual door axisand door edge is 1.1cm. The distance between rotation centerand the left edge of door knob is 1.5cm. The grasping pointis chosen as the points with a certain distance from the rightedge of knob. The distance is set as 1/3 of the total length ofknob. The Fig. 8(left) shows the result of color-based regiongrowing clustering.

Fig. 8. Results of color-based region growing clustering: left - ExperimentA, right - Experiment B

The extracted door point cloud is shown in the Fig. 9(left).Fig. 10(left) shows the point cloud of door knob.The door-opening process is shown in Fig. 11.

B. Normal Experiment 2: Door-opening with A Non-rotatableDoor Knob

In this experiment, the distance between the actual door axisand door edge is the same as experiment 1. This type of door

Fig. 9. Door point clouds: left - Experiment A, right - Experiment B

Fig. 10. Door knob point clouds: left - Experiment A, right - Experiment B

knob has grasping center without rotation center. Graspingcenter is chosen as the center of door knob. The result of color-based region growing clustering is shown in Fig. 8(right), theextracted door edge and door knob is shown in Fig. 9(right)and 10(right). The following figure shows the door-openingprocess.

C. Exception Handling 1: Door-opening with A Cuboid-likedDoor Knob

In this experiment, the two-finger gripper twists the doorknob and opens door to a certain degree, then it moves up acertain distance and loosens the gripper to out of the contactwith the door knob. At this time the monitoring module ofthe system receives the tactile feedback data and determinesthe knob has slipped, then the exception handling process isstarting. The two-finger gripper moves back a certain distanceto ensure no touching with door knob, then backs to the initialposition and finished exception handling. The following figureshows this process.

Fig. 11. Normal Experiment 1: door-opening with a cuboid-liked door knob

2223

Fig. 12. Normal Experiment 2: door-opening with a non-rotatable door knob

Fig. 13. Exception Handling 1: door-opening with a cuboid-liked door knob

D. Exception Handling 2: Door-opening with A Non-rotatableDoor Knob

In this experiment, the two-finger gripper opens door to acertain degree, then loosens the gripper to out of contact withthe door knob. At this time monitoring module determines theknob has slipped and starts exception handling. The grippermoves to the right a certain degree to ensure out of the rangeof door knob. Then backs to the initial position and completeexception handling. This process is shown in Fig. 14.

Fig. 14. Exception Handling 2: door-opening with a non-rotatable door knob

The successful experiments confirm the effectiveness and

robustness of the proposed framework of this paper, andembodies some exception handling ability.

VI. CONCLUSION AND FUTURE WORKS

This paper proposes an improved robotic framework fordoor opening task based on our previous work. Tactile sensoris utilized to ensure the grasping stability and monitor door-opening process. Appropriate strategies are taken to handlegripper slipping exception. This work also discovers VFHfeature is better than CVFH feature in knob recognition effectand recognition speed. Two normal door-opening experimentsand two exception handling experiments prove the availabilityand robustness of this framework.

Door-opening task planning on the mobile robotic platformand the generalization and transition of this framework wouldbe the focus of our future works.

REFERENCES

[1] A. H. Adiwahono, Y. Chua, K. P. Tee, and B. Liu, ”Automated dooropening scheme for non-holonomic mobile manipulator,” in Control,Automation and Systems (ICCAS), 2013 13th International Conferenceon, 2013, pp. 839-844.

[2] S. Ahmad, H. Zhang, and G. Liu, ”Multiple working mode control of door-opening with a mobile modular and reconfigurable robot,” Mechatronics,IEEE/ASME Transactions on, vol. 18, pp. 833-844, 2013.

[3] S. Gray, S. Chitta, V. Kumar, and M. Likhachev, ”A single planner fora composite task of approaching, opening and navigating through non-spring and spring-loaded doors,” in Robotics and Automation (ICRA),2013 IEEE International Conference on, 2013, pp. 3839-3846.

[4] Z. Li, Z. Xia, G. Chen, Y. Gan, Y. Hu, and J. Zhang, ”Kinect-based roboticmanipulation for door opening,” in Robotics and Biomimetics (ROBIO),2013 IEEE International Conference on, 2013, pp. 656-660.

[5] T. Winiarski and K. Banachowicz, ”Opening a door with a redundantimpedance controlled robot,” in Robot Motion and Control (RoMoCo),2013 9th Workshop on, 2013, pp. 221-226.

[6] M. Zucker, Y. Jun, B. Killen, T.-G. Kim, and P. Oh, ”Continuous trajectoryoptimization for autonomous humanoid door opening,” in Technologiesfor Practical Robot Applications (TePRA), 2013 IEEE International Con-ference on, 2013, pp. 1-5.

[7] S. Cousins, ”ROS on the PR2 [ROS Topics],” Robotics & AutomationMagazine, IEEE, vol. 17, pp. 23-25, 2010.

[8] J. Smisek, M. Jancosek, and T. Pajdla, ”3D with Kinect,” in Computer Vi-sion Workshops (ICCV Workshops), 2011 IEEE International Conferenceon, 2011, pp. 1154-1160.

[9] T. Cover and P. Hart, ”Nearest neighbor pattern classification,” Informa-tion Theory, IEEE Transactions on, vol. 13, pp. 21-27, 1967.

[10] R. B. Rusu, G. Bradski, R. Thibaux, and J. Hsu, ”Fast 3D recognitionand pose using the Viewpoint Feature Histogram,” in Intelligent Robotsand Systems (IROS), 2010 IEEE/RSJ International Conference on, 2010,pp. 2155-2162.

[11] A. Aldoma, M. Vincze, N. Blodow, D. Gossow, S. Gedikli, R. B. Rusu,et al., ”CAD-model recognition and 6DOF pose estimation using 3Dcues,” in Computer Vision Workshops (ICCV Workshops), 2011 IEEEInternational Conference on, 2011, pp. 585-592.

[12] K. Erol, J. Hendler, and D. S. Nau, ”HTN planning: Complexity andexpressivity,” in AAAI, 1994, pp. 1123-1128.

[13] R. B. Rusu and S. Cousins, ”3D is here: Point Cloud Library (PCL),” inRobotics and Automation (ICRA), 2011 IEEE International Conferenceon, 2011, pp. 1-4.

[14] G. Hirzinger, J. Bals, M. Otter, and J. Stelter, ”The DLR-KUKAsuccess story: robotics research improves industrial robots,” Robotics &Automation Magazine, IEEE, vol. 12, pp. 16-23, 2005.

[15] Tekscan. Flexiforce Sensors Manual. Available:http://www.tekscan.com/pdf/FLX-FlexiForce-Sensors-Manual.pdf

[16] M. Summerfield, Advanced Qt Programming: Creating Great Softwarewith C++ and Qt 4: Prentice Hall Press, 2010.

2224