3D-T A PRIORI UNKNOWN OBJECTS IN CLUTTERED DYNAMIC ENVIRONMENTS ?· Acknowledgements This Thesis is…

Embed Size (px)

Text of 3D-T A PRIORI UNKNOWN OBJECTS IN CLUTTERED DYNAMIC ENVIRONMENTS ?· Acknowledgements This Thesis...

3D-TRACKING OF A PRIORI UNKNOWN OBJECTS INCLUTTEREDDYNAMIC ENVIRONMENTS

by

Hans de Ruiter

A thesis submitted in conformity with the requirementsfor the degree of Doctor of Philosophy

Graduate Department of Mechanical and Industrial EngineeringUniversity of Toronto

Copyright c 2008 by Hans de Ruiter

3D-Tracking of A Priori Unknown Objects in Cluttered Dynami c Environments

Hans de Ruiter

Doctor of Philosophy

Graduate Department of Mechanical and Industrial Engineering

University of Toronto

2008

Abstract

Tracking of an objects full six degree-of-freedom (6-dof)position and orientation (pose) would

allow a robotic system to autonomously perform a variety of complex tasks, such as docking

from any preferred angle, surveillance of moving subjects,etc. Computer vision has been

commonly advocated as an effective tool for 3D (i.e., 6-dof)tracking Objects of Interest (OIs).

However, the vast majority of vision-based 6-dof pose trackers reported in the literature require

a model of the OI to be provideda priori. Finding/selecting the OI to track is also essential to

autonomous operation. A problem that has often been neglected.

This Thesis proposes a novel, real-time object-tracking system that solves all of the afore-

mentioned problems. The tracking procedure begins with OI selection. Since what constitutes

an OI is application dependent, selection is achieved via a customizable framework of Interest

Filters (IFs) that highlight regions of interest within an image. The region of greatest interest

becomes the selected OI. Next, an approximate visual 3D model of the selected OI is built on-

line by a real-time modeller. Unlike previously proposed techniques, this modeller can build

the model of the OI even in the presence of background clutter; an essential task for track-

ing one object amongst many. Once a model is built, a real-time 6-dof tracker (i.e., the third

sub-component) performs the actual 6-dof object tracking via 3D model projection and optical

flow.

ii

Performing simultaneous modelling and tracking presents several challenges requiring novel

solutions. For example, a novel data-reduction scheme based on colour-gradient redundancy is

proposed herein that facilitates using colour input imageswhilst still maintaining real-time per-

formance on current computer hardware. Likewise, a per-pixel occlusion-rejection scheme is

proposed which enables tracking in the presence of partial occlusions. Various other techniques

have also been developed within the framework of this Thesisin order to achieve real-time ef-

ficiency, robustness to lighting variations, ability to cope with high OI speeds, etc.

Extensive experiments with both synthetic and real-world motion sequences have demon-

strated the ability of the proposed object-tracking systemto tracka priori unknown objects.

The proposed algorithm has also been tested within two target applications: autonomous con-

voying, and dynamic camera reconfiguration.

iii

To my parents, Annie and Conradus

iv

Acknowledgements

This Thesis is the culmination of years of research that would not have been possible without

the support of others. I wish to acknowledge and thank all those whose help and support has

been so valuable over the duration of my time at the University of Toronto.

First, I would like to thank my supervisor, Professor Beno Benhabib, whose patience and

guidance has kept my research focussed on the overall goal ofobject tracking for autonomous

systems. His ability to ask the pertinent questions about experimental results ensured that every

aspect was researched thoroughly whilst helping me to develop the critical thinking required.

My Ph.D. advisory committee Professors R. Ben Mrad and F. BenAmara also gave valuable

suggestions, whilst monitoring my progress. Their collective guidance is greatly appreciated.

My colleagues in the Computer Integrated Manufacturing Laboratory provided great com-

pany and a sounding board for ideas. They are: Ardevan Bakhtari, Usman Ghumman, Hamid

Golmakani, Faraz Kunwar, Imtehaz Heerah, Matthew Mackay, Goldie Nejat, Bill Owen, Felix

Wong, Joseph Wong, Leslie Wong, Hu Yongbiao, and Michael Zheng. In particular Michael

Zheng implemented a depth-map extractor that my research required, as part of his Master of

Engineering degree. The research presented in this Thesis was also aided by Ling Xiaos work

as a summer student.

I am also extremely grateful for the support from my parents Annie and Conradus, and

my siblings, Anton, Ingrid, and Niels. It was my eldest brother Anton who first alerted me to

the University of Torontos research programs in robotics and had an apartment ready for me

before I had even arrived in Canada. My parents gave continual advice and support despite the

distance between Toronto, Canada, and Wellington, New Zealand. Ingrids visits to Canada

were always enjoyable, as were brief discussions with my youngest brother Niels, via email

and web-cam.

Life is much more than just university research. Fr. PatrickODea and the community at the

v

Newman Centre (the Catholic Chaplaincy at the University ofToronto) served both my social

and spiritual needs. Whilst too numerous to name, being amongst friends in this community

made life half way around the world from home enjoyable.

Finally, I would like to gratefully acknowledge the financial support of the University

of Toronto Graduate Fellowship, and the Natural Science andEngineering Research Council

(NSERC).

vi

List of Publications Generated from this Thesis

So far, the research detailed in this Thesis has produced nine publications: three journal papers

and six conference papers.

Journal Papers:

H. de Ruiter and B. Benhabib, On-line Modeling for Real-Time 3D Target Tracking,

Machine Vision and Applications, Springer, Published Online, April 2008.

H. de Ruiter and B. Benhabib, Visual-Model-Based, Real-Time 3D Pose Tracking for

Autonomous Navigation: Methodology and Experiments,Autonomous Robots, Pub-

lished Online, July 2008.

H. de Ruiter, Matthew Mackay, and B. Benhabib, Autonomous 3D Tracking for Recon-

figurable Active-Vision Based Object-Recognition,Int. Journal of Computer Vision, In

Review, Submitted June 2008.

Conference Papers:

H. de Ruiter and B. Benhabib, Tracking of rigid bodies for autonomous surveillance,

IEEE Int. Conf. on Mechatronics and Automation, Niagara Falls, Canada, July 2005, pp.

928-933.

H. de Ruiter and B. Benhabib, Colour-Gradient Redundancy for Real-time Spatial Pose

Tracking in Autonomous Robot Navigation,Canadian Conference on Computer and

Robot Vision, Qubec City, Canada, June 2006, pp. 20-28.

H. de Ruiter and B. Benhabib, Real-time Visual Vehicle Tracking for Autonomous Con-

voy Control, Int. Conf. on Autonomous Robots and Agents, Palmerston North, New

Zealand, December 2006, pp. 195-200.

H. de Ruiter and B. Benhabib, Online Modeling for Real-Time, Model-Based, 3D Pose

Tracking, in Advances and Innovations in Systems, Computing Sciences and Software

vii

Engineering, Khaled Elleithy, Ed., Dordrecht, The Netherlands, Springer, 2007, ch. 96,

pp. 555-560.

H. de Ruiter and B. Benhabib, Object-of-Interest Selection for Model-Based 3D Pose

Tracking with Background Clutter,Int. (On-line) Joint Conferences on Computer, In-

formation, and Systems Sciences, and Engineering, December 2007.

H. de Ruiter and B. Benhabib, Visual-Model Based Spatial Tracking in the Presence of

Occlusions,Canadian Conf. on Computer and Robotic Vision, Windsor, Canada, May

2008, pp. 163-170.

viii

Contents

Abstract ii

Acknowledgements v

List of Tables xiv

List of Figures xv

Nomenclature and Acronyms xxi

1 Introduction 1

1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.2 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3

1.3 Literature Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. 5

1.3.1 OI Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

1.3.2 Object Modelling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

1.3.3 Visual Object Tracking . . . . . . . . . . . . . . . . . . . . . . . . . .7

1.3.4 Object Tracking for Autonomous Navigation and other Real-Time Ap-

plications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

ix

1.4 Research Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . 10

1.5 Thesis Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

2 Tracking Initialization 15

2.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

2.2 Object of Interest Selection and Tracking Initialization . . . . . . . . . . . . . 19

2.2.1 OI Selection Framework . . . . . . . . . . . . . . . . . . . . . . . . . 20

2.2.2 Initial-Pose Estimation . . . . . . . . . . . . . . . . . . . . . . . .. . 28

2.2.3 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30