2
Live Ego-Motion Estimation David Nist´ er Sarnoff Corporation CN5300, Princeton, NJ 08530 [email protected] Figure 1: Reconstruction made from a sequence of turntable im- ages. 1. Description This demo will show live (real-time, low delay) estimation of the motion of a perspective camera. Some scene points are also reconstructed. The reconstruction of structure and motion is based on the video input alone and performed on a regular PC-laptop without special hardware. The estimation will be shown in three ways: Real-time directly from a video file stored on the hard- drive. Real-time through the capture card with camera play- back as the external video source. Live from camera using both turntable and freehand motion. The system uses the techniques described in [1] and [2], among other things. Examples of reconstructions made from disk are shown in Figures 1 and 2. A screenshot of the demo running is shown in Figure 3. An example of real- time reconstruction from camera playback is shown in Fig- ure 4. An example of live reconstruction is shown in Figure 5. References [1] D. Nist´ er. An Efficient Solution to the Five-Point Relative Pose Problem, IEEE Conference on Computer Vision and Pat- tern Recognition, Volume 2, pp. 195-202, 2003. [2] D. Nist´ er. Preemptive RANSAC for Live Structure and Mo- tion Estimation, IEEE International Conference on Computer Vision, pp. 199-206, 2003. Figure 2: Reconstruction made at real-time rate from a freehand motion. The camera makes an outer circle and then an inner circle with some forward motion in between. Figure 3: A screenshot of the demo running. The window at the top displays feature tracks on the original video. The window at the bottom displays the camera positions and some structure points. The graphic display is done concurrently with the estima- tion of camera motion, including feature detection, feature track- ing, robust estimation and local bundle adjustment. The estimation is done directly on video from the capture card, with camera play- back or live videofeed as the external video source. 1

Live Ego-MotionEstimationlear.inrialpes.fr/.../iccv03/0199_nister_demo.pdf · David Nister´ Sarnoff Corporation ... D. Nister´. An Efficient Solution to the Five-PointRelative

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Live Ego-MotionEstimationlear.inrialpes.fr/.../iccv03/0199_nister_demo.pdf · David Nister´ Sarnoff Corporation ... D. Nister´. An Efficient Solution to the Five-PointRelative

Live Ego-Motion Estimation

David NisterSarnoff Corporation

CN5300, Princeton, NJ [email protected]

Figure 1: Reconstruction made from a sequence of turntable im-ages.

1. DescriptionThis demo will show live (real-time, low delay) estimationof the motion of a perspective camera. Some scene pointsare also reconstructed. The reconstruction of structure andmotion is based on the video input alone and performed on aregular PC-laptop without special hardware. The estimationwill be shown in three ways:

Real-time directly from a video file stored on the hard-drive.

Real-time through the capture card with camera play-back as the external video source.

Live from camera using both turntable and freehandmotion.

The system uses the techniques described in [1] and [2],among other things. Examples of reconstructions madefrom disk are shown in Figures 1 and 2. A screenshot ofthe demo running is shown in Figure 3. An example of real-time reconstruction from camera playback is shown in Fig-ure 4. An example of live reconstruction is shown in Figure5.

References

[1] D. Nister. An Efficient Solution to the Five-Point RelativePose Problem, IEEE Conference on Computer Vision and Pat-tern Recognition, Volume 2, pp. 195-202, 2003.

[2] D. Nister. Preemptive RANSAC for Live Structure and Mo-tion Estimation, IEEE International Conference on ComputerVision, pp. 199-206, 2003.

Figure 2: Reconstruction made at real-time rate from a freehandmotion. The camera makes an outer circle and then an inner circlewith some forward motion in between.

Figure 3: A screenshot of the demo running. The window atthe top displays feature tracks on the original video. The windowat the bottom displays the camera positions and some structurepoints. The graphic display is done concurrently with the estima-tion of camera motion, including feature detection, feature track-ing, robust estimation and local bundle adjustment. The estimationis done directly on video from the capture card, with camera play-back or live videofeed as the external video source.

1

Page 2: Live Ego-MotionEstimationlear.inrialpes.fr/.../iccv03/0199_nister_demo.pdf · David Nister´ Sarnoff Corporation ... D. Nister´. An Efficient Solution to the Five-PointRelative

Figure 4: Estimation of the motion of a handheld camera, inducedby walking down a 160 meters long corridor. The whole motion issuccessfully integrated into the same coordinate frame. Note howthe straight trajectory builds up from left to right. The estimationwas done in real-time from the tape playback. Thus, the systemhas to deal with capture, dropping frames if not able to keep upetc. The estimation was made with both the feature-track windowand the 3D scene window displaying. In this mode, the delay fromthe tape to the reconstruction displaying on the screen is less thana second. No prior knowledge of the motion was used and thesystem is identical to the one used in Figure 5.

Figure 5: Estimation of a freehand motion created with a cali-bration cylinder in one hand and a camcorder in the other. Theestimation was done live. Thus, the system has to deal with cap-ture, dropping frames if not able to keep up etc. The estimationwas made with both the feature-track window and the 3D scenewindow displaying. Note how the trajectory builds up from top tobottom. No knowledge of the structure or motion was used andthe system is identical to the one used in Figure 4

2