Interactive Inverse Kinematics for Monocular Motion Estimation

Introduction:

This web site is a companion to the Vriphys 2009 paper "Interactive Inverse Kinematics for Monocular Motion Estimation". The main purpose of the web site is to host the videos referenced in the paper. These can be found in the next section.

We present an application of a fast interactive inverse kinematics method as a dimensionality reduction for monocular tracking. The system deals efficiently and robustly with box constraints and does not suffer from shaking artifacts. The presented system uses a single camera to estimate the motion of a human. The results show that inverse kinematics significantly speeds up the estimation process.

Demonstration Videos:

The following videos illustrates the usefulness of the chosen state-space. In these videos we are not aiming at perfect tracking as this requires a much more tuned system for making visual measurements. The point is simply that we can get results that are comparable to using 5000 particles in the full pose space using only 25 particles in end-effector space. Since each particle requires the evaluation of the likelihood function (which is computationally expensive) this provides a vast speed-up. Specifically, the end-effector tracker is about 700 times faster than the full-pose tracker.

To see the videos, click on the images below.

Full-Pose; 100 particles Full-Pose; 5000 particles End-Effector; 25 particles
Tracking result using 100 particles in angle space Tracking result using 5000 particles in angle space Tracking result using 25 particles in end-effector space

The Paper:

You can download the VRIPHYS 2009 paper from here. The BibTeX for the paper is

THIS BIBTEX ENTRY WILL BE UPDATED ONCE THE PAPER IS OUT...

@article{engell_et_al09,
  title = {Interactive Inverse Kinematics for Monocular Motion Estimation},
  author = {Morten Engell-Nørregård and Søren Hauberg and Jerome Lapuyade and 
           Kenny Erleben and Kim Steenstrup Pedersen},
}

For comments or questions please contact Morten Engell-Nørregård.