Correspondence-free
Synchronization and Reconstruction in a Non-rigid Scene
Lior Wolf and Assaf Zomet
School of Computer Science and Engineering,
The Hebrew University,
Jerusalem 91904, Israel
3D reconstruction of a dynamic non-rigid scene from features in two cameras
usually requires synchronization and correspondences between the cameras.
These may be hard to achieve due to occlusions, wide base-line, different
zoom scales, etc.
In this work we present an algorithm for reconstructing a non-rigid scene
from sequences acquired by two uncalibrated non-synchronized fixed orthographic
cameras.
It is assumed that (possibly) different points are tracked in the two
sequences.
The only constraint used to relate the two cameras is that every 3D point
tracked in one sequence can be described as a linear combination of some of
the 3D points tracked in the other sequence.
Such constraint is useful, for example, for articulated objects.
We may track some points on an arm in the first sequence, and some other
points on the same arm in the second sequence.
On the other extreme, this model can be used for generally moving points
tracked in both sequences without knowing the correct permutation.
In between, this model can cover non-rigid bodies, with local rigidity
constraints.
We present linear algorithms for synchronizing the two sequences and
reconstructing the 3D points tracked in both views. Outlier points are
automatically detected and discarded. The algorithm can handle
both 3D objects and planar objects in a unified framework, therefore avoiding
numerical problems existing in other methods.