


|
|
|
Real-Time Trackingby Axel Pinz, Dr. Graz
University of Technology El. Measurement + Signal
Processing
Abstract: The
term "tracking" has been used by several scientific communities,
often with rather different meaning. Thus, the first goal of this
lecture is to set the terms from a computer vision perspective of
view. Tracking can denote the process of detecting and following
2-dimensional entities in image sequences. A good example is the
tracking of persons or cars in surveillance videos, when a
stationary camera is watching moving objects. The lecture briefly
reviews major techniques related to tracking of motion in videos,
some of them capable of real-time operation, others being used in
off-line video processing. However, the main part of the lecture is
dedicated to tracking of 3D objects in 3D scene coordinates in
real-time. Three-dimensional tracking denotes the process of online
recovery of up to six degrees of freedom (6 DoF) of object pose
relative to a scene coordinate system. This can be achieved by
several 3D measurement methods including, but not limited to,
vision-based tracking. Relevant methods are briefly reviewed. In
vision-based tracking we distinguish between "outside-in"
(stationary cameras watching the scene) and "inside-out" tracking
(the pose of a moving camera is calculated relative to stationary
landmarks). Many times, a single tracking method will fail under
certain circumstances, which has led to the development of "hybrid"
trackers. Hybrid trackers use more than one sensor principle or
several algorithms to benefit from complementary sensor
characteristics and to gain robustness and precision. The lecture is
closed by a presentation of our own development of a hybrid tracker
for augmented reality applications and by a discussion of the
current state-of-the art. Real-time tracking constitutes a major
enabling technology for many high-potential applications including
better interfaces supporting difficult navigation tasks in medicine.
| |