A vision-based automatic tracking and observation system installed on an ROV has successfully tracked midwater ocean animals such as jellyfish in Monterey Bay, California. This system uses stereo vision to localize the tracking vehicle with respect to the target of interest and closes control loops to maintain the target in the views of the cameras. Reliance on the vision sensor imposes a constraint on the control system performance to keep the target in the fields of view of the cameras at all times. The constraint can be expressed as maximum allowable pointing and positioning errors, which are inversely proportional to the standoff distance to the specimen. For the system to track small specimens at short range, the constraint of keeping the target within the vision cones becomes very difficult to maintain continuously and the out-of-frame events that result are unrecoverable for the current technology. To expand the operational envelope of the system to include observation of smaller specimens at short standoff distances, a new approach is demonstrated that complements vision with sensors typically found on underwater vehicles. This approach softens the constraint of the viewing cones of the vision system by allowing tracking to continue during brief out-of-frame events. A non-linear multi-rate estimator implemented with a Sigma Point Kalman Filter (SPKF) fuses vision with water-relative velocities from a Doppler Velocity Log (DVL) and other vehicle measurements. With this estimator, the target’s position relative to the vehicle is propagated during periods of time when the specimen cannot be seen. The design of an estimator for this problem requires consideration of issues such as assumptions about the motion dynamics of the target, limited knowledge of the vehicle’s dynamic model and robustness to unmodeled disturbances. Simulated tracking results and data from field experiments are presented.
Download Full PDF Version (Non-Commercial Use)