SUMMARY
This thesis introduces a control method for image-based robot guidance that exploits the advantages of multiple cameras. It provides system survivability in the event of image occlusion or camera failure and produces a control action based on statistically meaningful data weighting without any prior knowledge of robot or camera parameters. The research presents a novel control law that uses a Kalman filter and can track a moving target. Adaptive filtering improves filter performance and it is decentralized for robustness and distributed processing. System stability is shown and the control method is supported experimentally and by simulation.