Publication

Modeling & Simulation, Testing & Validation (MSTV)
2011

AN AUGMENTED REALITY UAV-GUIDED GROUND NAVIGATION INTERFACE IMPROVE HUMAN PERFORMANCE IN MULTI-ROBOT TELE-OPERATION

by Sam Lee; Nathan P. Lucas; Alex Cao; Abhilash Pandya; R. Darin Ellis

Abstract

This research proposes a human-multirobot system with semi-autonomous ground robots and UAV view for contaminant localization tasks. A novel Augmented Reality based operator interface has been developed. The interface uses an over-watch camera view of the robotic environment and allows the operator to direct each robot individually or in groups. It uses an A* path planning algorithm to ensure obstacles are avoided and frees the operator for higher-level tasks. It also displays sensor information from each individual robot directly on the robot in the video view. In addition, a combined sensor view can also be displayed which helps the user pin point source information. The sensors on each robot monitor the contaminant levels and a virtual display of the levels is given to the user and allows him to direct the multiple ground robots towards the hidden target. This paper reviews the user interface and describes several initial usability tests that were performed. This research demonstrates the development of a human-multirobot interface that has the potential to improve cooperative robots for practical applications.