Autonomy Artificial Intelligence Robotics (AAIR)

Intuitive Robotic Operator Control (IROC): Integration of Gesture Recognition With An Unmanned Ground Vehicle and Heads Up Display

by Lisa Baraniecki; Jack Vice; Jonathan Brown; Josh Nichols; Dave Stone; Dawn Dahn


Currently, fielded ground robotic platforms are controlled by a human operator via constant, direct input from a controller. This approach requires constant attention on the part of the operator, decreasing situational awareness (SA). In scenarios where the robotic asset is non-line-of-sight (non-LOS), the operator must monitor visual feedback, which is typically in the form of a video feed and/or visualization. With the increasing use of personal radios, smart devices/wearable computers, and network connectivity by individual warfighters, the need for an unobtrusive means of robotic control and feedback is becoming more necessary. A proposed intuitive robotic operator control (IROC) involving a heads up display (HUD), instrumented gesture recognition glove, and ground robotic asset is described in this paper. Under the direction of the Marine Corps Warfighting Laboratory (MCWL) Futures Directorate, AnthroTronix, Inc. (ATinc) is implementing the described integration for completion and demonstration by 30 September 2016.