Semi-autonomous behaviors, such as leader-following and “point-and-go” navigation, have the potential to significantly increase the value of squad-level UGVs by freeing operators to perform other tasks. A variety of technologies have been designed in recent years to enable such semi-autonomous behaviors on board mobile robots; however, most current solutions use custom payloads comprising sensors such as stereo cameras, LIDAR, GPS, or active transmitters. While effective, these approaches tend to be restricted to UGV platforms capable of supporting the payload’s space, weight, and power (SWaP), and may be cost-prohibitive to large-scale deployment. Charles River has developed a system that enables both leader-following and “point-and-go” navigation behaviors using only a single monocular camera. The system allows a user to control a mobile robot by leading the way and issuing commands through arm/hand gestures, and is capable of following an operator both on foot and aboard a vehicle. The operator may equally direct the robot via a lightweight interface, by simply indicating an object of interest or destination in the robot’s camera view.