Robotic platforms require accurate geo-spatial localization for high-level mission planning, real-time site reconnaissance, and multi-machine collaboration. Global navigation satellite system (GNSS) receivers are most commonly used to provide UGVs with accurate geolocation. However, GNSS is not reliable in contested environments because it is vulnerable to jamming, spoofing and black-outs. To address these issues, the United States Army Corps of Engineers (USACE) - Engineer Research and Development Center (ERDC) has developed the Active Terrain Localization Imagery System (ATLIS) which uses on-board perception and a priori satellite imagery to eliminate reliance on GNSS for global positioning of a ground vehicle. Using LiDAR and camera imagery, ATLIS creates a vehicle-centric, orthorectified image that is compared to an a priori satellite image using template matching. It then produces a global position estimate for the vehicle. We develop a method to estimate the uncertainty of this position estimate, enabling fusion with other relative positioning sensors. We demonstrate the effectiveness of ATLIS in aiding localization of an unmanned ground vehicle (UGV) in a complex outdoor environment achieving an average planar euclidean distance error of 1.21m over a 5.1km run when compared to a GPS ground truth.