Analog Lunar Robotic Site Survey at Haughton Crater

2007 
Overview: The “Human-Robot Site Survey” (HRSS) project is a multi-year activity that is investigating techniques for lunar site survey[1]. The system that we are developing coordinates humans and multiple robots in a variety of team configurations and control modes in order to perform comprehensive surface surveys. Site survey involves producing high-quality, detailed maps, including 3D surface models, mineralogy, subsurface stratigraphy, etc. These maps are required for scientific understanding, site planning and operations, and in-situ resource utilization. In July 2007, two K10 rovers (Figs. 1 and 2) operated at Haughton Crater on Devon Island, Nunavut, Canada, autonomously surveying multiple lunar analog sites with terrain and subsurface mapping sensors. Operations were designed to simulate a near-term lunar mission, including remote sensing data, operations tools, proximity and remote operations back rooms, and limited-bandwidth data communications. Approach: Our approach is to develop and validate system-level concepts for comprehensive site survey in a variety of terrain and over a range of scales. We are developing methods that combine information from orbital imagery with surface activity of rovers equipped with survey instruments. In our work, two key topics are addressed: techniques for robots to perform effective survey, and techniques to enable effective human-robot interaction for varied configurations. With our approach, robotic survey tasks can be coordinated from ground-control or from inside surface habitats (or vehicles). A typical scenario involves multiple survey robots mapping a region for resources while human operators assess information from the rovers and provide physical and cognitive intervention. Coordination and dialogue between ground control, crew (EVA and IVA), and mobile robots uses peer-topeer human-robot interaction[1], [5]. During robotic surveying, software components run off-board (on ground stations) and on-board multiple survey robots. A traversability map is processed by a coverage planner, which computes survey points. A central executive coordinates task assignment and monitors execution. Acquired data is routed to a database for post-processing and analysis. Rover activity monitoring and interaction is provided by the Viz user interface[6], Ensemble ground systems software tools[7], and Google Earth. Sensors and mapping: The two K10 robots are identical except for survey instruments. K10 Red (Fig. 1) carries an Optech ILRIS-3D scanning lidar, which provides mm accurate 3D (x,y,z) points over a 40 o x40 o field of view. For full panoramas, the rover turns in place to acquire scans with overlap. Area coverage is provided by driving to waypoints, acquiring panoramas, then fusing multiple scans into a topo map. K10 Black (Fig. 2) carries the JPL CRUX groundpenetrating radar (GPR). The GPR operates at 800 MHz, measuring the subsurface with 10 cm resolution to a depth of 2.5 meters. Wide area coverage is provided by navigating on North-South and East-West transects within an area. A priori data: For the July 2007 test, mission planning and context imaging was provided by the QuickBird satellite. QuickBird images provided 60 cm/pixel full color over an 8km by 8km area. Registration to hand-collected tie points provided sub-meter registration to UTM. We generated a multiresolution KML overlay for Google Earth, and some local area image tiles were imported into Viz as a context basemap for 3D visualizations. Mission planning: GPR coverage plans were generated automatically using a Boustrophedon decomposition[10] of designated mapping areas. The input to Fig. 1. NASA Ames K10 Red rover with the Optech ILRIS-3D Lidar operating at Haughton Crater.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    5
    Citations
    NaN
    KQI
    []