Designing and Testing a Robotic Avatar for Space-to-Ground Teleoperation: the Developers' Insights

2020 
In late 2019, astronaut Luca Parmitano remotely controlled a rover equipped with a robotic manipulator, performing geology tasks on a moon-analog site from the ISS. One year and 7 months later, in July 2021, he will control the same rover in a more realistic moon-analog environment: a field of volcanic rock and regolith on mount Etna, Italy. These experiments constitute the Analog-1 campaign in the frame of ESA's METERON project. As payload developers, we want to create an interface for astronauts to intuitively operate robotic systems on a planetary or lunar surface: how can we maximise task efficiency and sense of immersion/transparency? At the same time, how can we minimise operator fatigue, and physical and mental effort? And how do we do this while constrained in the framework of human spaceflight, with upmass and software requirements, with delayed, low-bandwidth and unreliable communications? We show how we created a telerobotic system featuring an intuitive graphical and haptic user interface. This included a force feedback device and custom joystick, controlling a mobile robotic platform. The robotic platform consisted of an all-terrain chassis and two 7-DOF robotic arms with torque sensing. One arm was mounted on the front of the rover and used for manipulation; the other was mounted on top and used to reposition a camera. With this system, the astronaut was fully in control of the robot to collect rock samples. The only external input was from a ground team of scientists over voice-loop and text-messenger, concerning the choice of geological samples. Full, stable 6-DOF force feedback for the manipulation arm was provided via a sigma.7 haptic input device. This meant that the astronaut could feel (for the first time from space) not only full-DOF contact with the planet surface from orbit, but also the weight of the rocks they grasped. System status feedback was visually and intuitively presented on the user interface - running on a laptop on board the ISS - as well as views from two cameras. During development we continuously integrated requirements from various stakeholders and feedback from astronauts and astronaut trainers to improve the user interface. The analog tests delivered valuable insights about how to design a telepresence system to control robots on a planet's surface from orbit. We expect these insights to be useful for future development of teleoperated planetary robotics as well as terrestrial applications in similar scenarios.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    9
    References
    1
    Citations
    NaN
    KQI
    []