Planning Science Data Return of Mars Express with Support of Artificial Intelligence

2006 
Mars Express (MEX) has been in orbit around Mars since the beginning of 2004. A complex science payload, including a camera, spectrometers, a particle detector, a radar and a transponder for communications with Mars landers, generates a large amount of science data that need to be transferred to the ground. For MEX, a dedicated ESA ground station and a number of DSN stations are available to receive the data. As the mission is operated in a store and forward-type manner, all data are stored on board before it is down-linked. Depending on the season, the downlink data rate varies between 28 kbps and 182 kbps. The generation of science data has to take into account: a) the downlink bit rate b) the available downlink windows and c) the available mass memory space on the spacecraft. The amount of data to be down-linked varies between 800 Mbit and more than 5 Gbit per day. The allocation of downlink windows is driven largely by the science requirements and changes from orbit to orbit, i.e. there are no dedicated data downlink slots allocated as a fixed daily pattern, downlink slots continuously vary in length and time of day. Data have to be dumped from the onboard mass memory before more data are written into it to avoid loss of data. Consequently, the generation of a data downlink plan has been a challenge since the start of the mission. It became obvious that a dedicated tool was needed to help resolve the conflicting requirements. An operational software system was produced, based on experience gained from early MEX operations, and on an ESA study, referred to as MEXAR (Mars Express Scheduling Architecture), that led to a prototype of a software system demonstrating that artificial intelligence techniques for planning and scheduling can be beneficially applied to a real space mission. The tool has been successfully deployed in the MEX mission planning process as part of a suite of tools to support planning. Major objectives of the tool are the automation of the data dumping planning process and the maximization of the amount of dumped data. The paper describes the operational requirements for a data dumping tool, its iterative implementation process up to the final product and the operational assessment in terms of reduction of manpower effort and increase in the volume of dumped data.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    1
    References
    5
    Citations
    NaN
    KQI
    []