Filter-Based Abstractions for Safe Planning of Partially Observable Dynamical Systems.

2021 
We study planning problems for dynamical systems with uncertainty caused by measurement and process noise. Measurement noise causes limited observability of system states, and process noise causes uncertainty in the outcome of a given control. The problem is to find a controller that guarantees that the system reaches a desired goal state in finite time while avoiding obstacles, with at least some required probability. Due to the noise, this problem does not admit exact algorithmic or closed-form solutions in general. Our key contribution is a novel planning scheme that employs Kalman filtering as a state estimator to obtain a finite-state abstraction of the dynamical system, which we formalize as a Markov decision process (MDP). By extending this MDP with intervals of probabilities, we enhance the robustness of the model against numerical imprecision in approximating the transition probabilities. For this so-called interval MDP (iMDP), we employ state-of-the-art verification techniques to efficiently compute plans that maximize the probability of reaching goal states. We show the correctness of the abstraction and provide several optimizations that aim to balance the quality of the plan and the scalability of the approach. We demonstrate that our method is able to handle systems with a 6-dimensional state that result in iMDPs with tens of thousands of states and millions of transitions.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    59
    References
    0
    Citations
    NaN
    KQI
    []