One-Shot Imitation Drone Filming of Human Motion Videos.

2021 
Imitation learning has recently been applied to mimic the operation of a cameraman in existing autonomous camera systems. To imitate different filming styles, these methods have to train multiple independent models, where each model requires a significant number of training samples to learn one specific style. In this paper, we propose a framework, which can imitate a filming style by seeing only a single demonstration video of the target style, i.e., one-shot imitation filming. This is achieved by two key enabling techniques: 1) filming style feature extraction, which encodes sequential cinematic characteristics of a variable-length video clip into a fixed-length feature vector, and 2) camera motion prediction, which dynamically plans the camera trajectory to reproduce the filming style of the demo video. We implemented the approach with a deep neural network and deployed it on a 6 degrees of freedom (DOF) drone system by first predicting the future camera motions, and then converting them into the drone's control commands via an odometer. Our experimental results on comprehensive datasets and showcases exhibit that the proposed approach achieves significant improvements over conventional baselines, and our approach can mimic the footage of an unseen style with high fidelity.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []