Hierarchical Processing Model Based on Multi-modality Interaction Design

2019 
The information warfare requires diverse operations and a lot of information, the traditional human factors engineering design methods are prone to problems of high load of human–computer interaction, and it also lacks of quantitative capability. This paper focuses on five modalities of voice, eye control, touch, brain control, and gestures, and according to the man–machine–environment cognitive decision model and the hierarchical processing model for each modality interaction in different applicable scenarios to build the multi-modality interaction layered processing model to enhance the naturalness and friendliness of command interaction.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    6
    References
    0
    Citations
    NaN
    KQI
    []