Enhanced concert experience using multimodal feedback from live performers

2017 
In this paper, we aim to enhance the interaction between the performer and the audience in live idol performances. We propose a system for converting the movements of individual members of an idol group into vibrations and their voices into light on handheld devices for the audience. Specifically, for each performer, the system acquires data on movement and voice magnitudes via an acceleration sensor attached to the right wrist and microphone. The obtained data is then converted into motor vibrations and lights from an LED. The receiving devices for the audience members come in the form of a pen light or doll. A prototype system was made to collect acceleration data and voice magnitude data measurements for our experiments with an idol group in Japan to verify whether the performer's movements and singing voice could be correctly measured during real live performance conditions. We developed a program to present the strength of the movements and singing voice corresponding to one of the members as vibrations and lights based on the information of the recorded data. Then, an experiment was conducted for eight subjects that observed the performance. We found that seven out of eight subjects could identify the idol performer with corresponding vibrations and lighting from the device.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    2
    References
    0
    Citations
    NaN
    KQI
    []