Human-Machine Teaming in Music: anchored narrative-graph Visualization and Machine Learning

2020 
During the traditional music analysis process, stylistic rules usually have to be deduced directly from examples of compositions or past performance. In such cases, musicians create external representations of a music style domain as source for reflection, inspiration and collaboration. However, due to the large number of music examples, creating such representations can be essential, but at the same time, slow and costly.In this paper, we show that interactive visualization and machine learning could aid in supporting and enhancing musician cognition and team-based collaboration. Specifically, we propose an approach to this problem which: (1) allows musicians to visually externalize their evolving mental models of a music domain, in the form of thematically organized anchored pairs. i.e., (narrative, graph), each one corresponding to a specific music pattern, and (2) uses such pairs to develop a music style classification system based on machine learning, as support for musicians during their activities (composition, performance). To this end, we introduce a novel graph representation of music stylistic patterns and discuss the advantages of linking such a representation to machine learning. Results of a preliminary study involving 10 musicians provided us with overall positive feedback about the effectiveness of our approach as well as further directions to explore.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    15
    References
    0
    Citations
    NaN
    KQI
    []