Dungeons & swimmers
20
Citation
9
Reference
10
Related Paper
Citation Trend
Abstract:
We propose Dungeons & Swimmers, an interactive audio- and motion-based exergame for swimming. As the first of its kind, we explore its design considerations and opportunities stemming from swimming. We gamify the four different stroke types with an auditory feedback. For minimal interference, we develop a single sensor-based wearable prototype detecting the strokes and stroke types in real time. We conduct a pilot deployment to study initial user experiences.Keywords:
Motion sensors
Auditory feedback
Sonification
This paper introduces AcouMotion as a new hard-/software system for combining human body motion, tangible interfaces and sonification to a closed-loop human computer interface that allows non-visual motor control by using sonification (non-speech auditory displays) as major feedback channel. AcouMotion's main components are (i) a sensor device for measuring motion parameters (ii) a computer simulation to represent the dynamical evolution of a model world, and (iii) a sonification engine which generates an auditory representation of objects and any interactions in the model world. The intended applications of AcouMotion range from new kinds of sport games that can be played without visual displays and therefore may be particularly interesting for people with visual impairment to further applications in data mining, physiotherapy and cognitive research. The first application of AcouMotion presented in this paper is Blindminton, a sport game similar to Badminton which is particularly adapted to the abilities of people with visual impairment. We describe our current system and its state of development, and we present first sound examples for interactive sonification using an early prototype. Finally, we discuss some interesting research directions based on the fact that AcouMotion binds auditory stimuli and body motion, and thus can represent a counterpart to the Eye-tracker device that exploits the binding of visual stimuli and eye-movement in cognitive research.
Here we will provide sound examples and interaction video to demonstrate our current prototype of AcouMotion which is currently being developed to enable the sport game Blindminton for people with visual impairment.
This is a video of our current One-player version of Blindminton, which might be called Blindhit. Sound Synthesis is implemented in SuperCollider.
Video showing some Blindhit Interactions:
Sonification
Auditory feedback
Interface (matter)
Representation
Input device
Cite
Citations (0)
Auditory feedbacks are becoming increasingly popular in sports providing opportunities for monitoring and gait (re)training in ecological environments. We present the design process of a sonification strategy for modification of running parameters. The sonification provides real-time feedback of the performance through introduction of distortion of a baseline music track. The music BPM is continuously matched to the runners' cadence. The noise-based continuous feedback was able to significantly alter the mean running cadence in a non-instructed and non-disturbing way and performed better than standard verbal instructions. Although some of the participants did not respond effectively to the feedback, a large majority of the participants positively rated the feedback system in terms of pleasantness and motivation.
Sonification
Cadence
Biofeedback
Auditory feedback
Retraining
Audio feedback
Cite
Citations (8)
Our sensorimotor system has developed a specific relationship between our actions and their sonic outcomes, which it interprets as auditory feedback. The development of motion sensing and audio technologies allows emphasizing this relationship through interactive sonification of movement. We propose several experimental frameworks (visual, non-visual, tangible, virtual) to assess the contribution of sonification to sensorimotor control and learning in interactive systems. First, we show that the auditory system integrates dynamic auditory cues for online motor control, either from head or hand movements. Auditory representations of space and of the scene can be built from audio features and transformed into motor commands. The framework of a virtual sonic object illustrates that auditory-motor representations can shape exploratory movement features and allow for sensory substitution. Second, we measure that continuous auditory feedback in a tracking task helps significantly the performance. Both error and task sonification can help performance but have different effects on learning. We also observe that sonification of user’s movement can increase the energy of produced motion and prevent feedback dependency. Finally, we present the concept of sound-oriented task, where the target is expressed as acoustic features to match. We show that motor adaptation can be driven by interactive audio cues only. In this work, we highlight important guidelines for sonification design in auditory-motor coupling research, as well as applications through original setups we developed, like perceptual and physical training, and playful gesture-sound interactive scenarios for rehabilitation.
Sonification
Auditory feedback
Sensory Substitution
Auditory display
Audio feedback
Auditory scene analysis
Cite
Citations (6)
In this paper, a real-time interactive system for smile detection and sonification using surface Electromyography (sEMG) signals is proposed. When a user smiles, a sound is played. The surface EMG signal is mapped to pitch using a conventional scale. The timbre of the sound is a synthetic sound that mimics bubbles. In a user testing of smiling tasks, 14 participants underwent the system and are required to produce smiles under three conditions, i.e., auditory feedback with sonification, visual feedback with mirror, and no feedback. The impression of the system is evaluated through questionnaires and interviews with the participants. In addition, we analyzed the total amount of muscular activity and temporal envelope patterns of the sEMG during smiling. The questionnaire and interview showed that users felt that (1) the sonification system well reflects their facial expressions, and (2) the sonification system was enjoyable. The users also expressed that the auditory feedback condition is easier to smile with, as compared to the visual feedback or no feedback conditions. However, the analysis of sEMG did not provide a quantitative difference among the three conditions, which is most likely due to the experiment design, which lacks socially engaging settings.
Sonification
Facial electromyography
Timbre
Biofeedback
Auditory feedback
Audio feedback
Cite
Citations (16)
This paper introduces a new hard-/ and software system for the interactive sonification of sports movement involving arm- and leg movements. Two different sonifications are designed to convey rhythmical patterns that become auditory gestalt so that listeners can identify features of the underlying coordinated movement. The Sonification is designed for the application to enable visually impaired users to participate in aerobics exercises, and also to enhance the perception of movements for sighted participants, which is useful for instance if the scene is occluded or the head posture is incompatible with the observation of the instructor or fitness professional who shows the practices in parallel. Furthermore, the system allows to monitor fine couplings in arm/leg coordination while jogging, as auditory feedback may help stabilizing the movement pattern. We present the sensing system, two sonification designs, and interaction examples that lead to coordination-specific sound gestalts. Finally, some qualitative observations are reported from the first uses of the prototype.
Sonification
Auditory feedback
Auditory display
Cite
Citations (19)
To date, little attention has been devoted by the research community to applications of the Internet of Things (IoT) paradigm to the field of interactive sonification. The IoT has the potential to facilitate the emergence of novel forms of interactive sonifications that are the result of shared control of the sonification system by both the user performing the gestures locally to the system itself, and one or more remote users. This can for instance impact therapies based on auditory feedback where the control of the sound generation may be shared by patients and doctors remotely connected. This paper describes a prototype of connected shoes for interactive sonification that can be remotely controlled and can collect data about the gait of a walker. The system targets primarily clinical applications where sound stimuli are utilized to help guide and improve walking actions of patients with motor impairments.
Sonification
Auditory feedback
Auditory display
Cite
Citations (6)
In this paper we examine a wearable sonification and visualisation display that uses physical analogue visualisation and digital sonification to convey feedback about the wearer's activity and environment. Intended to bridge a gap between art aesthetics, fashionable technologies and informative physical computing, the user experience evaluation reveals the wearers' responses and understanding of a novel medium for wearable expression. The study reveals useful insights for wearable device design in general and future iterations of this sonification and visualisation display.
Sonification
Bridge (graph theory)
Wearable Technology
Physical computing
Cite
Citations (4)
Sonification
Auditory feedback
Tracking (education)
Feedback regulation
Auditory display
Motor Learning
Cite
Citations (21)
Sonification
Auditory feedback
Interface (matter)
Representation
Input device
Cite
Citations (53)
This paper addresses the development and preliminary user test of the Music Balance Board, an auditory force plate feedback tool for weight-shift training in patients with impairment in balance function. This newly developed system provides auditory feedback based on real-time sonification of weight distribution. In an exploratory study, twelve patients after ischemic stroke or brain trauma performed standing weight-shifting activities guided by the system. This study aimed at: (1) exploring the potential of interacting with a musical environment as a way to retrain how to keep equilibrium while standing, (2) investigating usage strategies of the Musical Balance Board, and (3) studying various sonification modes with different levels of complexity. A model involving associative, explorative and anticipative sonification strategies was tested. The model supports exploration of the ability of using auditory feedback to facilitate reinforcement learning for technology-assisted rehabilitation. Our results suggest that important requirements for designing sonification modes for balance training in people with brain damage are comprehensiveness, simplicity, attractiveness of the soundscape, and most of all musical pleasure.
Sonification
Auditory feedback
Soundscape
Audio feedback
Cite
Citations (4)