Unconstrained and constrained embodied interaction with audiovisual feedback during vocal performance
conference contributionposted on 23.09.2020, 10:42 by Balandino Di Donato
Any type of content contributed to an academic conference, such as papers, presentations, lectures or proceedings.
This paper presents the work on unconstrained and constrained embodied interaction with live audiovisual processing parameters during singing. Building upon the concept of affordance and embodiment and adopting a User-Centred Design approach, two case studies were realised. The first case study in a context where a performer is free to move and interact with the MyoSpat interactive system for live sound processing parameters (unconstrained interaction); and, the second in a performative situation where the musician is limited by a played instrument or interface (constrained interaction). The interaction design solution proposed in the two case studies was welcomed by the performers; its potential and limitation allowed invited the exploration of new gestures-sound relationships.