Artistic creation and technology | Virtual Reality, Augmented reality, interactive design, real time performance

vendredi 18 octobre
12:00 - 14:00

Accès libre et gratuit.

Animé par les artistes d’AGORA 2019

Live game-audio in Japan – work in progress – by Ricardo Climent

Since the 70s, Japanese digital culture has consistently been first to adopt and rapidly develop the concept of Digital Human to transcended embodiment from the Virtual to the Real and vice versa. Examples are found in its videogame culture, j-Pop bands, Digital Companions and robotics, e-mascots but also in traditional Noh Theatre plays. The talk will oversee Climent’s new interactive composition for Vocals and Virtual Reality singer, informed by the concept of Co-Presence, navigating existing literature and theatre tradition to experience Live Immersive Performance and the exploration Anthropomorphism and Technomorphism in Japan in 2019.

SonicMaps and Maia– merging location aware devices with AR/VR tools? – by Ignacio Pecino

Ignacio Pecino EASTN-DC Artist in Residence at NOVARS will discuss his EASTN-DC work in progress, including his latest implementation of SonicMaps geolocative tool presented at the RMA Conference Manchester, 11-13 September 2019, alongside his Maia AR/VR tool presented at EASTN-CCRMA, where he created a hyper-real virtual replica (1:1 scale) of a CCRMA’s stage room.
He will discuss how Maia can construct 3D parallel worlds, where user-generated content can be accurately localised, persistent, and automatically synchronised between both worlds—the real and its virtual counterpart. The presentation will also navigate how Maia can dissolve the boundaries between physical and digital worlds, facilitating creative activities and social interactions in selected locations regardless of the ontological approach. The talk will conclude with some potential avenues to merge Maia with his Sonicmaps and how a collective experience with EASTN-DC participants could be staged with EASTN-DC participants.

Project Presentation « IDE Fantasy » – by Iannis Zannos

The presentation shows techniques for interactively controlling sound and graphics synthesis using movement sensors based on IMU (Inertial Measurement Units)  to measure the movement of a performer.  This is part of a work-in-progress whose aim is to realize a series of interactive dance performances involving motion sensors and to explore the possibilities of telematic performance by concurrent performance in remote locations.  To enable remote interconnection, the movements of the dancers are captured using wearable motion sensors and are broadcast in real time over the internet to all performance locations.  Sound and graphics are synthesized locally at each performance location on the basis of the motion data.  The actions of remote dancers are thus percieved locally indirectly through the characteristics of the sounds and graphics.  The challenge is to enable coordination of the dancers and to make the actions of the different dancers perceptible through the characteristics of the sound and graphics designed.  In 2018 and 2019 a sensor system was developed and several rehearsals as well as public performances were realized.  For the next stages of the project it is proposed to perform pieces in venues of several partners of the EASTN-DC project as well as other partners.   The presentation will show how motion data are transmitted to computer over wifi, and live coding is used to control the generation of sound in SuperCollider and graphics on the Godot Gaming engine.  Reference to techniques for sending the control data to remote locations over the internet are shown.  This enables joint performance from several different remote locations. At each location the data sent from all performers is used to synthesize the audiovisual performance.