Software

Somax2

Go to Somax2 project page at Ircam

Somax 2 is an application for musical improvisation and composition. It is implemented in Max and is based on a generative model using a process similar to concatenative synthesis to provide stylistically coherent improvisation, while listening to and adapting to a musician (or any other type of audio or MIDI source) in real-time. The model is operating in the symbolic domain and is trained on a musical corpus, consisting of one or multiple MIDI files, from which it draws its material used for improvisation. The model can be used with little configuration to autonomously interact with a musician, but it also allows manual control of its generative process, effectively letting the model serve as an instrument that can be played in its own right.

While the application can be used straight out of the box with little configuration (see Getting Started), it is also designed as a library, allowing the user to create custom models as well as set up networks of multiple models and sources that are listening to and interacting with each other.

Somax 2 is a totally new version of the mythical Somax reactive co-improvisation paradigm designed in the ircam Music Representation team but never publicly released yet. Written in Max and Python, it features a modular multithreaded implementation, multiple wireless interacting players, new UI design with tutorials and documentation, as well as a number of new interaction parameters.

Showcasing the latest breakthroughs in responsive music-performance technology and creative AI Somax2 live generative environment is developed by researchers at IRCAM (Boulez’s world-renowned institute for sound and music research and creation). Based on machine learning, cognitive modeling and corpus based generation, Somax2 is designed to provide real-time machine improvisations agents that can autonomously interact in a rich and creative way with musicians or with each other, endlessly unfolding their musical textures in a constant interplay between agreement, discord or independence towards the evolving and often unpredictible acoustic context.

Somax2 creates a new mixed musical reality by blending in real-time the music produced by the ensemble on stage and the layers generated by the machine through multiple feed-back loops of reciprocal influences where humans and artificial agents simultaneously listen to, learn from and react to each other. Somax2 is an autonomous, self guided system, that can also be used as an instrument when the generation and interaction strategy are guided and steered by the intervention of a human musician.


Somax2 is a major outcome of the REACH (Raising Cocreativity in Cyber–Human Musicianship) European research project conducted at Ircam by Gérard Assayag that explores spontaneous emergence of joint behaviors in collective settings mixing humans and AI, leading to rich and surprising coevolution of forms not reducible to their individual components as often witnessed in natural complex systems dynamics. REACH has highlighted the term « cocreativity »for these processes by assessing that creativity is not restrained to a quality but is rather the outcome of complex interaction dynamics.

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *