{"id":2539,"date":"2023-02-22T12:31:00","date_gmt":"2023-02-22T11:31:00","guid":{"rendered":"https:\/\/reach.ircam.fr\/?p=2539"},"modified":"2024-12-10T12:33:59","modified_gmt":"2024-12-10T11:33:59","slug":"somax-2-a-reactive-multi-agent-environment-for-co-improvisation","status":"publish","type":"post","link":"https:\/\/reach.ircam.fr\/index.php\/2023\/02\/22\/somax-2-a-reactive-multi-agent-environment-for-co-improvisation\/","title":{"rendered":"Somax 2 a Reactive Multi-Agent Environment for Co-Improvisation"},"content":{"rendered":"\n<p>Joakim Borg, G\u00e9rard Assayag, Mikhail Malt<\/p>\n\n\n\n<p><em>SMC 2022 &#8211; Sound Music &amp; Computing<\/em>, Jun 2022, Saint-\u00c9tienne, France. Proceedings of the 19th Sound and Music Computing Conference, 2022<\/p>\n\n\n\n<p><a href=\"https:\/\/hal.science\/hal-04001271v1\/file\/SMC_Demo_poster_Somax_006.graffle.pdf\">Read full publication.<\/a><\/p>\n\n\n\n<p><strong>Abstract<\/strong>: Somax 2 is a multi-agent interactive system performing live machine co-improvisation with musicians, based on machine-listening, machine-learning, and generative units. The actual version ([Borg 2020], [Borg 2021a], [Borg 2021b]) is a recent development and algorithms improvement from the former Somax version and previous work in RepMus team ([Bonasse-Gahot 2012], [Bonasse-Gahot 2014], [Carsault\u00a02017], [Carsault\u00a02019], [Carsault\u00a02020], [Carsault\u00a02021]). Agents provide stylistically coherent improvisations based on learned musical knowledge while continuously listening to and adapting to input from musicians or other agents in real time. The system is trained on any musical materials chosen by the user, effectively constructing a generative model (called a corpus), from which it draws its musical knowledge and improvisation skills. Corpora, inputs and outputs can be MIDI as well as audio, and inputs can be live or streamed from Midi or audio files. Somax\u00a02 is one of the improvisation systems descending from the well-known Omax software, presented here in a totally new implementation. As such it shares with its siblings, the general loop [listen\/learn\/model\/generate], using some form of statistical modeling that ends up in creating a highly organized memory structure from which it can navigate into new musical organizations, while keeping style coherence, rather than generating unheard sounds as other ML systems do. However Somax 2 adds a totally new versatility by being incredibly reactive to the musician decisions, and by putting its creative agents to communicate and work together in the same way, thanks to cognitively inspired interaction strategies and finely optimized concurrent architecture that make all its units smoothly cooperate together.<\/p>\n\n\n\n<p>\u00a0<a href=\"https:\/\/hal.science\/hal-04001271v1\/file\/SMC_Demo_poster_Somax_006.graffle.pdf\">Read full publication.<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Joakim Borg, G\u00e9rard Assayag, Mikhail Malt SMC 2022 &#8211; Sound Music &amp; Computing, Jun 2022, Saint-\u00c9tienne, France. Proceedings of the 19th Sound and Music Computing Conference, 2022 Read full publication. Abstract: Somax 2 is a multi-agent interactive system performing live machine co-improvisation with musicians, based on machine-listening, machine-learning, and generative units. The actual version ([Borg [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":2540,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[73],"tags":[],"class_list":["post-2539","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-software"],"aioseo_notices":[],"blog_post_layout_featured_media_urls":{"thumbnail":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/12\/Screenshot-2024-12-10-123304-150x150.png",150,150,true],"full":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/12\/Screenshot-2024-12-10-123304.png",1337,590,false]},"categories_names":{"73":{"name":"Software","link":"https:\/\/reach.ircam.fr\/index.php\/category\/research\/software\/"}},"tags_names":[],"comments_number":"0","wpmagazine_modules_lite_featured_media_urls":{"thumbnail":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/12\/Screenshot-2024-12-10-123304-150x150.png",150,150,true],"cvmm-medium":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/12\/Screenshot-2024-12-10-123304-300x300.png",300,300,true],"cvmm-medium-plus":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/12\/Screenshot-2024-12-10-123304-305x207.png",305,207,true],"cvmm-portrait":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/12\/Screenshot-2024-12-10-123304-400x590.png",400,590,true],"cvmm-medium-square":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/12\/Screenshot-2024-12-10-123304-600x590.png",600,590,true],"cvmm-large":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/12\/Screenshot-2024-12-10-123304-1024x590.png",1024,590,true],"cvmm-small":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/12\/Screenshot-2024-12-10-123304-130x95.png",130,95,true],"full":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/12\/Screenshot-2024-12-10-123304.png",1337,590,false]},"_links":{"self":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/2539","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/comments?post=2539"}],"version-history":[{"count":1,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/2539\/revisions"}],"predecessor-version":[{"id":2541,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/2539\/revisions\/2541"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/media\/2540"}],"wp:attachment":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/media?parent=2539"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/categories?post=2539"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/tags?post=2539"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}