{"id":1804,"date":"2024-04-20T17:56:24","date_gmt":"2024-04-20T15:56:24","guid":{"rendered":"https:\/\/reach.ircam.fr\/?p=1804"},"modified":"2024-04-20T17:56:27","modified_gmt":"2024-04-20T15:56:27","slug":"cocreative-interaction-somax2-and-the-reach-project-the-computer-music-journal","status":"publish","type":"post","link":"https:\/\/reach.ircam.fr\/index.php\/2024\/04\/20\/cocreative-interaction-somax2-and-the-reach-project-the-computer-music-journal\/","title":{"rendered":"Cocreative Interaction: Somax2 and the REACH Project &#8211; The Computer Music Journal"},"content":{"rendered":"\n<p>G\u00e9rard Assayag,\u00a0Laurent Bonnasse-Gahot,\u00a0Joakim Borg; Cocreative Interaction: Somax2 and the REACH Project.\u00a0<em><em>Computer Music Journal<\/em><\/em>\u00a02022; 46 (4): 7\u201325. doi:\u00a0<a href=\"https:\/\/doi.org\/10.1162\/comj_a_00662\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/doi.org\/10.1162\/comj_a_00662<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/direct.mit.edu\/comj\/issue\/46\/4\">https:\/\/direct.mit.edu\/comj\/issue\/46\/4<\/a><\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img fetchpriority=\"high\" decoding=\"async\" width=\"1024\" height=\"577\" src=\"https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.53.16-1024x577.png\" alt=\"\" class=\"wp-image-1806\" srcset=\"https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.53.16-1024x577.png 1024w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.53.16-300x169.png 300w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.53.16-768x433.png 768w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.53.16.png 1472w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>Abstract : <\/p>\n\n\n\n<p>Somax2 is an artificial intelligence (AI)-based multiagent system for human\u2013machine \u201ccoimprovisation\u201d that generates stylistically coherent streams while continuously listening and adapting to musicians or other agents. The model on which it is based can be used with little configuration to interact with humans in full autonomy, but it also allows fine real-time control of its generative processes and interaction strategies, closer in this case<\/p>\n\n\n\n<p>to a \u201csmart\u201d digital instrument. An offspring of the Omax system, conceived at the Institut de Recherche et Coordination Acoustique\/Musique (IRCAM), the Somax2 environment is part of the European Research Council Raising Cocreativity in Cyber\u2013Human Musicianship (REACH) project, which studies distributed creativity as a general template for symbiotic interaction between humans and digital systems. It fosters mixed musical reality involving cocreative AI agents. The REACH project puts forward the idea that cocreativity in cyber\u2013human systems results from the emergence of complex joint behavior, produced by interaction and featuring cross-learning mechanisms. Somax2 is a first step toward this ideal, and already shows life-size achievements. This article describes Somax2 extensively, from its theoretical model to its system architecture, through its listening and learning strategies, representation spaces, and interaction policies.<\/p>\n\n\n\n<p><a href=\"https:\/\/www.dropbox.com\/scl\/fi\/msqjstyvtmwv7op1c61rp\/CMJ-24-Assayag_Bonnasse-Gahot_Borg.pdf?rlkey=3wadkqthwq81jggj5nsmp3eyx&amp;dl=0\"><strong>Read the full paper<\/strong><\/a><\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-large is-resized\"><img decoding=\"async\" width=\"721\" height=\"1024\" src=\"https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.54.18-721x1024.png\" alt=\"\" class=\"wp-image-1808\" style=\"width:402px;height:auto\" srcset=\"https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.54.18-721x1024.png 721w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.54.18-211x300.png 211w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.54.18-768x1091.png 768w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.54.18.png 956w\" sizes=\"(max-width: 721px) 100vw, 721px\" \/><\/figure>\n<\/div>\n\n\n<p><\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-large is-resized\"><img decoding=\"async\" width=\"1024\" height=\"481\" src=\"https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.54.00-1024x481.png\" alt=\"\" class=\"wp-image-1807\" style=\"width:650px;height:auto\" srcset=\"https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.54.00-1024x481.png 1024w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.54.00-300x141.png 300w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.54.00-768x361.png 768w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.54.00.png 1358w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n<\/div>","protected":false},"excerpt":{"rendered":"<p>G\u00e9rard Assayag,\u00a0Laurent Bonnasse-Gahot,\u00a0Joakim Borg; Cocreative Interaction: Somax2 and the REACH Project.\u00a0Computer Music Journal\u00a02022; 46 (4): 7\u201325. doi:\u00a0https:\/\/doi.org\/10.1162\/comj_a_00662 https:\/\/direct.mit.edu\/comj\/issue\/46\/4 Abstract : Somax2 is an artificial intelligence (AI)-based multiagent system for human\u2013machine \u201ccoimprovisation\u201d that generates stylistically coherent streams while continuously listening and adapting to musicians or other agents. The model on which it is based can be [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":1805,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[46],"tags":[],"class_list":["post-1804","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-publications-research"],"aioseo_notices":[],"blog_post_layout_featured_media_urls":{"thumbnail":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.50.28-150x150.png",150,150,true],"full":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.50.28.png",1106,1466,false]},"categories_names":{"46":{"name":"Publications","link":"https:\/\/reach.ircam.fr\/index.php\/category\/research\/publications-research\/"}},"tags_names":[],"comments_number":"0","wpmagazine_modules_lite_featured_media_urls":{"thumbnail":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.50.28-150x150.png",150,150,true],"cvmm-medium":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.50.28-300x300.png",300,300,true],"cvmm-medium-plus":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.50.28-305x207.png",305,207,true],"cvmm-portrait":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.50.28-400x600.png",400,600,true],"cvmm-medium-square":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.50.28-600x600.png",600,600,true],"cvmm-large":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.50.28-1024x1024.png",1024,1024,true],"cvmm-small":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.50.28-130x95.png",130,95,true],"full":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/04\/Capture-decran-2024-04-20-a-17.50.28.png",1106,1466,false]},"_links":{"self":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/1804","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/comments?post=1804"}],"version-history":[{"count":1,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/1804\/revisions"}],"predecessor-version":[{"id":1809,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/1804\/revisions\/1809"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/media\/1805"}],"wp:attachment":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/media?parent=1804"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/categories?post=1804"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/tags?post=1804"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}