{"id":1905,"date":"2024-05-03T12:13:22","date_gmt":"2024-05-03T10:13:22","guid":{"rendered":"https:\/\/reach.ircam.fr\/?p=1905"},"modified":"2024-07-08T12:43:05","modified_gmt":"2024-07-08T10:43:05","slug":"guiding-co-creative-musical-agents-through-real-time-instrumental-playing-technique-recognition","status":"publish","type":"post","link":"https:\/\/reach.ircam.fr\/index.php\/2024\/05\/03\/guiding-co-creative-musical-agents-through-real-time-instrumental-playing-technique-recognition\/","title":{"rendered":"Guiding Co-Creative Musical Agents through Real-Time Instrumental Playing Technique Recognition"},"content":{"rendered":"\n<p>Article by\u00a0<strong>Marco Fiorini<\/strong>\u00a0(IRCAM, Sorbonne Universit\u00e9, CNRS) and\u00a0<strong>Nicolas Brochec<\/strong>\u00a0(Tokyo University of the Arts) has been accepted for the SMC2024 conference (Sound and Music Computing, 4-6 July 2024, Porto, Portugal).<\/p>\n\n\n\n<p><a href=\"https:\/\/hal.science\/hal-04635907v1\/document\">Read the full paper<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=iH3EE3_INgg\">Watch a video demo of the system<\/a><\/p>\n\n\n\n<p><strong>Abstract: <\/strong>This paper presents a novel application and integration of a state-of-the-art CNN-based classifier for real-time flute Instrumental Playing Technique (IPT) recognition within the co-creative system Somax2. Focusing on music innovation in the framework of Corpus-Based Concatenative Synthesis (CBCS), our work addresses the critical gap of recognising instantaneous changes in instrumental playing techniques during co-improvisation with artificial agents. Our real-time IPT recognition system offers a new dimension to the Somax2 interaction paradigm, where artificial agents are now able to responsively engage with the recognized techniques.<br>Contributing to the broader field of human-machine interaction in computer music, our results have potential applications in improvisation, computer-aided composition and new interfaces for musical expression.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Article by\u00a0Marco Fiorini\u00a0(IRCAM, Sorbonne Universit\u00e9, CNRS) and\u00a0Nicolas Brochec\u00a0(Tokyo University of the Arts) has been accepted for the SMC2024 conference (Sound and Music Computing, 4-6 July 2024, Porto, Portugal). Read the full paper Watch a video demo of the system Abstract: This paper presents a novel application and integration of a state-of-the-art CNN-based classifier for real-time [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":1906,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[46],"tags":[],"class_list":["post-1905","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-publications-research"],"aioseo_notices":[],"blog_post_layout_featured_media_urls":{"thumbnail":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/05\/waveform_label.drawio-4-150x150.png",150,150,true],"full":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/05\/waveform_label.drawio-4.png",721,920,false]},"categories_names":{"46":{"name":"Publications","link":"https:\/\/reach.ircam.fr\/index.php\/category\/research\/publications-research\/"}},"tags_names":[],"comments_number":"0","wpmagazine_modules_lite_featured_media_urls":{"thumbnail":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/05\/waveform_label.drawio-4-150x150.png",150,150,true],"cvmm-medium":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/05\/waveform_label.drawio-4-300x300.png",300,300,true],"cvmm-medium-plus":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/05\/waveform_label.drawio-4-305x207.png",305,207,true],"cvmm-portrait":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/05\/waveform_label.drawio-4-400x600.png",400,600,true],"cvmm-medium-square":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/05\/waveform_label.drawio-4-600x600.png",600,600,true],"cvmm-large":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/05\/waveform_label.drawio-4.png",721,920,false],"cvmm-small":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/05\/waveform_label.drawio-4-130x95.png",130,95,true],"full":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2024\/05\/waveform_label.drawio-4.png",721,920,false]},"_links":{"self":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/1905","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/comments?post=1905"}],"version-history":[{"count":2,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/1905\/revisions"}],"predecessor-version":[{"id":2054,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/1905\/revisions\/2054"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/media\/1906"}],"wp:attachment":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/media?parent=1905"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/categories?post=1905"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/tags?post=1905"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}