{"id":2828,"date":"2025-06-18T13:55:19","date_gmt":"2025-06-18T11:55:19","guid":{"rendered":"https:\/\/reach.ircam.fr\/?p=2828"},"modified":"2025-06-18T13:55:20","modified_gmt":"2025-06-18T11:55:20","slug":"introducing-eg-ipt-and-ipt-a-novel-electric-guitar-dataset-and-a-new-max-msp-object-for-real-time-classification-of-instrumental-playing-techniques","status":"publish","type":"post","link":"https:\/\/reach.ircam.fr\/index.php\/2025\/06\/18\/introducing-eg-ipt-and-ipt-a-novel-electric-guitar-dataset-and-a-new-max-msp-object-for-real-time-classification-of-instrumental-playing-techniques\/","title":{"rendered":"Introducing EG-IPT and ipt~: a novel electric guitar dataset and a new Max\/MSP object for real-time classification of instrumental playing techniques"},"content":{"rendered":"\n<p>Article by <strong>Marco Fiorini<\/strong>*, <strong>Nicolas Brochec<\/strong>*, <strong>Joakim Borg<\/strong> and <strong>Riccardo Pasini<\/strong> (* equal contribution) has been accepted for the <strong><a href=\"https:\/\/nime2025.org\" title=\"\">New Interfaces for Musical Expression (NIME)<\/a><\/strong> Conference in <strong>Canberra, Australia<\/strong><a style=\"font-size: 16px; white-space: normal; color: rgb(95, 110, 98); font-family: -apple-system, system-ui, BlinkMacSystemFont, &quot;Segoe UI&quot;, Roboto, &quot;Helvetica Neue&quot;, Arial, sans-serif;\" href=\"https:\/\/nime.org\/\">.<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/hal.science\/hal-05061680v1\/document\" title=\"\">Read the full paper<\/a><\/p>\n\n\n\n<p><a href=\"https:\/\/www.youtube.com\/watch?v=PFiWNnOd-vg\" title=\"\">Video demo<\/a><\/p>\n\n\n\n<p><strong>Abstract:<\/strong> This paper presents two key contributions to the real-time classi- fication of Instrumental Playing Techniques (IPTs) in the context of NIME and human-machine interactive systems: the EG-IPT dataset and the ipt\u223c\u00a0Max\/MSP object. The EG-IPT dataset, specif- ically designed for electric guitar, encompasses a broad range of IPTs captured across six distinct audio sources (five microphones and one direct input) and three pickup configurations. This di- versity in recording conditions provides a robust foundation for training accurate models. We evaluate the dataset by employing a Convolutional Neural Network-based classifier (CNN), achieving state-of-the-art performance across a wide array of IPT classes, thereby validating the dataset\u2019s efficacy. The ipt\u223c\u00a0object is a new Max\/MSP external enabling real-time classification of IPTs via pre-trained CNN models. While in this paper it\u2019s demonstrated with the EG-IPT dataset, the ipt\u223c\u00a0object is adaptable to models trained on various instruments. By integrating EG-IPT and ipt\u223c, we introduce a novel, end-to-end workflow that spans from data collection, model training to real-time classification and human- computer interaction. This workflow exemplifies the entangle- ment of diverse components (data acquisition, machine learning, real-time processing, and interactive control) within a unified system, advancing the potential for dynamic, real-time music performance and human-computer interaction in the context of NIME.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Article by Marco Fiorini*, Nicolas Brochec*, Joakim Borg and Riccardo Pasini (* equal contribution) has been accepted for the New Interfaces for Musical Expression (NIME) Conference in Canberra, Australia. Read the full paper Video demo Abstract: This paper presents two key contributions to the real-time classi- fication of Instrumental Playing Techniques (IPTs) in the context [&hellip;]<\/p>\n","protected":false},"author":3,"featured_media":2820,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[52,46,42],"tags":[],"class_list":["post-2828","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-conferences","category-publications-research","category-research"],"aioseo_notices":[],"blog_post_layout_featured_media_urls":{"thumbnail":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2025\/06\/hqdefault-EjKv64-150x150.jpg",150,150,true],"full":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2025\/06\/hqdefault-EjKv64.jpg",480,360,false]},"categories_names":{"52":{"name":"Conferences","link":"https:\/\/reach.ircam.fr\/index.php\/category\/research\/conferences\/"},"46":{"name":"Publications","link":"https:\/\/reach.ircam.fr\/index.php\/category\/research\/publications-research\/"},"42":{"name":"Research","link":"https:\/\/reach.ircam.fr\/index.php\/category\/research\/"}},"tags_names":[],"comments_number":"0","wpmagazine_modules_lite_featured_media_urls":{"thumbnail":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2025\/06\/hqdefault-EjKv64-150x150.jpg",150,150,true],"cvmm-medium":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2025\/06\/hqdefault-EjKv64-300x300.jpg",300,300,true],"cvmm-medium-plus":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2025\/06\/hqdefault-EjKv64-305x207.jpg",305,207,true],"cvmm-portrait":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2025\/06\/hqdefault-EjKv64-400x360.jpg",400,360,true],"cvmm-medium-square":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2025\/06\/hqdefault-EjKv64.jpg",480,360,false],"cvmm-large":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2025\/06\/hqdefault-EjKv64.jpg",480,360,false],"cvmm-small":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2025\/06\/hqdefault-EjKv64-130x95.jpg",130,95,true],"full":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2025\/06\/hqdefault-EjKv64.jpg",480,360,false]},"_links":{"self":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/2828","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/comments?post=2828"}],"version-history":[{"count":1,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/2828\/revisions"}],"predecessor-version":[{"id":2829,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/2828\/revisions\/2829"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/media\/2820"}],"wp:attachment":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/media?parent=2828"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/categories?post=2828"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/tags?post=2828"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}