{"id":33,"date":"2020-05-29T09:30:50","date_gmt":"2020-05-29T07:30:50","guid":{"rendered":"https:\/\/reach.ircam.fr\/?p=33"},"modified":"2024-03-01T15:04:10","modified_gmt":"2024-03-01T14:04:10","slug":"cyber-human-musical-co-creativity","status":"publish","type":"post","link":"https:\/\/reach.ircam.fr\/index.php\/2020\/05\/29\/cyber-human-musical-co-creativity\/","title":{"rendered":"Cyber-human musical co-creativity: an ERC Advanced Grant for G\u00e9rard Assayag"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"33\" class=\"elementor elementor-33\">\n\t\t\t\t\t\t<section class=\"elementor-section elementor-top-section elementor-element elementor-element-55fd15f elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"55fd15f\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-1ba8c6c\" data-id=\"1ba8c6c\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-4a3c86d elementor-widget elementor-widget-text-editor\" data-id=\"4a3c86d\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><em><a href=\"https:\/\/www.ins2i.cnrs.fr\/en\/cnrsinfo\/cyber-human-musical-co-creativity-erc-advanced-grant-gerard-assayag\">Read original article<\/a><\/em><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-a2a914f elementor-widget elementor-widget-text-editor\" data-id=\"a2a914f\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<div class=\"introduction\">\n<div class=\"clearfix text-formatted field field--name-body field--type-text-with-summary field--label-hidden field__item\">\n<div class=\"tex2jax_process\">\n<p><strong>The IRCAM research professor G\u00e9rard Assayag has been awarded a 2019 ERC Advanced Grant for the REACH project (<em>Raising co-creativity in cyber-human Musicianship<\/em>) which enables him to continue his work on cyber-human musical co-creation. He founded the\u00a0<a href=\"https:\/\/www.stms-lab.fr\/team\/representations-musicales\/\" target=\"_blank\" rel=\"noreferrer noopener\">Musical Representations<\/a>\u00a0team at the\u00a0<a href=\"https:\/\/www.stms-lab.fr\/\" target=\"_blank\" rel=\"noopener noreferrer\">Science and Technology of Music and Sound Laboratory<\/a>(STMS \u2014 CNRS\/IRCAM\/Culture and Communication Ministry\/Sorbonne University) and models creative musical intelligence. Let&rsquo;s meet this researcher who talk about his work with passion.<\/strong><\/p>\n<p>\u00a0<\/p>\n<\/div>\n<\/div>\n<\/div>\n<div class=\"field field--name-field-entity-block field--type-entity-reference field--label-hidden field__items\">\n<div class=\"field__item\">\n<div class=\"block-description\">\n<div class=\"clearfix text-formatted field field--name-field-descriptive field--type-text-long field--label-hidden field__item\">\n<div class=\"tex2jax_process\">\n<p>G\u00e9rard Assayag&rsquo;s approach is part of a research-creation strategy which is close to the artistic process. In his research work, he makes the two fields of music and science interact and thus helps both fields progress.<\/p>\n<p><span lang=\"EN-GB\" xml:lang=\"EN-GB\">The\u00a0<a href=\"https:\/\/www.ircam.fr\/\" target=\"_blank\" rel=\"noopener noreferrer\">IRCAM<\/a>\u00a0is a place which is unique worldwide as it brings artists and scientists together in a common approach to knowledge and experimentation. This research professor began his career there as a researcher working on a project that would gain international recognition over the years. He and his team designed the computer-assisted composition software,\u00a0<a href=\"http:\/\/repmus.ircam.fr\/openmusic\/home\" target=\"_blank\" rel=\"noopener noreferrer\">OpenMusic<\/a>\u00a0&#8211; software featuring a visual programming language which is easily accessible and widely used by musicians and musicologists while also possessing an educational dimension. \u00ab\u00a0<em>It is a creative programme which generates new artistic ideas to test and try them out. As a result of this project, we started to look at direct forms of interaction between computers and musicians<\/em>,\u00a0\u00bb explains the researcher.<\/span><\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-10d9fd2 elementor-widget elementor-widget-image\" data-id=\"10d9fd2\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img fetchpriority=\"high\" decoding=\"async\" width=\"640\" height=\"461\" src=\"https:\/\/reach.ircam.fr\/wp-content\/uploads\/2021\/07\/openmusic_assayag-1024x738.jpg\" class=\"attachment-large size-large wp-image-525\" alt=\"\" srcset=\"https:\/\/reach.ircam.fr\/wp-content\/uploads\/2021\/07\/openmusic_assayag-1024x738.jpg 1024w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2021\/07\/openmusic_assayag-300x216.jpg 300w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2021\/07\/openmusic_assayag-768x554.jpg 768w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2021\/07\/openmusic_assayag-1536x1107.jpg 1536w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2021\/07\/openmusic_assayag-2048x1477.jpg 2048w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2021\/07\/openmusic_assayag-130x95.jpg 130w\" sizes=\"(max-width: 640px) 100vw, 640px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-548b77d elementor-widget elementor-widget-text-editor\" data-id=\"548b77d\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>How can musical ideas be developed jointly by a musician and a machine through a sound communication channel? How can a machine interact \u00ab\u00a0creatively\u00a0\u00bb with a musician? These questions raise new research issues involving artificial creativity. This was recently an object of a study inspired by the human creative process which focused on understanding past and present situations and anticipating future situations. Improvisation serves as a borderline case which is complex to model because it brings together the richest and most optimal characteristics of human creativity. In extremely short times (a few tens of milliseconds) improvisation produces highly structured messages in response to an evolving sonic environment and exhibits action strategies on several temporal scales.<\/p>\n<p>Musical improvisation is thus a key theme for the RepMus team at STMS directed by G\u00e9rard Assayag. This has led to numerous international collaborations with great musicians in the field (like Bernard Lubat, Steve Coleman, Roscoe Mitchell, Mike Garson, George Lewis, Evan Parker, Steve Lehman and many others). G\u00e9rard Assayag&rsquo;s team have created computer programmes linked to work on &lsquo;<em>Musical Information Dynamics<\/em>&lsquo; (an application of information theory to music). These include\u00a0<a href=\"http:\/\/repmus.ircam.fr\/omax\/home\" target=\"_blank\" rel=\"noopener noreferrer\"><em>Omax<\/em><\/a>, which models certain improvisation processes and is capable of dialoguing with a musician on stage, and\u00a0<a href=\"https:\/\/vimeo.com\/171790940\" target=\"_blank\" rel=\"noopener noreferrer\"><em>SoMax<\/em><\/a>, which implements a cognitive model of creative and contextually reactive musical memory.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d3231e4 elementor-widget elementor-widget-video\" data-id=\"d3231e4\" data-element_type=\"widget\" data-settings=\"{&quot;video_type&quot;:&quot;dailymotion&quot;,&quot;controls&quot;:&quot;yes&quot;}\" data-widget_type=\"video.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-wrapper elementor-open-inline\">\n\t\t\t<iframe class=\"elementor-video-iframe\" allowfullscreen allow=\"clipboard-write\" title=\"dailymotion Video Player\" src=\"https:\/\/dailymotion.com\/embed\/video\/xvd76n?ui-highlight&amp;start&amp;endscreen-enable=0&amp;controls=1&amp;mute=0&amp;ui-start-screen-info=1&amp;ui-logo=1\"><\/iframe>\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c1ebf73 elementor-widget elementor-widget-text-editor\" data-id=\"c1ebf73\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<div class=\"field__item\">\n<div class=\"block-description\">\n<div class=\"clearfix text-formatted field field--name-field-descriptive field--type-text-long field--label-hidden field__item\">\n<div class=\"tex2jax_process\">\n<p class=\"text-align-right\"><em>G\u00e9rard Assayag is a pioneer in this field and has notably created the OMAX software with his team which is used by many musicians, for example at a concert in New York with Bernard Lubat (Piano) and G\u00e9rard Assayag using the Omax programme. There have been several National Research Agency projects\u00a0<\/em><em>on this subject (IMPROTECH, SOR2, DYCI2, MERCI).<\/em><\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<div>\u00a0<\/div>\n<blockquote>\n<div class=\"field__item\">\n<div class=\"entity-block quote\">\n<div class=\"field field--name-field-citation field--type-string-long field--label-hidden field__item\">\u00ab\u00a0Embodiment represents a sensitive form of contact with digital technology which truly creates a mixed form of reality for a musician.\u00a0\u00bb<\/div>\n<div class=\"quote-author\">G\u00e9rard Assayag, IRCAM research professor<\/div>\n<\/div>\n<\/div>\n<\/blockquote>\n<div>\u00a0<\/div>\n<div class=\"field__item\">\n<div class=\"block-description\">\n<div class=\"clearfix text-formatted field field--name-field-descriptive field--type-text-long field--label-hidden field__item\">\n<div class=\"tex2jax_process\">\n<p>Now, the research professor is summarising all this work and implementing new perspectives thanks to this ERC Advanced Grant for the REACH project. His objective is to highlight co-creativity in cyber-human musical interactions.<\/p>\n<p><em>\u00ab\u00a0This is a sort of artificial musical intelligence, artificial musicality (\u00ab\u00a0machine musicianship\u00a0\u00bb). The term cyber-human expresses the continuity between human cognition and digital virtuality in the same way as cyber-physics expressed continuity between the digital and physical spheres. Here, the continuity involves creativity thanks to cyber-human co-action,\u00a0\u00bb\u00a0<\/em>G\u00e9rard Assayag explains<em>. \u00ab\u00a0Improvised musical interactions summarise many situations from everyday life in the same way as we&rsquo;re improvising our speech in this conversation,\u00a0\u00bb\u00a0<\/em>he adds<em>.<\/em><\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-10b579e elementor-widget elementor-widget-image\" data-id=\"10b579e\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t<figure class=\"wp-caption\">\n\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" width=\"640\" height=\"394\" src=\"https:\/\/reach.ircam.fr\/wp-content\/uploads\/2021\/07\/analyse_impro_stms_assayag-1024x631.jpg\" class=\"attachment-large size-large wp-image-530\" alt=\"\" srcset=\"https:\/\/reach.ircam.fr\/wp-content\/uploads\/2021\/07\/analyse_impro_stms_assayag-1024x631.jpg 1024w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2021\/07\/analyse_impro_stms_assayag-300x185.jpg 300w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2021\/07\/analyse_impro_stms_assayag-768x473.jpg 768w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2021\/07\/analyse_impro_stms_assayag-1536x947.jpg 1536w, https:\/\/reach.ircam.fr\/wp-content\/uploads\/2021\/07\/analyse_impro_stms_assayag-2048x1262.jpg 2048w\" sizes=\"(max-width: 640px) 100vw, 640px\" \/>\t\t\t\t\t\t\t\t\t\t\t<figcaption class=\"widget-image-caption wp-caption-text\">The Omax software carries out real-time analysis of improvisation structures (Assayag, L\u00e9vy)<\/figcaption>\n\t\t\t\t\t\t\t\t\t\t<\/figure>\n\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-6d48f49 elementor-widget elementor-widget-text-editor\" data-id=\"6d48f49\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<div class=\"field__item\">\n<div class=\"block-description\">\n<div class=\"clearfix text-formatted field field--name-field-descriptive field--type-text-long field--label-hidden field__item\">\n<div class=\"tex2jax_process\">\n<p>But how can a programme generate music that humans can understand? For this, computing is involved through modelling, AI techniques and automatic learning. Today&rsquo;s research aims to understand more about the \u00ab\u00a0symbiotic interactions\u00a0\u00bb between humans and machines facilitated by enhanced captures of physical and human signal and also learning. This paves the way for new forms of mixed reality to achieve embodiment (physical involvement) and co-creation.<\/p>\n<p>A phase of field observation and experimentation is required to bring this co-creation into play and better understand creative intelligence. In this context of improvised practices, concerts are well suited to this type of experiment. From a methodological point of view, the ERC REACH project therefore aims to create dual convergent movement &#8211; from digital to human (improving artificial agents&rsquo; analysis and production capacities) and from human to digital (enhancing a subject&rsquo;s involvement in the hybrid experience through embodiment). The aim of this is to produce an immersive and embodied symbiotic experience. \u00ab\u00a0<em>This is a chiasmus: we hope that the two research streams will intersect in the middle to produce new objects,<\/em>\u00a0\u00bb summarises the researcher about his ERC grant project.<\/p>\n<p>This project therefore has a solid interdisciplinary base in the image of IRCAM which is hosting the project. G\u00e9rard Assayag&rsquo;s ERC project combines computer science, music, cognitive sciences, anthropology and sociology through his long-standing collaboration with\u00a0<a href=\"http:\/\/ehess.modelisationsavoirs.fr\/index.html\" target=\"_blank\" rel=\"noopener noreferrer\">Marc Chemillier<\/a>\u00a0at the \u00c9cole des Hautes \u00c9tudes en Sciences Sociales. Collaboration initiatives with industry are being set up like\u00a0<a href=\"https:\/\/www.hyvibe.audio\/\" target=\"_blank\" rel=\"noopener noreferrer\"><em>Hyvibe<\/em><\/a>, an STMS start-up that works on inventions of acoustic instruments which are augmented by numerous sensors like the \u00ab\u00a0Smart Guitar\u00a0\u00bb. An international network is also being set up involving several academic actors such as the University of Tokyo, the EHESS and the IRCAM. Finally, each year the major\u00a0<a href=\"http:\/\/ikparisathina.ircam.fr\/\" target=\"_blank\" rel=\"noopener noreferrer\"><em>ImproTech<\/em><\/a>\u00a0event is organised linking scientific workshops and a music festival. The ERC grant will enable this type of international event to continue.<\/p>\n<p>Research into computational creativity involves Artificial Intelligence techniques like statistical learning, optimisation and deep learning. However this work also brings up philosophical questions such as whether we can really talk about creativity with regard to a machine? \u00ab\u00a0<em>The very nature of the question lies in the fact that the machine is not a subject in itself which means we think more in terms of interaction and relationships. In the past, we used statistical learning, namely observing what humans do over time to obtain models of sequences (forms, patterns). Now, the new AI techniques like generative learning of representations mean we are faced with large-scale problems. This is because music produces multivariate signals (several simultaneous independent and coherent dimensions) that need to be modelled at various structural or semantic levels,<\/em>\u00a0\u00bb says G\u00e9rard Assayag about the challenges facing his project.<\/p>\n<p>\u00ab\u00a0<em>When we create interactions between complex systems like humans and machines, we do not just observe a simple addition of behaviours. Instead we focus on the &#8217;emergence&rsquo; of concomitant and coherent forms of behaviour that cannot be simply explained by their separate components,<\/em>\u00a0\u00bb the researcher explains. These emergence processes bring forth new musical forms in a non-linear way that are not easy to predict and this is what we call co-creation.<\/p>\n<p>&nbsp;<\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<blockquote>\n<div class=\"field__item\">\n<div class=\"entity-block quote\">\n<div class=\"field field--name-field-citation field--type-string-long field--label-hidden field__item\">\u00ab\u00a0In some ways, humans learn from machines as well. Humans may change their musical tactics according to what the machine is feeding back to it but in return machines will continue to learn from human reactions. This forms a cross-learning loop with feedback and brings up reinforcement mechanisms which are well known in AI. This complexity is what we call co-creativity which is what we study.\u00a0\u00bb<\/div>\n<div class=\"quote-author\">G\u00e9rard Assayag, IRCAM research professor.<\/div>\n<\/div>\n<\/div>\n<\/blockquote>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-cd833b6 elementor-widget elementor-widget-video\" data-id=\"cd833b6\" data-element_type=\"widget\" data-settings=\"{&quot;video_type&quot;:&quot;vimeo&quot;}\" data-widget_type=\"video.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-wrapper elementor-open-inline\">\n\t\t\t<iframe class=\"elementor-video-iframe\" allowfullscreen allow=\"clipboard-write\" title=\"vimeo Video Player\" src=\"https:\/\/player.vimeo.com\/video\/376887800?color&amp;autopause=0&amp;loop=0&amp;muted=0&amp;title=1&amp;portrait=1&amp;byline=1#t=\"><\/iframe>\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-67a42f8 elementor-widget elementor-widget-text-editor\" data-id=\"67a42f8\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p class=\"text-align-right\" style=\"text-align: right;\"><em>A demonstration for the &lsquo;Neurones, les intelligences simul\u00e9es&rsquo; exhibition (Neurones, simulated intelligence) in the framework of Mutations\/Cr\u00e9ations 4 at the Georges Pompidou Centre, February 26<sup>th<\/sup>\u00a0\u2014 April 27<sup>th<\/sup>\u00a02020, illustrating autonomous creative agents which come from G\u00e9rard Assayag&rsquo;s team, one of the base elements of ERC REACH project.<\/em><\/p>\n<p>The work of G\u00e9rard Assayag and his team could make it possible to better decipher our creative behaviour, particularly the way in which knowledge, intuition, acts of will, communication and poiesis (the action of doing and creating) fit together in co-action situations, whether this involves humans, machines or both.<\/p>\n<p>Technically, AI is limited by its learning mode in this respect. It looks for regularity in patterns when it learns and analyses data, and therefore seeks stereotypes which are the opposite of creativity! This is a major epistemological problem that needs to be taken into account. \u00ab\u00a0<em>The challenge is to reproduce human know-how and therefore its creative dimension through new computational strategies to achieve less stereotyped AI learning,<\/em>\u00a0\u00bb G\u00e9rard Assayag sums up.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>Read original article The IRCAM research professor G\u00e9rard Assayag has been awarded a 2019 ERC Advanced Grant for the REACH project (Raising co-creativity in cyber-human Musicianship) which enables him to continue his work on cyber-human musical co-creation. He founded the\u00a0Musical Representations\u00a0team at the\u00a0Science and Technology of Music and Sound Laboratory(STMS \u2014 CNRS\/IRCAM\/Culture and Communication Ministry\/Sorbonne [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":458,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[60],"tags":[],"class_list":["post-33","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-articles"],"aioseo_notices":[],"blog_post_layout_featured_media_urls":{"thumbnail":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2020\/04\/ga_photo-374x249-1-150x150.jpg",150,150,true],"full":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2020\/04\/ga_photo-374x249-1.jpg",374,249,false]},"categories_names":{"60":{"name":"Articles","link":"https:\/\/reach.ircam.fr\/index.php\/category\/press\/articles\/"}},"tags_names":[],"comments_number":"0","wpmagazine_modules_lite_featured_media_urls":{"thumbnail":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2020\/04\/ga_photo-374x249-1-150x150.jpg",150,150,true],"cvmm-medium":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2020\/04\/ga_photo-374x249-1-300x249.jpg",300,249,true],"cvmm-medium-plus":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2020\/04\/ga_photo-374x249-1-305x207.jpg",305,207,true],"cvmm-portrait":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2020\/04\/ga_photo-374x249-1.jpg",374,249,false],"cvmm-medium-square":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2020\/04\/ga_photo-374x249-1.jpg",374,249,false],"cvmm-large":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2020\/04\/ga_photo-374x249-1.jpg",374,249,false],"cvmm-small":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2020\/04\/ga_photo-374x249-1-130x95.jpg",130,95,true],"full":["https:\/\/reach.ircam.fr\/wp-content\/uploads\/2020\/04\/ga_photo-374x249-1.jpg",374,249,false]},"_links":{"self":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/33","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/comments?post=33"}],"version-history":[{"count":22,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/33\/revisions"}],"predecessor-version":[{"id":908,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/posts\/33\/revisions\/908"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/media\/458"}],"wp:attachment":[{"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/media?parent=33"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/categories?post=33"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/reach.ircam.fr\/index.php\/wp-json\/wp\/v2\/tags?post=33"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}