3.3 aiya + chaosflöte sketches (may 2019)

Eager to incorporate AIYA with my de facto instrument of choice, I created a revision of the original AIYA software to take in the sensor measurements of the Chaosflöte instead of the MIDI controller, with the detection of the following elements:
- Sound: frequency, amplitude, spectral centroid, spectral flatness
- Motion: xyz rotation, xyz acceleration

The test users of the AIYA for MIDI keyboard were afraid of not being able to fully control it, and at first I attributed that only to the fact that playing with my improvisation machine was completely new to them. However, I began to feel the same fear of not being able to completely control the machine ("What if the machine gets too loud and begins a runaway feedback loop when I am far from the mixer and unable to quickly turn it off?", "What if I don't like how it sounds?" among other haunting thoughts). I felt the need to program an “on/off” control, where flipping the switch on my Chaosflöte board would also turn off AIYA’s sound output. “How does the machine know when the improvisation is over?” a mentor once mused, years before I created AIYA. This switch was an easy answer (for now).

The question of sonic identity
This version of AIYA still retained some of the electronically-generated sounds of the original version, taking the form of bell-like resonators fed through a variable delay line. It was interesting to have elements of the sonic palette so distinct from the timbre of the flute, and it added another dimension to the expression of the machine. It did, however, also introduce a slightly more defined (permanent?) sonic identity to AIYA, as these sounds would be present in its improvisation language, even if the human partners would change.

Something about that bothered me, I suppose, because I wanted AIYA’s own sonic identity to be as “context-aware” as possible. I took issue to the fact that there were many improvisation machines of renown spitting out sine tones and ring modulated saw tooth waves when something as timbrally complex as, for example, a cello was playing with it. It is not that cello and sine tone is inherently a distasteful combination; it ís more the knowledge that the machine improviser does not have the capacity to even show its intention to complement the timbre of its partner, which I find to be problematic. I wanted AIYA’s sonic identity to be as flexible as possible so it could mimic the “human” quality of being “content-aware,” of being sensitive to the sonic landscape. This also addresses the point made in Section 2 when regarding the machine as an extension of the performer’s body: “An intuitive relationship and/or similarity can be drawn between the sonic identities of the human-generated sound and Chaosflöte-specific sound” (in this case, the AIYA-specific sound). In this version of the improvisation machine, I tried to strike a balance between the electronically-generated sounds and the loop sampling mechanisms of the machine, to create a hybrid sound that still had a connection to the performer’s audio input signal.

(above) signal flow of this version of AIYA for chaosflöte
(above) an example of a purely electronically synthesized sound of AIYA (made in SuperCollider), the activation of which was determined by the interaction rules programmed in the MaxMSP part of AIYA.

Recordings of AIYA + Chaosflöte

On the left are two recordings of excerpts from improvisations made with AIYA and Chaosflöte (unedited).