The Machine Learning
You Never Asked For
What makes a person feel like a system is adapting to them?
Is it necessarily machine learning?
The main goal in designing this system was that musicians would feel like it was just another human performer, and not a machine. Here’s a clip of the system playing electronic instruments with a human performer, Aram Shelton:
I asked each of them to play 10 short pieces with the system (2-3 minutes in length). In a random selection of these pieces, the system was set to not listen to the performers at all but improvisers were not made directly aware of this experimental condition. After each take, they graded the system on 4 criteria on a 10 point scale and provided brief, written qualitative comments.
At the end of the experiment, they provided further qualitative, written comments and I conducted a brief informal interview with them about the experience as a whole.
For the ethnographic portion of this project, I arranged similar meetings with improvisers over time, but with a much more open-ended structure. They could play with the system as long as they liked and I let them lead the conversation about their experience playing with it.
Because free improvisation is a resolutely obscure musical subculture, locating participants involved a focused effort to find musicians who play this kind of music.
Similar results emerge from the larger ethnographic project. Over repeated interactions, improvisers suggest that the system had improved even though I have done nothing to change it. For example, at a concert featuring several improvisers who had played with the system before, several of them said that the system had "really grown up" and that it was "sounding a lot better" than before.