All the songs in this collection contain small clips or samples of machine-generated musical notes. I used online platforms that produce both sounds and visuals from a user’s input, the main one being “Paint with Music” (which is why the album has this name) which uses Google Magenta’s DDSP (Differentiable Digital Signal Processing).
On the online platform, the movements of virtual brush strokes that are made by the user on the screen are translated and outputted into musical notes, performed by a virtual instrument, for instance a flute, sax, keyboard or voice. On the “Paint with Music” platform, you can “paint” on to four types of virtual canvases: paper, sky, water and street.
Hirajōshi and Sky Music Video
The song contains two types of sound recordings: the first, a series of notes, were generated on the “Sky” canvas of the “Paint with Music” platform, which is in the Heptatonic scale. It contains bird-like stabs of sound. The second series of notes was made on the “Paper” canvas, and the notes are in the Japanese Hirajōshi scale, hence the name of the song.
The resulting melodies are literally from my hand drawing lines on the screen, as can be seen in the lyrical video. I recorded myself “playing” the melodies, then extracted the audio tracks. The sound files had to be repaired, optimized, quantized, split, rearranged and rejoined to form an actual melody rather than just jumbled sounds. Then I wrote an entirely new composition around those small samples. To say that this project was a challenge to the Sound Engineer is putting it mildly.
David Byrne writes in his book How Music Works, that often the reality of music production is that people create things to suit a context, not vice versa. So I had the context already, these two recordings from the A.I. platform, and I had to create the rest of the composition to fit around them. It was much easier than trying to do “something” with a random tune.