Track 1 on “Painting Music”: “Hirajōshi & Sky – Luke Garfield Mix”

(Above: Track #1 on “Painting Music” – “Hirajōshi & Sky – Luke Garfield Mix”
Luke Garfield, Sound Engineer, Banana Llama Studios)

All the songs in this collection contain small clips or samples of machine-generated musical notes. I used online platforms that produce both sounds and visuals from a user’s input, the main one being “Paint with Music” (which is why the album has this name) which uses Google Magenta’s DDSP (Differentiable Digital Signal Processing).

On the online platform, the movements of virtual brush strokes that are made by the user on the screen are translated and outputted into musical notes, performed by a virtual instrument, for instance a flute, sax, keyboard or voice. On the “Paint with Music” platform, you can “paint” on to four types of virtual canvases: paper, sky, water and street.

Hirajōshi & Sky – Luke Garfield Mix Lyrical Video

Audio track title: “Hirajōshi & Sky – Luke Garfield Mix” ©Cōdae 2022 
Album: “Painting Music”
Album ISRC CBAKR2203000
Audio track ISRC CBAKR2203001
Audio track Mixing & Mastering: Luke Garfield,  Banana Llama Studios
Lyrical video produced and published by: Red Pennant Communications Corp.
Music visualizations created on “Paint with Music” platform, developed by Simon Doury and Caroline Buttet, artists-in-residence at Google Arts & Culture Lab
Dancer in yellow clips from: (royalty-free) – Attributed to Polina Tankilevitch


The song contains two types of sound recordings: the first, a series of notes, were generated on the “Sky” canvas of the “Paint with Music” platform, which is in the Heptatonic scale. It contains bird-like stabs of sound. The second series of notes was made on the “Paper” canvas, and the notes are in the Japanese Hirajōshi scale, hence the name of the song.

The resulting melodies are literally from my hand drawing lines on the screen, as can be seen in the lyrical video. I recorded myself “playing” the melodies, then extracted the audio tracks. The sound files had to be repaired, optimized, quantized, split, rearranged and rejoined to form an actual melody rather than just jumbled sounds. Then I wrote an entirely new composition around those small samples. To say that this project was a challenge to the Sound Engineer is putting it mildly.

David Byrne writes in his book How Music Works, that often the reality of music production is that people create things to suit a context, not vice versa. So I had the context already, these two recordings from the A.I. platform, and I had to create the rest of the composition to fit around them. It was much easier than trying to do “something” with a random tune.

Next song in the next post: “A.I. Opera”

Leave a Reply