Hi! I’ve done some animation experiments separately…the image at the top of the Soundcloud page is from one of them. I haven’t taken the time to combine them yet, but I’ve been thinking about that.
How it works: it uses Perlin noise to tell its virtual fingers how to meander up and down the keyboard, with the note selections forced through a chord template which is changed from time to time. The chord changes are Markov-chained from a corpus that it refers to. The note rate mode also changes from time to time. Things like the density of note coverage, tempo, degree of mutation of chords and the grain of the Perlin noise feed can be controlled by the user. For these compositions, I set it up and pushed the start button. The percussion tracks were added separately in most cases, though one uses a drum VST that’s going through a randomizer in real time.