I’ve been experimenting with using Processing to improvise MIDI notes. If you’d like to check out some new music that wrote itself, my SoundCloud page for this project, 39 Angry Tigers, is over here.
Thanks so much for your support, likes, comments, questions, and sharing!
Interesting. How does it work? Some visualization also would be nice.
Hi! I’ve done some animation experiments separately…the image at the top of the Soundcloud page is from one of them. I haven’t taken the time to combine them yet, but I’ve been thinking about that.
How it works: it uses Perlin noise to tell its virtual fingers how to meander up and down the keyboard, with the note selections forced through a chord template which is changed from time to time. The chord changes are Markov-chained from a corpus that it refers to. The note rate mode also changes from time to time. Things like the density of note coverage, tempo, degree of mutation of chords and the grain of the Perlin noise feed can be controlled by the user. For these compositions, I set it up and pushed the start button. The percussion tracks were added separately in most cases, though one uses a drum VST that’s going through a randomizer in real time.
I hear Allman Brothers – Statesboro Blues.
Don’t ask me how that got in there!
Most of them have just one lead finger doing the noodling, but that one has two of them noodling independently, so you kind of get that dueling guitar thing going on there. I really can’t say where the boogie rhythm comes from. It does some things that I don’t understand.
Maybe the Allman Brothers used Markov-chain chord progressions!
Thanks for sharing this work-- and for a very interesting how-it-works breakdown on this. Really interesting to listen to these tracks and think about how the Perlin walk and Markvov-chain chord state are being used.
Also: the NWW List (!)
Thanks for checking it out, Jeremy!
Amazing work. You know, when you described the ‘virtual fingers’, I thought if you could get together with a 3D visuals programmer and visualize a person playing the music, something like the unnamed soundsculpture
by Daniel Franke & Cedric Kiefer(?)
That would be interesting. At the moment, I’m not capturing the MIDI stream in a file, but that would be the way to go. Send the file to a visual interpreter of some kind.
I’m using the MidiBus…it doesn’t save files, unfortunately. I’m sure there’s some way to get around that.
Amazin thing that you create!
I believe that themidibus is written on top of javax.sound.midi – so you could probably use MidiFileWriter and MidiFileReader if you wish.
Thanks for sending that my way, Jeremy. I can’t say I really understand how to implement MidiFileWriter in my code, but that’s a low priority for me right now.
At the moment I’m looking into adding an option to swing the rhythm (I’m halfway there, I think…it’s screwy) and doing nested iterations. If/when I get those to work right, I’ll post the results.
Two recently generated pieces, “Gorgeous Charlatan” and “Light Sails,” feature a newly swingy rhythm.