Hi everybody, I recently started working on a project I had had in mind for quite some time. I have reached a good point but now that I have to integrate the interactive part, I have a doubt.
This is the idea: I’m creating objects dynamically and currently storing them in an ArrayList. My goal is to use the MIDI Bus library to animate each object based on incoming MIDI messages (Note on/Note off). For those that are not familiar with MIDI messages: the note on includes a note number (range could be: 0-127), a note velocity (0-127) and the message “On” when the note is played and “Off” when it stops.
One of the properties of the objects I created is the MIDI note to associate to each object as an integer number. This property is not set sequentially so, for example, the midi note associated with the object 0 in the array could be 24, the one associated with the object 1 could be 12 etc.
Now: since my goal is to achieve real-time interaction, what’s the best way to find the object in my ArrayList with the correct MIDI note? To continue with the example given earlier: when the incoming MIDI note is = 12, the object in the array position = 1 is animated.
I don’t think that looping through the whole array is the best solution. Could it be using Hashmaps?
Good! I’m glad it’s as I thought. I’d like to ask you one more thing: since the interactive part only concerns MIDI but for other properties I’d like the objects to also be in the ArrayList I’ve already created, does it make sense to have both the ArrayList and the HashMap working in parallel? I mean, when working with MIDI I refer to the HashMap, while for everything else I continue to use the ArrayList as I am doing now?
Could there be any possible disadvantages with this approach?
You’re right but the problem is that I am assigning the MIDI note manually after the object is created with a dropdown menu created in CP5. So maybe I will have to keep both ArrayList and Hashmap. I will definitely write all the code to test my ideas and then refactor it at some point in the future.
Thanks! Now, thanks to you, I remembered that in a past project I used LinkedHashMap but I definitely need to study the documentation again.
LinkedHashMap is a subclass which extendsHashMap, so I’m suspicious whether it could be any speedier than its parent class.
Besides, I can’t remember seeing any sketch in this forum which required a container to have insertion order in order to work, given display coordinates are obtained from x & y fields, not indices or keys.
Can’t you pre-create all the MIDI notes you’re gonna need?
If not, you can also instantiate a class which would create that MIDI note and store it in the HashMap.
Will be slightly slower when inserting data into the map because of the need to maintain the linkedlist
Will be the same speed when retrieving a map element because it doesn’t need to access the linkedlist (any difference would be the order of nanoseconds)
Provides a guaranteed order when iterating over the map elements which is not the case with the HashMap
I agree and I have never had the need to use one myself ,but the OP obviously has
Well, I’m proposing a unified HashMap container for both the MIDI objects and the “animated” objects, which would be represented by a unified class.
Now for a dual container approach, where 1 is indexed-based and the other is key-based, still it doesn’t necessarily mean insertion order would be strictly required; although it’s a valid alternative, however a bit fragile, for coupling both containers.
No, because my idea is to work on the visualisation in Processing at the same time as the music is made. So I cannot know in advance how many and which MIDI messages I will send to Processing. I will constantly tweak the sketch until I reach a satisfying result.
The sketch could also be used and tweaked in live performances
So there’s no clear link between an incoming MIDI note and the “animation” object.
I suppose you have a separate algorithm that decides which “animation” to display for each MIDI note.
In such case a 2-container approach seems appropriate.
I haven’t started working on the animation yet but the idea is that the animation is “embedded” in the object itself (as a function or something similar… I was also looking at the Ani Library) and the incoming MIDI note is just a trigger. Each object has his own animation that starts when a specific midi note is received. As I said before, I’m choosing with a dropdown menu which MIDI message each object will “listen”.
This was an experiment from 2017:
And this is a repository from a couple of year ago: