Live coding visualisation

Hi there,

Since couple weeks, I am experiencing livecoding (amazing :slight_smile:) with FoxDot (in python : and I would like to make a program in processing to visualize it. I was trying to use the oscP5 library to get osc messages from foxdot (and it worked) but that’s not what I want, I can’t extract the tones and notes that are being played.

I searched how to get the sound from my computer with the minim library but I can’t figure out how, like in this video :


It sounds like you want processing to respond to live audio, rather than OSC. Did you try one of the basic minim audio input examples that comes with the library, and did they work for you?

If not, which example, what didn’t work / how didn’t it work?

@josephh I recommend coming over and asking on as well. Many of the live coding community are active there, including the author of Foxdot. And there have been various conversations around music/visual communication, including with Processing.

1 Like

Thanks for your answer, I tried the minim example with the audioinput but by default I get Input monitoring is currently disabled and no sound is coming even when I press ‘m’ to enable it.

Yeah, I am already on talk.lurk, that’s why I wanted to ask on the Processing forum about the minim library because I can’t get what I want with osc.