I have a sketch which responds to some audio, and I am happy with how it works. If I feed it a soundfile, that also works as expected. My problem occurs when I want to capture the resultant images to disc, so that I can make a movie later. The frames take longer than desired to render, which means that (a) I get fewer frames than expected; and (b) a movie made with those frames quickly goes out of sync.
I wonder whether there is a way of running the sketch in an ‘offline’ mode, where the audio playback is synchronised to the frame rate? I don’t need to listen whilst it’s rendering, so I don’t care for smooth audio playback at render time. I tried using cue()
and play()
in the draw()
function, but that seemed to mess up the fft which I was using…!
Simplified code is here (the real code does some mangling of the PGraphics
object, and makes the frame-render take longer still):
import processing.sound.*;
final int DISCS = 64;
PGraphics pg;
Amplitude amp;
FFT fft;
SoundFile file;
void setup()
{
size(960, 540,P3D);
frameRate(25);
pg = createGraphics(960,540,P3D);
// Load a soundfile from the /data folder of the sketch and play it back
file = new SoundFile(this, "14.wav");
file.play();
fft = new FFT(this, DISCS*16);
fft.input(file);
}
float r1 = 0;
float r2 = 0;
float r3 = 0;
float o = 0;
final int SPACING = 30;
final int DIAMETER = 600;
float[] levels = new float[DISCS*16];
int d=0;
void draw()
{
clear();
fft.analyze(levels);
pg.beginDraw();
background(0);
pg.noFill();
pg.lights();
pg.translate(width/2, height/2, 0);
r1 += noise(o)*PI;
r2 += noise(o)*PI;
r3 += noise(o)*PI;
pg.rotateX(r1);
pg.rotateY(r2);
pg.rotateZ(r3);
pg.pushMatrix();
pg.translate(0, 0, -DISCS*levels[2]*SPACING/2);
for(int i=0;i<DISCS;i++)
{
pg.translate(0, 0, levels[2]*SPACING);
pg.stroke(255*noise(i+o), 255*noise(1+i+o),255*noise(2+i+o));
pg.ellipse(0,0,DIAMETER*levels[(d+i)%DISCS],DIAMETER*levels[(d+i)%DISCS]);
}
pg.popMatrix();
pg.endDraw();
o+=0.01;
image(pg, 0, 0);
// saveFrame("f-#######.png");
}