Hi all,
I’ve got a project where I am drawing a sketch and saving the frames as a video using ffmpeg. My project is an ElectronJS desktop app, so I have access to both the browser and the operating system. The problem I’m facing is performance is very slow due to the way I’m capturing each canvas frame. Here’s my current workflow in a nutshell:
- I create an image stream (new PassThrough) that is piped to my ffmpeg process
- After drawing each frame, I convert it to a blob with canvas.toBlob()
- I convert the blob to arrayBuffer
- And get a buffer with Buffer.from()
- I write the buffer to the image stream with stream.write()
This works well but canvas.toBlob() is incredibly slow compared with everything else. Would anyone know how to optimise this?
I’m guessing there is no way p5 could draw to anything other than a canvas, something that can be converted to a blob or buffer quicker, is there? I think even createGraphics uses a hidden canvas. And I don’t know if we can draw to an OffscreenCanvas (wich may also not be faster to save).
I also want to experiment with canvas.captureStream() + MediaRecorder, but I doubt I can pipe that to ffmpeg directly.
I think the builtin saveFrame would not help either, as that tries to download the file directly. And it may use toBlob under the hood, anyway. Not sure.
Any ideas are appreciated.