I’m running in a little problem while trying to improve my basic video player project (which is working) on a PI.
I would like to move the declaration of my GLMovie from setup() to my osc callback.
I’ve read a bit about movie player class and changed “this” for a PApplet but it’s still not working…
Here is basically what i’m doing:
PApplet app;
GLMovie video;
void setup(){
app = this;
}
void oscEvent(OscMessage theOscMessage){
if (theOscMessage.checkAddrPattern("/select")==true){
video = new GLMovie(app, playlist[val]);
}
I can’t find more ressource to understand what i’m doing wrong here, any clue would be great!
thanks
(full code bellow in case it’s useful)
import gohai.glvideo.*;
import oscP5.*;
import netP5.*;
PApplet app;
GLMovie[] video;
OscP5 oscP5;
NetAddress master;
ArrayList<Integer> mIndex = new ArrayList<Integer>() ;
int num;
int countPos = 0;
int mCurrent = 0;
boolean lastvid = false;
boolean changevid = true;
String playlist[];
String config[];
void setup() {
size(1280, 720, P2D);
app = this;
config = loadStrings("config.txt");
oscP5 = new OscP5(this, int(config[1]));
master = new NetAddress(config[2], int(config[3]));
playlist = loadStrings("playlist.txt");
video = new GLMovie[6]; // maximum number of videos here, and not "playlist.length", too heavy
}
void draw() {
background(0);
if (mIndex.isEmpty()!=true) {
if(changevid==true){
video[countPos].jump(0);
changevid = false;
}else{}
if (lastvid == false) {
if (video[countPos].available()) {
video[countPos].read();
video[countPos].volume(1.0);
video[countPos].play();
if (video[countPos].duration() != 0.0 && float(nf(video[countPos].time(), 3, 1)) >= video[countPos].duration()-0.8) {
if (countPos<num-1) {
video[countPos+1].read();
countPos=countPos + 1;
changevid = true;
}else {
video[countPos].noLoop();
video[countPos].volume(0.0);
lastvid = true;
changevid = true;
OscMessage end = new OscMessage("/end");
end.add(1);
oscP5.send(end,master);
}
}
}
} else {
video[countPos].noLoop();
}
image(video[countPos], 0, 0, width, height);
} else {
}
}
void oscEvent(OscMessage theOscMessage) {
if (theOscMessage.checkAddrPattern("/select")==true) {
println("received");
num = theOscMessage.typetag().length();
println("num type tag is " + num);
mIndex.clear();
for (int i=0; i<num; i++) {
int val = theOscMessage.get(i).intValue();
println(val);
mIndex.add(val);
println(playlist[val]);
video[i]= new GLMovie (app, playlist[val]);
}
countPos=0;
lastvid = false;
changevid = true;
println(mIndex);
}
}
@gohai thanks for your answer
I get a NullPointerException on the line where the video is called for the first time (4th line of draw() video[countPos].jump(0); )
the GLMovie doesn’t get created as my println(playlist[val]); is executed just one time, the program basically stop right after to create the first instance of the GLMovie.
I’ve tried to do it in a more straight forward way to make sure the problem doesn’t come from somewhere else, but video = new GLMovie(app, "myvideoname.mp4"); will do the same behavior
Either video[countPos] is null at this point. I don’t know if your code is fully guarding this line from being executed under this conditions. But you can easily try out by changing the 4th line of draw to:
if (video[countPos] == null) {
println("Trying to call jump on an uninitialized GLVideo instance, something's wrong");
} else {
video[countPos].jump(0);
}
The other possibility is that GLMovie just doesn’t like the jump() method to be called before play. (Actually: you seem to be calling play() in rather unorthodox places, why would you call it after available() returns true?)
Try a simple sketch that does just this in isolation to find out:
GLMovie video = new GLMovie(app, "myvideoname.mp4");
video.jump(0);
testing video[countPos] == null returned that the GLVideo instance is uninitialized, as expected.
I have a version of the code working where i initialize the GLMovie in the setup(), so even if my code is a bit clumsy, it is supposed to work.
What I intend to do is sending a playlist via OSC where each argument of the message is the index for a video sequence to play. The videos are playing one after another automatically, when the end of the last video is reached, it freezes and wait for the next message to reproduce the same behavior.
The problem with initializing GLMovie in the setup is that if I go over something like 10 instances of GLMovie, it obviously crashes, and I intend to have hundreds of videos in my playlist.txt…
So I might be doing something wrong in the way I initialized the GLMovie in my OSC callback, but I can’t figure out what? Or is it anyway a bad way of doing what I want? I’ve found some people doing class object with video in it using processing.video* but never with GLVideo… Could it be a limitation of the library somehow?
Bellow is my working code with GLMovie initialization, if it’s somehow useful
import gohai.glvideo.*;
import oscP5.*;
import netP5.*;
GLMovie[] video;
OscP5 oscP5;
NetAddress master;
ArrayList<Integer> mIndex = new ArrayList<Integer>() ;
int num;
int countPos = 0;
int mCurrent = 0;
boolean lastvid = false;
boolean changevid = true;
String playlist[];
String config[];
void setup() {
size(1280, 720, P2D);
config = loadStrings("config.txt");
oscP5 = new OscP5(this, int(config[1]));
master = new NetAddress(config[2], int(config[3]));
playlist = loadStrings("playlist.txt");
video = new GLMovie[playlist.length];
for (int i=0; i<playlist.length; i++) {
video[i] = new GLMovie(this, playlist[i]);
}
}
void draw() {
background(0);
if (mIndex.isEmpty()!=true) {
if(changevid==true){
video[mIndex.get(countPos)].jump(0);
changevid = false;
}else{}
if (lastvid == false) {
if (video[mIndex.get(countPos)].available()) {
video[mIndex.get(countPos)].read();
video[mIndex.get(countPos)].volume(1.0);
video[mIndex.get(countPos)].play();
if (video[mIndex.get(countPos)].duration() != 0.0 && float(nf(video[mIndex.get(countPos)].time(), 3, 1)) >= video[mIndex.get(countPos)].duration()-0.8) {
video[mIndex.get(countPos+1)].read();
countPos=countPos + 1;
changevid = true;
}else {
video[mIndex.get(countPos)].noLoop();
video[mIndex.get(countPos)].volume(0.0);
lastvid = true;
changevid = true;
println("last video");
OscMessage end = new OscMessage("/end");
end.add(1);
oscP5.send(end,master);
}
println(float(nf(video[mIndex.get(countPos)].time(), 3, 1)));
println(video[mIndex.get(countPos)].duration());
}
}
} else {
video[mIndex.get(countPos)].noLoop();
}
image(video[mIndex.get(countPos)], 0, 0, width, height);
} else {
}
}
void oscEvent(OscMessage theOscMessage) {
if (theOscMessage.checkAddrPattern("/select")==true) {
num = theOscMessage.typetag().length();
mIndex.clear();
for (int i=0; i<num; i++) {
int val = theOscMessage.get(i).intValue();
mIndex.add(val);
}
countPos=0;
lastvid = false;
changevid = true;
println(mIndex);
}
}
Does adding synchronized to all 3 methods help? And thinking about it, make sure the GLMovie constructor is in draw() - isn’t the oscEvent called in another thread, and therefore has no valid context?
I know it’s a different matter, but if you have any clue how to recreate several times the same GLMovie that would be great (or is there a better way to do that?)
Have you synchronized oscEvent and draw? The code you have is going to get really mixed up if they overlap.
I can’t comment on the specifics of the GLVideo library, but probably the best way would be to convince @gohai to add access to change the uri on the underlying GStreamer PlayBin so that you only need one GLMovie to do this!
How many GLMovie instances do you want to have in parallel? How often would you re-instantiate them?
@neilcsmith It’s been a long time since I wrote this, so I don’t recall the exact reasons for why things ended up the way that they are now, but it sure made sense to me at the time… I am guessing that when you change the URI, the playback would be interrupted, and GStreamer would first read in the new capabilities before re-assembling a new flowgraph (the new file might use a different audio codec for example). I guess this looked just less appealing in practice than having, say, two GLMovie instances, each backed by their own PlayBin, which can be completely independently controlled, and one can switch between them seamlessly.
See the TwoVideos sketch that comes with the library for an examples.
I only intend to be working on the GLVideo library again when Raspbian completes their switchover to the Mesa VC4 OpenGL driver, at which point support for video decode and the camera will have to be revisited entirely.
@gohai yes, you stop playback and it reconstructs the pipeline as it needs, but it’s still markedly more efficient than recreating everything. You might need two to mix, true, although IIRC there’s also a signal for gapless playback - not tried it.
To be fair my own old video code did what you’re doing. But it was changed when rewriting for PraxisLIVE v4 (partly to support live recoding). Trying to work out feasibility of porting it across to vanilla Processing. Perhaps we can join all the dots back together when you’re relooking at this and have one video library for everyone?
@neilcsmith What I’d be advocating for: Having a clean, minimal Java Video library on top of your GStreamer 1.x bindings. No games with GPU, or other risky stuff (CPU and memory are generally fast enough anyhow). Minimal maintainence needs.
And a GLVideo library just for the Pi (and possibly other ARM boards via OMX etc), which continues to be written using JNI, and does some tricks to try to extract the most out of this particular platform.
Ah, OK. I’ll be actively looking at GPU across the board soon personally, probably behind a flag though - that not work for you? Got multiple bugs about too high CPU usage for FHD / multiple videos. And Pi is a target for me too. GL will be in the bindings soon. And JNA is great you know!
thanks a lot @gohai@neilcsmith
I’ve just tried with synchronized and it works!
I think I would need to have approximatively 10 GLMovie instances (that would be the maximal number of items in my playlist). And I would re-instantiate them about every 2 minutes (when they are done playing). I am basically doing some kind of live edit mashup with a big video data base.
The code is still a bit slow (still have few black frames between each video), but i’m also running it on a pi zero now, but the final project will be on PI 3.
The rest of the conversation seems really interesting but out of reach for me