Video Crashes with JNA: could not detach thread

Hello together,

I’ve recently started using the video functions of processing for a uni project. Sadly I seem to run into the mentioned error on all maschines I’m working on.

Even with the most basic version on the playback always get the error “JNA: could not detach thread”
eventually crashing the playback (See screenshot)
I have tried reinstalling processing and java, but nothing changes. Whats bothering me more is, that is sometimes it goes through ok.

Any Idea what may cause this?

import processing.video.*;
Movie myMovie;
void setup() {
  size(200, 200);
  myMovie = new Movie(this, "noisestorm_crab_rave.mp4");
  myMovie.play();
  myMovie.volume(0);
}
void draw() {
}
void movieEvent(Movie m) {
  m.read();
  println(m.time());
}

1 Like

Ok, I tried and checked some things.

It seems like the " void movieEvent(Movie m) " function is causing the problem.
I did not check it before, and when i handle all the updates in void draw(), with the movie.available() funtion, it’s working.

So for now its working well.

3 Likes

Hi,
it seems strange to me but it looks like i have it figured out now,
maybe it was my error for not understanding the way the Videolibraray works, or its a strange problem.
The following seems to generate a proper videostream with the correct framecount, in playbackrates up to 6x, without getting into any JNA error or crashes.

import processing.video.*;
Movie myMovie;
boolean ended = false;
float playbackSpeed = 6.0;
float videoFrameRate = 29.97;
void setup() {
  size(200, 200);
  /* End of playback function by ajohnsonlaird
     https://forum.processing.org/two/discussion/26225/movie-duration-always-returns-0-0-framerate-and-speed-do-not-seem-to-affect-playback
  */
  myMovie = new Movie(this, "noisestorm_crab_rave.mp4"){
        @Override public void eosEvent() {
        super.eosEvent();
        myEoS();
        }
  };
  myMovie.play();
  myMovie.speed(playbackSpeed);
  myMovie.volume(0);
  myMovie.noLoop();
  myMovie.frameRate(playbackSpeed * videoFrameRate);
}

void draw() {
  if (myMovie.available() && !ended) {
          myMovie.read();
          println(myMovie.time());
  }
  image(myMovie, 0, 0);
}

// Called every time a new frame is available to read
void movieEvent(Movie m) {
  if (!ended) {
    m.read();
  }
}

void myEoS() {
  ended = true;
}
2 Likes

Hi there,

Thank you so much for the solution. I have a sketch that uses a video as input, reads the brightness from the video.pixels[location], then draws many 3D spheres using the x, y coordinates. It is based on the Brightness Mirror example from Learning Processing" by Daniel Shiffman.
for some reason I have been having a hard time getting the whole sketch and the movie to play at the appropriate frameRate / speed, and also that the resulting video export file, using @hamoid 's great VideoExport Library has the correct duration. Have tried many solutions using speed but cannot seem to get it to work, using movie.speed(1.0) does not work, super fast, also tried movie.speed(0.1) that makes it slower but resulting duration is half of the source movie duration, which is super strange. I feel that it defaults to something while in setup and cannot change it afterwards.

I have found your solution here, and had also found this solution by @GoToLoop
https://forum.processing.org/two/discussion/14990/movie-begin-and-end-events.html#Item_1

If I understand correctly, this gives you a more direct access to what is running the movie under the hood?

My problem is that when I try both solutions, in the Processing IDE, I get the following error messages on the console and the sketch does not run.

 myMovie = new Movie(this, "MovieFile.mp4"){
        @Override public void eosEvent() {
        super.eosEvent();
        myEoS();
        }

The two times eosEvent() Appears in the proposed block of quote, it is underlined red by the IDE and in the console the message reads:

The function “eosEvent()” does not exist”.

Then when I try to run the sketch, the error message on the console reads:

The method eosEvent() of type new Movie(){} must override or implement a supertype method

My system is:
MacOs 10.13.6 High Sierra
My version of Processing is 3.5.4 (the newest I can run on my system)
Video Library for Processing 3, version 2.0 The Processing Foundation
My version of Java when I run $ java -version in my terminal is:
openjdk version “1.8.0_265”
OpenJDK Runtime Environment (AdoptOpenJDK)(build 1.8.0_265-b01)
OpenJDK 64-Bit Server VM (AdoptOpenJDK)(build 25.265-b01, mixed mode)

I read a bit online and I think it has to do with the compiler I have? but I do not seem to understand how to fix this. Many thanks for taking the time to read this.

Cheers from the Caribbean :slight_smile:

Hi :slight_smile: Movie speed related issues are very common when recording video. I should write about it and make it very visible in the video export library page :slight_smile:

In your program, is it important that the timing looks correct while running the program and also in the produced video? Or is it ok that while the video is being produced the program runs slow?

If what you care about is that the final video has exactly the duration you want, you can calculate how many animation frames do you need and produce exactly that amount of frames (also specifying the frame rate of the produced video).

If you are using a video as input, you can first convert that video into single frames (jpg or png), then load them one by one while producing the video clip. That way you know exactly how many input frames you have, and how many frames you output (probably the same number).

Would that work for you?

3 Likes

I’ve remade that old sketch as a new subclass so it works to the current video library:

2 Likes

Hi Sir

You are giving this community nice taste

1 Like

Hola Abe :slight_smile:

 Thank you for all your tutorials and things you share to help others learn. You have a big fan in the Dominican Republic.

This would be amazing!!! Been having a lot of troubles with this. Running into this same wall again and again. At the moment very excited building custom filters for myself (projections, as visual pieces, VJ, etc…) and Enjoy the idea of using videos, short loops as source material and transforming them into new things… Something happens that is not right in P3D move with the video library. Or the source video plays too fast, or I am not understanding something fundamental about P5 that keeps me from moving forward. I just would like that my sketches, while it is running using my video as INPUT, plays at the normal speed of the source video itself while applying my function changing pixels, using pixels as data to draw 3D shapes, etc…
I just feel that it defaults to something and accelerates… The resulting video with Video Export has the correct fps, but the info itself runs super fast. I have tried everything, or almost everything. Even with movie.speed(); it is very strange, with movie.speed(1.0) super fast, over in a second, even thought if I understand correctly from he reference page, that is the 100% starting base speed. If I try movie.speed(0.1) then a video that is 8secs in length lasts 4 secs (50% of its duration but 150% its speed?) I would have assumed that for it to last half the length of 8 secs, that would be (0.50)…

I am more interested in the second one, that the program runs slow, but that the resulting video has the exact same duration / speed / frameRate / fluidity that the original source video has… but I would appreciate if you could guide me to learn BOTH approaches, that way I can learn more, and have more control in my explorations / things I am trying to accomplish with the pixels.array.

I believe so, is this like a sure way to guarantee to keep the correct time? I know the video library has some quirks, at least the 2.0 the one that I can run in my Processing 3 version.
A weird quirk is that in order to read the pixels from a video in P3D, you have to display the video first, if not it is black / blank. I have to always use :
image(mov_TamboraMar, width, 0); Ti display it, even if outside the frame in order to to read pixels form the video. It was very hard to find this info, especially as a beginner? I feel that I am missing an important part of the puzzle to control: speed+duration

Did not want to make a long post, but feel I need to put a minimal reproducible example, as I feel that maybe something I am doing (or not doing) is causing this weird behavior. Tried with .mp4 and .mov, it is the same.

Please keep in mind that I have tried many many alternatives, although most of my efforts resulted in the sketch / video ending in 1 sec (Super fast), or in 4. When I started to print on the screen the frames+duration I noticed it got only to 12 frames… Some of what I have tried: mov.frameRate in setup, in draw. play+frameRate+pause movie in setup and then play in draw, with if statements… Using speed, etc etc etc… When I did not play it in setup(), but put speed(0.1) in setup, and then played it in draw and repeated the speed(0.1) in draw(), the resulting video export had the 8secs duration of the original source video, but the video did not run normally, super slow, a few frames very very slow, but in the 8 secs of the original source video.

I thinks / feel, that I need to:
Compute how long my function takes to take a frame as Input, do its thing to it and then return it to the screen and/or videoExport file… ??? or compute all of that before and then, once those computations have been made, only then, display them / write them to the videoExport file… This is my intuition but I am still learning and do not know how to do that.
In case it is of some help, an error that keeps popping up in the console in red is here:

java.lang.NullPointerException

  • at processing.opengl.Texture.copyBufferFromSource(Texture.java:827)*
  • at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)*
  • at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
  • at java.lang.reflect.Method.invoke(Method.java:498)*
  • at processing.video.Movie$NewSampleListener.newSample(Unknown Source)*
  • at org.freedesktop.gstreamer.elements.AppSink$2.callback(AppSink.java:232)*
  • at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)*
  • at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
  • at java.lang.reflect.Method.invoke(Method.java:498)*
  • at com.sun.jna.CallbackReference$DefaultCallbackProxy.invokeCallback(CallbackReference.java:520)*
  • at com.sun.jna.CallbackReference$DefaultCallbackProxy.callback(CallbackReference.java:551)*

Minimal reproducible example below:
It needs a INPUT video that is width:640 x height:360. of 8 secs duration.
Will attach a couple of pics so you have an idea of the resulting visual of my piece.

//    MAIN SKETCH

import processing.video.*;
import com.hamoid.*;

final String sketchName = getClass().getName();
final String  movie_w640_h360_Filename = "Tambora_frenteAl_Mar_IMG_5107_Scale_1—2_w640_x_h360_1.mp4";
Movie mov_TamboraMar;
float movieFPS = 29.97;
float movieSpeed = 1.0;
PGraphics pg_scaledUP_TamboraMar_OUT;
int videoScale = 2;
int cols;
int rows;
float mod;
int counter;
VideoExport videoExport;

void setup() {
  size(1280, 720, P3D);
  colorMode(HSB, 360, 100, 100, 100);
  frameRate(30);
  noStroke();

  mov_TamboraMar = new Movie(this, movie_w640_h360_Filename);
  mov_TamboraMar.play();   
  mov_TamboraMar.frameRate(movieFPS);

  pg_scaledUP_TamboraMar_OUT = createGraphics(width, height, P3D);
  pg_scaledUP_TamboraMar_OUT.beginDraw();
  pg_scaledUP_TamboraMar_OUT.colorMode(HSB, 360, 100, 100, 100);
  pg_scaledUP_TamboraMar_OUT.sphereDetail(3);                       
  pg_scaledUP_TamboraMar_OUT.endDraw();

  videoExport = new VideoExport(this, sketchName + "_A_.mp4");
  videoExport.setFrameRate(movieFPS);  
  videoExport.startMovie();
}

void draw() {

  if (mov_TamboraMar.available()) {
    mov_TamboraMar.read();
  }
  image(mov_TamboraMar, width, 0);  // Bug in P3D, have to display movie in order to read its pixels

  if (has_movie_started(mov_TamboraMar) == true) {  
    push();
    movIN_pgOUT_pixArray_brightMirror_scale_1_to_2(mov_TamboraMar, pg_scaledUP_TamboraMar_OUT);
    pop();
  }

  videoExport.saveFrame();

  if (frameCount % 30 == 0) {
    println( "FrameCount =  " + frameCount + "    Time:  " + mov_TamboraMar.time() );
  }

  if (is_movie_finished(mov_TamboraMar) == true) {
    videoExport.endMovie();
    exit();
  }
}

boolean is_movie_finished(Movie m) {
  return m.duration() - m.time() < 0.05;
}

boolean has_movie_started(Movie m) {
  return m.duration() - m.time() > 0.05;
}


Main function: movIN_pgOUT_pixArray_brightMirror_scale_1_to_2():



void movIN_pgOUT_pixArray_brightMirror_scale_1_to_2(Movie movIN, PGraphics movOut) {
  movIN.loadPixels();
  movOut.beginDraw();
  movOut.fill(359, 62, 80, 10);  
  movOut.noStroke();
  movOut.rect(0, 0, width, height);
  cols = movOut.width / videoScale;
  rows = movOut.height / videoScale;
  movOut.ambientLight(21, 19, 93);
  movOut.lightSpecular(0, 0, 99);
  movOut.directionalLight(128, 128, 128, 0, 1, 0);  // from above

  for (int x = 0; x < cols; x++ ) {
    for (int y = 0; y < rows; y++ ) {
      // Pixel location and color
      int newX = x * videoScale;
      int newY = y * videoScale;
      int location = x+y*movIN.width;
      color c = movIN.pixels[location];
      float sz = (brightness(c)/100)*videoScale;            

      movOut.pushMatrix();
      movOut.translate(newX + videoScale, newY + videoScale, sz * 20.5); // ALT 2      
      movOut.specular(21, 19, 93);
      movOut.shininess(5.0);                    
      movOut.ambient(21, 19, 93);
      movOut.noStroke();
      movOut.sphere(sz);   
      movOut.popMatrix();
    }
  }
  movOut.endDraw();
  pushMatrix();
  image(movOut, 0, 0);  // Main PG being displayed with main (1st) comp
  popMatrix();
}

Thank you so much for your help.

Thank you very much GoToLoop,

I tried to implement it, and I was able to change the frameRate on your provided example… but when I tried to implement it in the code above ( in Abe’s reply) it did not work, Or I was unable to make it work. Saved it in my library to try and implement it better.

Hi hi :slight_smile:

It’s a long post and no much time to reply in detail right now, but by looking at the code I think I know what may be happening.

  • You are loading and playing a video. The duration of that video will be whatever it says in the file. If it is 8 seconds long, then that’s how long it will play. You start playing the video.
  • Your program will attempt to play at 30 frames per second, but (here’s my assumption) fail, because you are drawing 57600 spheres if I’m not wrong. Lets say it manages to run at 5 fps.
  • That means that by the end of the loaded video, after 8 seconds it has rendered 40 frames to the video file.
  • When you play that resulting video at 30 fps it lasts 1.33 seconds.

Does that match what you are seeing?

To be sure your final video has 30 * 8 = 240 frames, you need to save 240 frames to the video file, independently of how fast or slow your program can run.

Since you are processing an input video, I think the safest thing to do is to extract the 240 frames from the input video, use your program to load, transform and save each frame into the result video. That way you can be sure it will be exactly 8 seconds.

If you want to output what you see in real time I think the simplest approach would be to use a program that does screen recording. That video may be 30 fps (if that’s what you asked for) but it can also happen that many of the frames in that video are repeated, if Processing was unable to update the screen 30 times per second because the program was too heavy.

Something that can help with this situation is to add a new feature to the Video Export Library to convert videos into frames in a folder. This wouldn’t be required if the Processing video player would allow requesting for specific frames by number (give me frame 7, give me frame 8) but it doesn’t do that afaik.

Let me know if I guessed right or not :slight_smile:

2 Likes

Hola Abe,

I apologize for this, just wanted to share my trial and error process in this.

Yes that is exactly the behavior that I am seeing.

I sort of, sort of understand. I already managed to extract the totality of the frames as PNG’s with ffmpeg, the total is 240 frames. I know how to load them into the sketch, but not the part of feeding them into my function, with the amount of time the function needs to process each frame change the pixels for spheres, and saving each transformed frame into the resulting video How does one find this time out? I guess with a Boolean variable? when the last sphere is created corresponding to the art pixel on the pixels.array? A bit lost.

Thank you for this, although not for this situation, I feel I understand this other approach more now.

Yes I think you are right, the closest I have found is that example, in the video library called ‘Frames’
by Andres Colubri, He states in the file, “It estimates the frame counts using the framerate of the movie file, so it might not be exact in some cases”

Yes you did! :slight_smile:

Mmm… I don’t see how the time has an effect.

The following is untested. But I think very few changes are needed in your program.

//    MAIN SKETCH

import processing.video.*;
import com.hamoid.*;

final String sketchName = getClass().getName();
final String  movie_w640_h360_Filename = "Tambora_frenteAl_Mar_IMG_5107_Scale_1—2_w640_x_h360_1"; // was .mp4 before
PImage mov_TamboraMar; //CHANGED
float movieFPS = 29.97;
float movieSpeed = 1.0;
PGraphics pg_scaledUP_TamboraMar_OUT;
int videoScale = 2;
int cols;
int rows;
float mod;
int counter;
VideoExport videoExport;

void setup() {
  size(1280, 720, P3D);
  colorMode(HSB, 360, 100, 100, 100);
  frameRate(30);
  noStroke();

  pg_scaledUP_TamboraMar_OUT = createGraphics(width, height, P3D);
  pg_scaledUP_TamboraMar_OUT.beginDraw();
  pg_scaledUP_TamboraMar_OUT.colorMode(HSB, 360, 100, 100, 100);
  pg_scaledUP_TamboraMar_OUT.sphereDetail(3);                       
  pg_scaledUP_TamboraMar_OUT.endDraw();

  videoExport = new VideoExport(this, sketchName + "_A_.mp4");
  videoExport.setFrameRate(movieFPS);  
  videoExport.startMovie();
}

void draw() {
  // assumes names like
  // "Tambora_frenteAl_Mar_IMG_5107_Scale_1—2_w640_x_h360_1_####.png"
  mov_TamboraMar = loadImage(movie_w640_h360_Filename + "_" + nf(frameCount, 4), ".png");

  image(mov_TamboraMar, width, 0);  // Bug in P3D, have to display movie in order to read its pixels

  push();
  movIN_pgOUT_pixArray_brightMirror_scale_1_to_2(mov_TamboraMar, pg_scaledUP_TamboraMar_OUT);
  pop();

  videoExport.saveFrame();

  if (frameCount == 239) {
    videoExport.endMovie();
    exit();
  }
}

and

void movIN_pgOUT_pixArray_brightMirror_scale_1_to_2(PImage movIN, PGraphics movOut) {
  movIN.loadPixels();
  movOut.beginDraw();
  movOut.fill(359, 62, 80, 10);  
  movOut.noStroke();
  movOut.rect(0, 0, width, height);
  cols = movOut.width / videoScale;
  rows = movOut.height / videoScale;
  movOut.ambientLight(21, 19, 93);
  movOut.lightSpecular(0, 0, 99);
  movOut.directionalLight(128, 128, 128, 0, 1, 0);  // from above

  for (int x = 0; x < cols; x++ ) {
    for (int y = 0; y < rows; y++ ) {
      // Pixel location and color
      int newX = x * videoScale;
      int newY = y * videoScale;
      int location = x+y*movIN.width;
      color c = movIN.pixels[location];
      float sz = (brightness(c)/100)*videoScale;            

      movOut.pushMatrix();
      movOut.translate(newX + videoScale, newY + videoScale, sz * 20.5); // ALT 2      
      movOut.specular(21, 19, 93);
      movOut.shininess(5.0);                    
      movOut.ambient(21, 19, 93);
      movOut.noStroke();
      movOut.sphere(sz);   
      movOut.popMatrix();
    }
  }
  movOut.endDraw();
  pushMatrix();
  image(movOut, 0, 0);  // Main PG being displayed with main (1st) comp
  popMatrix();
}
2 Likes