Processing.video library gives v4l2src error

I am interested in using the processing-video library for my project, but I think I hit a brick wall. I tried running the CustomPipeline example given that i’m trying to display a custom stream of an IP camera.

This is the error message I’m getting:

“IllegalArgumentException: No such Gstreamer factory: v4l2src”

This is the example code:

/**
 * CustomPipeline 
 * by Andres Colubri. 
 * 
 * Create a Capture object with a pipeline description to 
 * get video from non-standard sources.
 */

import processing.video.*;

Capture cam;

void setup() {
  size(640, 480);
  
  // Start the pipeline description with the "pipeline:" prefix, 
  // the rest could any regular GStreamer pipeline as passed to gst-launch:
  // https://gstreamer.freedesktop.org/documentation/tools/gst-launch.html?gi-language=c#pipeline-description 
  cam = new Capture(this, 640, 480, "pipeline:videotestsrc");
  cam.start();  
}

void draw() {
  if (cam.available() == true) {
    cam.read();
  }
  image(cam, 0, 0, width, height);
}

I am using a Raspberry Pi 4 and have installed gstreamer1.0. I have even tried to display a stream from the command line and it has worked, so I don’t think it’s an installation issue. I have found a few threads on google that explore this problem. Most of them are from 4-5 years ago though, and use gstreamer0.1 instead of gstreamer1.0 and an old processing.video library. The current version of the library (correct me if I’m wrong) should support gstreamer1.0 and is suposed to be a pretty smooth operation.

Anyone have any ideas on what the issue might be? Thank you!!

1 Like

I have a RaspberryPi4 but not processing ide installed, instead I use PiCrate (ruby implementation of processing) and this is what I get (no video device attached).


Are you using https://github.com/processing/processing-video/releases/tag/r6-v2.0-beta4, if not you should be. Console output Processing video library using GStreamer 1.14.4.

1 Like

I replaced my version of the library with the one you mentioned, and it worked! Thank you! Now I have the problem of actually getting my pipeline to work. This is the what I’m sending gst-launch through the command line: udpsrc port=5001 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink

I’m adding this line right after the "pipeline:" prefix but I’m getting a similar error:

IllegalArgumentException: No such Gstreamer factory: udpsrc port=5001

Maybe I’m not getting the syntax right?

Not a chance that the code that in the video library for custom pipelines can possibly work - it’s completely broken - https://github.com/processing/processing-video/blob/r6-v2.0-beta4/src/processing/video/Capture.java#L425

Your only option would be to subclass Capture and rewrite that method. :confounded:

@NaniAromix It is unfortunate :confounded: that you are in a position where:-

  1. The RaspberryPi is no longer officially supported, the last support finished with the Stretch distro on RaspberryPI3B+.
  2. It seems from @neilcsmith (and he should know) CustomPipeline is still a work in progress by @codeanticode see here.
  3. I know @kll has had success with compiling glvideo afresh on Buster, albeit with RaspberryPI3B+, but that doesn’t seem too attractive to me.
1 Like

Ha, yes, I’d forgotten that issue conversation! :+1: Was just about to file another one related to this thread. That’s exactly the problem - manual parsing of the pipeline dealing only with elements, not properties, caps, bins, automatic dynamic connections, etc. etc. Linked across from the issue to here.

1 Like

Seems like there’s no easy solution to this. I’m thinking of rewriting the “initCustomPipeline()” method you mentioned, but I’m kind of lost at the moment. What is the main issue with the current implementation?

Might openFrameworks work better in this case? No idea how complex your project is…

Sorry for the late reply!

It should be using Gst.parseLaunch() to build the whole pipeline from a String using the underlying GStreamer support for this. If you give certain elements a name property, you can then extract them from the pipeline by name for the elements that the library needs access to (eg. the appsink).

2 Likes

Hey @neilcsmith

I’m really new to using gstreamer and processing so I’m not sure exactly how to go about getting my camera to work using a custom pipeline. Can you provide an example using parseLaunch to build the pipeline and have it display using Capture?

I’ve been plugging away at my camera problem for a few weeks now and I have it working using the standard capture but the colors are totally wrong. It looks like blue and red are swapped so I was hoping using a pipeline to specify the format would be the solution.

Thank you! I’ve been doing some research on the subject and starting to see some interesting things. To my understanding, Gst.parseLaunch(String) returns the Element corresponding to that String name, but doesn’t seem to parse correctly the whole pipeline string?

In the example provided above there is only one element in the pipeline: videotestsrc, so there is no issue. On the other hand, if I use this pipeline string: udpsrc port=5001 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink it does not parse individually udpscrc and port=5001 for example. This leads to this error:

IllegalArgumentException: No such Gstreamer factory: udpsrc port=5001

Which is the original issue we had. So, I think I can boil down the problem into parsing Elements (udpsrc) and Properties (port=5001) separately and then handling both. How would I proceed to do that? I have been looking at this example by @codeanticode. It looks like he’s trying just that, but I can’t get to apply his implementation to my string.

Of course if I’m wrong on any of this let me know, I wouldn’t want to be climbing down the wrong rabbit whole.

The answer is to use Gst.parseLaunch. Can you share the full code you’re trying? That error message should be impossible from that call. Are you sure the updated code is being picked up?

I have subclassed Capture in order to override the initCustomPipeline() method. I have also added private methods that were innacessible:

import processing.video.*; 
import org.freedesktop.gstreamer.*;
import org.freedesktop.gstreamer.Buffer;
import org.freedesktop.gstreamer.device.*;
import org.freedesktop.gstreamer.elements.*;
import org.freedesktop.gstreamer.event.SeekFlags;
import org.freedesktop.gstreamer.event.SeekType;

class subCapture extends Capture {

  public subCapture(PApplet parent, int width, int height, String device) {
    super(parent, width, height, device);
  }


  private void makeBusConnections(Bus bus) {
    bus.connect(new Bus.ERROR() {
      public void errorMessage(GstObject arg0, int arg1, String arg2) {
        System.err.println(arg0 + " : " + arg2);
      }
    }
    );
    bus.connect(new Bus.EOS() {
      public void endOfStream(GstObject arg0) {
        try {
          stop();
        } 
        catch (Exception ex) {
          ex.printStackTrace();
        }
      }
    }
    );
  }



  @Override

  protected void initCustomPipeline(String pstr) {
    String[] parts = pstr.split("!");
    int n = parts.length;

    Element[] elements = new Element[n+4];

    for (int i = 0; i < n; i++) {
      String el = parts[i].trim();
      elements[i] = ElementFactory.make(el, null);
    }

    pipeline = new Pipeline();

    String frameRateString;
    if (frameRate != 0.0) {
      frameRateString = ", framerate=" + fpsToFramerate(frameRate);
    } else {
      frameRateString = "";
    }

    Element videoscale = ElementFactory.make("videoscale", null);
    Element videoconvert = ElementFactory.make("videoconvert", null);
    Element capsfilter = ElementFactory.make("capsfilter", null);

    initSink();

    capsfilter.set("caps", Caps.fromString("video/x-raw, width=" + width + ", height=" + height + frameRateString));

    elements[n + 0] = videoscale;
    elements[n + 1] = videoconvert;
    elements[n + 2] = capsfilter;
    elements[n + 3] = rgbSink;

    pipeline.addMany(elements);
    Pipeline.linkMany(elements);

    makeBusConnections(pipeline.getBus());
  }
  
}

My main sketch looks like this:

import processing.video.*;

Capture cam;

void setup() {
  //size(640, 480);
  size(1280, 1024); // Resolució del draw
  
  // Start the pipeline description with the "pipeline:" prefix, 
  // the rest could any regular GStreamer pipeline as passed to gst-launch:
  // https://gstreamer.freedesktop.org/documentation/tools/gst-launch.html?gi-language=c#pipeline-description 
  cam = new subCapture(this, width, height, "pipeline:udpsrc port=5001 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink");
  //cam = new subCapture(this,1280,1024,"pipeline:videotestsrc");  // Ha de capturar a la mateixa resolució que draw, sinó va molt lent.
  cam.start();  
}

void draw() {
  if (cam.available() == true) {
    cam.read();
  }
  image(cam, 0, 0, width, height);
  //println(frameRate);
}

The error message comes from the ElementFactory.make() call insinde the loop. It doesn’t recognize udpsrc port=5001 as an Element.

The idea is to replace all of that code using ElementFactory with a call to Gst.parseLaunch. I’ll have a look at writing a rough outline in the next few days when I’m back at my computer. Or see if you can make sense of how the PraxisLIVE capture component does this - https://github.com/praxis-live/praxiscore/blob/master/praxiscore-video-gstreamer/src/main/java/org/praxislive/video/gstreamer/components/GStreamerVideoCapture.java

This helps a lot thank you! I’ll have a go at it and see if I can figure it out.

Hello, so I’ve been going at it and trying to figure out how praxis-live does it. I’ve finally managed to parse the pipeline with Gst.parseLaunch() and now returns a Pipeline instead of an Element. This is the initCustomPipeline() I have modified:

  @Override
    protected void initCustomPipeline(String pstr) {

    pipeline = (Pipeline) Gst.parseLaunch(pstr);

    initSink();

    makeBusConnections(pipeline.getBus());
  }

Now there is the issue of the pipeline sink. What I have tried is to use the initSink() method to initialize the pipeline sink. This does nothing, the screen turns black but doesn’t give error message. The praxis-live implementation includes ...appsink name=sink" at the end of the pipeline description and then calls the sink from name to handle it, like seen here. Do I need to write a similar implementation for my case?

I think what you want is here - written in a text editor and not tested, but should be about right. You need to remove the call to initSink() and do some of it in here instead. And you need to make sure to remove autovideosink from the pipeline you’re passing in - it should end with the element before the sink ( jpegdec above) and then will have the additional elements added by the PIPELINE_END string before calling parseLaunch.

protected void initCustomPipeline(String pstr) {

    String PIPELINE_END
            = " ! videorate ! videoscale ! videoconvert ! appsink name=sink";

    pipeline = (Pipeline) Gst.parseLaunch(pstr + PIPELINE_END);

    String caps = ", width=" + width + ", height=" + height;
    if (frameRate != 0.0) {
      caps += ", framerate=" + fpsToFramerate(frameRate);
    }
    
    rgbSink = (AppSink) pipeline.getElementByName("sink");
    rgbSink.set("emit-signals", true);
    newSampleListener = new NewSampleListener();
    newPrerollListener = new NewPrerollListener();        
    rgbSink.connect(newSampleListener);
    rgbSink.connect(newPrerollListener);

    useBufferSink = Video.useGLBufferSink && parent.g.isGL();
    if (ByteOrder.nativeOrder() == ByteOrder.LITTLE_ENDIAN) {
      if (useBufferSink) rgbSink.setCaps(Caps.fromString("video/x-raw, format=RGBx" + caps));
      else rgbSink.setCaps(Caps.fromString("video/x-raw, format=BGRx" + caps));
    } else {
      rgbSink.setCaps(Caps.fromString("video/x-raw, format=xRGB" + caps));
    }

    makeBusConnections(pipeline.getBus());
}
1 Like

Great thanks! Although at the moment there are issues with the visibility of some classes and methods. Because I have sublassed Capture and overriding this method, the classes NewSampleListener and NewPrerollListener, as well as their constructors, are not visible because they are private. Video.useGLBufferSink is also innacessible.

One solution would be to modify initCustomPipeline() directly in Capture instead but I think it would be best not to touch any source code.

I would modify the sources then, and when you have it working, make a GitHub pull request to fix the library.