In my case I solved it by uninstalling and reinstalling the processing foundation video library.
I got the error after moving the processing-3.5.4 folder to another place. Windows 10.
Thank you for coming up with a workaround to this it’s been driving me nuts.
autovideosrc defaults to yuyv, which is slow at higher resolutions. To get mjpeg, I used this pipeline:
cam = new Capture(this, 1920, 1080, "pipeline: ksvideosrc device-index=0! image/jpeg, width=1920, height=1080, framerate=30/1 ! jpegdec ! videoconvert");
I had to specify my resolution twice so that Processing and gstreamer were on the same page, but it might not be necessary in all cases.
Thank you! This solved my issue.
It works! thanks! Now it can be opened every time!
Thanks @neilcsmith worked like a charm
@neilcsmith Thanks for the Capture solution it works like everyone else says, but I also get the same error for a basic video example “frames”
I don’t understand how I can apply the pipeline fix to the normal video src
I keep getting the error: "WARNING: no real random source present!
Processing video library using GStreamer 1.16.2 "
mov = new Movie(this, “launch2.mp4”);
/**
- Frames
- by Andres Colubri.
- Moves through the video one frame at the time by using the
- arrow keys. It estimates the frame counts using the framerate
- of the movie file, so it might not be exact in some cases.
*/
import processing.video.*;
Movie mov;
int newFrame = 0;
void setup() {
size(560, 406);
background(0);
// Load and set the video to play. Setting the video
// in play mode is needed so at least one frame is read
// and we can get duration, size and other information from
// the video stream.
mov = new Movie(this, “launch2.mp4”);
// Pausing the video at the first frame.
mov.play();
mov.jump(0);
mov.pause();
}
void movieEvent(Movie m) {
m.read();
}
void draw() {
background(0);
image(mov, 0, 0, width, height);
fill(0);
text(getFrame() + " / " + (getLength() - 1), 10, 30);
}
void keyPressed() {
if (key == CODED) {
if (keyCode == LEFT) {
if (0 < newFrame) newFrame–;
} else if (keyCode == RIGHT) {
if (newFrame < getLength() - 1) newFrame++;
}
}
setFrame(newFrame);
}
int getFrame() {
return ceil(mov.time() * 30) - 1;
}
void setFrame(int n) {
mov.play();
// The duration of a single frame:
float frameDuration = 1.0 / mov.frameRate;
// We move to the middle of the frame by adding 0.5:
float where = (n + 0.5) * frameDuration;
// Taking into account border effects:
float diff = mov.duration() - where;
if (diff < 0) {
where += diff - 0.25 * frameDuration;
}
mov.jump(where);
mov.pause();
}
int getLength() {
return int(mov.duration() * mov.frameRate);
}
Good morning everyone. I’ve carefully read your replies and tried them on an example sketch by processing. (I am using Version 4.0b3)
My webcam is connected trough EOS Webcam Utility.
When using this code
/**
* Frame Differencing
* by Golan Levin.
*
* Quantify the amount of movement in the video frame using frame-differencing.
*/
import processing.video.*;
int numPixels;
int[] previousFrame;
Capture video;
void setup() {
size(640, 480);
printArray(Capture.list());
// This the default video input, see the GettingStartedCapture
// example if it creates an error
//video = new Capture(this, Capture.list()[0]);
video = new Capture(this, "pipeline:autovideosrc");
// Start capturing the images from the camera
video.start();
numPixels = video.width * video.height;
// Create an array to store the previously captured frame
previousFrame = new int[numPixels];
loadPixels();
}
void draw() {
if (video.available()) {
// When using video to manipulate the screen, use video.available() and
// video.read() inside the draw() method so that it's safe to draw to the screen
video.read(); // Read the new frame from the camera
video.loadPixels(); // Make its pixels[] array available
int movementSum = 0; // Amount of movement in the frame
for (int i = 0; i < numPixels; i++) { // For each pixel in the video frame...
color currColor = video.pixels[i];
color prevColor = previousFrame[i];
// Extract the red, green, and blue components from current pixel
int currR = (currColor >> 16) & 0xFF; // Like red(), but faster
int currG = (currColor >> 8) & 0xFF;
int currB = currColor & 0xFF;
// Extract red, green, and blue components from previous pixel
int prevR = (prevColor >> 16) & 0xFF;
int prevG = (prevColor >> 8) & 0xFF;
int prevB = prevColor & 0xFF;
// Compute the difference of the red, green, and blue values
int diffR = abs(currR - prevR);
int diffG = abs(currG - prevG);
int diffB = abs(currB - prevB);
// Add these differences to the running tally
movementSum += diffR + diffG + diffB;
// Render the difference image to the screen
pixels[i] = color(diffR, diffG, diffB);
// The following line is much faster, but more confusing to read
//pixels[i] = 0xff000000 | (diffR << 16) | (diffG << 8) | diffB;
// Save the current color into the 'previous' buffer
previousFrame[i] = currColor;
}
// To prevent flicker from frames that are all black (no movement),
// only update the screen if the image has changed.
if (movementSum > 0) {
updatePixels();
println(movementSum); // Print the total amount of movement to the console
}
}
}
I am getting following console Output:
Processing video library using GStreamer 1.16.2
BaseSrc: [autovideosrc0-actual-src-ksvide] : No supported formats found
BaseSrc: [autovideosrc0-actual-src-ksvide] : not negotiated
BaseSrc: [autovideosrc0-actual-src-ksvide] : Internal data stream error.
WARNING: no real random source present!
Had the same problem for months, and adding just this solved it.
Oops, this link is no longer active.
I appreciate this
see Capture / Libraries / Processing.org
Quote
Use Capture.list() to show the names of any attached devices.
The “pipeline:autovideosrc” solution works for me. However when I try to export the code (windows X64) the compiled version does not work even if I include Java. Any idea?
I have the same but different problem with regards to webcams. I have an integrated webcam and a USB camera. I downloaded Processing’s Capture sketch and it only recognizes the integrated webcam. I have tried 3 different cameras, but got the same results. I used the above mentioned “pipeline:autovideosrc” piece of code, but saw no changes. I tried entering the name of my USB camera in cam = new Capture(this, LifeCam HD-3000); but now it says camera not found. However, this USB camera shows up on my computer along with the integrated camera using the computer’s camera app. Has anyone found a solution to this. I am using Processing version 4.
Thank You
I would try disabling the integrated webcam in Device Manager then see if autovideosrc finds the USB camera.
You could also try "pipeline: ksvideosrc device-index=1"
(or device-index=0
if the integrated webcam is disabled) instead of "pipeline: autovideosrc"
Hi
The "pipeline:ksvideosrc device-index = 1; was the fix.
Thank You for solving this problem for me. A job well done.
Thnx this solved all my troubles. I had to restart the code like 5 times to run now it works every time.
I have some students trying to get the video capture to work, and when they set their Capture object like this:
cam = new Capture(this, “pipeline:autovideosrc”);
they get this error in the console, and it does not run:
GstException: no element “autovideosrc”
GstException: no element “autovideosrc”
GstException: no element “autovideosrc”
I also tried the ‘ksvideosrc’ option, and that said that ‘ksvideosrc’ does not exist. Otherwise, the library loads, and it says it is using GStreamer 1.16.2
They are running Processing 4.b7 and b8
That’s interesting. It sounds like you might have a problem with your video library installation.
First check to make sure the camera appears and is enabled in the device manager and works with other apps, but it sounds like that’s not your problem.
Try running this sketch:
import processing.video.*;
Capture cam;
void setup() {
size(640, 480);
cam = new Capture(this, "pipeline:videotestsrc pattern=snow");
cam.start();
}
void draw() {
if (cam.available() == true) {
cam.read();
}
image(cam, 0, 0, width, height);
}
You should see television static.
Reinstall the video library and delete any other instances of the video library or similar libraries that might be interfering. Better yet delete all of your libraries, reinstall processing and point your sketchbook to a clean directory, install the video library and see if anything changes.
Does it only happen on school computers? It seems like it could easily be an issue with security software or IT administration but that’s beyond my expertise. Try running Processing as an administrator and see if that solves anything. You could also check your environment variables in case it’s trying to run a different instance of gstreamer (I’m not sure if this would actually cause a problem, but worth a shot imo).
Thanks for the suggestions, lfredericks. The students each have their own laptops, so it’s all varieties of OS types / versions / hardware… 60 different computers to support… 0_O
One question: I’m not as familiar with Windows, how can you tell if you are running Processing as an administrator?