GLSL Shaders using Processing Pi on a Pi 3 B+

Hey @jeremydouglass, thanks for the additional references. I went through them, and I was still getting errors, so no not yet. I tried a few different things, including re-writing the shader to work outside of the “ShaderToy” nomenclature. I resized the videos to be smaller, updated the Pi’s GPU memory to 256mb etc. I ensured it still works on OSX, but when it’s run on a Raspberry Pi 3B+ the sketch is an empty white screen.

The only output in the console is:

Final caps: video/x-raw(memory:GLMemory), format=(string)RGBA, width=(int)640, height=(int)360, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, texture-target=(string)2D
Final caps: video/x-raw(memory:GLMemory), format=(string)RGBA, width=(int)640, height=(int)360, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)25/1, texture-target=(string)2D

Perhaps there’s no support to set PGraphics from Processing to a sampler2D texture in the shader on processing pi? It crossed my mind there’s something with GLVideo images when you set a texture3D. Also, maybe it’s the I’m mixing up something between how frag and color shaders work. At the moment I think I’m only using a Processing Color Shader.

shaderDisolveGLSL.pde

//import processing.video.*;
import gohai.glvideo.*;

PShader mixShader;  

PGraphics pg;
PGraphics pg2;

//Movie movie;
//Movie movie2;

GLMovie movie;
GLMovie movie2;

void setup() {
  size(640, 360, P2D);
  noSmooth();
  pg = createGraphics(640, 360, P2D);

  //movie = new Movie(this, "_sm/LabspaceDawnv1blur2.mp4");
  movie = new GLMovie(this, "_sm/LabspaceDawnv1blur2.mp4");
  movie.loop();

  //movie2 = new Movie(this, "_sm/LabspaceFireblur2.mp4");
  movie2 = new GLMovie(this, "_sm/LabspaceFireblur2.mp4");
  movie2.loop();

  pg = createGraphics(width, height, P2D);
  pg2 = createGraphics(width, height, P2D);

  mixShader = loadShader("fadeshader.glsl");
  mixShader.set("iResolution", float(width), float(height));
  mixShader.set("iTime", millis()/1000.);

  mixShader.set("iChannel0", pg);
  mixShader.set("iChannel1", pg2);

}  

//void movieEvent(Movie m) {
void movieEvent(GLMovie m) {
  m.read();
  redraw();
}

void draw() {
  
  pg.beginDraw();
    pg.image(movie, 0, 0, width, height);
  pg.endDraw();

  pg2.beginDraw();
    pg2.image(movie2, 0, 0, width, height);
  pg2.endDraw();
  
  shader(mixShader);
  rect(0, 0, width, height);
  
}

fadeshader.glsl

#ifdef GL_ES
precision mediump float;
precision mediump int;
#endif

// Type of shader expected by Processing
#define PROCESSING_COLOR_SHADER

uniform float iTime;
uniform sampler2D iChannel0;
uniform sampler2D iChannel1;
uniform vec2 iResolution;

void main() {
    
    vec2 uv = gl_FragCoord.xy / iResolution.xy;
    vec4 mixColor = vec4(0.0);
    vec4 color0 = vec4(uv.x,uv.y,0.0,1.0);
    vec4 color1 = vec4(uv.x,uv.y,0.0,1.0);

    color0 = texture2D(iChannel0, uv);
    color1 = texture2D(iChannel1, uv);

    float duration = 10.0;
    float t = mod(float(iTime), duration) / duration;
    
    mixColor = mix(color0, color1, t);
    gl_FragColor = mixColor;
}

I’ve updated a new version of the sample sketch with the smaller videos here if anyone was curious: https://www.dropbox.com/sh/fu2plxmqhf7shtp/AADxqmW9zf73EsdzworCb5ECa?dl=0

Any recommendations or areas of thought for me to look more into would be greatly appreciated!

1 Like