Realtime image processing using thread(); or class extending thread

its amazing the performance is so great thank you so much. does the shader using multi thread within .glsl file?? i gues not??

thank you brother learn many thing from you especially raw pixel

1 Like

hi brother could i use shaders to improve fps and computing in facerecognition also combining opencv library? thank you

I’m not actually sure. I don’t have much experience with computer vision, I just know a bit about shaders. If you can solve a problem by running code on individual pixels, that’s what shaders are for. It’s up to you to figure out how to apply shaders in future projects.

1 Like

hi brother do you know what web or youtube tutorial which i could learn creating fragment shaders file from? thank you

You can use the Shader tutorial for processing: Shaders \

Yes. Shaders are automatically parallelized on the GPU. I didn’t suggest like @Shwaa because by the time you’ve done loadPixels() on a GL backed PGraphics you often don’t really win. It’s doing all the same raw pixel looping to get and transform the data from the GPU. Where you can keep the data on the GPU, shaders are far better.

The book of shaders can be found online and is pretty good, although some of its advanced chapters are empty.

I would also make us of

And of course the best way to learn is to look for actual examples, shadertoy can help.

hi brother after succeed applying your code in java mode i try to use it in processing android mode and adapt ketai camera library with its method as subtitue to video library but there is error mention could not compile fragment shader andthe error message “sqrt” no matching overloaded function found. does sqrt function within glsl try to find its matching function in the proccessing sketch. thank you for your help
this is whole error message
Process: processing.test.androcolortrackingwavgxnynthresholdwshader, PID: 18155
java.lang.RuntimeException: Cannot compile fragment shader:
0:22: S0001: No matching overload for function ‘sqrt’ found
at processing.core.PGraphics.showException(
at processing.opengl.PShader.compileFragmentShader(
at processing.opengl.PShader.compile(
at processing.opengl.PShader.init(
at processing.opengl.PShader.getUniformLoc(
at processing.opengl.PShader.setUniformImpl(
at processing.opengl.PShader.set(
at processing.test.androcolortrackingwavgxnynthresholdwshader.AndroColorTrackingWavgXnYnThresholdWshader.draw(
at processing.core.PApplet.handleDraw(
at processing.opengl.PSurfaceGLES$RendererGLES.onDrawFrame(
at android.opengl.GLSurfaceView$GLThread.guardedRun(
at android.opengl.GLSurfaceView$

this is the code with ketai camera

PShader colorFinder, colorPosShader;
PGraphics overlay, posBuffer;
// Variable for capture device
KetaiCamera camera;

// A variable for the color we are searching for.
color trackColor; 
float threshold = 0.1;

void setup() {
  size(640, 480, P2D);
  overlay = createGraphics(width, height, P2D);
  posBuffer = createGraphics(width, height, P2D);
  colorFinder = loadShader("colorDetect.glsl");
  colorPosShader = loadShader("colorPos.glsl");
  camera = new KetaiCamera(this, width, height, 24);
  trackColor = color(255, 0, 0);

void onCameraPreviewEvent()
void draw() {
  colorFinder.set("threshold", threshold);
  colorFinder.set("targetColor", red(trackColor) / 255.0, green(trackColor) / 255.0, blue(trackColor) / 255.0, 1.0);
  colorPosShader.set("threshold", threshold);
  colorPosShader.set("targetColor", red(trackColor) / 255.0, green(trackColor) / 255.0, blue(trackColor) / 255.0, 1.0);
  overlay.image(camera, 0, 0);
  posBuffer.image(camera, 0, 0);
  //compute average position by looking at pixels from position buffer
  PVector avg = new PVector(0, 0);
  int count = 0;
  for(int i = 0; i < posBuffer.pixels.length; i++){
    // encoded so blue is > 0 if a pixel is within threshold
    if(blue(posBuffer.pixels[i]) > 0){
      // processing takes 0-1 (float) color values from shader to 0-255 (int) values for color
      // to decode, we need to divide the color by 255 to get the original value
      avg.add(red(posBuffer.pixels[i]) / 255.0, green(posBuffer.pixels[i]) / 255.0);
  if(count > 0){
    // we have the sum of positions, so divide by the number of additions
    avg.div((float) count);
    // convert 0-1 position to screen position
    avg.x *= width;
    avg.y *= height;
  } else {
    // appear offscreen
    avg = new PVector(-100, -100);
  image(overlay, 0, 0);
  circle(avg.x, avg.y, 16);
  fill(0, 50);
  rect(0, 0, 150, 30);
  //text("FPS: " + frameRate, 20, 60);
  //text("Threshold: " + threshold, 20, 80);

void mousePressed() {
  // Save color where the mouse is clicked in trackColor variable
  int loc = mouseX + mouseY*camera.width;
  trackColor = camera.pixels[loc];

void mouseWheel(MouseEvent e){
  threshold -= e.getCount() * 0.01;
  threshold = constrain(threshold, 0, 1);

One of the first few lines informs you that the shader doesnt know what the sqrt() function is.

Glsl has a lot of similarities to processing, however not all processing or java functions are native to glsl, so perhaps sqrt isnt native to glsl?

is it means the compiler looking for matching func similar to sqrt func within processing??
as far as i know sqrt func task is calculating distance between for short b.x and a.x. is it possible to use processing built in func dist();

hi brother!! may i know where 3 comes from

so glsl does have a sqrt function, unfortunately I do not know enough about glsl to be able to help with this one. Best of luck.

someone from other site suggest me to change sqrt(3) to sqrt(3.0) since sqrt func is float and its work.

1 Like

In the colorDetect shader, I comment // colors are from 0-1, so the max distance between colors is sqrt(3). To break this statement down, in glsl shaders, colors are represented by a vec4, where each component of the vector stores a different color channel. The x stores red, y stores green, z stores blue, and w stores alpha (opacity). In processing, by default, RGB colors have integer components ranging from 0-255, but in the shader, the components scale from 0-1 as a float.
Because the color filtering code checks the “distance” between colors, you can think of the r, g, and b values storing your colors as points in 3D space. The distance function for 3 dimensions is
And, since we know x, y, and z can only range from 0-1, the maximum distance is if one point is at (0, 0, 0) and the other at (1, 1, 1). Plugging in those numbers to the formula, and you get
which is just sqrt(3).
If you want to cut off this maximum distance to find similar colors, multiplying it by a 0-1 constant (the threshold) will ensure that the cutoff occurs within the maximum distance between colors.

1 Like

hi brother ! if you don’t mind would you like to help me how to check if there is target color between range of x = 500 - width and y = 320 - height. thank you