Changing webcam-pixels to "nearest" color from color palette array

Hi

Using my webcam and a predefined color palette array I want to change each pixels from the webcam into the closest matching color from the color palette array.

I am having trouble finding an efficient and stable way to do it.

Any help would be greatly appreciated!

Why was this topic moved to Processing --> libraries @GoToLoop?
Is there a library that does this already?

Best
Andreas

1 Like

how far is that from what you want?

// https://processing.org/reference/filter_.html
// https://discourse.processing.org/t/changing-webcam-pixels-to-nearest-color-from-color-palette-array/14811

import processing.video.*;
Capture video;

void setup() {
  size(640, 480);
  video = new Capture(this, width, height);
  video.start();
}

void draw() { 
  if (video.available()) {
    video.read();
    video.filter(POSTERIZE,8);
    image(video, 0, 0);
  }
}

Pretty far I think. It is not to achieve a visual effect, but to accurately figure out which color in the palette is the closest to each pixel.

so what would be the output?

Sending an image with the correct colors to Runway to do SPADE-COCO generation, see https://github.com/runwayml/processing-library

so you want change the colors of the picture?

if you want quantify the color there must be a rule like
color name x = color(hx,sx,bx) range:

  • hue from a to hx to b,
  • saturation from c to sx to d,
  • brightness from e to bx to f,

do you have that rule or math?

and that calc you need to do on each pix color?

import processing.video.*;
Capture video;

void setup() {
  size(320, 240);
  colorMode(HSB);
  video = new Capture(this, width, height);
  video.start();
}

void draw() { 
  if (video.available()) {
    video.read();
    //video.filter(POSTERIZE, 4);
    //image(video, 0, 0);                 // show org or filtered picture
    video.loadPixels();
    int count = video.width * video.height;
    for (int i = 0; i < count; i++) {
      color c = video.pixels[i];
      float h = hue(c);
      float s = saturation(c);
      float b = brightness(c);
      if ( hprint ) println("i "+i+" h "+nf(h, 0, 1)+" s "+nf(s, 0, 1)+" b "+nf(b, 0, 1) );
      // what is the RULE for a correct color?
      video.pixels[i] = color(h, s, b);
    }
    updatePixels();
    hprint = false;

    image(video, 0, 0);                   // show mod picture
  }
}

boolean hprint = false;
void keyPressed() {
  if ( key == 'h' ) hprint = true;
}

a example for ranging math here with WEB SAFE COLORS

// https://processing.org/reference/filter_.html
// https://discourse.processing.org/t/changing-webcam-pixels-to-nearest-color-from-color-palette-array/14811

import processing.video.*;
Capture video;

void setup() {
  size(160, 120);
  //colorMode(HSB,100);    // as we later change float to int this here is already POSTERIZE ( reduced color resolution )
  video = new Capture(this, width, height);
  video.start();
}

void draw() { 
  if (video.available()) {
    video.read();
    //video.filter(POSTERIZE, 4);         
    //image(video, 0, 0);                 // show org or filtered picture
    video.loadPixels();
    int count = video.width * video.height;
    for (int i = 0; i < count; i++) {
      color c = video.pixels[i];
      int h  = (int)hue(c);
      int s  = (int)saturation(c);
      int br = (int)brightness(c);
      int r  = (int)red(c);
      int g  = (int)green(c);
      int bl = (int)blue(c);
      // what is the RULE for a correct color?
      //h = h;
      //s = s;
      //br = br;
      // test web safe quantification
      int wr  = websafe(r);
      int wg  = websafe(g);
      int wbl = websafe(bl);
      if ( hsbprint ) println("i "+i+" h "+h+" s "+s+" b "+br+" / r "+r+" g "+g+" b "+bl+" // websafe wr "+wr+" wg "+wg+" wbl "+wbl);
      video.pixels[i] = color(wr, wg, wbl); //color(h, s, b);
    }
    updatePixels();
    hsbprint = false;

    image(video, 0, 0);                   // show mod picture
  }
}

int websafe(int c) {
  //https://www.rapidtables.com/web/color/Web_Safe.html
  // c from 0 ..255 for r g bl / w in 6 steps for 255 step color space
  int w = 0;
  if       ( c < 25 ) w = 0;
  else if  ( c < 75 ) w = 51;
  else if  ( c <125 ) w = 102;
  else if  ( c <175 ) w = 153;
  else if  ( c <225 ) w = 204;
  else                w = 255;
  return w;
}



boolean hsbprint = false;
void keyPressed() {
  if ( key == 'h' ) hsbprint = true;
}

In order to use a webcam, we already need the video library. :video_camera:

1 Like

Super cool @kll, thanks for the example :smiley:

I do not have a specific color rule yet, but was thinking about using closest hue, however I have limited experience with this, so any suggestions for the best rule is very welcome? For the project I am facing a webcam towards a perler bead, if that makes sense…

Also, if instead of the web-safe colors, what if I was trying to match a predefined color palette like color[] myColors = {#FF0000, #FFC000, #E0FF00};, any hints how to achieve that?

What size pallete are you talking about? Is this fullcolor or grayscale?

Also, when you say “closest”, what color distance metric? If you want the closest in RGB colorspace then you can compute that efficiently with

d = (dx * dx + dy * dy + dz * dz); // no sqrt required

as you are basically sorting – for related discussion see https://stackoverflow.com/questions/3693514/very-fast-3d-distance-check

Note however that RGB distance is really rough – in some places it won’t mimic intuitive perceptual colorspaces, as it is based output – not the eye.

to give you a frame for editing your own palette
see this mod

import processing.video.*;
Capture video;

void setup() {
  size(640, 480);
  video = new Capture(this, width, height);
  video.start();
}

void draw() { 
  if (video.available()) {
    video.read();
    if ( !showfilter ) image(video, 0, 0);                 // show org or filtered picture
    video.loadPixels();
    int count = video.width * video.height;
    for (int i = 0; i < count; i++) 
      //video.pixels[i] = color(towebsafe(video.pixels[i]));
      video.pixels[i] = color(toUserPalette(video.pixels[i]));
    updatePixels();

    if ( showfilter ) image(video, 0, 0);                   // show mod picture
  }
}

color towebsafe( color in ) {
  return color( websafe((int)red(in)), websafe((int)red(in)), websafe((int)blue(in)) );
}

int websafe(int c) {
  //https://www.rapidtables.com/web/color/Web_Safe.html
  // c from 0 ..255 for r g bl / w in 6 steps for 255 step color space
  int w = 0;
  if       ( c < 25 ) w = 0;
  else if  ( c < 75 ) w = 51;
  else if  ( c <125 ) w = 102;
  else if  ( c <175 ) w = 153;
  else if  ( c <225 ) w = 204;
  else                w = 255;
  return w;
}

/*
color[] myColors = {#FF0000, #FFC000, #E0FF00};

                        0,   0, 0    // kll add
                      255,   0, 0
                      255, 192, 0
                      224, 255, 0
*/
int user_r(int c) {
  int w = 0;
  if       ( c <112 ) w = 0;
  else if  ( c <240 ) w = 224;
  else                w = 255;
  return w;
}
int user_g(int c) {
  int w = 0;
  if       ( c < 96 ) w = 0;
  else if  ( c <224 ) w = 192;
  else                w = 255;
  return w;
}
int user_b(int c) {
  int w = 0;
  return w;
}

color toUserPalette( color in ) {
  return color( user_r((int)red(in)), user_g((int)red(in)), user_b((int)blue(in)));
}

boolean showfilter = true;

void keyPressed() {
  if ( key == 'f' ) showfilter = ! showfilter;         // toggle show filtered or not
}


Hi @jeremydouglass :smiley:
Yeah, that is also sort of what I am asking. I do not have a specific color distance metric yet, and I’m asking advice on that topic as well. I managed to integrate closest color in both RGP + HSB colorspace into @kll example and it is working decently, but I wondering if there is better way to do it? I don’t care about what is closer to they human eye, just as long as the colors of my beads are tracked consistently through the camera + script.

Another way if you need more raw speed might be to move the code into a shader:
https://processing.org/tutorials/pshader/

More details would be helpful. Are you trying to track multiple beads like objects – e.g. blob detection, OpenCV? Are you trying to track multiple objects by detecting individual colors, or are you tracking blobs, then checking the colors where the blobs are found? Knowing that the whole region occupied by a black bead is “black” depends a lot on things like scale and lighting conditions – as it can look gray, have a darker hole in the middle, reflect patches of white glare, and also reflect colors from things around it, et cetera. All of the individual pixels of a bead will almost never match its single color.

@jeremydouglass the beads should not be tracked as objects. What I want to do is to have a webcam be placed looking down on a perler bead and send an image with the correct colors to Runway to do SPADE-COCO generation, see https://github.com/runwayml/processing-library

So essentially what I want to do is change the colors of the picture, so that each bead (as seen through the webcam) gets converted into the nearest color from a predefined color palette. No tracking or anything like that.

This is what I have so far, and it is pretty decent, thanks for the help to @kll , but I am still sure there could be improvements? Speed is not the most important thing, it is more important to make it robust to light/shade/reflections thing things around it etc as you also mention.

import processing.video.*;
Capture video;

boolean applyBlur = true;
boolean findClosestColor = true;

color[] myColors = {
  #B8B1A6, 
  #273E56, 
  #334039, 
  #987113, 
  #5D0F12
};

void setup() {
  size(640, 360, P2D);
  video = new Capture(this, width, height);
  video.start();
}

void draw() {
  if (video.available()) {
    video.read();
  }
  if (applyBlur) video.filter(BLUR, 5.0);
  image(video, 0, 0);

  if (findClosestColor) {
    loadPixels();
    int count = width * height;
    for (int i = 0; i < count; i++) {
      color c = pixels[i];
      pixels[i] = color(closestColor(c));
    }
    updatePixels();
  } 
  fill(255);
  text("press 'b' to toggle blur and 'c' to toggle closestColor", 10, 10);
}

color closestColor(color c) {
  float recordDistance = 10000000;
  int index = 0;

  for (int i = 0; i<myColors.length; i++) {
    if (colorDistanceRGB(c, myColors[i]) < recordDistance) {
      recordDistance = colorDistanceRGB(c, myColors[i]);
      index = i;
    }
  }
  color returnColor = myColors[index];
  return returnColor;
}

float colorDistanceRGB(color a, color b) {
  float redDiff = red(a) - red(b);
  float grnDiff = green(a) - green(b);
  float bluDiff = blue(a) - blue(b);
  return sqrt( sq(redDiff) + sq(grnDiff) + sq(bluDiff) );
}

void keyPressed() {
  switch(key) {
  case 'b':
    applyBlur = !applyBlur;
    break;
  case 'c': 
    findClosestColor = !findClosestColor;
    break;
  }
}

So somebody is adding the beads to a tray / rack as a drawing technique? Are they regularly placed? and are they barrel-up, so that each has a hole in the center, like this?

16%20AM

1 Like

Yeah, like that!
This is my current lo-fi setup (from the angle of the camera) + my results with the code from my post above:

Got it! Are you planning to have a straight-overhead camera, or should it be robust from multiple angles?

In either case, you might want to consider camera calibration – and/or marking / registering your corners. For example, with BoofCV or OpenCV libraries

https://boofcv.org/index.php?title=Tutorial_Camera_Calibration

Yellow is suffering from glare. However, if you have a calibrated grid then you can just detect which color pixels fall in each grid square, and make each whole square equal to the most occurring color.

1 Like

If you use HSB instead of RGB, wouldn’t be the “distance” more perceptually correct?

(NB- I don’t reallly believe in the term “perceptually correct”. I am colorblind and know non-colorblind people suffer an exaggerated perception of some dull color they call “red.”)

1 Like

In general, I don’t believe (?) that HSB distances in Processing will have less warping than RGB – I haven’t tested this, but I think that is just a different numerical representation of the same non-perceptual linear colorspace. My understanding is that whatever kind of vision you have, the excitation of your eyes in relation to input to whatever kind of rods / cones you have is curved, not linear – so estimates of “nearness” of two colors – or particularly which is nearer given three colors – when computed on linear arrangements of RGB or HSB values may produce distorted results when comparing changes in different directions through color space. Still, HSB and RGB should often be good enough. Measuring just H distance can be helpful in that it is at least easily ordered – although comparing a triangle of points on the ring can still be tricky.

That is fair–it is normative in a way that isn’t justified. There are many different perceptual colorspaces, but they share some properties. Possibly relevant is the Processing Library Color Blindness, which contains code for both Daltonization and Color Blindness Simulation.

Previous forum announcement of that library:

Wanted to share some initial results here :slight_smile:

The tracking could be better, might share some debug views and ask for advice on how to improve it later on…

3 Likes