How to get the position of moving pixel with camera

I tried to convert the code on this url into the code that processing could run.

MotionDetectionTest

In order to capture the position of multiple moving pixels with the camera, I completed the conversion of some of the code, but an error occurred on lines 64-66, prompting: Array Index Out of Bounds Exception. I haven’t been able to solve this problem for a long time after trying. I really hope to get your help! Thank you very much!

import processing.video.*;

Capture video;
PImage prev;
PImage img;
ArrayList<FlowZone> fZones;

int nb=20;
float UVCutoff = 3; 

void setup() {
  size(640, 480, P2D);
  video = new Capture(this, width, height, 30 );
  video.start();
  prev = createImage(width, height, RGB);
}

void captureEvent(Capture video) {
  prev.copy(video, 0, 0, video.width, video.height, 0, 0, prev.width, prev.height);
  prev.updatePixels();
  video.read();
}

void draw() {
  video.loadPixels(); 
  prev.loadPixels();
  loadPixels();
  image(video, 0, 0);

  ArrayList<FlowZone> fZones = calculateFlow (video.width, video.height);

  for (FlowZone fz : fZones) {
    fz.draw();
  }   
  updatePixels();
}


ArrayList<FlowZone> calculateFlow(int width, int height) {
  ArrayList<FlowZone> zones = new ArrayList();
  if (prev.pixels == null)
    return zones;  

  int step = 8;
  int winStep = step * 2 + 1;
  int A2, A1B2, B1, C1, C2;
  int u, v;
  int wMax = width - step - 1;
  int hMax = height - step - 1;
  int globalY, globalX, localY, localX;

  for ( globalY = step + 1; globalY < hMax; globalY += winStep) 
  {
    for ( globalX = step + 1; globalX < wMax; globalX += winStep) 
    {
      A2 = A1B2 = B1 = C1 = C2 = 0;

      for ( localY = -step; localY <= step; localY++) 
      {
        for ( localX = -step; localX <= step; localX++) 
        {
          int address = (globalY + localY) * width + globalX + localX;

          float gradX = (video.pixels[(address - 1) * 4]) - (video.pixels[(address + 1) * 4]);
          float gradY = (video.pixels[(address - width) * 4]) - (video.pixels[(address + width) * 4]);
          float gradT = (prev.pixels[address * 4]) - (video.pixels[address * 4]);

          A2 += gradX * gradX;
          A1B2 += gradX * gradY;
          B1 += gradY * gradY;
          C2 += gradX * gradT;
          C1 += gradY * gradT;
        }
      }

      int delta = (A1B2 * A1B2 - A2 * B1);

      if (delta != 0) {
        /* system is not singular - solving by Kramer method */
        int Idelta = step / delta;
        int deltaX = -(C1 * A1B2 - C2 * B1);
        int deltaY = -(A1B2 * C2 - A2 * C1);
        u = deltaX * Idelta;
        v = deltaY * Idelta;
      } else
      {
        /* singular system - find optical flow in gradient direction*/
        int norm = (A1B2 + A2) * (A1B2 + A2) + (B1 + A1B2) * (B1 + A1B2);
        if (norm != 0) 
        {
          int IGradNorm = step / norm;
          int temp = -(C1 + C2) * IGradNorm;
          u = (A1B2 + A2) * temp;
          v = (B1 + A1B2) * temp;
        } else
        {
          u = v = 0;
        }
      }

      if (-winStep < u && u < winStep &&
        -winStep < v && v < winStep) {
        zones.add(new FlowZone(globalX, globalY, u, v));
      }
    }
  }
  return zones;
}


class FlowZone {
  public float X, Y, U, V;

  public FlowZone(float x, float  y, float  u, float  v) {
    X = x;
    Y = y;
    U = u;
    V = v;
  }  

  public void draw() {   
    if ((abs(U) + abs(V)) / 2.0 < UVCutoff) return;     
    pushMatrix();
    pushStyle();
    stroke(255);
    fill(0, 0, 0, 80);
    line(X, Y, U*3, V*3);
    popStyle();
    popMatrix();
  }
}

The sketch you linked works for me (Chrome)

Can I ask you to elaborate on the problem? How do you know it doesn’t work on lines 64-66?

Yes, the link sketch is working properly on Chrome.
But I want to convert it to run on p3. My main goal is to get the position of these moving pixels to generate other effects. I make a lot of conversion, such as I attached the code above,but the P3 in the 64-66 line error: ArrayIndexOutOfBoundsException: 307204.

Lines 64-66:

float gradX = (video.pixels[(address - 1) * 4]) - (video.pixels[(address + 1) * 4]);
float gradY = (video.pixels[(address - width) * 4]) - (video.pixels[(address + width) * 4]);
float gradT = (prev.pixels[address * 4]) - (video.pixels[address * 4]);

Since I’m a beginner, I’m not sure if there are any bugs in other code.
Thank you very much! (sorry for my Google translation)

Ok, I think I understand.

You’re trying to convert JavaScript Processing (P5.j)s into Java Processing (JP)?

If that’s the case, I think I know why you’re getting out-of-bound errors.

P5.js and JP, while similar, actually use very different methods to access the pixel array.

While JP’s pixels is an array of Color() objects, P5.js is actually an array of integers. P5.js has an index for every individual color channel. Thats one for red, one for green, one for blue, and one for the alpha. Thus:

  • JP’s pixels array is of size: width*height.
  • P5.js’s pixels array is of size: widthheight4

Thats why we have a 4 below

To fix your problem, try removing the 4’s.

1 Like

Thank you for your answer.
After I delete 4, JP is actually working. But it’s still not possible to capture moving pixel points like links. I’ll try to see what the problem is again.
Thank you very much!

I don’t think I understand what you mean by “links”. Can you elaborate?

Sorry, I mean it can’t be as perfect as the sketch in the link. JP’s code still doesn’t capture moving pixels perfectly. This is my latest tweak of the code, which I’ve looked at for a long time but still can’t figure out, which I suspect is a problem with the use of the functions of ‘prev.pixels’ and’ video. Pixels’.

Because I replaced the ‘oldImage’ and ‘newImage’ in p5. Js with ‘prev.pixels’ and’ video. Pixels’, I output them respectively with print () and found that the values were different.

This is my latest code:

import processing.video.*;

Capture video;
PImage prev;
PImage img;
ArrayList<FlowZone> fZones;

int nb=20;
float UVCutoff = 3.2; 

void setup() {
  size(640, 480, P2D);
  video = new Capture(this, width, height, 30);
  video.start();
  prev = createImage(width, height, RGB);
}

void captureEvent(Capture video) {
  prev.copy(video, 0, 0, video.width, video.height, 0, 0, prev.width, prev.height);
  prev.updatePixels();
  video.read();
}

void draw() {
  //background(0);
  video.loadPixels(); 
  prev.loadPixels();
  loadPixels();
  image(video, 0, 0);

  ArrayList<FlowZone> fZones = calculateFlow (video.width, video.height);

  for (FlowZone fz : fZones) {
    fz.draw();
  }   
  updatePixels();
}


ArrayList<FlowZone> calculateFlow(int width, int height) {
  
  //println(prev.pixels,video.pixels);
   
  ArrayList<FlowZone> zones = new ArrayList();
  if (prev.pixels == null)
    return zones;  

  int step = 8;
  float winStep = step * 2 + 1;
  float A2, A1B2, B1, C1, C2;
  float u, v;
  float wMax = width - step - 1;
  float hMax = height - step - 1;
  int globalY, globalX, localY, localX;

  for ( globalY = step + 1; globalY < hMax; globalY += winStep) 
  {
    for ( globalX = step + 1; globalX < wMax; globalX += winStep) 
    {
      A2 = A1B2 = B1 = C1 = C2 = 0;

      for ( localY = -step; localY <= step; localY++) 
      {
        for ( localX = -step; localX <= step; localX++) 
        {
          int address = (globalY + localY) * width + globalX + localX;

          //float gradX = (video.pixels[(address - 1) * 4]) - (video.pixels[(address + 1) * 4]);
          //float gradY = (video.pixels[(address - width) * 4]) - (video.pixels[(address + width) * 4]);
          //float gradT = (prev.pixels[address * 4]) - (video.pixels[address * 4]);

          float gradX = (video.pixels[address - 1]) - (video.pixels[address + 1]);
          float gradY = (video.pixels[address - width]) - (video.pixels[address + width]);
          float gradT = (prev.pixels[address]) - (video.pixels[address ]);

          A2 += gradX * gradX ;
          A1B2 += gradX * gradY ;
          B1 += gradY * gradY ;
          C2 += gradX * gradT ;
          C1 += gradY * gradT ;
          //println(gradT);
        }
      }

      float delta = (A1B2 * A1B2 - A2 * B1);

      //println(A2, A1B2, B1,C2,C1,delta);
      //println(delta);
      
      if (delta != 0) {
        /* system is not singular - solving by Kramer method */
        float Idelta = step / delta;
        float deltaX = -(C1 * A1B2 - C2 * B1);
        float deltaY = -(A1B2 * C2 - A2 * C1);
        //if (deltaX != 0 && deltaY != 0) {
        u = deltaX * Idelta;
        v = deltaY * Idelta;
        //println(delta, Idelta, deltaX, deltaY, u, v);
      } else {
        /* singular system - find optical flow in gradient direction*/
        float norm = (A1B2 + A2) * (A1B2 + A2) + (B1 + A1B2) * (B1 + A1B2);
        //println(norm);
        if (norm != 0) {
          float IGradNorm = step / norm;
          float temp = -(C1 + C2) * IGradNorm;
          u = (A1B2 + A2) * temp;
          v = (B1 + A1B2) * temp;
          //println(delta, u, v);
        } else {
          u = v = 0;
        }
      }

      if (-winStep < u && u < winStep &&
        -winStep < v && v < winStep
        && u != 0.0 && v != 0.0 ) {
        zones.add(new FlowZone(globalX, globalY, u, v));
      }
    }
  }
  return zones;
}


class FlowZone {
  public float X, Y, U, V;

  public FlowZone(float x, float  y, float  u, float  v) {
    X = x;
    Y = y;
    U = u;
    V = v;
    //println( u, v);
  }  

  public void draw() {   
    if ((abs(U) + abs(V)) / 2.0 < UVCutoff) return;     
    pushMatrix();
    pushStyle();
    stroke(255);
    fill(0, 0, 0, 80);
    translate(X, Y);
    line(0, 0, U*3, V*3);
    popStyle();
    popMatrix();
  }
}

So, I haven’t worked too much with JP’s pixels[] , but they are Color Objects if I remember correctly.

If you print out a Color object, it will return a weird integer. This is because Colors are represented in binary, but printed, are converted to ints.

Basically, the former P5.js code was kind of approximating change in color:

pixels[i * 4] is the location of the Red Channel. pixels[(i * 4) + 1] is the green channel, and so on. They weren’t measuring change in the color as a whole; just one of the channels.

float gradX = (video.pixels[address - 1]) - (video.pixels[address + 1]);

might be calculating something you don’t want - to calculate the same number the P5.js code was calculating, you’re going to have to grab the color’s red channel every time you access pixels[].

Try this :thinking:

float gradX = red(video.pixels[address - 1]) - red(video.pixels[address + 1]);

I tried it but it didn’t work, and I think there might be a mistake elsewhere. I can’t find it yet.

However, I tried another approach, using the openCV library, which successfully achieved the effect I wanted.

// Dense optical flow
import processing.video.*;
import org.opencv.video.*;
import org.opencv.video.Video;

// Capture size
final int CAPW = 640;
final int CAPH = 480;

Capture cap;
CVImage img;
float scaling;
int w, h;
Mat last;

void setup() {
  size(640, 480);
  System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
  cap = new Capture(this, CAPW, CAPH);
  cap.start();
  scaling = 10;
  w = floor(CAPW/scaling);
  h = floor(CAPH/scaling);
  img = new CVImage(w, h);
  last = new Mat(h, w, CvType.CV_8UC1);
}

void draw() {
  if (!cap.available()) 
    return;
  background(0);
  cap.read();
  img.copy(cap, 0, 0, cap.width, cap.height, 
    0, 0, img.width, img.height);
  img.copyTo();
  Mat grey = img.getGrey();
  Mat flow = new Mat(last.size(), CvType.CV_32FC2);
  Video.calcOpticalFlowFarneback(last, grey, flow, 
    0.5, 3, 10, 2, 7, 1.5, Video.OPTFLOW_FARNEBACK_GAUSSIAN);
  grey.copyTo(last);
  image(cap, 0, 0);
  drawFlow(flow);
  grey.release();
  flow.release();
  text(nf(round(frameRate), 2), 10, 20);
}

void drawFlow(Mat f) {
  // Draw the flow data.
  pushStyle();
  noFill();
  for (int y=4; y<f.rows(); y+=3) {
    int py = (int)constrain(y*scaling, 0, cap.height-1);
    for (int x=4; x<f.cols(); x+=3) {
      double [] pt = f.get(y, x);
      float dx = (float)pt[0];
      float dy = (float)pt[1];

      // Skip areas with no flow.
      if (dx == 0 && dy == 0)  continue;

      int px = (int)constrain(x*scaling, 0, cap.width-1);
      //color col = cap.pixels[py*cap.width+px];
      stroke(255);
      dx *= scaling;
      dy *= scaling;
      float ml = 1;
      if ( abs(dx) > ml && abs(dy) > ml) {
        //line(px+cap.width, py, px+cap.width+dx, py+dy);
        line(px, py, px+dx, py+dy);
      }
    }
  }
  popStyle();
}

import org.opencv.core.*;
import org.opencv.imgproc.*;
import java.nio.ByteBuffer;
import java.util.ArrayList;

public class CVImage extends PImage {
  final private MatOfInt BGRA2ARGB = new MatOfInt(0, 3, 1, 2, 2, 1, 3, 0);
  final private MatOfInt ARGB2BGRA = new MatOfInt(0, 3, 1, 2, 2, 1, 3, 0);
  // cvImg - OpenCV Mat in BGRA format
  // pixCnt - number of bytes in the image
  private Mat cvImg;
  private int pixCnt;

  public CVImage(int w, int h) {
    super(w, h, ARGB);
    System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
    pixCnt = w*h*4;
    cvImg = new Mat(new Size(w, h), CvType.CV_8UC4, Scalar.all(0));
  }

  public void copyTo() {
    // Copy from the PImage pixels array to the Mat cvImg
    Mat tmp = new Mat(new Size(this.width, this.height), CvType.CV_8UC4, Scalar.all(0));
    ByteBuffer b = ByteBuffer.allocate(pixCnt);
    b.asIntBuffer().put(this.pixels);
    b.rewind();
    tmp.put(0, 0, b.array());
    cvImg = ARGBToBGRA(tmp);
    tmp.release();
  }

  public void copyTo(PImage i) {
    // Copy from an external PImage to here
    if (i.width != this.width || i.height != this.height) {
      println("Size not identical");
      return;
    }
    PApplet.arrayCopy(i.pixels, this.pixels);
    this.updatePixels();
    copyTo();
  }

  public void copyTo(Mat m) {
    // Copy from an external Mat to both the Mat cvImg and PImage pixels array
    if (m.rows() != this.height || m.cols() != this.width) {
      println("Size not identical");
      return;
    }
    Mat out = new Mat(cvImg.size(), cvImg.type(), Scalar.all(0));
    switch (m.channels()) {
    case 1:
      // Greyscale image
      Imgproc.cvtColor(m, cvImg, Imgproc.COLOR_GRAY2BGRA);
      break;
    case 3:
      // 3 channels colour image BGR
      Imgproc.cvtColor(m, cvImg, Imgproc.COLOR_BGR2BGRA);
      break;
    case 4:
      // 4 channels colour image BGRA
      m.copyTo(cvImg);
      break;
    default:
      println("Invalid number of channels " + m.channels());
      return;
    }
    out = BGRAToARGB(cvImg);
    ByteBuffer b = ByteBuffer.allocate(pixCnt);
    out.get(0, 0, b.array());
    b.rewind();
    b.asIntBuffer().get(this.pixels);
    this.updatePixels();
    out.release();
  }

  private Mat BGRAToARGB(Mat m) {
    Mat tmp = new Mat(m.size(), CvType.CV_8UC4, Scalar.all(0));
    ArrayList<Mat> in = new ArrayList<Mat>();
    ArrayList<Mat> out = new ArrayList<Mat>();
    Core.split(m, in);
    Core.split(tmp, out);
    Core.mixChannels(in, out, BGRA2ARGB);
    Core.merge(out, tmp);
    return tmp;
  }

  private Mat ARGBToBGRA(Mat m) {
    Mat tmp = new Mat(m.size(), CvType.CV_8UC4, Scalar.all(0));
    ArrayList<Mat> in = new ArrayList<Mat>();
    ArrayList<Mat> out = new ArrayList<Mat>();
    Core.split(m, in);
    Core.split(tmp, out);
    Core.mixChannels(in, out, ARGB2BGRA);
    Core.merge(out, tmp);
    return tmp;
  }

  public Mat getBGRA() {
    // Get a copy of the Mat cvImg
    Mat mat = cvImg.clone();
    return mat;
  }

  public Mat getBGR() {
    // Get a 3 channels Mat in BGR
    Mat mat = new Mat(cvImg.size(), CvType.CV_8UC3, Scalar.all(0));
    Imgproc.cvtColor(cvImg, mat, Imgproc.COLOR_BGRA2BGR);
    return mat;
  }

  public Mat getGrey() {
    // Get a greyscale copy of the image
    Mat out = new Mat(cvImg.size(), CvType.CV_8UC1, Scalar.all(0));
    Imgproc.cvtColor(cvImg, out, Imgproc.COLOR_BGRA2GRAY);
    return out;
  }
}

Thank you very much for your help. :coffee::coffee::coffee:I think I will find the error of the previous code in the future.

Glad you got this working :blush:

Can I ask what you’re doing with this?