Native camera android in processing android

i think it is!
if you place a timer everytime the event triggers, you can check it! i did!
it triggers once every 130-140 milliseconds so it is indeed 7fps approximately

maybe you are seeing the processing actual framerate that can be really fast!

could you check with a timer and print everytime a capture event triggers to see if you really have 15fps? if that is the case then for sure it is a software (library) bug

but after all that shows to me that is a software issue (in ketai’s library where it calculates fps), not hardware!
just selects the first available previewRate the camera supports. Unfortunatelly as i mentioned to @akenaton , i am not into java to dig in ketai’s library it’ll take me ages! :slight_smile:

and here is an interesting article about speed, since i see Noel and paul are into graphics with processing!

speed tests!
http://www.magicandlove.com/blog/2014/07/28/processing-performance-test-2/
and
http://www.magicandlove.com/blog/2014/07/28/processing-performance-test-1/

even the push, popMatrix, make the framerate significantly lower, so i use code to even mirror a Pimage - for example video! it boosts 4-5 times the framerate, with a bit of code!
I also always single loop the image pixel array and if i need i calculate the coords, for y = i/width; and x =i%width;

mr Chung is a great help to the community of processing, with extremely good articles in hiw blog! respect!

2 Likes

I think it does. See the code here. It takes the camera hardware values.
(You can also take this whole class directly into your code, instead of importing it as a library, and make some changes.)

1 Like

i’ll try later to set nearestFPS to 30 manually to check if it works.

Also i’ll check if i can find a more efficient way to decode YUV420SP

there must be a solution.

@noel did you check with a timer evet to see if it is actually 15 your prevew fps? it’ll be nice if we had that feedback from you or @paulgoux that get 15! so every 65-66milliseconds

I’ll check later and post back

1 Like

Not when I choose a bigger cam-frame. Then it drops to the mentioned 7 f/s The draw()-frame is not updated until it finishes it’s for loops. So I still argue that the slowness of the sketch is in it’s draw()-frame calculations. If you use for instance: if (frameCount % 5 == 0) { } around the for-loops you get a much better result. You haven’t posted a code yet, but I assume you are tracking motion by color comparing, like in my sketch But if you are going to use a powerful red led, for instance, you only need to search a for real red pixel in the pixels array, which will be a lot faster.

1 Like

the code is identical to your’s and based to Shiffmans blob detection.
I just create a frame differencing before that, so to search for the color only in the area that differs from previous frame. I calculate min and max x,y in the area that changed. if you want i can post it of course but i do not see any significant differences from yours, that worth mentioning.

The problem is with 7fps it does not worth trying with the phone, 15fps maybe ok.

a real dirty code i tried on the phone was


boolean penDown = false;

int evaisthisia = 2; 
import ketai.camera.*;
//boolean ease_drawing = false;
KetaiCamera cam;
color trackColor= color(135, 13, 26);

boolean erasing = false;
float prevX=-1;
float prevY=-1;
float easing = 0.45;
float xe;
float ye;
boolean ease_drawing = false;
float mouse_easing = 0.15;
float xem;
float yem;
float line_weight = 2;
int brightnes_thresh=220;
int red_thresh = 245;
int green_thresh=220;
PImage send;
boolean mode_laser = true;
int ww = 640;
int hh = 480;
boolean landscape = true; 
float avgX=0;
float avgY=0;
float pgScaling =1.2;
int how_fast = 1;
PImage ttt;
int m;
void setup() {
  fullScreen();
  orientation(PORTRAIT);


  //imageMode(CENTER);
  textAlign(CENTER, CENTER);
  textSize(displayDensity * 10);
  cam = new KetaiCamera(this, 120, 160, 30);
  ttt = createImage(120, 160, ARGB);
  ttt.loadPixels();
  for (int i=0; i<ttt.pixels.length; i++) ttt.pixels[i]=color(0, 0, 0, 0);
  ttt.updatePixels();

  m = millis();
  cam.read();
}


void clearScreen() {
  //pg = createGraphics(width, height);   //clear pggraphics
  //pg.beginDraw();
  //pg.background(0, 0, 0, 0);
  //pg.endDraw();
  ttt = createImage(120, 160, ARGB);
  ttt.loadPixels();
  for (int i=0; i<ttt.pixels.length; i++) ttt.pixels[i]=color(0, 0, 0, 0);
  ttt.updatePixels();
}



void mousePressed()
{
  showcam=!showcam;
  //Toggle Camera on/off
  if (mouseX < width/3 && mouseY < 100)
  {
    if (cam.isStarted())
    {
      cam.stop();
    } else
      cam.start();
  }

  if (mouseX < 2*width/3 && mouseX > width/3 && mouseY < 100)
  {
    if (cam.getNumberOfCameras() > 1)
    {
      cam.setCameraID((cam.getCameraID() + 1 ) % cam.getNumberOfCameras());
    }
  }

  //Toggle Camera Flash
  if (mouseX > 2*width/3 && mouseY < 100)
  {
    //if (cam.isFlashEnabled())
    //  cam.disableFlash();
    //else
    //  cam.enableFlash();
    clearScreen();
    showcam=!showcam;
  }
}

void drawUI()
{
  //pushStyle();
  textAlign(LEFT);
  //fill(0);
  noFill();
  stroke(255);
  rect(0, 0, width/3, 100);
  rect(width/3, 0, width/3, 100);

  rect((width/3)*2, 0, width/3, 100);

  fill(255);
  if (cam.isStarted())
    text("Camera Off", 5, 80); 
  else
    text("Camera On", 5, 80); 

  if (cam.getNumberOfCameras() > 0)
  {
    text("Switch Camera", width/3 + 5, 80);
  }



  text("Clear Scr", width/3*2 + 5, 80);
}

boolean first_pass = true;


void onCameraPreviewEvent()
{
  cam.read();
  int ellapsed = millis()-m;

  println(cam.width +" -- "+cam.height+" "+1000/ellapsed+"fps");
  m = millis();
}


void draw() {
  if (cam != null && cam.isStarted()) {

    avgX=0;
    avgY=0;
    int count = 0;

    cam.loadPixels();

    for (int i=0; i<cam.pixels.length; i+=how_fast) {
      int y = i/cam.width;
      int x = i%cam.width;
      color c = cam.pixels[i];
      color currentColor = cam.pixels[i];
      float r1 = red(currentColor);
      float g1 = green(currentColor);
      float b1 = blue(currentColor);

      if (r1-g1>100 && r1-b1>100) {  

        cam.pixels[i]= color(255, 255, 0); //inkme;


        count++;
        avgX += x;
        avgY += y;
      }
    }
    cam.updatePixels();
    //ttt.updatePixels();

    if (showcam) {
      image(cam, 0, 0, width, height);
      image(ttt, 0, 0, width, height);
      //image(cam, width/2, height/2, width, height);
      stroke(255, 255, 0);
      line(0, height/2, width, height/2);
      line(width/2, 0, width/2, height);
      text(frameRate, width/2, height/2);
    } else background(125);

    if (count > 0) { //limit) { 
      //println("pixels "+count);
      avgX = avgX / count;
      avgY = avgY / count;
      //float scale = width
      ttt.loadPixels();
      int pos = int(avgX)+int(avgY)*cam.width;
      ttt.pixels[pos]= color(255, 0, 0); //inkme;
      ttt.updatePixels();

      if (first_pass) {  
        prevX = avgX;
        prevY = avgY;
        xe = prevX; 
        ye = prevY;
        first_pass = !first_pass;
        penDown = true;
      } else { 
        float distanceFromPrevFrame = dist(prevX, prevY, avgX, avgY); // evaisthisia
        if (distanceFromPrevFrame>=evaisthisia) {
          if (ease_drawing) {
            float targetX = avgX;
            float dx = targetX - xe;
            xe += dx * easing; 
            float targetY = avgY;
            float dy = targetY - ye;
            ye += dy * easing;
            avgX = (xe);
            avgY = (ye);
          }

          // show cursors o or x if erasing or deleting
          if (erasing) {
            rectMode(CENTER);
            noFill();
            rect(avgX, avgY, 10, 10);
            stroke(255);
            line(avgX-5, avgY-5, avgX+5, avgY+5);
            line(avgX-5, avgY+5, avgX+5, avgY-5);
          } else {
            ellipseMode(CENTER);
            noFill();
            ellipse(avgX, avgY, 10, 10);
            //line_weight = (cp5.getController("lineWeight").getValue());
            //inkme=cp5.get(ColorWheel.class, "c").getRGB();
            ellipseMode(CENTER);
            noFill();
            ellipse(avgX, avgY, 10, 10);
          }
          //noFill();
          fill(0, 0, 255, 150);
          stroke(255);
          // show area recognized

          fill(255, 0, 0);

          if (!erasing) {
            //strokeWeight(line_weight); // apo js pairnw timi
            //stroke(inkme);
            //fill(inkme);
            //output.println("specs,"+line_weight+","+inkme);
            //output.println(prevX+","+prevY+","+avgX+","+avgY); // Write the coordinate to the file
            //ellipse(avgX,avgY,4,4);
            //pg.line(round(prevX), round(prevY), round(avgX), round(avgY));
            // pg.line((prevX)*(width/640), (prevY)*(width/640), (avgX)*(width/640), (avgY)*(width/640));
            //if (penDown) points.add(new PVector(avgX, avgY));
            //points.add(new PVector(avgX, avgY));
            //if (points.size() > 200) points.remove(0);
          } else {
          }

          prevX = avgX;
          prevY = avgY;
        }
      }
    } else { //println("not found !");
      first_pass = true;  
      //if (penDown) points.add(new PVector(-1, -1)); // íá âÜëåé ìéá öïñÜ
      penDown = false; // pen lifted
    }
  } // cam available
  else
  {
    background(128);
    text("Camera is off.", width/2, height/2);
  }
  drawUI();
}
// end draw()

float level=0.5;
boolean showcam=true;
boolean filters=false;
//boolean show_script = false;
color inkme = color(255, 0, 0);
void keyPressed() { //mode_laser
  if (key == '0') clearScreen();
  if (key == 'c') showcam=!showcam;
  if (key == 'e') {
    erasing=!erasing; 
    println("erasing:"+erasing);
  }
  if (key == 'l') {
    mode_laser=!mode_laser; 
    println("laser:"+erasing);
  }
  if (key == 'o') {
    ease_drawing=!ease_drawing; 
    println("easing :"+ease_drawing);
  }
  // if (key == 'a') threshold--;
  if (key == 'r') inkme = color(255, 0, 0);
  if (key == 'g') inkme = color(0, 255, 0);
  if (key == 'b') inkme = color(0, 0, 255);
  //  if (key == 's') threshold++;
  if (key == 'z') level-=0.05;
  if (key == '+' || key =='=') {
    easing+=0.05; 
    println(easing);
  }
  if (key == '-' || key =='_') {
    easing-=0.05; 
    println(easing);
  }
}

void lines() {
  stroke(255); //255,255,255,40);
  strokeWeight(1);
  for (int i=0; i<=height; i+=50) {
    line(0, i, width, i);
  }
}

Any chance shaders could improve the speed. My sobel edge detection is very fast on the phone.

the main issue is the framerate, since captore event triggers @7fps it is not usable.
Even when i use nothing regarding image processing, just show the frame unfortunatelly i get that poor fps.
I also saw that in ketai’s github as issue’s posted

and

and more

here i try to understand how that guy got 20fps

If you take the cam-frame small enough you get that.
I experimented a bit with the suggestion I made above, and the frameRate stays at the normal 30 f/s with a 600/480 resolution. I used a small reflective red object. A led is too bright.
If you want the cam-image displayed as well, just draw the lines on a transparent PGraphics.

import ketai.camera.*; 

KetaiCamera cam; 
int dx, dy, pdx, pdy; 

void setup() { 
  size(640, 480); 
  background(255); 
  textSize(40); 
  strokeWeight(4); 
  stroke(0, 0, 80); 
  cam = new KetaiCamera(this, 640, 480, 30); 
  cam.setCameraID(0); 
  cam.start();
} 

void onCameraPreviewEvent() { 
  cam.read();
} 

void draw() { 
  cam.loadPixels(); 
  for (int x = 0; x < cam.width; x++ ) { 
    for (int y = 0; y < cam.height; y++ ) { 
      color current = cam.pixels[x+y*cam.width]; 
      float r = red(current); 
      float g = green(current); 
      float b = blue(current); 
      if (r > 220 && g < 40 && b < 40) { 
        dx = x; 
        dy = y;
      }
    }
  } 
  line(dx, dy, pdx, pdy); 
  pdx = dx; 
  pdy = dy; 
  fill(255); 
  rect(0, 0, 280, 60); 
  fill(0); 
  text(str(frameRate), 40, 40);
}
1 Like

i am using a plain sketch not even showing preview image.

import ketai.camera.*;

KetaiCamera cam;
int time;
void setup() {
  orientation(LANDSCAPE);
  imageMode(CENTER);
  cam = new KetaiCamera(this, 176, 144, 30);
  time = millis();
}

void draw() {
  //image(cam, width/2, height/2);
}

void onCameraPreviewEvent()
{
  println(millis()-time);
  time=millis();
  cam.read();
}

// start/stop camera preview by tapping the screen
void mousePressed()
{
  if (cam.isStarted())
  {
    cam.stop();
  }
  else
    cam.start();
}
void keyPressed() {
  if (key == CODED) {
    if (keyCode == MENU) {
      if (cam.isFlashEnabled())
        cam.disableFlash();
      else
        cam.enableFlash();
    }
  }
}

the time for every frame capture is
image
so 7 fps approximately (this is on another phone.)

It doesn;t matter how efficient and elegant the tracking / drawing code is. If you preview 7fps you cannot work as you should. You would be processing the same frame many times.