Overlay horizon line in camera and save pitcture

Here’s a rough sketch of what I want to do, but don’t really know how. The image is taken with a small device called a Bris Sextant. It makes multiple images of the sun. As the sun rises or sets, different sun images will touch the horizon. To use it requires calibration, and to calibrate requires a visible horizon like a large body of water. I don’t have that, so what I want to do is use a tablet camera and a sensor to draw a horizon line and measure the angle the sun is from the horizon, then save the picture with the overlay included.

Here’s the code that shows how I want it to look in android. I have code that will read orientation from another device over USB. Save the posted image into the data directory if you try it.



PImage img;  // Declare variable "a" of type PImage
PFont font8, font9, font12, font15;
int i, j; // enummerators
float horizonInstrSize=68;
int xLevelObj,yLevelObj;
float angy;
float  angyLevelControl;
void setup() {
  size(440, 460);
    font8 = createFont("Arial bold",8,false);
    font9 = createFont("Arial bold",9,false);
    font12 = createFont("Arial bold",12,false);
    font15 = createFont("Arial bold",15,false);
  // The image file must be in the data folder of the current sketch 
  // to load successfully
   // img = loadImage("suns.JPG");  // Load the image into the program
     img = loadImage("suns2.jpg");  // Load the image into the program
}

void draw() {
  background(0);

   
  image(img, width/2 - (img.width/5)/2, height/2 - (img.height/5)/2, img.width/5, img.height/5);
  stroke(0,255,0);
  line(0,height/2,640,height/2);
//  scale(-2);
  
    // ---------------------------------------------------------------------------------------------
  // Magnetron Combi Fly Level Control  --- borrowed from Multiwii
  // ---------------------------------------------------------------------------------------------
  angyLevelControl=((angy<-horizonInstrSize) ? -horizonInstrSize : (angy>horizonInstrSize) ? horizonInstrSize : angy);
  pushMatrix();

  translate(width/2,height/2);
  scale(2.5);
  noStroke();

  if (angy>0) 
    fill(124,73,31);
    noStroke();   
////////////////////////  triangle(0,0,x,-angyLevelControl,-x,-angyLevelControl);
  // inner lines
  strokeWeight(1);
  for(i=0;i<8;i++) {
    j=i*15;
    if (angy<=(35-j) && angy>=(-65-j)) {
      stroke(255,255,255); line(-30,-15-j-angy,30,-15-j-angy); // up line
      fill(255,255,255);
      textFont(font9);
      text("+" + (i+1) + "0", 34, -12-j-angy); //  up value
      text("+" + (i+1) + "0", -48, -12-j-angy); //  up value
    }
    if (angy<=(42-j) && angy>=(-58-j)) {
      stroke(167,167,167); line(-20,-7-j-angy,20,-7-j-angy); // up semi-line
    }
    if (angy<=(65+j) && angy>=(-35+j)) {
      stroke(255,255,255); line(-30,15+j-angy,30,15+j-angy); // down line
      fill(255,255,255);
      textFont(font9);
      text("-" + (i+1) + "0", 34, 17+j-angy); //  down value
      text("-" + (i+1) + "0", -48, 17+j-angy); //  down value
    }
    if (angy<=(58+j) && angy>=(-42+j)) {
      stroke(127,127,127); line(-20,7+j-angy,20,7+j-angy); // down semi-line
    }
  }
  strokeWeight(2);
  stroke(255,255,255);
  if (angy<=50 && angy>=-50) {
  //  line(-40,-angy,40,-angy); //center line
    fill(255,255,255);
    textFont(font9);
    text("0", 34, 4-angy); // center
    text("0", -39, 4-angy); // center
    
  }

  // lateral arrows
  strokeWeight(1);
  // down fixed triangle



  fill(0);
  noStroke();
  translate(-width/2,-154);
  rect(0,0,width,100);
  translate(0,105);
  rect(0,100,width,40);
  popMatrix();
  
  
  
  
}






Here’s a screenshot. It doesn’t do anything yet. I don’t know how to get and save images from the camera, and I want to overlay the scale to somehow show the angle at time of shot. Screen

Take a look at the PGraphic.save().

You can maybe save the current display to a copy Pgraphic, draw what u need on the copy, and then save the drawn-over copy

1 Like

Oh boy, I bit off a chunk this time. I am able to bring up the native camera app from within the fragment/sketch. What if I turned public void draw() into ??

public void draw(Canvas canvas) {

}

Would I be able to permanently draw my angle gauge on top? Do I need to use OpenCV?

I found this code but it just crashes and says:

Caused by: java.lang.NullPointerException: Attempt to invoke virtual method ‘android.content.res.Resources android.content.Context.getResources()’ on a null object reference

  public void onResume() {
    super.onResume();
    println("onResume()!");
    // Sete orientation here, before Processing really starts, or it can get angry:
    orientation(LANDSCAPE);

    // Create our 'CameraSurfaceView' objects, that works the magic:
    gCamSurfView = new CameraSurfaceView();
  }

  public Context getApplicationContext() {
    return this.context;
  }


  class CameraSurfaceView extends SurfaceView implements SurfaceHolder.Callback, Camera.PreviewCallback {
    // Object that accesses the camera, and updates our image data
    // Using ideas pulled from 'Android Wireless Application Development', page 340

    SurfaceHolder mHolder;
    Camera cam = null;
    Camera.Size prevSize;
  //  private Context context;

    // SurfaceView Constructor:  : ---------------------------------------------------
    CameraSurfaceView() {
      super(context);

      //   super(context);

      // Processing PApplets come with their own SurfaceView object which can be accessed
      // directly via its object name, 'surfaceView', or via the below function:
      // mHolder = surfaceView.getHolder();
      mHolder = getSurfaceHolder();
      // Add this object as a callback:
      mHolder.addCallback(this);
    }



    // SurfaceHolder.Callback stuff: ------------------------------------------------------
    public void surfaceCreated(SurfaceHolder holder) {
      // When the SurfaceHolder is created, create our camera, and register our
      // camera's preview callback, which will fire on each frame of preview:
      cam = Camera.open();
      cam.setPreviewCallback(this);

      Camera.Parameters parameters = cam.getParameters();
      // Find our preview size, and init our global PImage:
      prevSize = parameters.getPreviewSize();
      gBuffer = createImage(prevSize.width, prevSize.height, RGB);
    }

    public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
      // Start our camera previewing:
      cam.startPreview();
    }

    public void surfaceDestroyed(SurfaceHolder holder) {
      // Give the cam back to the phone:
      cam.stopPreview();
      cam.release();
      cam = null;
    }

    //  Camera.PreviewCallback stuff: ------------------------------------------------------
    public void onPreviewFrame(byte[] data, Camera cam) {
      // This is called every frame of the preview.  Update our global PImage.
      gBuffer.loadPixels();
      // Decode our camera byte data into RGB data:
      decodeYUV420SP(gBuffer.pixels, data, prevSize.width, prevSize.height);
      gBuffer.updatePixels();

      image(gBuffer, 0, 0);
    }

    //  Byte decoder : ---------------------------------------------------------------------
    void decodeYUV420SP(int[] rgb, byte[] yuv420sp, int width, int height) {
      // Pulled directly from:
      // http://ketai.googlecode.com/svn/trunk/ketai/src/edu/uic/ketai/inputService/KetaiCamera.java
      final int frameSize = width * height;

      for (int j = 0, yp = 0; j < height; j++) {
        int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
        for (int i = 0; i < width; i++, yp++) {
          int y = (0xff & ((int) yuv420sp[yp])) - 16;
          if (y < 0)
            y = 0;
          if ((i & 1) == 0) {
            v = (0xff & yuv420sp[uvp++]) - 128;
            u = (0xff & yuv420sp[uvp++]) - 128;
          }

          int y1192 = 1192 * y;
          int r = (y1192 + 1634 * v);
          int g = (y1192 - 833 * v - 400 * u);
          int b = (y1192 + 2066 * u);

          if (r < 0)
            r = 0;
          else if (r > 262143)
            r = 262143;
          if (g < 0)
            g = 0;
          else if (g > 262143)
            g = 262143;
          if (b < 0)
            b = 0;
          else if (b > 262143)
            b = 262143;

          rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) & 0xff00) | ((b >> 10) & 0xff);
        }
      }
    }
  }

@HackinHarry===

not sure to understand exactly what you want but:

  • if you want to have your cam image full screen you must use surfaceView (or textureView) as in the code in your last post (the error is easy to solve: you are in a fragment and this code is for an activity, so, eg, you must write this.getActivity.getApplicationContext()); i have already posted code for that in the old forum.
  • when the image is displayed it s easy to save it to external storage using save() with a path towards this directory; dont forget to add permission required
  • as for the overlay you create a textView, setting its background to transparent ,remove it from its layout, get the layout from your cam preview and add the tv with addView.
    I can be more precise but before i have to be sure that i understand well!

@akenaton Yes, sir, thank you. Any links to your old code?

Basically, I want to record the vertical orientation angle at the time of the picture. I have a angle sensor on USB and I am able to receive it into the fragment/sketch without the camera. As the pitch angle changes, the scale moves up or down and the angle is measured on the horizon line.

If possible, I wanted to overlay at the moment the shutter button is pressed.

Ive got an OpenCV android example that I’m going to tweak to use a Region of Interest(ROI) that apparently is easy to overlay. I’m reading but I think this is the way to go, use OpenCV.

This was a worthwhile project that I should have pursued further. It will take some focus to resurrect it, but the result should be a quick way to measure the Sun elevation above horizon, time of observation without a water horizon like an expensive marine sextant needs…

@HackinHarry ===

beautiful project ! if i get some free time i will try to make a test.