Hi all, here is my problem.
I’m using for the first time processing in Android mode.
I setup an environment with sphere floating in space.
What I want to do is trigger an audio file when I look at a specific sphere.
The problem is that even if in the reference there are functions that should help me calculate the intersection between the objects and my line of sight ( https://android.processing.org/reference/vr/object.html ) I’m not good enough at this to figure out how to do it! x) That function should give me a matrix, but I don’t know what to do with it at the moment.

I way to do it would be get the projection of the 3d object position on the screen in order to get the x,y position and check wether or not that object is at the centre of the screen. I can do it in Java mode with PeasyCam but I cannot understand how to do the same in this kind of environment.

Any suggestion or resource I could look at??
Thank you very much for your attention!

You have to use the determinant, not at my computer though so i cannot pull up a resource. Its really just about using either a matrix multiplier, to obtain the determinant or use the general formula to calculate the line intersection, this will sometimes be available on wikipedia, and is the formula specifically for calculating, that type of line intersection. If you want to get a feel for this type of program, you could check out, Dan Shiffmans recent coding challenge, where he made a ray caster, same sort of problem but with one dimension less (he would cast lines from a point and check if they had intersected another line), maybe an alternative would be one function to check the x component and another to check the y component, in 3d space.