Mapping mouse position of user screen to

I want to synchronize the stream projection of an environment video (youtube stream) with some geometrical objects drawn in it (e.g. little non intersecting circles) in such a way that a user on a smartphone or pc monitor can select one object simply by putting the mouse or finger on the screen position where one circle object is. Then the system generates an audiovisual event which is transmitted through the stream. All the geometrical objects are computed and drawn before streaming them so that the client side code at first has no position info where to find them on the user screen.

I mean can I arrange it securely for all user devices that the x/y position of the sketch correponds to the xy position of the user screen so that the system identifies the x, y pixel position of any object resp. that mouseX / mouseY are inside one circle? Some experience based feedback I would appreciate.

hi! sorry but I don’t understand what you are trying to do exactly. You seem to have multiple components like streaming, front end (p5.js) and back end. Can you elaborate a bit more on what tools you are planning to use? also, can you make some sketches to describe what you want to achieve?

Code examples i have not yet. So I try to explain the problem more detailed. I have a big Java App which extends a realtime video of a physical object with an interactive AR layer. The interaction til now is based on a local kinect sensor. If the participants move their arms and hands they control target frames like in a game. if they match the position of some geometrical obejcts (like circles) some events are triggered showing different audiovisual behaviours.
Now I want to make a remote version substituting the kinect sensor by a simple mouse/touch based input. But til now I have no experience with streams. So I would like to check out if my idea for a solution is feasible or i should take another apporach. Idea is because the computation is heavy to compute the complete video and graphic layer in the Java App and send the video as stream to youtube. Then the stream Link I integrate in the frontend code written with P5 and backend with Node, Express e.g. described in the great tutorials from dan Shiffman. The control data (mouse/finger positions) are forwarded via server to the Java App. But to detect that an event is triggered requires to know the position of the target frames on the interactive layer. And I want that the position of the target layer is identical with the position of the mouse/finger on the screen. What I want is that user can touch and trigger directly what they are seeing.

Hope this question makes sense. Otherwise I can try to give an example code. But this could take a little because i am yet not at the point of realization but recherche.

thank you for the explanation but still I cannot get the big picture. If I understand correctly, the server side is responsible for the streaming to YouTube based on data input from clients. Because you want the users to interact with the streaming, somehow you need to get the timestamp of the received streaming… is that correct?

If so, I feel this is very challenging because basically what you can embed on the client side is an iframe of YouTube, which you don’t have access from your scripts. You may be able to get some information from YouTube API - for example, with getDuration. Nevertheless, when you have frame drops due to internet connection, it seems the duration is affected too (skipping dropped frames), so I don’t know how reliable it is.