🥽 core-xr


user-d407c1 08 January, 2024, 10:11:34

Hi @user-8bce45 ! Pupil Capture does not have knowledge of the VR environment, even though you can cast the screen, it will not record the 3D environment.

So essentially, if you are doing ray-casting as you mentioned above. You can basically get the "hit point" coordinate in Unity.

Then you will need to store it within Unity to a csv file, together with the timestamp.

Since you mentioned that you want to know what has been gazed, you can use the raycast and colliders in your objects of interest, such that you can detect whenever your gaze vector "hits" an object. https://docs.unity3d.com/2018.3/Documentation/ScriptReference/RaycastHit.html

user-c32d43 31 January, 2024, 13:48:32

Hi! I'm currently using the Screencast Prefab attached to the Main Camera to my XR Origin to share the view into Pupil Capture. However, I found the the script causes double vision within the headset.

I am unsure how to solve this as I've followed what was mentioned in the hmd-eyes document: Make sure the camera uses the VR head transform as the origin - normally just by adding it as a child to the main/VR camera - and has position and rotation set to zero

edit: I've just read the following point about performance, this might be a related matter.

End of January archive