Hi pupil-labs, just a quick question, your Core VR inserts for the vive pro, do they fit the vive pro two too?
Hi @user-a79410 , ah, no, it does not. What are you looking to do with eyetracking in VR?
Hi Rob thanks for the answer. We have perhaps damaged vive pro which we wanted to use with your core VR add on. We need to develop a virtual environment for pupil reflex stimulation. therefore we need to access in real-time gaze data and the eye rotation centre.
@user-a79410 I see. If you ever have interest in an upgraded way to do eyetracking in VR, then I recommend checking out Neon XR. With that, it should be possible to 3D print a mount for your Vive Pro.
Just in case we needed replacement, I would upgrade. However if that does not work, then we would stick to vive pro.
Hi folks, I have a few tracks that have a strong center bias - when gaze is is towards the periphery, the estimate is not nearly eccentric enough. I am well versed with your old source code and have been digging around for ways to move the estimated eyeball center forwards, towards the screen, but no luck yet. Any advice?
Hi @user-8779ef. What sort of accuracy are you achieving with your post-hoc calibration routine? Also, does the calibration routine include targets presented at the eccentricities you're expecting to measure?
I am processing this data using the HMDMD post hoc pipeline, which we worked with Pablo to develop back in the day
I am integrating the Pupil core with unity, but after restarting my PC the Pupil capture app is giving me an error. It says [ERROR] network_api.controller.pupil_remote_controller: Could not bind to Socket: tcp://*:50020. Reason: Address in use
Hi @user-386d26 , are you still encountering this? Can you try restarting Pupil Capture with default settings? Go to the "General settings" tab and click "Restart with default settings".
Thank you very much, that is good to know. Maybe you can help me out once more: Is there a way to change gaze data (or even better the eye center / pupil center) from the head coordinate system to the world coordinate system? Currently the vestibulo-ocular reflex movement (672710-672714) is too fast for the saccade detection (light blue bars) to work, resulting in long windows without any fixation data. A velocity based saccade detection algorithm (red/green coloration) works better, but it seems to me the most elegant way to solve this problem would be to add the head movement to the eye movement, so that any eye movement caused by the vestibulo-ocular reflex gets cancelled out. I think I could solve this on my own, but was hoping there might be an already established code for this :)
Hi @user-d7c9b4, Did you use AprilTags during your experiment? Then, you could give the Head Pose Tracker a try.
Otherwise, Pupil Core does not have something like an IMU, so did you have a way to measure head pose/rotation during the experiment?
We are using the HTC Vive VR Plugin, so we have IMU Data.
Hi @user-d7c9b4 , thanks.
Do I understand correctly that you used hmd-eyes with the Core VR Add-on in Unity?
Hi @user-d7c9b4 , I see. Do you mind if we move this discussion to the 🥽 core-xr channel?