πŸ₯½ core-xr


user-7853dc 06 December, 2024, 02:40:33

Hi everyone! I’ve collected some data, but I noticed that the calibration doesn’t always yield good results for everyone. Even when the participant isn’t wearing makeup or glasses, it sometimes doesn’t work well. Is there a way to improve this? Could it be that the quality of the calibration depends on individual factors, such as the person’s head size or other characteristics? Any insights would be greatly appreciated!

(I'm using pupil labs inside a vive cosmos elite)

nmt 06 December, 2024, 02:41:20

Hi @user-290cc0! I've moved your message to the appropriate channel. Could you share an example recording with [email removed] such that we can provide more concrete feedback?

user-d5c9ea 13 December, 2024, 18:13:52

Hi, I recently collected data using the core-xr on my HTC Vive and was exploring Pupil Player until a colleague recommended Pupil Cloud for analysis. I am seeking clarification and support on two things: 1. Can I use the Screen Recording data (which includes a bunch of .npy and other format files) from Pupil Capture and analyze it on Pupil Cloud. 2. How do I upload these files on Pupil Cloud ? I am unable to see an option on the web UI.

nmt 15 December, 2024, 14:26:18

Hi @user-d5c9ea! Recordings made with the core xr equipment can't be uploaded to Pupil Cloud. Only Neon and Pupil Invisible recordings are compatible with Pupil Cloud. What analysis are you trying to perform in particular?

End of December archive