🥽 core-xr


Year

user-4c21e5 14 January, 2025, 14:58:23

Hi @user-d5c9ea! Yes, you can consume those binary files programmatically if you wish. And others have indeed done so. Check out this message for reference: https://discord.com/channels/285728493612957698/285728493612957698/757530345842278421

Are you sure those timestamps were taken from the same clock? It's just that Pupil Capture runs it's own clock, which is different from system/world time. Read more about that here: https://docs.pupil-labs.com/core/terminology/#timing

user-d5c9ea 14 January, 2025, 15:23:56

Thanks. I am converting pupil time to system time. I am curious if that conversion is in UTC and not local to the system where the data was recorded ?

user-4c21e5 15 January, 2025, 04:14:12

🤔 I'm not quite sure I understand the question. Could you elaborate what you mean by 'conversion is in UTC and not local to the system?' For context, pupil time is essentially arbitrary as it has a random start point. System Time is just the current datetime of the device that is running Pupil Core software.

user-a79410 17 January, 2025, 05:23:53

Hi pupil-labs, I am trying to set up your core-xr and got a few things right. That is on a windows 11 pro, Unity 6.0 xxx, VS 22 community, pupil capture 3.5.1. I can see the eye videos in some of unity demos and see the calibration method, however, the visual are all in pink. Though I like the colour .... there might be just a little thing wrong in the settings.

user-f43a29 17 January, 2025, 09:06:27

HI @user-a79410 , my first guess is that this could be due to Unity version. Unity 6 is quite new and a number of things changed in that release. Have you tried using a version closer to Unity 2018, such as Unity 2021 LTS that is available in Unity Hub?

user-a79410 17 January, 2025, 05:25:04

Chat image

user-a79410 17 January, 2025, 05:32:05

And I can see the VR in the HMD Vive Pro well. The second thing, I would need to record the gaze data for post-hoc analyses. We are after the estimated eye's rotation centre of the 3D model, then we might be able to map the presented stimuli to retinal locations. Would you know if is already available to some degree somewhere? many thanks in advance.

user-f43a29 17 January, 2025, 09:08:20

Yes, this data is provided in the pupil_positions.csv file. You might be interested in the circle_3d_normal_x/y/z and theta/phi variables.

user-d407c1 20 January, 2025, 07:59:57

Hi @user-a79410 !

Quickly stepping in for @user-f43a29 here—could you let us know which rendering pipeline are you using?

It looks like the materials might be missing on the assets, which can happen if you're using URP or HDRP in Unity. You can resolve this by either:
- Switching to the built-in rendering pipeline, or
- [Converting the assets to [email removed]

For further details, you may want to consult Unity's forums or documentation.

user-a79410 24 January, 2025, 04:30:36

Many thanks @user-f43a29 & @user-d407c1 Switching to the built-in rendering pipeline did the job for me.

user-497e26 23 January, 2025, 02:21:13

Hello, we’re using “core” in connection with a VR project. Our target distance is up to 2000 mm, and we are checking the gaze distance through the core open-source code (using self.last_gaze_distance in gazerheadset.py).

Previously, the hard-coded depth was 500 mm, which meant we couldn’t confirm distances up to 2000 mm. After modifying the hard-coded depth to 2000 mm and performing calibration at around 2000 mm, we were able to see more meaningful results than before.

Is there a method or approach we could try that would enable us to measure longer distances (around 2000 mm) more effectively, as we did in this case?

user-4c21e5 23 January, 2025, 02:26:17

Hi @user-fce73e! I've moved your message to this channel! I think it would be helpful if you could elaborate on a few things such that we can try to offer constructive feedback. Firstly, can you expand on 'using Core in connection with a VR project' - do you have the standalone Pupil Core headset, or are you using Core cameras mounted into a HMD? Secondly, what is your goal by using gaze distance - are you trying to measure viewing depth based solely on gaze data?

user-a5dd96 23 January, 2025, 04:13:55

Hi, I'm Kentaro, and I'm new here. Our lab is working on data visualization in AR. I'm looking for advice on which Pupil Labs product would be best for our setup.

Our goal is to use a Pupil Series device (Neon or Core) to capture world camera and eye gaze data. We then want to process both data streams (world camera + gaze) and overlay visuals in the user's field of view in real-time.

Our lab currently has XReal Air, Quest 3, and Vision Pro. Given this setup, which Pupil Labs device would be the best choice?

Right now, we’re considering purchasing Pupil Core and using it with XReal Air, but we’re unsure if this combination will work effectively or if there’s a better approach. Any recommendations would be greatly appreciated!

user-f43a29 24 January, 2025, 15:11:56

Hi @user-a5dd96 , you'll have an overall smoother experience with our latest eyetracker, Neon. It is calibration-free & modular, making it a great fit for AR and real-world research. It also has an integrated 9-DoF IMU, which can be helpful in such contexts.

With Pupil Core, you have to re-calibrate it whenever it slips, and using it with third-party glasses, while possible, can make the whole process trickier. In comparison, Neon is designed to be mobile & mountable, which leads to a more comfortable experience when wearing it in an AR setup and walking around.

When it comes to using Neon in an XR context, we already offer a mount for the Quest 3 headset (scroll down to the options), as well as a Unity integration. We have also open-sourced the geometry of the module, making it possible to build custom mounts for other devices.

If you'd like, we can demonstrate Neon + Pupil Core in a Demo call, where we can also answer any other questions in more detail. Feel free to book an appointment at your convenience here.

user-fce73e 23 January, 2025, 04:48:17

We are using the standalone Pupil Core headset (the glasses-type model). We are researching VR-optimized displays and lenses, and we’d like to use the human gaze distance (depth) purely as a reference.

For that reason, we want to confirm if we can measure a person’s gaze distance up to about 2000 mm. Also, we are using only gaze data from the Pupillabs Core algorithm to measure depth

user-4c21e5 24 January, 2025, 02:33:54

So if I understand correctly, you wear the Core headset, whilst gazing at a physical screen/display, mounted at some distance from the subject. And you're using the hdm-eyes repo to calibrate in the context of a VR scene?

user-5bd924 23 January, 2025, 19:56:35

Hi, I'm Michael. Our lab is using the "HTC Vive Binocular Add-on" eye tracker to collect pupillometry data, and we're having a suspected hardware issue that may require re-wiring or similar service. Should I share more details here, or in a direct conversation with someone? Many thanks in advance for your help!

user-4c21e5 24 January, 2025, 03:41:17

Hi @user-5bd924 - I see you've opened a support ticket. We will continue there!

End of January archive