🥽 core-xr


user-f3bc0e 01 December, 2023, 00:54:06

Hi Neil, thanks for the quick response, correct me if I am wrong, but I assume the reply I should be looking at is to use the screencast feature? Since I used only the frontal camera and HMD of the VIVE headset, I did not use Unity at all. I coded my experiment in python-opencv. To use the screencast feature, I assume I need to set up everything in Unity so I could stream the desired frame to Pupil Capture as world view? My goal eventually is not to track gaze in a VR environment, but rather on a 2D frame captured by the front camera. Thanks again for the help.

nmt 01 December, 2023, 02:02:10

Indeed, Pupil Capture will not pick up the front camera. I provided the link to hmd-eyes because you inquired whether there was an alternate method of calibration, and yes, that would necessitate the use of Unity. There are some conceptual difficulties with what you're proposing about calibrating using a front-mounted scene camera on a VR headset. However, I'm not entirely sure I fully understand your objective. Could you elaborate more on your ultimate goal or research question?

user-f3bc0e 01 December, 2023, 02:28:46

Indeed, Pupil Capture will not pick up

user-aafe26 01 December, 2023, 12:17:56

@nmt Hello, may I know if I can use pupil lab for Meta Quest 3, I love to use the pupil diameter from Neon, while enjoying the rest of the functions from Meta. I am not a HTC fans. Thank you!

nmt 05 December, 2023, 03:03:56

Hey @user-aafe26 👋. This is definitely a valid use case. We're currently exploring the possibility of a Quest 3 mount, but we do not have a concrete roadmap yet. If you had the inclination and necessary skills to prototype your own mounts, you could use Neon's 'Bare Metal' kit: https://pupil-labs.com/products/neon/shop#bare-metal (designed for prototyping). We have decided to open-source the nest and Neon module .step files to assist users in this process: https://github.com/pupil-labs/neon-geometry

user-2e0f13 08 December, 2023, 04:37:01

hello , can i attach this neon with my quest 2 as i want to work for my research work in which i need the eye tracking?

wrp 08 December, 2023, 04:42:41

Hi @user-2e0f13 👋 We have tested Neon + Meta Quest 2, and I can say that it does work. However, I need to follow up with my colleagues to discuss constraints and get back to you with more details.

Please note that we do not yet have a timeline for frame mounts for XR systems other than the Pico 4 at this time.

user-2e0f13 08 December, 2023, 05:00:30

cost of this product ?

wrp 08 December, 2023, 05:56:36

5900 EUR for Neon module, frame mount, companion device. Academic discount brings cost down to 5200.

user-3ff946 08 December, 2023, 09:52:12

will it work on valve index?

wrp 08 December, 2023, 11:50:16

Hi @user-3ff946 👋 We have not yet tested Neon compatibility with Valve Index.

user-886683 08 December, 2023, 12:39:51

@wrp Hi! can i also get details about Quest2+Neon? Thank you

user-8b1248 08 December, 2023, 13:59:29

For those of us who already own a Neon module and Companion Device, is it possible to purchase only the Pico 4 frame mount?

mpk 09 December, 2023, 10:33:17

Yes it is!

wrp 09 December, 2023, 10:30:39

Yes! We haven't released the pico 4 frame mount yet, but pricing expected to be similar to Neon frames.

user-8779ef 08 December, 2023, 15:58:29

Some important questions: * What is the sampling rate in Unity? Are we throttled to the display's refresh rate? * Do you provide unity timestamps of the time that the eye image was taken? If not, a method to convert between your timestamps and unity time? * Have you measured latency from the time the eye image was taken to the time that the display has been updated? * Does unity have access to monocular gaze vectors, or just binocular? * Have you measured the accuracy within Unity? (and if you provide an average number, across what field of view?)

marc 11 December, 2023, 16:04:35

Hi @user-8779ef! Thanks again for the questions! Definitely touching some important points and I had to look up some of the details 😀 Here are the responses:

What is the sampling rate in Unity? Are we throttled to the display's refresh rate? There are two events exposed. The first one is PupilLabs.GazeDataProvider.gazeDataReady, which is part of the event loop in the main thread of the Unity application. It thus fires at the framerate of the Unity application (rather than the XR display). This will indeed be lower than the 200 Hz of the real-time gaze signal. The callback function can either receive the latest value or to the mean of N latest values (can be configured in GazeDataProvider).

Advanced users could use the second event called PupilLabs.RTSPClient.GazeDataReceived in a separate threat. This event fires every time data is received from Neon, which should happen at 200 Hz. Synchronization with the main thread needs to be handled manually though.

Do you provide unity timestamps of the time that the eye image was taken? If not, a method to convert between your timestamps and unity time? Not yet! We want to provide an example implementation that allows users to sync recordings made in XR with other data post hoc. For example, pupillometry data can currently not be calculated in real-time but has to be exported from Pupil Cloud. Our goal is to make it easy to sync this data with any data generated in Unity.

We are still exploring different approaches to this problem. If you have any wishes or recommendations on how this should be done, we are all ears!

wrp 09 December, 2023, 10:55:44

All good questions. We will respond on Monday.

user-8779ef 09 December, 2023, 13:14:11

Thanks!

user-8779ef 09 December, 2023, 17:04:55

@wrp ...and, once you have all those issues under control, please add "screen-cast" to the customer wish list 🙂 . Your previous solution was really efficient. In contrast, the native Unity recorder had horrible issues with dynamic compression levels the last time I checked. I'm about to see if there are other solid solutions, because we need it for a project.

Users like me will need this for post-hoc temporal realignment of gaze location to account for real-time latency.

It is also necessary if the goal is to access the specific frame that corresponded to a frame of gaze data. This is critical for debugging (for comparing numerical records of gaze behavior against a visual indication) and, in my case, for performing operations on the screen image in gaze-centered coordinates.

marc 11 December, 2023, 16:05:54

Also noted regarding the "screen-cast" recorder. We'll look into that as part of the post hoc synchronization implementation!

user-be56c8 10 December, 2023, 00:58:00

Is there an eye tracker product that would fit for quest 3?

marc 11 December, 2023, 16:08:21

Hi @user-be56c8! Today, the only mount we have available for pre-order is for the Pico 4. We are currently evaluating additional mounts for other current headsets and the Quest 3 is certainly one of the candidates we are looking at! I can not cite a concrete date by when a mount might be available though yet.

So for the moment I can only point at the option to build your own mount https://docs.pupil-labs.com/neon/neon-xr/build-your-own-mount/ or to wait a while longer until there is more news. We should be able to announce news on what the next mounts will be early next year!

marc 11 December, 2023, 16:04:40

Have you measured latency from the time the eye image was taken to the time that the display has been updated? We have not yet run a proper evaluation on this. Am I interpreting this correctly, that you’d be interested in recording data that is very accurately synced with stimulus presentation?

We know that the latency for image acquisition and processing by the Companion app is pretty low, about 10 ms. But the current setup also includes a wifi connection between the Companion device and the real-time API client in the Neon XR Core Unity package, which introduces additional latency and potentially some variance depending on the network conditions. We are planning on providing estimates of the entire pipeline in the future.

Does unity have access to monocular gaze vectors, or just binocular? Unity has real-time access to the gaze signal that is generated by the Neon Companion app in real-time, which currently is only a binocular one, i.e. a directional vector originating from the (in this case virtual) scene camera.

The 3D eye state data that is available post hoc via Pupil Cloud contains eye location and optical axis data per eye, which can serve as an estimate of monocular gaze direction.

We will add real-time eye state in the near future to the companion app, and will then be able to stream this to Unity too!

marc 11 December, 2023, 16:04:42

Have you measured the accuracy within Unity? (and if you provide an average number, across what field of view?) The general gaze accuracy of Neon, i.e. 1.9 deg uncalibrated and 1.3 deg with offset correction in a 60x60 deg FOV, should mostly transfer to XR.

For XR, there is the additional setup step of calibrating the exact positioning of the Neon module in the XR headset, which is required in order to map gaze data into the virtual XR world. We have not yet fully evaluated how much error (if any) is introduced in this step. Improvements to this may also come in the future should there be a noticeable error.

user-11b6e8 13 December, 2023, 11:34:57

Hello @marc, I also had some questions regarging the time management of the data timestamps received from pupil within a Unity application.

  • In the provided demos when the network controller connects to pupil they call a time sync function which then enables the possibility to convert the pupil timestamps into unity times. Should this time sync function be called repeatedly and not just after the first connection? My experimental setting lasts for 40 minutes and I was wondering if I sould resync the clocks after a certain amount of time (e.g., every X minutes) or is it enough done just once, or worse, multiple time syncing might hinder the validity of the whole recorded data.
  • We are recording through internal logs the pupil and gaze data recevied by the companion app for post experience data analysis where we wish to find relations between the user behaviour in the virtual environment and the eye data available. Thus, all the pupil and gaze data are dumped in a specific log file where each event is labelled in time using the PupilToUnityTime function in order to later relate the eye events with the unity events recorded. Is this a good practice to achieve the desired result?
marc 13 December, 2023, 12:41:15

Hi @user-11b6e8! Could you clarify which demos you are referring to exactly? Are you using Pupil Core and HMD Eyes, or Neon and the Neon XR unity integration?

user-11b6e8 13 December, 2023, 13:07:17

sorry I am using Pupil Core and HMD Eyes

user-cdcab0 19 December, 2023, 17:02:30

is it enough done just once, or worse, multiple time syncing might hinder the validity of the whole recorded data

Once should be enough, but multiple syncs won't hinder anything so long as aren't bombarding the network/CPUs with sync requests

all the pupil and gaze data are dumped in a specific log file where each event is labelled in time

This sounds perfectly fine, but there is an alternative approach that you might consider. Rather than streaming everything from Pupil Capture to Unity and logging in Unity, you could stream your Unity events (and screencast) to Pupil Capture and record them there. Then you can load your recordings in Pupil Player where you can visualize the timeline of events and make use of the other tools and plugins there.

user-11b6e8 19 December, 2023, 17:03:40

terrific, thanks for the information and suggestion!

user-8779ef 20 December, 2023, 18:07:46

Based on your responses to my questions, it sounds like you are using the gaze direction from the neon and simply importing that direction into the VR environment, without taking into account the eyes' pose with respect to the VR optics or display, and without any knowledge of what is on the screen. Is that true?

This would be a departure from how you (and most/all companies) have done things in the past, and opens up the possibility of inaccurate gaze estimation if the eye is not in the location assumed by the 3D geometry implied by the camera settings in the virtual environment. Specifically, the vr camera's focal length / field of view implies an eye position inside the headset that may be inaccurate. If I'm not being clear, keep in mind that if you were to adjust VR headset relief (distance from the face) while the head is still, the gaze direction to an object at a stable location in the VR environment would change by several degrees, even though the object's position with respect to the virtual head has not moved. This is a problem because there is no adjustment of the virtual camera to account for changes between the real-world eye/headset geometry. Typically, this adjustment is made through a calibration sequence.

Is that true? If so, will you be providing a means to to account for this possible source of error, for example, by calibrating using gaze targets placed in the virtual environment?

user-8bce45 21 December, 2023, 12:07:59

Hi, I have a question about Pupil Time and Unity. I read that "all timestamps received via hmd-eyes are in Pupil Time. For converting into Unity Time please have a look the TimeSync component discusssed here.".

I would like to understand what Pupil Time means, as it could be more relevant for me than Time.realtimeSinceStartup in Unity. My goal is to receive in my dataset the time passed since I started recording eye-tracking data or since the Unity scene started. Can this be done by using Pupil Time? Or is there a better float I can use in Unity? I would also be happy with a float about Time passed since Unity scene started. Is there a float that better reflects this than Time.realtimeSinceStartup ?

user-cdcab0 21 December, 2023, 17:18:02

It's, essentially, a clock managed by the Pupil Core software. Timestamps are recorded as (floating-point) seconds since an arbitrary start time. More here: https://docs.pupil-labs.com/core/terminology/#_2-pupil-time

user-8bce45 21 December, 2023, 17:27:47

So each time you click record it counts from January 1, 1970, 00:00:00 (UTC) ?

user-cdcab0 21 December, 2023, 17:34:28

No, you're thinking of the Unix Epoch. Pupil Time uses an arbitrary epoch, meaning it's different each time and not associated with any event

user-8bce45 21 December, 2023, 19:15:39

thanks

user-8bce45 21 December, 2023, 19:44:21

I received data from recording in Unity. Now, I want to see where my participant was looking within the Unity world. Is it right to look at gaze_point_3d_y and gaze_point_3d_x from the datafile gaze_positions? It seems like my data don't match where the participant was gazing.

nmt 27 December, 2023, 10:07:57

Hi @user-8bce45! Have you considered using ray casting to find the intersections of gaze and the features in your VR environment? You can read about that in the developer docs

user-8bce45 08 January, 2024, 09:51:22

Based on the *developer docs * it seems like I should use GazeDirection and GazeDistance rather than gaze_point_3d. However, the exported files from Pupil Capture don't show these data. Is there a way I can export GazeDirection and GazeDistance data?

user-8bce45 03 January, 2024, 14:21:11

Hi Neil, thanks for the help. Yes, I used ray casting and it seemed like the positions projected in the Unity project correspond to the actual gaze directions. However, I cannot discover where in the data I can find these positions.

End of December archive