🤿 neon-xr


Year

user-880443 05 May, 2025, 11:12:47

[email removed] van der Laan! Thanks for

user-e3d623 07 May, 2025, 17:47:13

Hello, my pupil lab fails to display the fixations and scan paths of the data recorded by the updated neon companion app. any idea?

Chat image

nmt 07 May, 2025, 20:00:44

Hi @user-e3d623! would it be possible to invite [email removed] to your workspace so we can take a closer look?

user-e3d623 08 May, 2025, 08:20:44

I've sent the invitation. Our workspace is Yao(Marc)'s workspace.

nmt 08 May, 2025, 16:50:41

Thanks, @user-e3d623! We will look into this and get back to you asap

nmt 08 May, 2025, 17:42:04

Hi @user-e3d623! As there’s no scene video in your Neon recordings (used in the context of VR), it isn’t possible to display the fixation scanpath in Cloud. This is because the fixation scanpath relies on optic flow present in the scene video. However, fixations are still computed for the recording and are available in the download.

user-e3d623 08 May, 2025, 17:58:31

Understood. Thank you.

user-5fe304 19 May, 2025, 19:49:50

I want to order longer bolts for mounting the neon in quest 3 frame. Can you confirm the bolt specifications?

user-f43a29 20 May, 2025, 09:48:35

Hi @user-5fe304 , may I first ask what the end goal is? Do you plan on removing the current screws from your Neon module and replacing them?

user-5fe304 20 May, 2025, 10:51:13

Hi Rob, yes that is correct. I’m fairly certain that M2 6mm will allow for more thread contact when mounting the neon module in the quest frame. We have no issue with the original screws mounted into neon frames, but lack sufficient contact at times when switching to the quest frame.

user-f43a29 21 May, 2025, 11:38:08

Hi @user-5fe304 , could you please reach out about this to [email removed] Thanks!

user-f43a29 20 May, 2025, 15:24:34

Hi @user-5fe304 , I see. Please note that modifying the module in that way will void the manufacturer’s warranty Let me confer with the team and update you.

user-8ca45b 21 May, 2025, 13:08:12

One stupid question: what is the range of the eye tracking data in fixations.csv (in pixels), and correspond to what FOVs? From the website I found that the eye tracking data FOV should be 100° x 80°, is that correct? https://pupil-labs.com/products/neon/specs

user-8ca45b 21 May, 2025, 13:17:06

does it share the same system as scene video, i.e., 1600x1200 px resolution with a field of view of 103°x77°?

nmt 22 May, 2025, 12:31:57

Hi @user-8ca45b! The values you've shared for the scene camera resolution and field of view are correct. Technically, though, Neon can measure gaze outside of this area, so fixations can also be computed from these data points. That said, such instances will be rare, as under typical—or at least natural—viewing conditions, gaze tends to be distributed more centrally. You can see the gaze distribution during some day-to-day activities measured by Neon on page 11 of our white paper.

user-cf5a97 22 May, 2025, 13:51:58

Hi, in January I bought "Better safe than sorry - Neon eye tracking bundle". Unfortunately, the project got discontinued and I would like to sell the glasses with the phone. The glasses have barely been used. I would like to sell it for 5600 Euro. Contact me if you are interested [email removed]

user-c99f82 27 May, 2025, 21:04:04

Hello @user-f43a29 is there anyway i could install realtime api in unity? As i am having triuble recording fixation, blink, imu values using neon xr package..?

user-f43a29 27 May, 2025, 21:04:30

Hi @user-6c6c81 , I've moved your post to the 🤿 neon-xr channel. So, the Neon XR Core Package is actually already a Unity simplification wrapper around the Real-time API. To clarify, the Real-time API itself is not exactly something you install, but rather something you use. It is programming-language agnostic. The Unity and Python packages are two ways to use it.

If you are trying to extend the Real-time API functionality in the Neon XR Core Package, then I recommend referencing this part of the RTSPClientWs.cs file. The API specification would be helpful here. Otherwise, I'll check with the team tomorrow about the 💡 features-requests you already opened (https://discord.com/channels/285728493612957698/1362170952951005344/1362170952951005344).

user-6c6c81 27 May, 2025, 21:05:32

Thank you rob.

user-aa6efa 28 May, 2025, 08:56:36

Hi team, I have a question. It may sound silly but is there any way to create and save events in Unity using the xr package like the python api does ? I'm struggling to use this feature in Unity compared to python. Thanks !

user-f43a29 28 May, 2025, 09:24:06

Hi @user-aa6efa , yes, this is possible. Please see the last part of this message and let us know if you have more questions: https://discord.com/channels/285728493612957698/1187405314908233768/1288425388824985601

user-6c6c81 28 May, 2025, 18:51:28

Hello @user-f43a29 any update on my request yet. Thank you.

user-f43a29 28 May, 2025, 18:54:21

Hi @user-6c6c81 , yes, sorry, it fell off my radar to respond promptly.

The udp branch of the Neon XR Core Package integrates with a C++ library that in principle offers those datastreams. However, it will require some restructuring of that Unity code to facilitate IMU and eye events (e.g., fixations). There is currently no fixed deadline for that.

In the meantime, the code in RTSPClientWs.cs (https://discord.com/channels/285728493612957698/1187405314908233768/1377029724576157796) can be expanded to accommodate what you request. If you are looking for dedicated support with this, then I recommend looking into one of our Support Packages.

user-f43a29 28 May, 2025, 18:55:04

Alternatively, you can run a recording in parallel and post-hoc synchronize the IMU, fixation, etc data with the Unity data.

user-6c6c81 28 May, 2025, 18:57:32

I am already trying running a pythoncode paralelly which uses realtime api and syncing it with my unity application. and also i will be looking into your support package. Thanks.

user-f43a29 28 May, 2025, 18:58:30

Ah, when I say run a recording in parallel, I mean the white button in the Neon Companion app, not the Real-time API. In this case, it should be a bit easier to post-hoc synchronize that way, but naturally, use whichever is easiest in your situation.

user-6e7a38 30 May, 2025, 11:13:11

Hi, I would like to get the fixations in unity from NeonXR package. As far as I understood the neon app sends messages to unity which contains gazes, fixations and eye state but fixations are not decoded. I couldn't figure out which bytes in the message correspond to fixations to decode. Thanks!

user-f43a29 30 May, 2025, 13:38:13

Hi @user-6e7a38 , please check these two messages, and the links in them, and let us know if you have any questions:

You may also want to reference the implementation in the Python version of the Real-time API

Please note that if you want to use the x/y coordinates of the fixations in the context of VR, then you will also need to convert them to the VR‘s coordinate system using the mount calibration values. It is a similar process to what is already done for gaze in the Neon XR Core Package. Let us know if you need guidance on that.

user-6e7a38 30 May, 2025, 13:59:08

Great, thank you

End of May archive