Hi everyone, have you ever tried to see/save the eyes' data after a specific time after the user interacts with a Unity object?
I think it could be done triggering the NeoxXR to communicate to Unity the data of that instant
Hi @user-a9fbdb , please see my messge here that I think is what you are looking for: https://discord.com/channels/285728493612957698/1420380056475402241/1422889199039811594
Hi, I have a question for neon XR for meta Quest 3. 1) Is it possible to stream scene video data, gaze data, and IMU data from neon XR? 2) Is it possible to stream those data to desktop without through companion app, and directly to the desktop?
Hi @user-59eb77 , do you mean you want to stream the VR video with a gaze overlay?
May I ask what your use case is?
Currently, you can stream the gaze + eyestate data via Neon XR. If you want the other data, you can have a Python script connect in parallel and stream the other data.
To answer your other question, you need to use the Companion app to stream to a desktop.
Hi all, I couldn't find this calibration information on docs so I'm asking here: is it fundamental to have it correct the intereye distance on the profile in the companion app? How to precisely measure that distance? And how would it impact an error of 5mm in that measure? Thank you!
Hi @user-a9fbdb 👋. Entering the wearer IPD will improve the accuracy of the pupillometry measurements. So if you're doing pupillometry specifically, we'd recommend it. Though, it's not expliclity required. If you don't enter it, the population average (63mm) is the default. There are various ways to measure IPD, the simplest being with a ruler. Measurement error from doing the measurement this way is unlikely to have any significant impact.
Hi, I bought the head strap together with the neon glasses "Better safe than sorry", and I was expecting, that the strap would be extendable, so that it can bi loosened and tightened. Unfortunately this is not the case and I would like to ask you how I can use it on different subjects.
Hi @user-3c26e4 👋🏻 The head strap compatible with the Better Safe Than Sorry frame is extendable. Have you tried pulling the tabs on each side to adjust the fit? This should let you easily loosen or tighten it to suit different wearers.
I want to run some AI model in real time for streamed video and gaze data. But if I use a python script, would it be streamed from companion app? If I connect directly to the desktop to the neon XR, is it still possible to stream to the desktop?
Hi @user-59eb77 , gaze data is streamed from the Neon Companion app, but the VR scene is not streamed that way. May I clarify if you are using a VR or an AR setup?
With respect to streaming to the desktop, you can stream VR video via direct connection using our neon-vr-recorder tool with gaze overlaid.
However, if you need the raw gaze data, then you need to either:
Im using neon-vr-recorder tool, i can see the streaming from VR but without Gaze data
Hi @user-af95e6 👋 Are you running it tethered or wirelessly?
Could you double-check that developer mode and USB debugging are enabled, and if on wireless mode, that you used the correct adb connect <IP> command as described in our README?
Hello, this is my first time working with eye tracking. The device I’m using is Neon XR. What I want to do is something like what you can see in these examples:
https://www.youtube.com/watch?v=OW5EOqqQ_qs https://x.com/pupil_labs/status/1835946367715528726
In other words, I’d like to record what the user sees in VR and overlay a dot showing where they are looking — for example, like the red dot shown in the video. Specifically, when using VRChat, I want to visualize what objects or areas the user is focusing on, either in real time or by overlaying the gaze point later using the recorded data.
I also plan to use LSL (Lab Streaming Layer) to synchronize the gaze data with other information, so that I can later review what happened at that moment together with other logs.
From what I’ve read, it seems that if the goal is to build my own Unity application, the “MRTK3 Template Project” or “The Neon XR Core Package” mentioned on this page: https://docs.pupil-labs.com/neon/neon-xr/MRTK3-template-project/
might not be suitable for my purpose.
There is also something called the “Real-Time API”, which might be what I should use. In addition, this page: https://docs.pupil-labs.com/neon/neon-xr/
mentions a simpler method that uses a smartphone app to get gaze data.
Could you please tell me which method is appropriate if I want to achieve what is shown in the videos above? Also, I’m not sure about how calibration works. The “MRTK3 Template Project” mentions calibration, but other pages don’t provide that information. For the method you recommend, could you please explain whether calibration is necessary and how to perform it?
Hi @user-740e46 👋 ! Let’s go over how NeonXR works and which setup best fits your goal.
If you plan to use MRTK3 or the NeonXR Unity template, that means you’ll be building your own Unity application. In that case, you can import our packages directly and access gaze data (as well as other metrics like pupil size, eyelid aperture, etc.) inside your app, and interact with your VR setup.
However, if you want to overlay gaze on third-party content, such as VRChat or Microsoft Flight Simulator, where you don’t have access to the source code, the best tool is NeonVR Recorder.
This application records the VR display and overlays gaze data in real time, just like in the videos you shared.
Regarding calibration, NeonXR performs a mount calibration, which aligns the Neon module’s position relative to the VR display. Once that’s done, the system uses a calibration-free gaze estimation, so there’s no need for per-user calibration each session. Also worth noting that by default there is a Meta Quest 3 calibration, so if you use that headset, you might not need to perform it.
A few extra notes:
- Neon always needs to be connected to the Companion Device, which handles power, storage, and computation.
- VR data is not streamed by default to LSL.
- I’d suggest starting with the basics:
1. Try the Real-Time API to stream gaze data.
2. Experiment with Lab Streaming Layer (LSL) for event synchronization.
3. Once comfortable, move to NeonXR in Unity (MRTK3 examples), and finally try the VR Recorder.
That workflow will give you a solid understanding of how each tool fits together and what adjustments might best suit your experimental setup.
For both of AR and VR. I want to use Quest 3 for VR setup but also passthrough mode for AR.
When using neon-vr-recorder tool, do I need to use Neon Companion app, or just directly connect the desktop to Neon XR?
Hi @user-59eb77 👋! You’ll still need to have Neon connected to the Companion Device, that’s where the Companion App runs the inference and sets up the API connection. Connecting directly won't work.
Is there any way that we can get data like pupil diameter, blink, eye lid closure with neonvr recorder ?
Hi @user-af95e6 👋 The NeonVR Recorder is mainly a tool for overlaying gaze on top of the VR display content.
All the underlying data, like pupil size, blinks, and other metrics, is still stored on the Companion Device if you record, or can be streamed through the API for you to use however you need.
But we can pull the gaze data from the companion device, why not other data ?
Yes, you can pull the rest of the data through the realtime api https://docs.pupil-labs.com/neon/real-time-api/
But it wont be synchronised with the overlay image from neonvr recorder
To clarify, neon-vr-recorder is simply a utility that uses the Realtime API together with scrcpy to generate a video with a gaze overlay.
The code is fully open-source and quite minimal. This means you can easily modify it, for example, to store the vr video frame timestamps, apply offset estimation, and create a synced data stream (e.g. vr_video_timestamps.csv) that you can correlate with the rest of the data.
If that sounds a bit too complex to set up on your own, you can also consider our consultancy services, we can help tailor it to your workflow.
Thank you @user-d407c1
I tried neon-vr-recorder, but when I ran record.py, I got this error:
Raw gaze data has unexpected length: … KeyError: 89
After I ran pip install --upgrade pupil-labs-realtime-api, that issue was fixed and the gaze circle started appearing.
However, there’s still another problem: the screen is often completely black, or it flickers between the video and black frames. Sometimes it turns dark halfway through the recording.
If that issue didn’t happen, it would basically be working fine. I already tried changing bitrate and fps, but it didn’t help.
Are there any known cases or fixes for this?
Hi @user-740e46 , a tip from the team:
You can additionally try the updated version in the latest commit of the dev branch and let us know how that goes.
Hi @user-740e46 , are you seeing the flicker when in the Meta Horizon Home Screen?
If so, this is a known issue with the latest Meta OS, but there is a solution. Simply start a third party app, like our MRTK3 Template Examples, and then the flickering in neon-vr-recorder should stop. Basically, when you open Meta Horizon menus or you are in the Meta Horizon Home Environment, then the flickering occurs.
Just make sure to pass the “-s” flag to record.py.
Hello, I purchased the NeonXR mount for the Quest 3, and I read in the documentation that it can't record the virtual environment seen through the headset. If I record the virtual scene using Unity, how can I synchronize that video with the data from Pupil Cloud?
Hi @user-8fc524 , we offer the neon-vr-recorder tool for receiving a synced video with gaze overlaid.
Note that the data on Pupil Cloud are not saved in VR coordinates. The transformation to VR coordinates happens in real-time on the headset via the Neon XR Core Package.
So, if you have your own routines in place already, then you can simply save the streamed VR data for each Unity frame. Depending on the level of synchronization you need, you might need to measure the average transmission latency and account for that.
Let us know if that clears things up.
So I don’t need to use Pupil Cloud. I just have to import the neon-vr-recorder package into my Unity project and start recording. Can I also save a CSV file along with the video? 😊
Hi @user-8fc524 , the neon-vr-recorder tool is not a Unity package, but rather a Python script that you run on a separate computer.
You can save a CSV file with the data in VR coordinates, but you would need to write some Unity code to generate that for you. Or, add some Python code to the recorder tool to generate that. Feel free to post these as 💡 features-requests , so that they can be passed onto the team for review.
In case you are looking for a turnkey solution, you can also check out the software provided by SilicoLabs, which integrates with Neon XR.
Okeey, thank you. I will try that
No problem. Feel free to post my questions as you proceed.
I have a question about NeonXR.
1) Should neon-vr-recorder tool be imported to VR headset, or the companion app? If it is VR headset, it record video frame from headset with gaze data that is transferred from XR neon(XR neon -> headset -> the desktop)?
2) does it offer real time streaming or just only recording?
Hi @user-59eb77 👋 ! Those are two completely different utilities 👇
NeonXR is a Unity3D integration that uses our Realtime API. It streams all data, gaze, pupil size, eye openness, etc. directly into Unity, so you can use it live inside your own application with the provided prefabs. We also provide several MRTK3 examples.
NeonVR Recorder, on the other hand, is a separate Python-based tool built with scrcpy. It screen-captures the VR headset display as single view, and overlays the gaze data onto that video. It’s mainly meant for recording, not for real-time rendering inside Unity.
Thank you for your quick response! What I meant by real time streaming is streaming from headset to the desktop like server.
1) So you mean NeonVR recorder should be built on the headset, and it only offers recorded images, not streaming, right?
2) Additionally, if I use passthrough mode in Meta Quest 3 for AR, does NeonVR recorder still work?
So you mean NeonVR recorder should be built on the headset, and it only offers recorded images, not streaming, right?
neon-vr-recorder tool or for that matter srcpy uses adb to gather the screen and stream it to your computer. It can be used wirelessly or tethered through USB.
Then our realtime api is used to gather data from Neon, and overlay gaze position to the content streamed from the headset and record it as a new video.
The advantage is that you can use it with 3rd party content where you do not have access to the code.
Additionally, if I use passthrough mode in Meta Quest 3 for AR, does NeonVR recorder still work?
Yes
Hello! I tried to run neon-vr-recorder, but I’m getting some errors. I don’t understand why this is happening or how to fix it — do you have any idea? 😅
I also have another question: is it possible to save gaze data to know how long the user is looking at an object? I need to activate the video recorder when the user “activates” a button after looking at something for 5 seconds.
If you want to save the gaze data in 3D VR coordinates manually, then you can add a routine to your Unity app that saves the streamed data provided by the Neon XR package (see steps 6 and 7). As a reference, you can see how it is done for the examples in our fork of the MRTK3 project.
Hi @user-8fc524 , can you try explicitly passing pass the IP address of Neon to the command:
python record.py -i IP_ADDRESS
You find the IP address in the stream section of the Neon Companion app.
Also, it would be recommended to let the neon-vr-recorder tool run for the whole experiment session. Having it activate for each 5 second segment will be difficult, since it takes a moment for the tool to start every time and the start time is non-deterministic.
To know when they look at objects, you can use Events, although you will need to add some Unity code to handle that. If you would like a turn-key solution for all of this, be sure to check out SilicoLabs' experiment builder software that integrates with Neon.
Okay, I was thinking about handling it with events, but I’m not sure how to create one or store it in the Neon Companion App. I’ll read all the documentation you send me and give it a try. Thank you!!
You are welcome. The links above describe the general process and how you do it with Neon's Real-time API. To do it from Unity, you will want to also check their documentation for sending HTTP Post requests.
Hey Pupil Labs, I recently updated the Neon XR Core Package "udp" branch to commit fb5a097d82e0d14d2dd1abd10c767086c0b39491 (Aug 5) and it works fine on my work pc. However, on the experiment pc i get a DLLNotFoundException (see screenshot) when I try to connect with the Neon (using recommended anker ethernet hub and 2nd network card). When I revert back to commit a0ea64cf2a7e8578b856dde614ab57010d7cbe6c (Apr 9), it works again (both work pc and experiment pc). The commit before that works on both pc's and the commit after that works only on my work pc. Both PC's are on Windows 11 (maybe not exact same version) and I'm using Unity 2022.3.62f2 . The companion app is on version 2.9.18-prod on a Moto Edge 40 Pro.
I tried deleting/regenerating the Library folder on the experiment pc and also using the Neon XR Core Package in a new empty unity project, both give me the same error on the experiment pc after that specific commit.
Both pc's are being managed by our system admin. A different user account is used but both have local admin rights.
Any idea how I can solve the error? Let me know if you need more details.
Hi @user-ed9dfb , that message indicates that the pl-rtsp-service DLL is missing or not being found. Can you try reinstalling the plugin fresh on that machine?
Hello all, this question has probably been answered already, but I am unsure whether I understood correctly.
I am using a Quest 3 with the NeonXR mount and planning to do some experiments on Unity using the Neon XR Core Package.
To properly record both the Unity app with the overlaid gaze and the eyes data, i.e. axes coordinates, I should use neon-vr-recorder to record the Unity application screen and then either the Python script or Core Package in Unity to record eyes data? If I got it correctly, Neon data is available in unity real-time, so saving all the eyes data from Unity should be doable?
Given all this, for this scenario the Neon Companion App is not purely needed to record, but of course to compute and store, right? Since the scene recording will be happening via neon-vr-recorder (while I may still use the app to record the eyes).
Sorry for the already repetitive questions, but as I'm working on my thesis, I want to avoid huge mistakes and knowing some solutions before hand would be helpful!
Thank you in advance 🙂
Hi @user-7afd77 , no problem and essentially correct. Let me clarify:
The neon-vr-recorder tool allows you to save the Unity scene playback with gaze automatically overlaid for visualisation, review, and qualitative analysis purposes.
If you would like a turnkey solution for all of this, you can also check out SilicoLabs' software, which integrates with Neon.
Thank you, that clarifies a lot!
I am now trying to run a recording using neon-vr-recorder but fails to locate a module: ModuleNotFoundError: No module named 'pupil_labs.neon_recording.stream' I followed the README instructions on the github repository and installed correctly the packages from requirements.txt
Did I do something wrong? Or is there some discrepancy between the package versions?
Hi @user-7afd77 ! the library pl-neon-recording has been updated today, and this tool hasn't yet being updated. Could you install pip install pupil-labs-neon-recording<2.0?
Sure, I was trying with version 1.4.2, let me try again with that command
My colleague @user-cdcab0 has already updated pl-realtime so you can simply pip install -U pupil-labs-realtime-api, so you can get 1.7.3
Correctly installed with your command, thanks.
I am having troubles record.py does not recognize my headset IP as it results None, but even specifying it manually won't make it work
Hi @user-7afd77 , do you mean it does not recognize Neon's IP?
Exactly. I even tried with python record.py -i IP_ADDRESS, as you suggested in a previous message. I took the IP Address from the Companion App
hi, I'm using the Neon Module with the Meta Quest 3, but when I make a recording the scene camera is always grey and also the audio is not recorded even if the option to do it is enabled. How can I solve this?
Hi @user-3e88a5 , the scene camera is not used in Neon XR, because it is blocked anyway when placed inside the VR headset. So, the gray frames there are intentional.
If you want to record the video of the VR scene, then check out our neon-vr-recorder tool.
I will relay the request to record audio in Neon XR scenarios to the team. Feel free to also post it as a 💡 features-requests .
Thank you a lot! I didn't want to use the neon-vr-recorder tool because I'm using also other sensor and I have everything connected over wifi and I don't want to overload the network. And I can't use it wired because I'm doing test with a subject who is walking with an exoskeleton
The default is for the VR video to be sent over a Quest cable, not over the network. Only the gaze data would additionally be sent over the network, which should not be much extra load.
yes yes, but I can't connect the quest to a PC because I need to let the subject to walk freely. Thank you for your answer!
In your Unity application, you can save the person's position and gaze data in the VR scene over time and then re-render the scene post-hoc with that trajectory to produce a video for playback.
Hey @user-f43a29 , I checked on the experiment pc and the dll is there in the new unity project with freshly installed Neon Core XR Package. I saw you earlier wrote (deleted message?) if I made a build on the experiment pc, which I probably havent on the most recent commit, so I'm trying that now. I'm also now testing on a 3rd pc.
Yes, I edited the message. You should not need to build the DLL, as it is provided by default in the udp branch of the Neon XR Core Package. Apologies for any confusion.
Yes, but actually I'm mostly interested in the audio which is not recorded if the scene camera does not record anything. Instead of recording audio from pupil I will add the audio recording in the Quest application no problem.
Since Neon's microphones are in the module, which is encapsulated in the VR headset in this case, you will probably record clearer audio by using Quest's microphones.
Yes, I edited the message. You should
Hello again. I was wondering why the neon.local:8080 always shows this even though the module is connected to the phone
Hi @user-7afd77 ! Do you have more than one device? Could it be that another companion device with the module disconnected is on the same network? If not could you check on the companion device that the frame and module are recognized?
If you are referring to the phone, I think there are no other phones that have been connected to the module. I suppose they are recognized, as I have the image of the quest 3 and I can see the eyes camera working
Thanks for sharing the additional information, kindly note that on NeonXR the scene camera is disabled as it would be blocked, and therefore the monitor app may not work. May I ask what are you trying to achieve throught the monitor app?
Actually nothing much, I wanted to check all the options available on it. The previous user of the current module I have told me that I should see something like "device connected. Streaming data" and I was wondering if what I'm seeing right now is correct or not
Hello, this might be a really stupid and basic question:
If I want to read data in Unity with Neon XR, is there a library to check names and references? Whenever I start a simple project importing Neon XR, there seems to be no data being logged besides the correct connection with the module and 200 messages processed.
Are they the same from the python API?
Thanks
Hi @user-7afd77 , may I ask what you mean by "names and references"? Do you mean a function that gives you the current gaze data?
To clarify, Neon XR's data is not logged automatically. You have to run a function that collects the data and save it to a file of your choice.
Yup, I'm trying to collect data and use it runtime (and save it as well). Do I just check the scripts file in the Neon XR Scripts folder? Is there no "documentation" of them?
Our fork of the MRTK Template Project acts as examples of how to use Neon XR in the context of that system. The general principles are the same for other SDKs and approaches.
Essentially, you just need to monitor the gazeDataReady Unity event via an OnGazeDataReady method, as listed in Step 7 at that link.
See here for a code example. It takes a GazeDataProvider object, which can be an instance of the NeonGazeDataProvider class from the Neon XR Core Package.
Let us know if you have more questions.
Sorry for the late reply and thank you for the answer. Worked like a charm, thank you.
hi @user-d407c1 Sorry for the delayed reply.
The error we discussed earlier has been resolved on the dev branch. However, there’s still an issue where the screen appears completely black on the first run, even though it works fine from the second run onward.
Hi @user-740e46 ! No worries, would you mind refreshing me, what were you trying to achieve? Were you using neon-vr-recorder?
Heyo! The SDK seems to be using MRTK3 under the hood. Can it also be used with Unity's XRIT's gaze system?
Hi @user-5f2716 , the Neon XR Core Package uses neither MRTK3 nor XRIT. It is completely independent of both.
We simply provide a fork of the MRTK3 Template Project as one example of how to integrate the Core Package with an SDK. You could equally as well integrate with XRIT, Meta's SDKs, etc.
Let us know if that clears things up.
Ah thanks! So no one made a way yet to use the XRIT gaze provider with it yet?
Since the MRTK3 is built on and extends the XRIT, one could say that has already been effectively accomplished. Having said that, I do not know of an explicit integration of the Neon XR Core Package with the XRIT Gaze Provider, but the principles would be the same. Please see this message for more details: https://discord.com/channels/285728493612957698/1187405314908233768/1433093616883994685