🤿 neon-xr


Year

Month (messages)

user-a9fbdb 01 October, 2025, 10:11:37

Hi everyone, have you ever tried to see/save the eyes' data after a specific time after the user interacts with a Unity object?

user-a9fbdb 01 October, 2025, 10:14:17

I think it could be done triggering the NeoxXR to communicate to Unity the data of that instant

user-f43a29 01 October, 2025, 10:16:42

Hi @user-a9fbdb , please see my messge here that I think is what you are looking for: https://discord.com/channels/285728493612957698/1420380056475402241/1422889199039811594

user-59eb77 02 October, 2025, 18:54:30

Hi, I have a question for neon XR for meta Quest 3. 1) Is it possible to stream scene video data, gaze data, and IMU data from neon XR? 2) Is it possible to stream those data to desktop without through companion app, and directly to the desktop?

user-f43a29 03 October, 2025, 09:38:17

Hi @user-59eb77 , do you mean you want to stream the VR video with a gaze overlay?

May I ask what your use case is?

Currently, you can stream the gaze + eyestate data via Neon XR. If you want the other data, you can have a Python script connect in parallel and stream the other data.

To answer your other question, you need to use the Companion app to stream to a desktop.

user-a9fbdb 03 October, 2025, 06:15:29

Hi all, I couldn't find this calibration information on docs so I'm asking here: is it fundamental to have it correct the intereye distance on the profile in the companion app? How to precisely measure that distance? And how would it impact an error of 5mm in that measure? Thank you!

user-4c21e5 03 October, 2025, 09:31:14

Hi @user-a9fbdb 👋. Entering the wearer IPD will improve the accuracy of the pupillometry measurements. So if you're doing pupillometry specifically, we'd recommend it. Though, it's not expliclity required. If you don't enter it, the population average (63mm) is the default. There are various ways to measure IPD, the simplest being with a ruler. Measurement error from doing the measurement this way is unlikely to have any significant impact.

user-3c26e4 06 October, 2025, 16:27:42

Hi, I bought the head strap together with the neon glasses "Better safe than sorry", and I was expecting, that the strap would be extendable, so that it can bi loosened and tightened. Unfortunately this is not the case and I would like to ask you how I can use it on different subjects.

user-c2d375 07 October, 2025, 06:35:28

Hi @user-3c26e4 👋🏻 The head strap compatible with the Better Safe Than Sorry frame is extendable. Have you tried pulling the tabs on each side to adjust the fit? This should let you easily loosen or tighten it to suit different wearers.

user-59eb77 07 October, 2025, 15:44:03

I want to run some AI model in real time for streamed video and gaze data. But if I use a python script, would it be streamed from companion app? If I connect directly to the desktop to the neon XR, is it still possible to stream to the desktop?

user-f43a29 07 October, 2025, 16:39:09

Hi @user-59eb77 , gaze data is streamed from the Neon Companion app, but the VR scene is not streamed that way. May I clarify if you are using a VR or an AR setup?

With respect to streaming to the desktop, you can stream VR video via direct connection using our neon-vr-recorder tool with gaze overlaid.

However, if you need the raw gaze data, then you need to either:

  • Stream it from the Neon Companion app via the Real-time API with Python. This can also be done over Ethernet cable to the desktop.
  • Stream it to Neon XR, which automatically transforms that data to VR coordinates and then stream that data from the headset to the desktop.
user-af95e6 07 October, 2025, 21:16:30

Im using neon-vr-recorder tool, i can see the streaming from VR but without Gaze data

Chat image

user-d407c1 08 October, 2025, 06:54:10

Hi @user-af95e6 👋 Are you running it tethered or wirelessly?

Could you double-check that developer mode and USB debugging are enabled, and if on wireless mode, that you used the correct adb connect <IP> command as described in our README?

user-740e46 08 October, 2025, 04:09:27

Hello, this is my first time working with eye tracking. The device I’m using is Neon XR. What I want to do is something like what you can see in these examples:

https://www.youtube.com/watch?v=OW5EOqqQ_qs https://x.com/pupil_labs/status/1835946367715528726

In other words, I’d like to record what the user sees in VR and overlay a dot showing where they are looking — for example, like the red dot shown in the video. Specifically, when using VRChat, I want to visualize what objects or areas the user is focusing on, either in real time or by overlaying the gaze point later using the recorded data.

I also plan to use LSL (Lab Streaming Layer) to synchronize the gaze data with other information, so that I can later review what happened at that moment together with other logs.

From what I’ve read, it seems that if the goal is to build my own Unity application, the “MRTK3 Template Project” or “The Neon XR Core Package” mentioned on this page: https://docs.pupil-labs.com/neon/neon-xr/MRTK3-template-project/

might not be suitable for my purpose.

There is also something called the “Real-Time API”, which might be what I should use. In addition, this page: https://docs.pupil-labs.com/neon/neon-xr/

mentions a simpler method that uses a smartphone app to get gaze data.

Could you please tell me which method is appropriate if I want to achieve what is shown in the videos above? Also, I’m not sure about how calibration works. The “MRTK3 Template Project” mentions calibration, but other pages don’t provide that information. For the method you recommend, could you please explain whether calibration is necessary and how to perform it?

user-d407c1 08 October, 2025, 08:07:01

Hi @user-740e46 👋 ! Let’s go over how NeonXR works and which setup best fits your goal.

If you plan to use MRTK3 or the NeonXR Unity template, that means you’ll be building your own Unity application. In that case, you can import our packages directly and access gaze data (as well as other metrics like pupil size, eyelid aperture, etc.) inside your app, and interact with your VR setup.

However, if you want to overlay gaze on third-party content, such as VRChat or Microsoft Flight Simulator, where you don’t have access to the source code, the best tool is NeonVR Recorder.
This application records the VR display and overlays gaze data in real time, just like in the videos you shared.

Regarding calibration, NeonXR performs a mount calibration, which aligns the Neon module’s position relative to the VR display. Once that’s done, the system uses a calibration-free gaze estimation, so there’s no need for per-user calibration each session. Also worth noting that by default there is a Meta Quest 3 calibration, so if you use that headset, you might not need to perform it.

A few extra notes:
- Neon always needs to be connected to the Companion Device, which handles power, storage, and computation.
- VR data is not streamed by default to LSL. - I’d suggest starting with the basics:
1. Try the Real-Time API to stream gaze data.
2. Experiment with Lab Streaming Layer (LSL) for event synchronization.
3. Once comfortable, move to NeonXR in Unity (MRTK3 examples), and finally try the VR Recorder.

That workflow will give you a solid understanding of how each tool fits together and what adjustments might best suit your experimental setup.

user-59eb77 08 October, 2025, 17:10:12

For both of AR and VR. I want to use Quest 3 for VR setup but also passthrough mode for AR.

When using neon-vr-recorder tool, do I need to use Neon Companion app, or just directly connect the desktop to Neon XR?

user-d407c1 09 October, 2025, 09:41:03

Hi @user-59eb77 👋! You’ll still need to have Neon connected to the Companion Device, that’s where the Companion App runs the inference and sets up the API connection. Connecting directly won't work.

user-af95e6 09 October, 2025, 09:17:35

Is there any way that we can get data like pupil diameter, blink, eye lid closure with neonvr recorder ?

user-d407c1 09 October, 2025, 09:43:23

Hi @user-af95e6 👋 The NeonVR Recorder is mainly a tool for overlaying gaze on top of the VR display content.

All the underlying data, like pupil size, blinks, and other metrics, is still stored on the Companion Device if you record, or can be streamed through the API for you to use however you need.

user-af95e6 09 October, 2025, 09:47:09

But we can pull the gaze data from the companion device, why not other data ?

user-d407c1 09 October, 2025, 09:51:45

Yes, you can pull the rest of the data through the realtime api https://docs.pupil-labs.com/neon/real-time-api/

user-af95e6 09 October, 2025, 09:54:36

But it wont be synchronised with the overlay image from neonvr recorder

user-d407c1 09 October, 2025, 10:02:33

To clarify, neon-vr-recorder is simply a utility that uses the Realtime API together with scrcpy to generate a video with a gaze overlay.

The code is fully open-source and quite minimal. This means you can easily modify it, for example, to store the vr video frame timestamps, apply offset estimation, and create a synced data stream (e.g. vr_video_timestamps.csv) that you can correlate with the rest of the data.

If that sounds a bit too complex to set up on your own, you can also consider our consultancy services, we can help tailor it to your workflow.

user-740e46 09 October, 2025, 13:59:39

Thank you @user-d407c1

I tried neon-vr-recorder, but when I ran record.py, I got this error: Raw gaze data has unexpected length: … KeyError: 89

After I ran pip install --upgrade pupil-labs-realtime-api, that issue was fixed and the gaze circle started appearing.

However, there’s still another problem: the screen is often completely black, or it flickers between the video and black frames. Sometimes it turns dark halfway through the recording.

If that issue didn’t happen, it would basically be working fine. I already tried changing bitrate and fps, but it didn’t help.

Are there any known cases or fixes for this?

Chat image

user-f43a29 13 October, 2025, 07:50:54

Hi @user-740e46 , a tip from the team:

  • Make sure to wake up the VR headset and wait a few seconds before starting neon-vr-recorder.

You can additionally try the updated version in the latest commit of the dev branch and let us know how that goes.

user-f43a29 09 October, 2025, 15:53:08

Hi @user-740e46 , are you seeing the flicker when in the Meta Horizon Home Screen?

If so, this is a known issue with the latest Meta OS, but there is a solution. Simply start a third party app, like our MRTK3 Template Examples, and then the flickering in neon-vr-recorder should stop. Basically, when you open Meta Horizon menus or you are in the Meta Horizon Home Environment, then the flickering occurs.

Just make sure to pass the “-s” flag to record.py.

user-8fc524 14 October, 2025, 09:49:21

Hello, I purchased the NeonXR mount for the Quest 3, and I read in the documentation that it can't record the virtual environment seen through the headset. If I record the virtual scene using Unity, how can I synchronize that video with the data from Pupil Cloud?

user-f43a29 14 October, 2025, 10:35:16

Hi @user-8fc524 , we offer the neon-vr-recorder tool for receiving a synced video with gaze overlaid.

Note that the data on Pupil Cloud are not saved in VR coordinates. The transformation to VR coordinates happens in real-time on the headset via the Neon XR Core Package.

So, if you have your own routines in place already, then you can simply save the streamed VR data for each Unity frame. Depending on the level of synchronization you need, you might need to measure the average transmission latency and account for that.

Let us know if that clears things up.

user-8fc524 14 October, 2025, 10:54:47

So I don’t need to use Pupil Cloud. I just have to import the neon-vr-recorder package into my Unity project and start recording. Can I also save a CSV file along with the video? 😊

user-f43a29 14 October, 2025, 10:56:17

Hi @user-8fc524 , the neon-vr-recorder tool is not a Unity package, but rather a Python script that you run on a separate computer.

You can save a CSV file with the data in VR coordinates, but you would need to write some Unity code to generate that for you. Or, add some Python code to the recorder tool to generate that. Feel free to post these as 💡 features-requests , so that they can be passed onto the team for review.

In case you are looking for a turnkey solution, you can also check out the software provided by SilicoLabs, which integrates with Neon XR.

user-8fc524 14 October, 2025, 11:33:51

Okeey, thank you. I will try that

user-f43a29 14 October, 2025, 11:34:06

No problem. Feel free to post my questions as you proceed.

user-59eb77 16 October, 2025, 14:25:48

I have a question about NeonXR.

1) Should neon-vr-recorder tool be imported to VR headset, or the companion app? If it is VR headset, it record video frame from headset with gaze data that is transferred from XR neon(XR neon -> headset -> the desktop)?

2) does it offer real time streaming or just only recording?

user-d407c1 16 October, 2025, 14:52:10

Hi @user-59eb77 👋 ! Those are two completely different utilities 👇

  • NeonXR is a Unity3D integration that uses our Realtime API. It streams all data, gaze, pupil size, eye openness, etc. directly into Unity, so you can use it live inside your own application with the provided prefabs. We also provide several MRTK3 examples.

  • NeonVR Recorder, on the other hand, is a separate Python-based tool built with scrcpy. It screen-captures the VR headset display as single view, and overlays the gaze data onto that video. It’s mainly meant for recording, not for real-time rendering inside Unity.

user-59eb77 16 October, 2025, 14:57:24

Thank you for your quick response! What I meant by real time streaming is streaming from headset to the desktop like server.

1) So you mean NeonVR recorder should be built on the headset, and it only offers recorded images, not streaming, right?

2) Additionally, if I use passthrough mode in Meta Quest 3 for AR, does NeonVR recorder still work?

user-d407c1 16 October, 2025, 15:07:02

So you mean NeonVR recorder should be built on the headset, and it only offers recorded images, not streaming, right?

neon-vr-recorder tool or for that matter srcpy uses adb to gather the screen and stream it to your computer. It can be used wirelessly or tethered through USB. Then our realtime api is used to gather data from Neon, and overlay gaze position to the content streamed from the headset and record it as a new video.

The advantage is that you can use it with 3rd party content where you do not have access to the code.

Additionally, if I use passthrough mode in Meta Quest 3 for AR, does NeonVR recorder still work?

Yes

user-8fc524 21 October, 2025, 08:30:00

Hello! I tried to run neon-vr-recorder, but I’m getting some errors. I don’t understand why this is happening or how to fix it — do you have any idea? 😅

I also have another question: is it possible to save gaze data to know how long the user is looking at an object? I need to activate the video recorder when the user “activates” a button after looking at something for 5 seconds.

Chat image

user-f43a29 21 October, 2025, 08:39:45

If you want to save the gaze data in 3D VR coordinates manually, then you can add a routine to your Unity app that saves the streamed data provided by the Neon XR package (see steps 6 and 7). As a reference, you can see how it is done for the examples in our fork of the MRTK3 project.

user-f43a29 21 October, 2025, 08:37:19

Hi @user-8fc524 , can you try explicitly passing pass the IP address of Neon to the command:

python record.py -i IP_ADDRESS

You find the IP address in the stream section of the Neon Companion app.

Also, it would be recommended to let the neon-vr-recorder tool run for the whole experiment session. Having it activate for each 5 second segment will be difficult, since it takes a moment for the tool to start every time and the start time is non-deterministic.

To know when they look at objects, you can use Events, although you will need to add some Unity code to handle that. If you would like a turn-key solution for all of this, be sure to check out SilicoLabs' experiment builder software that integrates with Neon.

user-8fc524 21 October, 2025, 08:44:37

Okay, I was thinking about handling it with events, but I’m not sure how to create one or store it in the Neon Companion App. I’ll read all the documentation you send me and give it a try. Thank you!!

user-f43a29 21 October, 2025, 08:46:52

You are welcome. The links above describe the general process and how you do it with Neon's Real-time API. To do it from Unity, you will want to also check their documentation for sending HTTP Post requests.

user-ed9dfb 22 October, 2025, 12:47:57

Hey Pupil Labs, I recently updated the Neon XR Core Package "udp" branch to commit fb5a097d82e0d14d2dd1abd10c767086c0b39491 (Aug 5) and it works fine on my work pc. However, on the experiment pc i get a DLLNotFoundException (see screenshot) when I try to connect with the Neon (using recommended anker ethernet hub and 2nd network card). When I revert back to commit a0ea64cf2a7e8578b856dde614ab57010d7cbe6c (Apr 9), it works again (both work pc and experiment pc). The commit before that works on both pc's and the commit after that works only on my work pc. Both PC's are on Windows 11 (maybe not exact same version) and I'm using Unity 2022.3.62f2 . The companion app is on version 2.9.18-prod on a Moto Edge 40 Pro.

I tried deleting/regenerating the Library folder on the experiment pc and also using the Neon XR Core Package in a new empty unity project, both give me the same error on the experiment pc after that specific commit.

Both pc's are being managed by our system admin. A different user account is used but both have local admin rights.

Any idea how I can solve the error? Let me know if you need more details.

Chat image

user-f43a29 22 October, 2025, 13:51:10

Hi @user-ed9dfb , that message indicates that the pl-rtsp-service DLL is missing or not being found. Can you try reinstalling the plugin fresh on that machine?

user-7afd77 22 October, 2025, 13:26:55

Hello all, this question has probably been answered already, but I am unsure whether I understood correctly.

I am using a Quest 3 with the NeonXR mount and planning to do some experiments on Unity using the Neon XR Core Package.

  • To properly record both the Unity app with the overlaid gaze and the eyes data, i.e. axes coordinates, I should use neon-vr-recorder to record the Unity application screen and then either the Python script or Core Package in Unity to record eyes data? If I got it correctly, Neon data is available in unity real-time, so saving all the eyes data from Unity should be doable?

  • Given all this, for this scenario the Neon Companion App is not purely needed to record, but of course to compute and store, right? Since the scene recording will be happening via neon-vr-recorder (while I may still use the app to record the eyes).

Sorry for the already repetitive questions, but as I'm working on my thesis, I want to avoid huge mistakes and knowing some solutions before hand would be helpful!

Thank you in advance 🙂

user-f43a29 22 October, 2025, 13:45:01

Hi @user-7afd77 , no problem and essentially correct. Let me clarify:

  • Neon connects to the Neon Companion app, running on the provided Android phone, where the gaze estimation pipeline runs.
  • The Neon Companion app then streams this data over the network to the Unity application via the Neon XR package.
  • The Neon XR package automatically converts the streamed gaze data to VR coordinates upon receipt.
  • You can then add a Unity routine to save the real-time VR gaze data to disk.
  • Running a recording on the phone would rather save the data in coordinates of Neon's scene camera, which can be useful to have as a backup.
  • Similarly, saving the real-time data via a separate Python script provides the data in scene camera coordinates by default, not VR coordinates.

The neon-vr-recorder tool allows you to save the Unity scene playback with gaze automatically overlaid for visualisation, review, and qualitative analysis purposes.

If you would like a turnkey solution for all of this, you can also check out SilicoLabs' software, which integrates with Neon.

user-7afd77 22 October, 2025, 13:56:35

Thank you, that clarifies a lot!

I am now trying to run a recording using neon-vr-recorder but fails to locate a module: ModuleNotFoundError: No module named 'pupil_labs.neon_recording.stream' I followed the README instructions on the github repository and installed correctly the packages from requirements.txt

Did I do something wrong? Or is there some discrepancy between the package versions?

user-d407c1 22 October, 2025, 13:59:18

Hi @user-7afd77 ! the library pl-neon-recording has been updated today, and this tool hasn't yet being updated. Could you install pip install pupil-labs-neon-recording<2.0?

user-7afd77 22 October, 2025, 13:59:42

Sure, I was trying with version 1.4.2, let me try again with that command

user-d407c1 22 October, 2025, 14:09:48

My colleague @user-cdcab0 has already updated pl-realtime so you can simply pip install -U pupil-labs-realtime-api, so you can get 1.7.3

user-7afd77 22 October, 2025, 14:19:42

Correctly installed with your command, thanks.

I am having troubles record.py does not recognize my headset IP as it results None, but even specifying it manually won't make it work

user-f43a29 22 October, 2025, 14:21:59

Hi @user-7afd77 , do you mean it does not recognize Neon's IP?

user-7afd77 22 October, 2025, 14:22:36

Exactly. I even tried with python record.py -i IP_ADDRESS, as you suggested in a previous message. I took the IP Address from the Companion App

user-3e88a5 22 October, 2025, 15:24:51

hi, I'm using the Neon Module with the Meta Quest 3, but when I make a recording the scene camera is always grey and also the audio is not recorded even if the option to do it is enabled. How can I solve this?

user-f43a29 22 October, 2025, 15:27:45

Hi @user-3e88a5 , the scene camera is not used in Neon XR, because it is blocked anyway when placed inside the VR headset. So, the gray frames there are intentional.

If you want to record the video of the VR scene, then check out our neon-vr-recorder tool.

I will relay the request to record audio in Neon XR scenarios to the team. Feel free to also post it as a 💡 features-requests .

user-3e88a5 22 October, 2025, 15:35:59

Thank you a lot! I didn't want to use the neon-vr-recorder tool because I'm using also other sensor and I have everything connected over wifi and I don't want to overload the network. And I can't use it wired because I'm doing test with a subject who is walking with an exoskeleton

user-f43a29 22 October, 2025, 15:39:02

The default is for the VR video to be sent over a Quest cable, not over the network. Only the gaze data would additionally be sent over the network, which should not be much extra load.

user-3e88a5 22 October, 2025, 15:43:08

yes yes, but I can't connect the quest to a PC because I need to let the subject to walk freely. Thank you for your answer!

user-f43a29 22 October, 2025, 15:47:56

In your Unity application, you can save the person's position and gaze data in the VR scene over time and then re-render the scene post-hoc with that trajectory to produce a video for playback.

user-ed9dfb 22 October, 2025, 15:48:01

Hey @user-f43a29 , I checked on the experiment pc and the dll is there in the new unity project with freshly installed Neon Core XR Package. I saw you earlier wrote (deleted message?) if I made a build on the experiment pc, which I probably havent on the most recent commit, so I'm trying that now. I'm also now testing on a 3rd pc.

Chat image

user-f43a29 22 October, 2025, 15:49:18

Yes, I edited the message. You should not need to build the DLL, as it is provided by default in the udp branch of the Neon XR Core Package. Apologies for any confusion.

user-3e88a5 22 October, 2025, 15:55:54

Yes, but actually I'm mostly interested in the audio which is not recorded if the scene camera does not record anything. Instead of recording audio from pupil I will add the audio recording in the Quest application no problem.

user-f43a29 23 October, 2025, 07:40:26

Since Neon's microphones are in the module, which is encapsulated in the VR headset in this case, you will probably record clearer audio by using Quest's microphones.

user-ed9dfb 23 October, 2025, 08:53:44

Yes, I edited the message. You should

user-7afd77 27 October, 2025, 13:52:34

Hello again. I was wondering why the neon.local:8080 always shows this even though the module is connected to the phone

user-d407c1 27 October, 2025, 13:56:38

Hi @user-7afd77 ! Do you have more than one device? Could it be that another companion device with the module disconnected is on the same network? If not could you check on the companion device that the frame and module are recognized?

user-7afd77 27 October, 2025, 13:52:53

Chat image

user-7afd77 27 October, 2025, 14:01:07

If you are referring to the phone, I think there are no other phones that have been connected to the module. I suppose they are recognized, as I have the image of the quest 3 and I can see the eyes camera working

user-d407c1 27 October, 2025, 14:03:49

Thanks for sharing the additional information, kindly note that on NeonXR the scene camera is disabled as it would be blocked, and therefore the monitor app may not work. May I ask what are you trying to achieve throught the monitor app?

user-7afd77 27 October, 2025, 14:05:33

Actually nothing much, I wanted to check all the options available on it. The previous user of the current module I have told me that I should see something like "device connected. Streaming data" and I was wondering if what I'm seeing right now is correct or not

user-7afd77 29 October, 2025, 13:43:37

Hello, this might be a really stupid and basic question:

If I want to read data in Unity with Neon XR, is there a library to check names and references? Whenever I start a simple project importing Neon XR, there seems to be no data being logged besides the correct connection with the module and 200 messages processed.

Are they the same from the python API?

Thanks

user-f43a29 29 October, 2025, 13:47:19

Hi @user-7afd77 , may I ask what you mean by "names and references"? Do you mean a function that gives you the current gaze data?

To clarify, Neon XR's data is not logged automatically. You have to run a function that collects the data and save it to a file of your choice.

user-7afd77 29 October, 2025, 13:51:02

Yup, I'm trying to collect data and use it runtime (and save it as well). Do I just check the scripts file in the Neon XR Scripts folder? Is there no "documentation" of them?

user-f43a29 29 October, 2025, 14:02:23

Our fork of the MRTK Template Project acts as examples of how to use Neon XR in the context of that system. The general principles are the same for other SDKs and approaches.

Essentially, you just need to monitor the gazeDataReady Unity event via an OnGazeDataReady method, as listed in Step 7 at that link.

See here for a code example. It takes a GazeDataProvider object, which can be an instance of the NeonGazeDataProvider class from the Neon XR Core Package.

Let us know if you have more questions.

user-7afd77 29 October, 2025, 14:36:22

Sorry for the late reply and thank you for the answer. Worked like a charm, thank you.

user-740e46 30 October, 2025, 09:28:29

hi @user-d407c1 Sorry for the delayed reply.

The error we discussed earlier has been resolved on the dev branch. However, there’s still an issue where the screen appears completely black on the first run, even though it works fine from the second run onward.

user-d407c1 30 October, 2025, 10:38:07

Hi @user-740e46 ! No worries, would you mind refreshing me, what were you trying to achieve? Were you using neon-vr-recorder?

user-5f2716 30 October, 2025, 18:40:33

Heyo! The SDK seems to be using MRTK3 under the hood. Can it also be used with Unity's XRIT's gaze system?

user-f43a29 31 October, 2025, 08:28:43

Hi @user-5f2716 , the Neon XR Core Package uses neither MRTK3 nor XRIT. It is completely independent of both.

We simply provide a fork of the MRTK3 Template Project as one example of how to integrate the Core Package with an SDK. You could equally as well integrate with XRIT, Meta's SDKs, etc.

Let us know if that clears things up.

user-5f2716 31 October, 2025, 09:40:01

Ah thanks! So no one made a way yet to use the XRIT gaze provider with it yet?

user-f43a29 31 October, 2025, 10:16:46

Since the MRTK3 is built on and extends the XRIT, one could say that has already been effectively accomplished. Having said that, I do not know of an explicit integration of the Neon XR Core Package with the XRIT Gaze Provider, but the principles would be the same. Please see this message for more details: https://discord.com/channels/285728493612957698/1187405314908233768/1433093616883994685

End of October archive