🀿 neon-xr


Year

user-3e88a5 01 November, 2024, 12:25:17

Hello, I'm working with the pupil neon and the meta quest3, I bought the nest before that the adapter for the meta quest 3 was released. Now I'm trying to run the examples in the mrtk template project but they're not working, are there some particular setting that do I have to set in Unity? In particular I'm trying to run the PL_Calibration example

user-f43a29 04 November, 2024, 09:54:06

Hi @user-3e88a5 , may I first confirm the following:

  • Are you using the neon-pico branch of the MRTK repository?
  • What kind of network connection are you using?

May I also ask for more clarification about what exactly is not working? Do you receive an error?

nmt 01 November, 2024, 17:44:47

Hi @user-3e88a5 and @user-386d26! Apologies for the delayed response. Our Product Specialists with more experience of Neon-XR will be able to assist you with your setup next week. Thanks for your patience!

user-3e88a5 04 November, 2024, 08:42:39

thank you!

user-f43a29 04 November, 2024, 09:51:49

Hi @user-386d26 , no need to apologize!

  • Participant Calibration: Neon is calibration-free, so there is no calibration step that you need to do at the start of each experimental session. It is deep-learning powered, providing accurate gaze data from the moment you put it on. I recommend checking out our Neon accuracy whitepaper for more information
  • Mount Calibration: However, whenever you put the Neon module into a new VR headset, you do need to run a one-time mount calibration. This is not the same as an eyetracking calibration. Rather, it lets the system know the relationship between Neon's coordinate system and the VR coodinate system
  • Neon data: I recommend taking a look through the Data Collection section of the documentation. When you run a recording, data is stored on the Neon Companion Device (i.e., the phone) and the data can be simultaneously streamed in real-time to your Unity computer
  • Storing data locally: As mentioned, data from all streams are saved on the Companion Device, but gaze will not be saved in VR space coordinates. Rather, you would collect that data in real-time via Neon XR and save that to a file. You want to monitor the gazeDataReady event. Our fork of the MRTK3 Template Project contains examples of monitoring this event
  • Data on Pupil Cloud: By default, the Neon Companion app also uploads your recordings to Pupil Cloud, which serves as a data backup and analysis platform. Please note that the VR camera feed is not uploaded to Pupil Cloud

Please note that streaming with Neon XR is meant for interactive use. If you need more thorough data analysis, may I ask what you plan to do exactly?

user-386d26 06 November, 2024, 21:35:05

Thanks for the detailed response! I believe that what I'm looking for now is just storing real-time gaze data locally. I set up the MRTK3 sample project on my machine, but where can I find examples of monitoring the gazeDataReady event? I'm trying to search through the scenes to find any that use the NeonXR object like directed in the NeonXR Package documentation, but I can't seem to make sense of this. Please let me know where I can find examples of this to mimic. Thanks again!

user-3e88a5 04 November, 2024, 10:05:51

I'm using the neon branch of the repository. I tried to start from scratch using just the neon-xr core package and in that case I manage to see the ray from the eye in my application but it's not calibrated. So I tried to use the MRTK sample for the calibration, there are some settings that I fixed in unity in order to make the application build and run on the meta quest 3. Then I can run the application but I something it's not working since if I pinch looking at some markers nothing happens. I'm using my personal network connection

user-f43a29 04 November, 2024, 11:59:14

Hi @user-3e88a5 , a new version of our fork of the MRTK3 Toolkit was just pushed on the neon branch.

Could you update to that and let us know how it works out?

Also, you might need to hold your hand a bit further in front of your face when doing the pinch gestures.

user-f43a29 04 November, 2024, 10:31:38

I see. Are you running this with the Quest Link or with SteamVR running at the same time?

user-3e88a5 04 November, 2024, 10:06:41

In some minutes I'll send you soe screen of the projects settings in unity

user-3e88a5 04 November, 2024, 10:31:34

I disabled googleAR core otherwise I cannot use Vulkan as graphic, and that's required by meta quest support

Chat image

user-3e88a5 04 November, 2024, 10:32:00

that's the message if enable google ar core

Chat image

user-3e88a5 04 November, 2024, 10:32:23

No, I'm not using quest link, I'm deploying the app as a standalone app on the meta quest 3

user-f43a29 04 November, 2024, 10:43:43

Hi @user-3e88a5 , thanks for all of this info. Let me confer with the team and get back to you.

user-3e88a5 04 November, 2024, 10:33:37

Also in the calibration example I cannot see nothing that recall the Neon Gaze data provider, is that normal?

user-3e88a5 04 November, 2024, 10:44:56

ok, thank you a lot for your time!

user-3e88a5 04 November, 2024, 12:57:14

now I receive this error

Chat image

user-f43a29 04 November, 2024, 15:25:46

Hi @user-3e88a5 , this could indicate that your system ran out of RAM or harddrive space during the build. Could you check that and then try again from a clean, fresh state?

user-3e88a5 04 November, 2024, 12:57:58

that's the message in the console

Chat image

user-3e88a5 04 November, 2024, 15:55:05

ok, tomorrow I'll try, but this happens just if I build the project with unity 2021, if I migrate into unity 2022 it doesn't happen, but I have the same problem I told you before

user-f43a29 04 November, 2024, 16:06:36

Ok. You had mentioned that you had changed other settings in Unity. Could you list what they were?

user-3e88a5 05 November, 2024, 08:19:56

well, I'm referring to the project settings. And it was regarding GoogleAR Core; so the thing is if I disable googleARcore and I enabke vulkan graphics (required from Optimize Buffer discard) the application start but than if I pinch only the center marker explode and then I can't continue. If I took the googleARcore with the OpenGLSE3 graphics I still have some warning in the project settings, the program build and deploy to the meta quest3 but then it doesn't run

user-f43a29 05 November, 2024, 08:21:09

@user-3e88a5 Ok, could you then open a ticket in πŸ›Ÿ troubleshooting about this?

user-3e88a5 05 November, 2024, 08:21:41

that's the warnings, so I know how to solve the one referred to the speech interactor but I don't think is needed, than one is referred to the deprecation of hand tracking for android used by openxr and the other one is about the Vulkan graphics thing.

Chat image

user-386d26 06 November, 2024, 22:02:20

Update on this, I am looking through the PL_EyeTrackingBasicSetupExample scene, but I don't see any references to the NeonXR object mentioned in the docs, nor do I see any gazeDataReady event, but I think I'm just not looking in the right places.

user-f43a29 07 November, 2024, 10:58:35

Hi @user-386d26 , apologies for being unclear. Let me expand on that:

Note that is important to do the mount calibration for your Quest headset first, if you want your gaze data to be in sensible units for your VR experiment. Also, use of only the Neon XR Core Package by itself is meant more for users who plan to integrate with other SDKs or systems.

user-006317 12 November, 2024, 11:21:43

Hi Pupil Lab, I got a time urgent question if you can help me. I am planning to use NEON-XR with any of the HMD that fits. I have 3 qeustons on this: 1. How the data stream works? The eyetracking data will stream from Companion app or from the VR side? (I am using iMotions so wondering if the data streaming will just work as usual as per using NEON glasses itself with iMotions) 2. how accuracte the eyetracking data is, given that there is an additional screen from HMD itself? 3. Can your NEON-XR work with Varjo XR series? Like for XR4? Appreciate a quicker response, thanks!!

user-f43a29 12 November, 2024, 11:54:41

Hi @user-006317 , we also received your email and replied there. Thanks!

user-cad8c8 18 November, 2024, 06:30:22
user-f43a29 18 November, 2024, 07:56:40

Hi @user-cad8c8 , yes, it is in principle possible. You will need to receive the real-time streamed data (see here for full details) in your code. The Neon XR code contains an implementation for getting the gaze data.

Essentially, you connect to the RTSP server over WebSockets @ ws://neon-ip-address:8686 (you can also use neon.local), setup the session, and decode the received data. WebSockets will probably be necessary here, when working in the browser.

You could use that as a reference to start. If you need increased accuracy in temporal synchronization, then you may want to add code for collecting & responding to RTCP packets.

user-cad8c8 18 November, 2024, 06:30:22

Thanks for this! I am aware of it in being available in Unity, able to be exported to webxr. But I would like to have access to the pupil labs gaze data in (a custom?) XR compatible browser for Quest3 so that I can run any webxr website in this browser and make use of the pupil labs eye tracking on browsers.

user-cad8c8 18 November, 2024, 06:31:39

Will this be possible? Some suggestions for development in this area could be really helpful!

user-cad8c8 18 November, 2024, 08:00:07

Great! Thanks for the reference! I'll look into it!

user-f43a29 18 November, 2024, 08:00:34

@user-cad8c8 No problem, but please note that this does not account for mount calibration. This will provide you with Neon's raw gaze signal in scene camera coordinates.

You could run the mount calibration in Neon XR and then re-use the provided values in your code.

user-cad8c8 18 November, 2024, 08:11:57

@user-f43a29 Do you have suggestions regarding translating/mapping scene-camera coordinates to immersive-ar boundary coordinates?

user-f43a29 18 November, 2024, 08:52:53

If you know the coordinate system of your AR simulation and you can easily cast rays in that environment, then you could follow a similar calibration process as used in Neon XR. The end result is a transformation from Neon scene camera coordinates to the coordinate system of your simulator.

If you would rather not extract and modify that code, it might be easier to determine the relationship between the coordinate system of your AR environment and the coordinate system of Unity for the PL_Calibration scene. It'll most likely be a composite scale, translation, and rotation. Then, after you've run our mount calibration, you get the gaze data, convert it temporarily to Unity's system, and then apply the transformation from Unity to your AR system.

It's just an idea and has not been thoroughly tested, so you'll need to test it to see which approach is easiest. And please note that the process might be a bit more complicated with AR.

user-cad8c8 18 November, 2024, 08:54:12

Thanks for this!

user-9b48d1 18 November, 2024, 14:03:13

Hello there! I would like to include Eyetracking into an existing Unity Project for the Quest 3. I find the MRTKDevTemplate a bit confusing though, and I have followed the instructions that have been posted here (https://discord.com/channels/285728493612957698/1187405314908233768/1273433381622911037) already, except for the last part about the listener, which I still need to implement, but I do not quite understand how, do I need to implement an event listener with the XR Rig (the player), which then holds the OnGazeReady function?

user-f43a29 20 November, 2024, 15:25:54

Hi @user-9b48d1 , if you are using the MRTK3 Template Project, then you do not necessarily need to write code to check gazeDataReady yourself. The GazeDataProvider class and the sub-class, NeonGazeDataProvider, from the underlying Neon XR Core package automatically check for this event in a background process. If using MRTK3, then it is already using those classes -> take a look at the MRTKInputNeon class for an implementation and see this message for more details: https://discord.com/channels/285728493612957698/1187405314908233768/1304037280390053928

Regarding Quest 3, we would recommend first getting your experiment working in Unity and test with the MRTK3 Input Simulator in the Editor, without the VR device interface. Then, every now and then, build it and load it to the headset for a more authentic test. May I ask, do you plan to also use Quest Link and SteamVR?

Regarding the Neon XR Core Package, using that directly is meant more for users who want to integrate with other SDKs or systems. In its default state, it will provide you with gaze data in Neon Scene Camera coordinates, but typically you would want gaze data in the coordinate system of your VR simulation. The PL_Calibration scene of the MRTK3 Template Project shows how to get a mount calibration file for this. You will want to do this at least once for your Quest 3.

The MRTK3 Template already bundles the Neon XR Core Package, so you can remove it for said project if you installed it extra. After following the instructions for Other Platforms, the scenes are found under Assets/PupilLabs/Scenes as shown in the attached screenshot. When you double click on PL_Calibration, the scene should load, as shown in the second screenshot.

Chat image Chat image

user-c9caf9 19 November, 2024, 02:01:42

Hi, I'm try to calibrate the neon eye tracker with Gooivs headset, this headset is just a external monitor, I download the neon-branch and try to modified the project so I can run the calibration. Before change anything in the project, there are two errors in the project. It seems the Gaze Data Provider and Storage are None in the script. How I suppose to fix this and how should I connect the Neon to unity?

Chat image Chat image

user-f43a29 19 November, 2024, 08:52:05

Hi @user-c9caf9 , may I first confirm that you followed the steps here? Thanks!

user-c9caf9 19 November, 2024, 20:55:47

Yes, since I'm try to do calibration with other platforms so I followed the 'other platforms' setup steps and do all six steps. The second picture show the current addressables groups and settings, Could you take a look and tell me is there anything wrong? Thank you!

Chat image Chat image

user-f43a29 20 November, 2024, 11:13:25

@user-c9caf9 Let me take a look here and check with the team and update you.

user-c9caf9 20 November, 2024, 00:15:46

Hi, I also noticed that I could connect the device to computer via ethernet? https://docs.pupil-labs.com/neon/hardware/using-a-usb-hub/ I bought the exact same Anker hub and connect it to computer, but my computer doesn't identify it. Is there any DHCP or internet setting need to be down with the phone or computer? I'm using win11 and the phone is OnePlus 10 come with the neon tracker.

user-f43a29 20 November, 2024, 11:12:50

Hi @user-c9caf9 , yes, this is possible. If you are using Windows 11, then check these instructions if you want to directly connect to the Ethernet port of your computer: https://discord.com/channels/285728493612957698/1047111711230009405/1272483345137139732

It will be easier though to connect your Neon and your computer via Ethernet to the same router. Then, you have to do less configuration and you will still have essentially the same low transmission latencies.

user-11c6b0 20 November, 2024, 09:15:25

Hello! I just got the neon package with the quest 3 attachment, is there any guide online on how to put the tracker itself in the glasses? I can't seem to find the leaflet guide

user-f43a29 20 November, 2024, 16:08:54

Hi @user-11c6b0 , are you referring to how to mount the Neon module in the Quest 3 mount or how to place the mount itself in the Quest 3 headest?

user-c9caf9 20 November, 2024, 11:14:16

Thanks, we have figure out the ethernet connection. But the Unity problem still exists.

user-f43a29 20 November, 2024, 16:12:12

You can also try going to Window->Asset Managment->Addressables->Groups and clicking Update a Previous Build. If you receive an error message, then it is advised to delete that folder and try from step 1 again.

user-f43a29 20 November, 2024, 16:11:02

Hi @user-c9caf9 , you may want to delete the AddressableAssetsData folder and try those steps again. You can find that folder as shown in this screenshot.

Chat image

user-c9caf9 20 November, 2024, 16:14:09

I tried, got build success, but when I run the pl_Calibration scene, it sill has the same issue

user-f43a29 20 November, 2024, 16:18:51

Are you trying to run the scene via a cable link to the headset? Is Goovis an Android-based headset that can run untethered?

user-f43a29 20 November, 2024, 16:14:58

Are you testing this standalone or can I just confirm that your is Neon powered up and connected to the same network as that computer?

user-c9caf9 20 November, 2024, 16:14:58

Do I need to change the ip address in config files so unity can find the neon device?

user-f43a29 20 November, 2024, 16:17:09

By default, no, as the autoIP option should be enabled. However, if your network is not supporting automatic device discovery for some reason, then yes, you can try entering the direct IP address, as shown in the Stream section of the app.

user-c9caf9 20 November, 2024, 16:16:00

The device is connected to OnePlus 10 phone, the neon compasion is running

user-c9caf9 20 November, 2024, 16:19:20

No, that headset is just an external monitor

user-c9caf9 20 November, 2024, 16:20:49

I think the neon can be found by my computer, I just run the simple python code use discover_one_device() and it can return gaze information. For now I did'nt change the ip address in the config file

user-f43a29 20 November, 2024, 20:50:06

Hi @user-c9caf9 , I have an update from the team:

  • If you want to directly start the PL_Calibration scene by itself, then you should manually add an instance of the MRTK NeonXR Variant prefab to the scene.
  • For our fork of the MRTK3 Template, the typical workflow is rather to build the application and run it, beginning with the PL_HandInteractionExamples scene, which already has an instance of this prefab. This particular instance is by default not destroyed when you load new scenes.
  • Then, you navigate to other scenes from there, such as to the PL_Calibration scene.

Please also see the attached images.

Chat image Chat image

user-f43a29 20 November, 2024, 16:23:51

Ok, I am checking with the team and will update you when I know more.

user-f43a29 20 November, 2024, 20:53:03

@user-386d26 and @user-9b48d1 I believe the above info will also be useful to both of you.

@user-386d26, you may also want to check this message: https://discord.com/channels/285728493612957698/1187405314908233768/1308815595030052985

user-c9caf9 20 November, 2024, 20:53:24

Thank you so much!!! It works

user-1391e7 22 November, 2024, 09:37:51

Hello hello! We have recently bought two mounts for the Meta Quest and with the eye tracking sensor attached, we've noticed that the sensor itself causes a noticeable amount of discomfort. Do you have any experience with ways to alleviate this?

nmt 25 November, 2024, 04:35:55

Hi @user-1391e7! Can you elaborate on what you mean by discomfort?

user-d086cf 25 November, 2024, 15:59:12

Hi guys, is there any way to automate the process of loading a recording in pupil player and generating the fixation/saccade data for export? We are a bit limited in that we can't be uploading our data to the cloud (silly IRB rules), but Id be interested to know if we can do it with or without pupil cloud.

user-1391e7 26 November, 2024, 07:55:42

the sensor presses into the bridge of the nose and forehead, part of the weight of the HMD also comes to bear on the nose now

nmt 27 November, 2024, 14:24:42

Thanks for sharing. Have you tried extending the face shield maximally? It should also be possible to get a thicker pad - that should alleviate the issue.

user-f43a29 26 November, 2024, 09:32:51

Hi @user-d086cf , if I remember correctly, you have a Neon, yes?

If so, you can check out:

  • pl-rec-export -> A command-line tool that you can run in a loop to batch export. It exports fixations & saccades.
  • pl-neon-recording -> A Python library for reading Neon's Native Recording Data. You can use this to build your own custom batch export script

Would those work?

user-d086cf 26 November, 2024, 13:38:18

This is exactly what I was looking for! Thanks!

user-c9caf9 26 November, 2024, 22:47:47

Hi, I just tried to do the calibration in pl_Calibration scene, does those red dots means the calibration is very bad? The second picture show's my current setup

Chat image

user-c9caf9 26 November, 2024, 22:49:00

Chat image

user-6d7fe6 27 November, 2024, 09:33:54

Hi, I couldn't find information on the weight of neon-XR frames. How heavy would for example the frame for the Quest 3 be? Is the frame just attached to the VR headset or is it swapped with some other part of the headset?

user-f43a29 27 November, 2024, 15:09:15

Hi @user-6d7fe6 , the weights of the Neon XR Frame mounts that we provide are:

  • Meta Quest 3 Frame mount: 94g (without Neon module)
  • Pico 4 Frame mount: 36g (without Neon module)

With respect to the Meta Quest 3 Frame mount, you would remove the default face shield and replace it with the one that we include in the package.

user-f43a29 27 November, 2024, 15:02:34

Hi @user-c9caf9 , that calibration could be significantly improved and probably should not be used.

I've attached an example of a calibration result that is desirable. You can see that the green and red dots come close to laying on top of each other.

Considering your setup, you had mentioned that you use the Goovis headset as if it were an extended/external monitor? This could potentially lead to problems, as it breaks some of the assumptions that are made in VR/AR contexts.

The relevant team asks if you also have a reduced Field of View (FoV) with your setup. Were you able to easily see and look at all the calibration targets in the periphery or were they located at the edges, potentially even cut off at the edge of the screen?

Chat image

user-c9caf9 27 November, 2024, 20:45:11

I just wondering why the algorithm can't calculate the pos?

Chat image

user-c9caf9 27 November, 2024, 17:33:37

I was want to say that my calibration result is very bad, not the algorithm. I can see all the targets in the goovis, and I was able to see the edge of the screen.

user-f43a29 28 November, 2024, 08:47:25

Hi @user-c9caf9 , the algorithm is actively estimating pos(ition), but the algorithm works under the assumption that you are using your headset as a 3D stereoscopic display; in other words, as a full VR device.

Using the device as if it were a flat extended monitor breaks some of these assumptions, which relate to the positioning of the Neon Scene camera in relation to the VR camera.

I could imagine that this plays a role.

Is it possible to do a brief test where you use the headset as a VR display, rather than as a flat display?

user-c9caf9 03 December, 2024, 19:42:50

I think in default the headset will run as a flat display, left and right view will look same. I make a little change in the game scene, put two camera and generate a side by side view. Red dot is the visualized gaze point for testing.

Chat image

End of November archive