🤿 neon-xr


Year

user-386d26 01 October, 2024, 21:16:30

Hey, thanks for the response, but looks like there is no screwdriver included, I’m fairly sure I didn’t receive one. Unless I’m missing where it should be, can you let me know where it was supposed to be?

user-f43a29 02 October, 2024, 09:18:20

Hi @user-18c6b6 , no problem.

And my apologies if you or anyone else saw the original message that I had here. I had confused it with something else for a moment.

So, technically, there is some room for glasses and people do use VR headsets that way, but irrespective of whether it will fit comfortably with standard Neon XR is the problem that the frames of the glasses will partly block the eye cameras. This will disturb the gaze signal. Also, the reflective/refractive properties of the lenses in the glasses will introduce additional undesirable distortions of the eye images. This is generally to be avoided in order to maintain good data quality.

As you suggest, you could also try wearing Neon in one of our glasses frames inside the VR headset. You would no longer have the problem that the glasses block the eye cameras, but then you need to test it on a case-by-case basis, as whether it will fit comfortably will partly depend on the participant’s face shape. Aside from comfort, you would also lose a bit of the convenient mapping of gaze to the VR virtual space that is provided by the Neon XR code. You would need to do the mount calibration process for each participant with a glasses-mounted Neon and if the headset or glasses slip, then you would need to redo that calibration. (Note that this is not the same as eye tracking calibration of a participant’s gaze).

So, it is not impossible, but it would require some piloting to find the best solution in your case.

user-0a9f16 03 October, 2024, 19:38:37

Thanks again for the information.

user-0a9f16 06 January, 2025, 23:04:00

Hi @user-f43a29 , We got the ready set go frame and are planning to use them inside Quest Pro. I was able to get the eye state visualization working in my Unity project (built on OpenXR) using Neon XR Core package but it is not calibrated properly. So I followed the Mount calibration (https://docs.pupil-labs.com/neon/neon-xr/build-your-own-mount/#calibrating-the-mount) . I cloned your fork of the MRTK project, checked out to neon branch, created the Addressables settings, but I cant get any data in any of the scenes. I get the following errors in the image in PL_calibration scene.

Another query, how should I incorporate this Calibration to my application? Please correct if I am wrong: We get the config file from performing calibration using the MRTK sample scene. And use this config file in my Unity application in the addresable group. Do it for each person or whenever the sensor position changes.

Thank you

Chat image

user-1ec436 03 October, 2024, 09:48:50

We’ve fixed some minor bugs in our program, but the OnGazeDataReady event is still being triggered at 25Hz. Ideally, we’d like to record at 100Hz or higher. Do you know of any ways to realize it?

  • The companion device is set to 200Hz.
  • The UnityEditor is running at around 400FPS.
  • The data on PupilCloud is recorded at around 200Hz (though this might be irrelevant).

The contents of the attached file are as follows - gaze.csv : CSV file downloaded from PupilCloud - NewFile_metrics_Neon175917.csv : CSV file created by having OnGazeDataReadyEvent appended each time it is called - CSVWriter_Metrics : Script to write metrics obtained from NeonXR and LeapMotionController in CSV - Metrics_Neon : Script for handling metrics obtained from NeonXR

NewFile_metrics_Neon175917.csv gaze.csv CSVWriter_Metrics.cs Metrics_Neon.cs

user-d407c1 07 October, 2024, 09:00:34

Hi @user-1ec436 👋! Let’s break the problem down into smaller parts, and I’ll start with a few clarifications and questions:

  • Cloud reprocesses recordings to ensure 200Hz data, including fixations and blinks, so this aspect shouldn’t be a concern.
  • Regarding the Unity Editor, if you’re using a VR headset, I doubt the framerate is actually 400Hz. If your code relies on Update(), it will be called on each frame. Which headset are you using?
  • For the Companion Device, could you confirm which one you’re using? Is it the Moto Edge 40 Pro? Older companion devices may struggle to stream 200Hz gaze data with eye state (which is why it’s disabled by default). However, with the latest Companion Device (Moto), it should run consistently at 200Hz.
  • One potential bottleneck could be how you’re streaming the data. For example, saturated Wi-Fi networks can cause dropped frames, but this wouldn’t explain a drop from 200Hz to 25Hz.

IMHO There may be a bottleneck in the code, either on when the code is being executed or in how is consumed the data. I’d suggest running our demos to check the sampling rate there to help isolate the issue.

user-1ec436 15 October, 2024, 14:04:15

@user-d407c1 , thank you for the support, and I apologize for the late reply.

Which headset are you using? We are using Quest 3 and Quest-Link. The refresh rate setting for Quest Link was 72Hz (default value).

If your code relies on Update(), No, our code relies on OnGazeDataReady(). Additionally, making the simplest possible scene to check the interval at which OnGazeDataReady() is called, we found that it was approximately 0.04s(25Hz). Checking the interval for the Update function, it was about 0.0016 seconds (600Hz).

For the Companion Device, could you confirm which one you’re using? The device is Motorola Edge 40 Pro.

We are using this USB hub to connect the companion device with NeonXR and the router. Could the low 25Hz rate be caused by the USB hub's limited specifications ? The link below is for the newer version of this USB hub. The exact model we are using is no longer available. https://www.amazon.co.jp/-/en/Converter-Adapter-Docking-Ethernet-Multi-Display/dp/B07X61ZVFV

Thank you for your support.

user-f43a29 17 October, 2024, 11:29:37

Hi @user-1ec436 , first, thanks for your assistance and patience throughout the debugging process.

We wanted to send you an update. The relevant team has taken a look into this and was able to reproduce the issue. It is related to the use of gazeDataReady. It should not run at 25Hz in this case, so the team will implement a fix for that, but it is still worth noting that the gazeDataReady event is sent after synchronization with the main thread. This means the main thread throttles the rate at which gaze data is received.

The gazeDataReady event is designed for interactive use cases, where this is acceptable, and it is an easy way to get up and running.

If you intend to collect data for analysis when using Unity, then I can recommend two options:

  1. Run a recording simultaneously on the Neon Companion device. This is a standard & recommended way, as you will have the data from all sensors and the data can be uploaded to Pupil Cloud for backup and analysis. The recordings can then also be used with the other analysis tools in the Neon ecosystem. To clarify, Neon can record & stream data at the same time.
  2. Alternatively, you can write code that uses the GazeDataReceived property of the internal RTSPClient specification. You would need to write your own class that implements it, and the NeonGazeDataProvider class can be used as a reference to understand how to do it. Note that your custom code should run in a separate thread and you will need to coordinate synchronization with other threads.
user-1ec436 15 October, 2024, 14:06:29

Chat image

user-f43a29 16 October, 2024, 08:18:16

Hi @user-1ec436 , I am stepping in for @user-d407c1 again:

  • Are you writing data to the CSV files on every iteration? So, every time OnGazeDataReady() is called, you write data to the hard drive?
  • I cannot say for certain if the USB hub is causing the issue. Do you experience the same reduction in data streaming framerate when you use our Python-based Real-time API or when you connect over a WiFi router/hotspot, instead of the USB hub?
user-2e4b9c 30 October, 2024, 15:59:15

Hey Rob. The issue turned out to be related to a config file generated by the scripts in the neon unity package (in C:/Users/Username/AppData/LocalLow/UnityCompanyName/ProjectName/config.json). There is an IP field in this file. Once I changed it manually to the current IP of the Neon, the connection to unity was working. This is a bit frustrating, since the IP is dynamic, but so far it seems to be working, and the apparent interaction with the IG program is resolved too.

Now that we are able to read/record the gaze origin and direction values in the unity scene, we'd like to establish a calibration that maps these gaze values into screen coordinates. For our use case, the user will be wearing the Neon in a chin rest facing a large monitor that will show the stimuli. I scripted up a four corner calibration sequence that calculates the geometric median of the incoming localGazeDirection values for each corner. Then I attempt to establish a mapping of these directions into something akin to screen space coordinates, but I am having trouble getting a good solution here. Are there any example projects that have a similar use case that I may be able to borrow some calibration code from? Or perhaps is there another approach to this that would be better?

user-de323d 30 October, 2024, 16:02:23

Digging into the scripts in the NeonXR Core package, I see that there's a few in a calibration folder. Is there any documentation on how to use these calibration scripts within unity?

user-d407c1 30 October, 2024, 16:20:14

Hi @user-d086cf ! I moved your messages to the 🤿 neon-xr channel.
Here is the documentation about calibrating the mount in Neon XR.

That said, given the context you are sharing, I am not 100% sure that you need the whole NeonXR package, but rather use something simpler like the real time screen gaze package to get the coordinates on the screen, and then something to transfer those screen coordinates to Unity. For example, using the Gaze controlled cursor demo and using the cursor as input in Unity might already satisfy your needs without needing to use NeonXR outside a XR environment.

user-d086cf 30 October, 2024, 16:21:51

Awesome, thanks Miguel! I'll check out this other package

user-d086cf 31 October, 2024, 13:47:48

Regarding the april tags, do they need to be active on the display the entire time, or just during the initial calibration when the surface is being defined?

user-d407c1 31 October, 2024, 13:56:56

The package I linked is designed to be displayed at all times, as you might move your head and the module with it.

If you are fixing the relationship between the screen and the module to the chin-rest, estimate the translation vector just once, but you would need to modify the code.

user-d086cf 31 October, 2024, 16:24:33

Hi Miguel, sorry to keep bugging you on this. Is it possible to connect the Neon directly to the computer running the python API code, or do we need to stream from the companion device?

user-d407c1 31 October, 2024, 16:32:57

You would need to stream from the companion device as this is where the magic happens (where the neural network runs).

user-d086cf 31 October, 2024, 16:33:16

gotcha, thanks!

user-386d26 31 October, 2024, 19:03:47

Hi, I'm very new to the Neon XR package and eye tracking in general. My research organization has a Unity project set up where we are collecting the participants' gaze and pupil data using Pupil Core and the Pupil Capture application. However, we just received a Neon and want to integrate it into the Unity project using the Meta Quest 3 and Quest link, where the scene plays in the Quest by setting the OpenXR runtime to the quest instead of the HTC Vive, which we were using previously. I don't really even know where to begin in replicating our app using the new Neon XR package and the Neon. I want to figure out what data is stored by the Neon and where it is stored, and how I can modify the existing code to store the recorded data locally on the PC where Unity is running. Additionally, I don't know how to handle calibration for the Neon, as previously we were controlling when calibration started via a key press, so how can I mimic the same process with the Neon? I know these are relatively vague questions and I apologize for my inexperience here, but if someone wouldn't mind giving me a brief high-level overview of what to do in order to get the Neon working with my Unity project I would really appreciate it.

End of October archive