Hey, thanks for the response, but looks like there is no screwdriver included, I’m fairly sure I didn’t receive one. Unless I’m missing where it should be, can you let me know where it was supposed to be?
Hi @user-18c6b6 , no problem.
And my apologies if you or anyone else saw the original message that I had here. I had confused it with something else for a moment.
So, technically, there is some room for glasses and people do use VR headsets that way, but irrespective of whether it will fit comfortably with standard Neon XR is the problem that the frames of the glasses will partly block the eye cameras. This will disturb the gaze signal. Also, the reflective/refractive properties of the lenses in the glasses will introduce additional undesirable distortions of the eye images. This is generally to be avoided in order to maintain good data quality.
As you suggest, you could also try wearing Neon in one of our glasses frames inside the VR headset. You would no longer have the problem that the glasses block the eye cameras, but then you need to test it on a case-by-case basis, as whether it will fit comfortably will partly depend on the participant’s face shape. Aside from comfort, you would also lose a bit of the convenient mapping of gaze to the VR virtual space that is provided by the Neon XR code. You would need to do the mount calibration process for each participant with a glasses-mounted Neon and if the headset or glasses slip, then you would need to redo that calibration. (Note that this is not the same as eye tracking calibration of a participant’s gaze).
So, it is not impossible, but it would require some piloting to find the best solution in your case.
Thanks again for the information.
Hi @user-f43a29 , We got the ready set go frame and are planning to use them inside Quest Pro. I was able to get the eye state visualization working in my Unity project (built on OpenXR) using Neon XR Core package but it is not calibrated properly. So I followed the Mount calibration (https://docs.pupil-labs.com/neon/neon-xr/build-your-own-mount/#calibrating-the-mount) . I cloned your fork of the MRTK project, checked out to neon branch, created the Addressables settings, but I cant get any data in any of the scenes. I get the following errors in the image in PL_calibration scene.
Another query, how should I incorporate this Calibration to my application? Please correct if I am wrong: We get the config file from performing calibration using the MRTK sample scene. And use this config file in my Unity application in the addresable group. Do it for each person or whenever the sensor position changes.
Thank you
We’ve fixed some minor bugs in our program, but the OnGazeDataReady event is still being triggered at 25Hz. Ideally, we’d like to record at 100Hz or higher. Do you know of any ways to realize it?
The contents of the attached file are as follows - gaze.csv : CSV file downloaded from PupilCloud - NewFile_metrics_Neon175917.csv : CSV file created by having OnGazeDataReadyEvent appended each time it is called - CSVWriter_Metrics : Script to write metrics obtained from NeonXR and LeapMotionController in CSV - Metrics_Neon : Script for handling metrics obtained from NeonXR
Hi @user-1ec436 👋! Let’s break the problem down into smaller parts, and I’ll start with a few clarifications and questions:
IMHO There may be a bottleneck in the code, either on when the code is being executed or in how is consumed the data. I’d suggest running our demos to check the sampling rate there to help isolate the issue.
@user-d407c1 , thank you for the support, and I apologize for the late reply.
Which headset are you using? We are using Quest 3 and Quest-Link. The refresh rate setting for Quest Link was 72Hz (default value).
If your code relies on Update(), No, our code relies on OnGazeDataReady(). Additionally, making the simplest possible scene to check the interval at which OnGazeDataReady() is called, we found that it was approximately 0.04s(25Hz). Checking the interval for the Update function, it was about 0.0016 seconds (600Hz).
For the Companion Device, could you confirm which one you’re using? The device is Motorola Edge 40 Pro.
We are using this USB hub to connect the companion device with NeonXR and the router. Could the low 25Hz rate be caused by the USB hub's limited specifications ? The link below is for the newer version of this USB hub. The exact model we are using is no longer available. https://www.amazon.co.jp/-/en/Converter-Adapter-Docking-Ethernet-Multi-Display/dp/B07X61ZVFV
Thank you for your support.
Hi @user-1ec436 , first, thanks for your assistance and patience throughout the debugging process.
We wanted to send you an update. The relevant team has taken a look into this and was able to reproduce the issue. It is related to the use of gazeDataReady
. It should not run at 25Hz in this case, so the team will implement a fix for that, but it is still worth noting that the gazeDataReady
event is sent after synchronization with the main thread. This means the main thread throttles the rate at which gaze data is received.
The gazeDataReady
event is designed for interactive use cases, where this is acceptable, and it is an easy way to get up and running.
If you intend to collect data for analysis when using Unity, then I can recommend two options:
GazeDataReceived
property of the internal RTSPClient specification. You would need to write your own class that implements it, and the NeonGazeDataProvider class can be used as a reference to understand how to do it. Note that your custom code should run in a separate thread and you will need to coordinate synchronization with other threads.Hi @user-1ec436 , I am stepping in for @user-d407c1 again:
Hey Rob. The issue turned out to be related to a config file generated by the scripts in the neon unity package (in C:/Users/Username/AppData/LocalLow/UnityCompanyName/ProjectName/config.json). There is an IP field in this file. Once I changed it manually to the current IP of the Neon, the connection to unity was working. This is a bit frustrating, since the IP is dynamic, but so far it seems to be working, and the apparent interaction with the IG program is resolved too.
Now that we are able to read/record the gaze origin and direction values in the unity scene, we'd like to establish a calibration that maps these gaze values into screen coordinates. For our use case, the user will be wearing the Neon in a chin rest facing a large monitor that will show the stimuli. I scripted up a four corner calibration sequence that calculates the geometric median of the incoming localGazeDirection values for each corner. Then I attempt to establish a mapping of these directions into something akin to screen space coordinates, but I am having trouble getting a good solution here. Are there any example projects that have a similar use case that I may be able to borrow some calibration code from? Or perhaps is there another approach to this that would be better?
Digging into the scripts in the NeonXR Core package, I see that there's a few in a calibration folder. Is there any documentation on how to use these calibration scripts within unity?
Hi @user-d086cf ! I moved your messages to the 🤿 neon-xr channel.
Here is the documentation about calibrating the mount in Neon XR.
That said, given the context you are sharing, I am not 100% sure that you need the whole NeonXR package, but rather use something simpler like the real time screen gaze package to get the coordinates on the screen, and then something to transfer those screen coordinates to Unity. For example, using the Gaze controlled cursor demo and using the cursor as input in Unity might already satisfy your needs without needing to use NeonXR outside a XR environment.
Awesome, thanks Miguel! I'll check out this other package
Regarding the april tags, do they need to be active on the display the entire time, or just during the initial calibration when the surface is being defined?
The package I linked is designed to be displayed at all times, as you might move your head and the module with it.
If you are fixing the relationship between the screen and the module to the chin-rest, estimate the translation vector just once, but you would need to modify the code.
Hi Miguel, sorry to keep bugging you on this. Is it possible to connect the Neon directly to the computer running the python API code, or do we need to stream from the companion device?
You would need to stream from the companion device as this is where the magic happens (where the neural network runs).
gotcha, thanks!
Hi, I'm very new to the Neon XR package and eye tracking in general. My research organization has a Unity project set up where we are collecting the participants' gaze and pupil data using Pupil Core and the Pupil Capture application. However, we just received a Neon and want to integrate it into the Unity project using the Meta Quest 3 and Quest link, where the scene plays in the Quest by setting the OpenXR runtime to the quest instead of the HTC Vive, which we were using previously. I don't really even know where to begin in replicating our app using the new Neon XR package and the Neon. I want to figure out what data is stored by the Neon and where it is stored, and how I can modify the existing code to store the recorded data locally on the PC where Unity is running. Additionally, I don't know how to handle calibration for the Neon, as previously we were controlling when calibration started via a key press, so how can I mimic the same process with the Neon? I know these are relatively vague questions and I apologize for my inexperience here, but if someone wouldn't mind giving me a brief high-level overview of what to do in order to get the Neon working with my Unity project I would really appreciate it.