Hi, when the calibration is done, the unity apps will save the matrix to the conifgs.json file. Does this matrix use Unity left hand coordinate system or opencv right hand one? Also I have the same question for the coordinates/vectors appeared in Player.log and Player-prev.log,
I want to know this cause I write the same algorithm in python and use the data from log file for testing. The matrix I got from it has already make the eye tracker work correctly(at least it looks correct when I visualized the gaze point), but I want to do some further validation and make sure all the steps are correct.
Hi @user-c9caf9! I'll step in briefly here for Rob. It's the Unity coordinate system!
Hi there, could I get a step by step guide for running the mount calibration on a meta quest 3? I am using the quest 3 add on module from Pupil Labs, so if there is a config.json that works for everyone using that setup that would also be great.
But I'm asking since I've been trying to get the MTRK3 calibration scene to run on both Quest Link and on an APK I side loaded onto the Quest. So is there a recommended procedure for someone who is using a Quest? Typically the way I'm integrating my existing Unity app with the Quest is via OpenXR, so I'm unfamiliar with what it takes to run an MRTK3 setup on the Quest.
Hey @user-386d26! Can I just confirm, have you been able to successfully deploy the sample project or not?
Hi, what does the "tl x [px]" "tr x [px]" "br x [px]" "bl x [px]" mean in the "surface_positions.csv"? What is the max of these pixels? And is it the position related the cropped 2D surface?
Hi @user-a97d77 ! π These parameters refer to the surface position, specifically the position of the top-left, top-right, bottom-left, and bottom-right corners in the scene camera coordinate.
You can find more detailed information in the documentation.
hi quick question! how does pupil neon's accuracy/data ouptut compare to pupil core?
I've worked with core in the past and I'm looking at neon for AR/VR data that's more transparent/accurate than meta quest pro's internal eye tracker
Hi @user-8bd1e2! Pupil Core is a calibrated system that requires a controlled environment like a lab (e.g., with constant lighting, and no headset slippage) to achieve the best results. Using the 3d detection pipeline (more info on the gaze mapping pipeline can be found here ), you can achieve accuracy of ~1Β°β2Β°.
For a more versatile solution, we'd recommend NeonXR (our VR solution for Neon). I recommend exploring also the NeonXR docs.
Neon employs deep-learning-based gaze estimation, which is calibration-free and highly robust in dynamic environments. It connects to a smartphone via USB, with the phone serving as both the power source and recording unit, ensuring full portability. In terms of accuracy, you can expect 1.8Β° uncalibrated (and 1.3Β° with offset-correction). See also the full list of Neon's specs here: https://pupil-labs.com/products/neon/specs
You can also find a high-level description as well as a thorough evaluation of the accuracy and robustness of the algorithm in our white paper: https://zenodo.org/records/10420388
this is really useful, thank you!
have there been any publications demonstrating how Neon-XR can be used realtime with Unity?
I'm ideating on gaze-dependent experimental manipulations
Hi again @user-8bd1e2 - You can find publications using our products in our publications list. Please note that, as NeonXR was launched just a year ago, there may currently be few publications specifically using NeonXR.
We maintain a Neon XR package for Unity. It is a real-time client implementation in Unity and example implementation of transforming the real-time eye tracking data into the virtual coordinate system. The package provides examples to show you how to utilize eye tracking data within Unity very easily. You can find more details here: https://docs.pupil-labs.com/neon/neon-xr/neon-xr-core-package/
also, is there any validation work you could show me comparing NeonXR to the Meta Quest Pro's internal eye tracker?
To the best of my knowledge, there's no validation work comparing NeonXR to the Meta Quest Pro's internal eye tracker.
Please note that NeonXR uses the same technology as Neon, that is a deep learning approach which provides calibration-free and slippage-invariant gaze data in any environment. You can refer to the specs and Neon's accuracy test report for more details (see my earlier message: https://discord.com/channels/285728493612957698/1187405314908233768/1316750822264148039)
Hi folks. I understand that the Neon produces only a cylopean gaze vector. Are monocular gaze vectors on the roadmap?
...and, if so, 2025? π
...and, if so, 2025? π
Let me add some detail:
Hey @user-8779ef! I've moved your message here since you're interested in VR.
Regarding gaze, you're right. Neon provides a 2D gaze estimate, which is also expressed as a cyclopean gaze ray (elevation and azimuth [deg]). Depending on the selected gaze mode in-app, Neon's gaze can be generated either from binocular eye image pairs or monocularly from the left or right eye image.
Currently, you can't obtain both left and right eye monocular gaze estimates simultaneously. You might want to suggest this in π‘ features-requests.
Of note: We also generate eye state data with Neon, allowing you to access the optical axes of both eyes. It would probably be feasible for you to implement a custom calibration to transform these into visual axes within a VR environment.
Thank you, @nmt . I did submit that request. Is it possible to use each monocoular mode in post-hoc, in serial, thereby generating the two monocular tracks for the same VR session?
No problem. No, this isn't possible with the current pipeline. The monocular gaze is only computed in real-time on the Companion device.
That said, do you think the eye state measurements could be useful for you?
I'm curious how I can purchase a add on eye tracking module for the vive pro ? Because I can't seem to find where to buy that accessory
Hi @user-0c6521 π! The Pupil Core VR Add-ons, including the HTC Vive Add-on, have been deprecated in favor of NeonXR.
Neon is a much more flexible and robust solutionβitβs modular, so you can use it with various headsets or even in a regular frame. Plus, its deep learning approach makes it more robust and easier to use.
We donβt have an off-the-shelf frame for the HTC Vive with NeonXR, but weβve made the module geometries, PCB board, and a frame design open source so you can easily build your own mount.
That said, if you really want the old HTC Vive mounts based on Pupil Core, feel free to reach out to us at sales@pupil-labs.com to inquire.