🤿 neon-xr


Year

marc 02 January, 2024, 08:50:14

Hi @user-ff2367! Yes exactly, every mount geometry needs to be calibrated once to determine the position it places the module in within the headset. The resulting calibration data needs to be placed within the config.json file of the app. No further calibration should be necessary on a per user or per session basis. The calibration process is documented here: https://docs.pupil-labs.com/neon/neon-xr/build-your-own-mount/

user-ff2367 08 January, 2024, 10:35:46

Hello @marc! Thank you very much for the information

user-435838 17 January, 2024, 18:37:14

Hello! I'm currently exploring Neon XR core package with XR Interaction Toolkit in Unity and I'm curious about its compatibility, specifically regarding the XRGazeInteractor and its support for OpenXR bindings. Does it support OpenXR bindings or can it be bridge with XRGazeInteractor?

I would appreciate any experiences or advice with using NeonXR with XRGazeInteractor/XR Interaction Toolkit!

user-d407c1 23 January, 2024, 10:09:09

Hi @user-435838 ! Our developer has prepared a sample for this use case here and some instructions as you can see below:

Based in Unity3D docs it is possible to use XRGazeInteractor even without OpenXR bindings.

If a device that supports eye tracking does not support these bindings, you will need to retrieve the gaze Pose data from the device and update the XRGazeInteractor Transform with an additional provider.

So, basically you need to: 1. Create a new project 2. Import the XR Interaction Toolkit with Starter Assets and XR Device Simulator 3. Import Neon XR as mentioned here

  1. Open the DemoScene.unity
  2. Add NeonXR prefab into the scene
  3. Add XR Gaze Interactor Bridge component to the NeonXR/PupilLabs gameObject and set Gaze Interactor as target
  4. Disable input tracking on XR Controller component which is attached to Gaze Interactor gameObject.

  5. Add a callback for the Gaze Data Ready event, which is the OnGazeDataReady methodfrom XRGazeInteractorBridge.cs (XRGazeInteractorBridge is implementation of the additional provider mentioned above)

  6. Add XR Device Simulator prefab into the scene

  7. Store the modified scene as SampleScene.unity and then you can run it.

Note: The scene has some simulators you may want to disable when building your application.

user-057596 22 January, 2024, 15:16:26

When using NeonXR is the gaze overlay and VR automatically recorded together as depicted in the Microsoft Flight simulator demonstration because this wasn’t the case with the Core device which required a rather convoluted method of capturing both the VR scene and gaze overlay.

user-d407c1 22 January, 2024, 15:55:12

HI @user-057596 ! The video demo requires couple things:

  1. Make a screen recording (you can use OBS for that)
  2. Map to the gaze to the recording.

Essentially, is similar to what is done here

In this case, Flight Simulator, allows you to show both views (HMD and PC mirrored), so recording the latest is easier and usually less distorted. But recording directly from SteamVR should also be easy, and there is even a plugin for that.

Then you need to translate the gaze coordinates. Neon outputs them in the scene camera space, so you just need to find the relationship to the content you are seeing.

If you look at some targets and you have the gaze position and the pixel coordinate of that target, is a simple correlation, kinda like a traditional calibration in eye tracking. But since the relationship between the scene camera and the content is "not" going to change, and NeonNet takes care of the relationship between your eyes and the module, you can do it just one time.

PS. If you want the best temporal sync, have a look here

user-057596 22 January, 2024, 15:59:11

Thanks @user-d407c1 that was really helpful 💪🏻

user-435838 23 January, 2024, 16:15:21

Hello Miguel, thank you so much!! I'll try it out today.

End of January archive