🀿 neon-xr


Year

user-fcc31d 04 June, 2024, 11:43:19

Hello, what headsets (or devices) is Neon XR compatible with, not considering the mount ( the month is not a problem I can design and 3D print it). Or does it work with any system? Android, Windows, Linux etc..., anything that can run MRTK or OpenXR? In my day-to-day, I mainly work with Quest 2-3, Hololens and soon with VisionPro. Also Android and IOS AR, XR standalone and WebXR.

user-d407c1 04 June, 2024, 11:59:22

Hi @user-fcc31d πŸ‘‹ 🀿 ! There are two main considerations when thinking of implementing it onto a headset.

1. Mount: Since you mention you are able to 3D print your own frame, you could place it on any headset of your choice, the only limitation being whether it physically fits inside and whether it would be comfortable to wear? For some headsets it can be a tight fit.

Luckily for you, we have open sourced the geometries of a frame, module and nest, so if you have a 3D model of the headset, you can even check how it would fit without buying or printing anything.

  1. Software: Currently, we maintain a Unity 3D package and a set of demos using the MRTK3 (that should work for the Quest , Hololens and VisionPro). In the future, we might include a package /lib for Unreal 5. But again this is only the software offered by default, the realtime API under the hood utilises HTTP REST, Websockets and the RTSP, so if there is a library supporting those protocols in your preferred language, you should be able to replicate the NeonXR Unity implementation. For example, for C++ you have Live555 that supports RTSP.
user-fcc31d 04 June, 2024, 12:07:04

So in short if understand it correctly, I can hook to the API using any library that supports REST and RTSP and connect/use Neon with any device. Also out of the box you support MRTK3 in Unity for easy development for any headset that can use MRTK3.

user-d407c1 04 June, 2024, 12:29:51

Correct! Also, the Unity package is standalone, and MRTK3 is only used for demos. Besides hooking to the real-time API, you will also need to do this one-time calibration to figure out the translation from the scene camera to the virtual environment.

user-683a24 04 June, 2024, 14:17:18

does the the NeonXR implementation also have the same capabilities as the Pupil Core? I'm interested in pupil diameter specifically, as well as logging instances of rapid eye movement. I'm planning on possibly using this with the meta quest 3 or pro in AR

user-d407c1 04 June, 2024, 14:24:19

Hi @user-683a24 ! Neon and by extension its XR integration, offers pupillometry and gaze at 200Hz on a more robust way than Pupil Core or Pupil Core VR Addons could previously offer.

Additionally, it opens new venues as you can measure eye relief or headset slippage thanks to the eye state which tells you the distance in x,y,z of the center of the eyeball relative to the scene camera.

If you’re interested, please contact us at info@pupil-labs.com to schedule a demo and Q&A session where you can ask all your questions

user-683a24 04 June, 2024, 14:25:45

Understood, will do!

user-1aa180 07 June, 2024, 09:10:33

Hello. My team is interested in purchasing a Neon device to use both during real life scenarios with a Ready Set Go! frame, and with a custom frame to fit inside a VR headset. If we opted to purchase the Ready Set Go! bundle, would we be able to detach the frame from the Neon module and attach it to our own frame? Or is the device permanently attached to the frame? Thanks for yout help!

user-07e923 07 June, 2024, 09:17:55

Hi @user-1aa180, thanks for getting in touch! I'm happy that you've interests in our products. πŸ™‚ Neon is completely modular. You can reattach the module onto many different frames by removing the two screws with the provided screwdriver.

For making your own frame, I recommend checking out this guide to build your own VR headset mount.

user-1aa180 07 June, 2024, 09:35:30

thank you, this is great to hear! I'll pass the information on to the rest of my team πŸ™‚

user-ed9dfb 17 June, 2024, 12:16:50

Hello Pupil Labs, I'm trying to work on our Neon Unity implementation again, but for some reason I don't get any gaze data inside my running Unity app while this has worked before.

-I can start up Neon Monitor from my browser and view both video feed and gaze pointer from the same pc -It looks like I get a connection (with correct ip address) inside Unity with the Neon (see screenshot) but OnGazeDataReceived from NeonGazeDataProvider.cs is never called and rawGazeDir and rawGazePoint always stay at their default values -when connected through Unity, this also shows on the Neon Android App as 1 device connected under Connected Devices -Simulation is off -I allowed both private and public networks to communicate with Unity Editor in firewall settings -recently, the phone did update itself to Android 14, maybe this is related?

Any tips/advice?

Chat image

user-cdcab0 19 June, 2024, 07:39:25

Hi, @user-ed9dfb - we recently added real-time eye state support to the Neon companion app, which requires an updated version of the neon-xr Unity package. I'd recommend making sure you have the latest version of both the Neon companion app and the neon-xr Unity package

user-ed9dfb 19 June, 2024, 08:10:52

Hey @user-cdcab0 , ill try that, thanks

user-ed9dfb 19 June, 2024, 15:02:50

@user-cdcab0 Updating the neon-xr unity package solved the issue, thanks for the help!

user-ed9dfb 26 June, 2024, 15:01:54

Hey Pupil Labs, is there any update on real time blink detection in the neon xr core package?

nmt 28 June, 2024, 03:23:45

Hi @user-ed9dfb! There's no concrete timeline for real-time blinks being computed in-app. But you can run the blink detector programatically. This Alpha Lab article shows you how.

user-ed9dfb 27 June, 2024, 10:44:19

additionally, is there data available on eye tracking latency when using neon xr core package? (eye movement-> gaze data arrives in Unity application). If not, I would be interested in ideas on how this possibly can be measured.

With our current SMI eye tracking glasses, this was measured by shining bright infrared light into the eye tracking camera and measuring the eye tracking lost delay. Perhaps we can use a similar approach for Neon.

nmt 29 June, 2024, 02:52:03

Hey @user-ed9dfb! To respond to your second question, the total latency from eye image acquisition to gaze data being ready in your Unity application is difficult to predict exactly, mainly due to network latency, which depends on several factors beyond our control. But we do know that using an Ethernet cable and a USB Type-C hub can lead to lower latency communications.

Your approach of using an IR strobe that appears in the eye camera video is valid and can yield very accurate measurements, provided you know the exact timing of when the strobe was activated.

End of June archive