🤿 neon-xr


Year

user-7df65a 03 September, 2024, 07:52:26

Hi, Pupil Labs! I have some questions regarding app development using the Neon XR Core Package and Unity.

・Is it possible to display the Neon scene video along with an overlaid gaze circle on a Unity scene using the Neon XR Core Package (or MRTK)?

・If it is possible, what kind of customizations are required within the Unity scene? So far, following the instructions on this page, the console shows up as seen in the screenshot, but the gaze is not displayed in the scene. It is counted as a connected device in the "Stream to Pupil Monitor" section of the Neon Companion App.

・Additionally, can the gaze information (and scene video) transmitted from the Neon Companion App be received by Unity apps built on multiple different devices?

Chat image

user-f43a29 03 September, 2024, 14:45:33

Hi @user-7df65a , to be sure I understand: - A person will be wearing Neon and walking around a room or outside and you want to display the scene video in real-time within a Unity scene? - Can you clarify a bit further what you mean by "can gaze information be received by apps built on multiple different devices?" Do you mean if the data from Neon will be different if the Unity app is built on Linux or Windows?

user-7df65a 04 September, 2024, 00:43:49

Thank you for your reply! @user-f43a29 The app I want to develop is as follows: ・A user wearing Neon (streamer) and multiple other users with smartphones (viewers) walk around outside. ・The streamer’s scene video and gaze circle are streamed in real-time to the viewers' smartphones via a local network. ・The receiving app on the viewers' smartphones is built with Unity (iOS and Android).

All the data received on each smartphone is the same (scene video and gaze circle). As the first step in development, I am trying to receive the scene video and gaze circle in Unity's game view before building the app for smartphones. While it is possible to achieve this by accessing http://neon.local:8080/ on each smartphone, I want to develop it in Unity, considering future expansion to AR glasses or HMDs.

Chat image

user-8619fb 03 September, 2024, 15:08:12

Thanks @user-f43a29 ! Can you point to any specific basic one in particular which just gets the gaze positions from the camera? I can't find exact documentation on it that can help me get started. I'm sorry but I'm having a hard time connecting it to any template once I hit play.

user-f43a29 04 September, 2024, 14:06:42

Hi @user-8619fb , the PL_EyeTrackingBasicSetupExample might be easiest to start with. The examples in the MRTK3 Template Project make use of the GazeDataProvider class.

There is also the base Neon XR Core Package. If you reach steps 6 and 7 of the Neon XR Core Package installation instructions, then you will see a more minimal example of obtaining gaze data, by checking for the gazeDataReady event.

user-f43a29 04 September, 2024, 10:18:30

Hi @user-7df65a , thanks for the explanation. I understand now.

So, Neon XR is designed for mapping gaze to VR/AR scenes, with the idea being that one person is wearing a single VR/AR headset.

If you would rather stream data to a Unity app running on a phone(s), then you might be better off using your own implementation of the real-time API, since Neon XR makes assumptions related to VR/AR and does not include functionality for streaming the scene camera (e.g., the scene camera is not relevant in VR).

The real-time API can be used from many languages and you can use code from Neon XR to get a headstart. You should actually need less code overall for what you are looking to do, since you do not need to map gaze to VR/AR.

You might also want to use the Python implementation of the Real-time API as a reference.

user-7df65a 04 September, 2024, 11:57:05

Thank you for your reply! I understand that it's not possible to receive scene video with Neon XR. I will consider implementing it using the Real-Time API.

user-ed9dfb 09 September, 2024, 09:35:01

Hello Pupil Labs, I have worked with both the normal Neon and the Neon as part of the Pico 4 bundle. With the Pico4 bundle Neon, I can see a VR headset image in the Neon Companion App. Are there differences between the Pico4 neon / bare metal and the normal Neon / bare metal? Are they fully interchangeable?

user-d407c1 09 September, 2024, 10:31:37

Hi @user-ed9dfb ! The modules are essentially the same, with the main difference being the frame. The module and app remain unchanged, so you can swap the module between them.

user-ed9dfb 09 September, 2024, 10:38:48

thanks!

user-ed9dfb 09 September, 2024, 13:54:39

I'm trying to interact with the MRTK demo scenes with the pico 4 controller instead of finger pinch gesture (since I also haven't figured out yet how to make hand tracking working with pico 4 in editor), do you have any tips for a MRTK and new input system novice? There are so many places to adjust input/actions/controls so I'm now doing trial and error but perhaps you can give me a quick tip on how to interact with any of the interactibles through a button press on the pico 4 controller?

user-f43a29 11 September, 2024, 21:13:47

Hi @user-ed9dfb , I'm briefly stepping in for @user-d407c1 .

Can you check if the scene named HandInteractionExamples works with controllers for you? Then, we will know if the Pico Live Preview plugin is setup correctly.

Then, try the PL_HandInteractionExamples scene and enable "Far rays" in the Editor. To do so, the checkbox shown in the attached image has to be enabled on both objects highlighted in the hierarchy. Afterwards, controllers should work fine.

Please let us know how it works out.

Chat image

user-ed9dfb 09 September, 2024, 13:55:39

for example, I tried to add a joystick clicked action to the Activate Action Value in the MRTK RightHand Controller , but this didnt work

Chat image

user-ed9dfb 09 September, 2024, 13:59:10

When I play the scene with pico 4 controllers, I can see the controllers act as virtual/simulated hands and I somewhat can interact with the buttons, but for example with the piano buttons, I can half press them, but they are never fully pressed/activated (e.g. they don't make a sound)

user-ed9dfb 09 September, 2024, 14:28:29

I also tried adding primary and secondary button mappings to Select Action, but this doesn't work either

Chat image

user-f43a29 12 September, 2024, 07:41:32

Hi @user-ed9dfb , the team wanted to pass along some additional info:

If you only want to emulate the pinch via button press and still use the gaze ray, that will be more difficult, because there are additional hand checks in MRTK3 that would not pass while using the controller.

Generally, the preferred way of development should be using the MRTK3 Input Simulator in the Editor without the VR device and not using the Pico Live Preview plugin. Then, you can load the Unity app onto the VR device from time to time to see how it feels with the headset on.

user-1ec436 11 September, 2024, 11:21:08

@Rob Thanks for the great support in July. https://discord.com/channels/285728493612957698/1187405314908233768/1261254975586177064

We successfully got Neon to send data to Unity in real time using the USB-Hub. The OnGazeDataReady event is updating variables in the Unity script, but we are having troubles with two things.

  1. The frequency at which the OnGazeDataReady event is called is about 25Hz, which is too small. The refresh rate on the CompanionDevice is set to 200Hz and the data on PupilCloud updates the values at 200Hz. UnityEditor has a frame rate of about 200Hz.

  2. measuring latency is too large Compare 3d_eye_state.csv stored in PupilCloud with the CSV created on Unity, latency was determined based on the pupil diameter change graph. Then we found that the latency was about 500 ms. The attached image shows that “the graph shape matched after moving the pupil diameter change graph for about 500 ms.

Code to get the timestamp on the Unity side :

DateTime utcNow = DateTime.UtcNow;
long unixTimestampMillis = ((DateTimeOffset)utcNow).ToUnixTimeMilliseconds();

Both the ping value between the PC and the router and the ping value between the CompanionDevice and the router are less than 1 ms. Considering the possibility that the clocks of each device are off, we checked the following sites to see if the clocks are off. The computer running Unity Editor was 0.1 second slower and the companion device was 0.8 seconds faster. (apparently not an exact value). What methods do you use to verify latency? How do you deal with the clock deviations?

Thank you in advance !

Chat image Chat image

user-f43a29 11 September, 2024, 13:14:07

Hi @user-1ec436 , that's great to hear! May I ask for a clarification first and then we can proceed to the OnGazeDataReady code?

  • What USB hub is that? I cannot make any guarantees about devices other than the tested USB hub.

Regarding the transmission latency and clock offsets, you can reference the TimeOffsetEstimator code of the Real-time API. It is a reliable way to measure the transmission latency, provided that the network is stable (that could depend on the quality of the USB hub).

You can also try our other sync tips.

Regarding the pupil diameter data, is there a reason that you prefer the data collected via Unity as opposed to the data recorded on the phone (i.e., the data that gets uploaded to Pupil Cloud)?

user-1ec436 18 September, 2024, 13:05:10

Hi @user-f43a29 , thank you again ! We used TimeOffsetEstimator and it showed that the transmission latency was incredibly little ! - Mean roundtrip duration: 0.4 ms - Mean time offset: -941 ms

Now it is easy to measure the transmission latency, so it will make following development even more efficient !

We have no reason to prefer the data collected via Unity as opposed to the data recorded on the phone, we just tried to measure transmission latency by comparing changes in pupil diameter.

user-d29312 16 September, 2024, 02:36:03

Hi is it possible to use neon with htc vive? If yes, how would I go about setting it up?

nmt 16 September, 2024, 02:42:10

Hi @user-3f7dc7! Conceptually speaking, yes. Neon can be mounted into various HMDs, as long as there is space to accommodate the module. Although we don't have a pre-built mount for the HTC Vive, and so a mount would need to be prototyped to fit the geometries and constraints of the headset.

I also recommend heading over to this section of the docs to read about how to get started with integrating Neon into your XR environment. Let us know if you have any specific questions!

user-3f7dc7 16 September, 2024, 11:32:55

Thank you!

user-ed9dfb 16 September, 2024, 13:36:27

Thanks! I finally managed to do an eye calibration using the PL_Calibration scene.

After importing the Pico Live Preview Package and enabling it in the Windows XR Plug-in Management and enabling the Far Ray gameobjects, I had working controllers with aim and select functionality.

Then, I added the MRTK NeonXR Variant prefab to the PL_Calibration scene. I also changed the ip adres in the config.json file (in appData folder).

For the calibration to work, I had to gaze towards the targets and simultaneously aim and select at the targets with the controller. This was a bit cumbersome but it worked in the end.

I do feel that using the MRTK framework costed me a lot of extra time troubleshooting and seemingly its hard to add controller functionality, is there any benefit to you or me compared to using just the XRI toolkit?

Also, since I'm on a institute WIFI, I have some extra difficulties to connect devices which is why I preferred to do the calibration through PC VR / Unity Editor since I have previously figured out connections issues between Neon Companion App and my office pc with the network admins (devices needed to be whitelisted and proxy cache servers needed to be disabled).

Next step is to implement the Neon Core Package in our VR Experiment which should be easier as it doesn't rely on MRTK. Thanks for the support .

user-f43a29 17 September, 2024, 15:57:00

Hi @user-ed9dfb , I'm glad it is working out for you!

The MRTK3 builds on the XRI Toolkit, so it is also possible to use that, if preferred.

We have found the MRTK3 to be an easy way to get up and running with eyetracking and hand gestures and wanted others to benefit from that, as it is more "full featured". With respect to issues with the MRTK3 and controller functionality, the developers of that project are better positioned to potentially simplify that aspect.

Is it possible to obtain and use a local, dedicated WiFi router? That should simplify things and will improve the signal, potentially allowing you to avoid the need for PCVR and Pico Live Preview. Using institution/work WiFi can cause issues, as you've experienced.

user-ed9dfb 18 September, 2024, 14:01:40

Thanks for the clarification! I personally found it hard to adjust the calibration scene to my needs as I couldn't figure out a way to replace the finger pinch gesture by a controller button press (or even keyboard button press). I feel I would need to study MRTK and the calibration scene more to figure this out.

I hope that we are able to buy the Anker USB ethernet hub soon, so we can circumvent WIFI issues altogether.

user-f43a29 19 September, 2024, 06:59:01

@user-ed9dfb No problem. Happy to help!

Just to clarify, there is no restriction that you must use MRTK3. You can use the example code provided there as a reference when integrating with other frameworks or SDKs.

user-f43a29 19 September, 2024, 06:56:37

@user-1ec436 , great to hear!

As you've found, when you compare the data streams from Unity and the Neon Companion App, it is important to account for latency and clock offsets.

With respect to the 25 Hz sample rate in Unity, can you share the log file that is generated? You can share it via DM or send it via email to [email removed] if you prefer

user-1ec436 19 September, 2024, 23:58:19

Thank you for your support ! I will contact you againt after investigating a problem which we found with the output CSV data.

user-f43a29 20 September, 2024, 08:11:03

@user-1ec436 You are welcome. Feel free to post your problem here. Maybe we can save you some time!

user-0a9f16 23 September, 2024, 22:03:40

Hi @user-f43a29 , We want to use the neon module along with the quest 3 to measure the pupil diameter. We will be triggering some events within the VR scene and would like to track the pupil diameter values from that moment along with the time stamps. Will it be possible for the timestamps of the events in the Unity scene and the pupil data timestamps to be recorded with the same clock or synchronize?

user-f43a29 24 September, 2024, 09:47:20

Hi @user-0a9f16 , yes, this is possible. What level of accuracy are you looking for?

user-0a9f16 24 September, 2024, 21:58:07

@user-f43a29 , We are looking for maximum of 20 ms error.

I have tried going through the documentation. As we will be running the experiment on Unity, One approach would be using Unity neon XR package. We can extract the realtime Gaze data using this but not the Pupil Data .Because I couldnt fine any reference for that in the unity package. Please correct me If I'm wrong. But if I can get the pupil data realtime from unity, I have thought of mapping the required unity events timestamps with the Neon Module sensor data directly in the host computer.

Another approach would be just collecting the sensor data independently from the unity events' timestamps. We would be following this : https://docs.pupil-labs.com/neon/data-collection/time-synchronization/ for syncing our host systems clock with UTC before the experiment and collect the unity event timestamps in UTC itself. And then mapping them later. (Could you explain how the TimeOffsetEstimator works and how to apply in my case)

Which approach would be better or result in more accuracy?

If I decide to write my own client for the Real Time API in c#; are there methods to get the Pupil data?

user-f43a29 25 September, 2024, 09:02:30

Hi @user-0a9f16 , thanks for the clarification.

You can send events with custom timestamps (see part 1 of my colleague, @user-cdcab0 's, post here (https://discord.com/channels/285728493612957698/1047111711230009405/1288404861649424404) for additional details). As long as you account for the clock offsets, then you can theoretically reduce error to the point that it is avoided (i.e., ~0ms error).

If you want to write your own implementation of the Real-Time API in C# for Unity, then Neon XR would be a good resource to start from, but aside from events, I think it already provides what you need. It seems to me that it might be easier to add event sending via HTTP Post requests to your own fork of Neon XR, rather than re-writing parts of it from scratch.

You can also make a 💡 features-requests or inquire about a custom consultancy, if you'd like us to add this feature.

user-0a9f16 26 September, 2024, 17:52:26

Thank you for the clarification. So this is the workflow I need to implement: - Sync up the companion apps clock and the host systems clock before the experiment to the same NTP server. - Estimate the time offset before the experiment once. - Send the events from the host accommodating the offset error (converting into companion clock time stamps). - And for sending the events in unity, I have to just create a Post request as the Neon XR package does not have have this function yet. - Analyze the collected pupil data which will also contain timestamps of the custom events I have sent

Thank you for your help and please correct me If I am incorrect or I can use any better approach.

user-f43a29 27 September, 2024, 06:47:06

Hi @user-0a9f16 , yes, that's a valid workflow. Just note that the pupil data and events are contained in two separate files:

Based on the events, you can filter the pupil data stream or align it to some other data stream.

You can of course check and update the time offset periodically (as often as you need), in case rigorously controlling for any clock drift is essential to your experiment.

user-18c6b6 02 October, 2024, 02:30:20

@user-f43a29 , thanks for the confirmation. (With Icon).

I have two queries regarding the frames:

1) If we use the XR frame for the quest 3, can the users put on their glasses.

2)Or if we get any of the non XR frames/glasses: can they be used inside a VR headset?

user-ba852f 30 September, 2024, 07:42:50

Hi, any idea if the 3D model for the Quest 3 mount will be published ?

user-d407c1 30 September, 2024, 07:52:46

Hi @user-36c24b 👋! I’ve moved your message to the appropriate channel, 🤿 neon-xr .

I guess you’re referring to the mount available here ? Please note that the model isn’t just a 3D-printed piece but a custom facial interface, with cushioning.

You can purchase it directly here

user-36c24b 30 September, 2024, 15:04:56

Oh I thought it was just a small printed mount. Thanks for the clarification

user-386d26 30 September, 2024, 21:27:55

Hey everyone, sorry for the long message:

I just received my neon and meta quest 3 mount/interface, but I am not sure how to install the neon into the interface. It appears as if I need some sort of screwdriver which did not come with the device, despite the website citing tool-free installation. Please let me know how to install the device, and if I am just not understanding how the mount is set up.

I know there is supposed to be an onboarding meeting but unfortunately I was not the one to make the purchase, I am working under a research scientist who handled the purchasing but also is unavailable for the next month. So I don't have any information about the purchase or any correspondences with Pupil Labs, just the device.

Please let me know how to install the device into the mount. Hopefully this is not a stupid question and I haven't missed the intuitive way to attach it 😭 .

user-d407c1 01 October, 2024, 08:04:52

Hi @user-386d26 👋 ! Swapping frames is quite easy, you will need to remove the two little screws as you mention and re-attach them to the new frame, have a look in our docs where we have a video demonstrating the process. The only tool that you would need is a hex 1.5 screwdriver that we include with the frame.

End of September archive