Hi team, I downloaded the MRTK3 Template in Unity with neon-xr in Quest3. However, it just does head-tracking by showing heatmap when I move my head, but not eye-tracking when I move my eyes. Could you let me know how to solve this problem?
Hi @user-9c4649 , are the Quest 3 headset and Neon connected to the same network, such as a local WiFi router or phone hotspot?
Hi, this query is related to the neon XR for Quest 3. Based on one of the exisiting posts in this channel, I recorded the VR scene with OBS along with recording the eye tracking data on the phone. I wanted to overlay the eye tracking data on this OBS recording using Reference Image Mapper. I first uploaded a video, and later took a screen grab and just uploaded an image, but the enrichment process always ended up in an error: The enrichment could not be computed based on the selected scanning video and reference image. Please refer to the setup instructions and try again with a different video or image. Please suggest what to do.
hi! do you think that is possible, in the calibration example, to change the interaction when I look at the target from "pinching" to "press a button on the meta quest 3 controller". For me the hand pinching interaction doesn't work very well.
Hi @user-3e88a5 , yes, this is in principle possible.
You may want to check this message: - https://discord.com/channels/285728493612957698/1187405314908233768/1283535993269391371 Note also that the tip there is independent of the Pico headsets.
If you experience any difficulty, then let us know and also check [Unity's Input [email removed] as referenced by MRTK3's Input docs.
Otherwise, feel free to post it as a 💡 features-requests and I'll pass the message onto the team.
Hi @user-d5c9ea , the purpose of the Reference Image Mapper is to build a 3D model of an arbitrary scene and map gaze onto the Reference Image. It is very powerful for real-world scenes, where it can be difficult to get access to the ground truth.
To derive the 3D model, Reference Image Mapper requires a Scanning Recording. Since Neon's scene camera is typically blocked in a VR context, the result would be a black video feed, which would not be usable for the purposes of Reference Imager Mapper. I believe this is what the error message is referring to.
Aside from that, in the case of VR, you already have access to the ground truth 3D model and the wearer's position in that model, so Reference Image Mapper is not really needed in such a case.
Overall, it sounds like you want to map gaze onto a VR video recording. To that end, we now offer an open-source tool, neon-vr-recorder, that can do this. If you have already completed data collection, then you could use parts of that code to do the video overlay post-hoc. It might just require some manual time sync.
Thanks @user-f43a29 for the detailed insight and this makes sense to me. I'll give the neon VR recorder a try and I also wanted to ask, can I still use pupil cloud for any kind of analysis post video overlay process ?
@user-d5c9ea While there are currently no XR specific analysis tools or video overlay features in Pupil Cloud, it still offers benefits for such recordings:
Gaze data rate
is lowered in the Neon Companion app settings or when using a OnePlus 8 device with Neon.Note that the above mentioned gaze & fixation data will be in the coordinate system of Neon's scene camera. To convert them to the VR coordinate system, as it is done in the Neon XR Core Package, you can follow the tips here: https://discord.com/channels/285728493612957698/1187405314908233768/1339268128722518229
You can then synchronize these data post-hoc.
Thanks, @user-f43a29 for this info. I would be honest that coming from Pupil Core and its integration with XR, Neon tools feel a little underdeveloped to me. Anyway, I hope that more support is extended for Neon XR, especially for analysis in near future. On the flipside, I was looking at the neon VR recorder code and wasn't sure how to stop the recording, as well as can you elaborate a bit more on where the data is saved, format, and any tips on the post hoc sync. Thanks !
Hi @user-d5c9ea , always feel free to share what you would like to see in Neon XR! We are always open to feedback. You can also open a 💡 features-requests at any time. The Neon XR ecosystem is under active development and we are happy to incorporate users' insights.
With respect to neon-vr-recorder, currently, the tool opens a window that displays the gaze overlaid on the VR feed. A call could be added to save the gaze overlay to a video file. I'll pass this request along to the team.
Just note that the tool uses a slightly different approach to mount calibration when overlaying the gaze and you may want to tweak the values for your setup. To be clear, you still want a mount calibration from the PL_Calibration
scene in your Neon XR Unity app, but you do not need to re-create the PL_Calibration
scene for the neon-vr-recorder
tool. A simple validation target or two should be enough.
When you ask where the data is saved & formatted, are you referring to gaze data saved in a CSV format?
With respect to post-hoc sync, you could use Events for that. Please see this message here: https://discord.com/channels/285728493612957698/1187405314908233768/1288425388824985601 and let me know if you have more questions about that.
Thanks @user-f43a29 , I was able to make the neon vr recorded work, but as a feedback, the code tries to automatically gather the Neon IP which it is failing to do. I manually set the args.ip to the IP address of the Neon device. However, the stream is now binocular and if I want to record the monocular stream, how can I do that ? any insight would be helpful. Thanks !
Hi @user-d5c9ea , glad to hear it is working for you!
So, automatically discovering Neon on the network is the default mode of the Real-time API. That can be blocked by a firewall configuration or a network type. May I ask what kind of network you are using? That might explain it.
Otherwise, if you pass Neon's IP address to the -i
command-line parameter, then you should not need to edit the code.
If you want to stream monocular gaze data, then you can change the Gaze Mode
setting in the Neon Companion app to use either the left, right, or both eyes for gaze estimation.
Please let me know if I misunderstood.
Hi @user-f43a29 , we were doing it over wifi and even turned the firewall off too, but ip showed as none when printed. Also, the setting change in the companion app is not reflected in the streaming frame of neon VR recorder.
Are you changing the setting while the VR recorder is running? Be sure to unplug and re-plug Neon after changing the setting.
May I also ask how you confirmed that it is not changing? To be clear, you will see only one gaze circle in all modes. The data source
for gaze is what is changing.
I did exactly the same as suggested and it still streams as binocular.
Hi all! I am considering purchasing a neon XR bundle, and was hoping someone might be open to share their experiences - to get a sense of what time investment and skillsets are needed. I am building solutions to use XR for market research purpuses on a number of use cases.
For my initial (most basic ) use case, i would actually just want to show a (very large) webbrowser in a VR or MR evironment and want to get analytics on what people are looking at like those offered on pupil clouds (gaze tracking, heat maps, AOIs, time to first fixation etc) , Does anyone have a similar use case?
I am planning to use the meta quest 3 initially, but also have an Apple Vision Pro - so if anyone is using that already , I would love to hear too!
(PS I am based in Amsterdam, any more Dutchies here by any chance?)
Hi @user-9b48d1 , how do you connect the headset and Neon?
Otherwise, would you be able to provide the contents of Unity's console when you run the app while the headset is connected via cable to your computer?
If you have adb
installed, then if possible, could you also provide the contents of:
adb logcat -d > out.txt
Running it after you start up your app. Thanks!
Hi Rob!
I rely on the automatic connection via the API connecting to the device registered on the network. It seems to work when launching on PC, as I receive this message on the console:
[DnsDiscovery] received response from: PI monitor:Neon Companion:7a5c7e332d3cf8a5 UnityEngine.Debug:Log (object) PupilLabs.DnsDiscovery/<DiscoverOneDevice>d__5:MoveNext () (at [email removed] UnityEngine.UnitySynchronizationContext:ExecuteTasks ()
I would have to see about the adb and I will also try to build the application again, so much so far 🙂
Hi @user-9b48d1 , thanks for the update. Is your network running on a local, dedicated WiFi router?
Otherwise, could you try the following:
If that does not resolve it, then please send the full contents of the Unity console, after the app has been started. You can send it via DM, if you prefer.
Hi Rob, yes, I have a seperate, local access point dedicated for this project. I will look into your other suggestions after I checked my script which I currently suspect to be the culprit. 🙂
Hello All I predominently work with Meta quest pro i see that there is a mount for meta quest 3 but do we have custom mount for meta quest pro as well can anybody please help me out Thanks!
Hi @user-6a81c4! We don't have a mount specifically for the quest pro. That said, Neon XR works in principle with any VR headset, so long as the module can be mounted comfortably within the device. If so, it might be possible to develop your own mount. Further details here: https://docs.pupil-labs.com/neon/neon-xr/build-your-own-mount/#build-your-own-mount-for-neon
Hi Rob, so I have been tinkering a while now, with seemingly no effect. I have done as you suggested, no change yet. However, now it seems as though I cannot even connect with the Neon Companion Device through Unity, as I get a connection error where the Neon device is not found. I already attempted to change the rtsp settings in the config file to the ip address displayed by the Neon app, no luck yet. I will try again tomorrow and attach the console readout.
Hi @user-9b48d1 , once you can send the logs, then we can look into what may have gone wrong. Can you establish a connection when running the MRTK3 Example APK?
AId4Children Support :)
Hy
We were wondering if it would be possible to wire the neon eye tracker straight to the quests 3 headset and read in the eye tracking natively on headset? we would prefer if we can eliminate the phone part.
Hi @user-53a389 , I've moved your message to the 🤿 neon-xr channel. To answer your question, this is currently not possible.
May I ask, how has your experience with Neon XR been so far?
we recently got the eye tracker and mount, just seeing what our options are. running it natively on headset would the simplest aproach
what makes it not possible?
Hi @user-53a389 , Neon is currently supported on a selected set of well-tested Companion Devices to ensure maximal performance, robustness, and reliability. At the current moment, NeonNet, the technology that powers the eye tracking, pupil size estimation, and other sensor capabilities, is fine-tuned to specific SoCs, such as found on those devices. As such, we do not support connecting Neon directly to the Meta Quest 3 in that way.
when connecting the neon module to the phone, the neon companion app crashes after a few seconds. any idea what could be the issue?
Hi @user-53a389 , could you open a Support Ticket in 🛟 troubleshooting ?
Hi I'm trying to build the MRTK neon example for quest3, I followed all of the steps but it shows that error.
How should I switch to other scenes from PL_HandInteractionExamples scene? I successfully install the example to Quest3 but it seems PL_HandInteractionExamples scene won't visualize the gaze point
Hi @user-c9caf9 , glad to hear you got it running! To open the navigation menu, hold up only one of your hands, with your palm facing you. In that menu, you will also find an option to activate gaze point visualization.
Thanks, but I noticed that the gaze dot is only at the center of the view, It's not moving to where I actually looking at. I already connect pupil neon to the phone and the phone, the quest3 are connect to same wifi
Is it a dedicated, local WiFi router or a University/work WiFi access point?
It works after I replugin the eye tracker Thanks!
That's great! You are welcome and let us know how it goes
Hi 🤿 neon-xr , I'm looking for more information on neon/core setup as I would like a quickiest way to integrate eye-tracking onto my custom AR glasses (custom Android ROM)... The device needs to be running standalone, so may I know if the eye-tracking can run locally at all time. And since I intend to integrate the hardware (cameras, leds ,etc) into my custom device for compactness, I'm curious if this is even possible later on?
so far, I only see videos of neon/core being used in AR/VR headsets while being tethered to a companion phone... hence i'm having these doubts
would appreciate anyone sharing their experiences integrating neon/core into any headset 🙏
Hi @user-12e49c , it sounds like you might be interested in checking out our new Integrators page. Please let us know if that clears things up.
hi there!
I'm looking to get my lab set up with Neon and I had some questions:
Is it easy to switch between using Neon in glasses and an HMD? I've got experiments where I'd like to one or the other.
I've got a Meta Quest Pro, can Neon integrate with a MQP? By integrate, I mean can I use Neon eye tracking in a Unity scene with a MQP instead of the MQP's eye tracking?
I know this was asked by a colleague of mine earlier but I need a fresher answer to the question: I'm going to need gaze data for each individual eye in real time. Is it possible to configure Neon to provide eye gaze vectors for each individual eye, like you'd get from Pupil-Core? If so, is this something the Pupil team could help me implement or would my lab be on our own?
Thanks!
Hi @user-8bd1e2 👋 ! See below the answer to your questions.
2a. Gaze in Unity: Yes. The NeonXR Core package facilitates getting gaze coordinates on your Unity environment.
2b. MetaQuest Pro: For the Meta Quest 3 we offer a mount/frame, but for the MetaQuest Pro note that you would need to develop your own mount. See more here.
3. Dual monocular: As of today, gaze is either single monocular or binocular but not dual monocular. While we are working towards that end, there is no current ETA. That said, optical axes vectors are reported binocularly independent for each eye.
Let me know if you have any other questions.
This is useful! Thank you.
One follow-up, is there an example out there of someone using Neon to build a gaze-contingent display? An example could be using the gaze data to create a retinally persistent (new) blind spot
Sure! As far as I know, there isn't a demo that does exactly that, but this guide provides an example of a gaze-contingent paradigm using Neon to control the cursor on a PC.
In NeonXR, we also have some MRTK3 demo scenes that showcase gaze interaction, including object collision, heatmap generation, and more.
As for generating a scotoma, this can be achieved using a shader that updates based on gaze position. You might find this article relevant.
Hello @user-f43a29 Can you help me with setting up the neon xr with unity?
Hi @user-6c6c81 , I've moved your message to the 🤿 neon-xr channel.
We can of course help you get set up! Have you already worked through the Neon XR Documentation? At what point did you encounter trouble?
Also, if you have not yet had your free 30-minute Onboarding for Neon, then just send an email to info@pupil-labs.com with the original Order ID and we can schedule it.
Hello @user-f43a29 pl.calibration, pl.config, pl.config.defaults files are missing. They are not even available in NeonXR package i installed. i do have normal json files but the pl.xxx files are missing. please help me with this.
Hi @user-6c6c81 , those items, like pl.calibration
, are known as Addressables
in Unity. They are similar to URLs in your browser and should point to the .json
files that you mention.
Could you try the following?
AddressableAssetsData
folder from Unity's Project browserAddressables
steps from the Neon XR Documentation (steps 2 to 4)Let us know if that resolves it.
Hey @user-f43a29 I followed those steps again, but still im facing the issue.
[email removed] I followed those steps again,