Hi @user-d407c1 Yes, Iโm using neon-vr-recorder. Iโve been testing it by displaying the Quest 3โs video feed on my PC using scrcpy, and then overlaying a circular marker on that screen.
Previously, it stopped midway due to an error, but that issue has been resolved. However, on the first run, the window remains completely black. If I stop it and run it again, it works fine from the second run onward.
Also, Iโm running it with the -s option, but it only shows the left-eye view. Is it possible to display the views from both eyes instead?
Hi @user-740e46 , I'm briefly hopping in for @user-d407c1 .
Are you starting the neon-vr-recorder tool while the Quest 3 is in sleep mode? If so, then you will see the black screen. You should first wake the headset, start your app, and then start the neon-vr-recorder tool.
With respect to showing the view of only one VR display, the main goal of neon-vr-recorder is to show you "what the participant is seeing". The participant ultimately sees one unified view of the VR scene. In addition, there is usually little difference between the two displays; just enough to induce stereopsis. Unless, are you potentially doing an explicit binocular cue conflict experiment?
Aside from that, showing the view of both VR displays would require mapping gaze twice, hence two different calibration files. This introduces complexity and also could give the user the wrong impression that dual monocular gaze is being computed.
For these reasons, we just show the view of one VR display.
This is what i get when i run record.py. May be a gpu problem ?
Hi @user-2eefe1 , this is not related to the GPU but happens when you are in the Meta Home Screen. Once you start an app, that should go away.
Thanks Rob. When i maximize, the video is half of the window. Is this correct ?
We can add dedicated resize handling code.
Thanks again. How can i record the video and how can i stop the program, couse the only way i found is killing the code window ?
We started adding code last week to save the video to a file. That should be released in the coming days.
Currently, yes, you end the program by using Ctrl-C in the command line where the Python code was started (i.e., no need to close the code window; the terminal).
Sometimes i have a out.mp4 file in the folder, but not always and Ctrl-c not works for me
Hi. Is there any official south korea distributor? And do you have any plan to develop NEON-XR in Unreal ?
Hi @user-63c8a8 ๐ ! We do ship worldwide, but you can find our list of authorized resellers here.
If youโd like tailored guidance on purchasing or logistics, feel free to reach out to [email removed]
Regarding Unreal Engine, while we donโt currently offer ready-made prefabs like in NeonXR for Unity, we do provide a C++ version of the Realtime API.
Itโs less plug-and-play, but it gives you full access to our data streams gaze, pupil, and event data in real-time directly within Unreal Engine.
could i know how this video was made: https://www.youtube.com/watch?v=OW5EOqqQ_qs
Hi @user-2eefe1 ! You can find here what it entails. Now, we have neon-vr-recorder tool which pretty much let you do the same.
While recording scene video with Neon on Quest 3 using the Companion App, the phone becomes very hot. After some time, the phone vibrates and the Neon deviceโs LED flashes red. The Companion App shows the message:
โSensor failure โ The Companion App has stopped recording Scene Video data.โ
However, when checking the recordings later, the scene video and data were still saved correctly. Since the phone must remain powered on to continue LSL streaming, I am not turning the screen off. I tried enabling Silent Watchdog in the Companion Settings, and after enabling it, the issue has not reappeared so far. I suspect the problem is related to thermal throttling of the smartphone during high-load recording.
If there is any additional information about this condition or recommended settings to help prevent it, I would greatly appreciate your advice.
Hi @user-740e46 ๐ Could you please create a ticket in ๐ troubleshooting and share a few more details about your setup? like that dongle you seem to be using with a 100W PD?
While the S25 is a powerful Companion Device, running at 200 Hz with LSL, eye state, and fixations enabled in real time can be quite demanding.
In our experience, though, this model should handle it without overheating.
However, you should always turn off the screen, keeping it on offers no benefit and is very likely the main cause of overheating.
Additionally, could you share which version of the Companion App are you using?
Hello, I'm still working on my thesis with Neon XR.
I am facing (a lot of) accuracy issues. I cannot seem to properly gaze at items distant 1.5m and beyond. To debug, enabling the eye raycast (the one enabled from Neon XR prefab) shows that the raycast doesn't hit the spheres most of the time at such distances.
Helpful information: - Mount calibration has been redone (just to be sure) and is properly being imported by the application.
My setup: I spawn spheres relative to the player camera using spherical coordinates. Each sphere has a distance from the camera (0.5, 1, 1.5, 2, 2.5 m) and is positioned at 9 polar angles (top/middle/bottom ร left/center/right, e.g., 135ยฐ, 90ยฐ, 45ยฐ, etc.). 20ยฐ is the vertical tilt from the forward axis. Then I convert these to 3D Cartesian (x, y, z) positions for Unity. This way, the objects are placed in 3D space around the user, and I can also compute the visual angle for size/distance control.
Without touching any code, just spawning items and gazing at them, and enabling the raycast on the XR prefab, it just wouldn't align well on items further away (1.5m and on).
Do you have any suggestions to fix this, maybe I'm missing something?
Hi @user-7afd77 , can you provide an image or a screen recording of what you are describing? Then, we can communicate it better to the Neon XR team for you to get advice.
Just to clarify, redoing the Gaze Offset Correction many times for different objects at different distances does not have a cumulative effect. Each time you do it, it simply overwrites the previous setting. You only need to do it once for a clear target, near central vision, at one reasonable distance.
Ok that's nice to know.
Sure, let me record a simple video
Sorry, couldn't upload on Discord. Here's the wetransfer link: https://we.tl/t-wCwwvDeQD7
Please note that: - For each sphere I'm looking and focusing on the fixation (red) points on the sphere. - At the end I'm looking and focusing on the upper leaf of the left plant.
You can see that the debug raycast from the Neon prefab does not hit the items in the majority of distant items, and if it does, not for so long. The smoothing effect has not been applied to have raw gaze data, and we don't think it could improve the current situation, can it?
(please ignore the lacking testing environment put in a couple of minutes for the sake of the video).
Thank you!
Hi @user-7afd77 , no problem. Thanks for sending.
I see now what you mean. Smoothing would not make much of a difference here, I think, but you can try it.
Would you be willing to re-do the mount calibration scene and send a snapshot of the final screen that shows the green and red dots, when it offers to save the results? Then, I will pass both of those onto the team for review.
Hi, thank you for answering.
Sure, I can try both options, but I will do it tomorrow at the office since I don't have access to the headset now.
To be fair, I was tinkering a bit logging some data to get an estimate of accuracy and for close items (0.5m, for which I have no problem at gazing at the items) average angular error is around 1.5ยฐ, but it increases exponentially up to 8 degrees for 2.5m, as it doesn't even gaze the cubes most of the times.
I would like to fix this issue asap as my thesis deadline is close and would like to go back to focusing on vergence, but gaze accuracy is kind of linked to it.
It will be helpful then to see how the red+green dots align for you. The mount calibration scene presents targets near and far, for example.
Hello, here's the mount calibration