Hi Bea! I've been working recently with this as well: - Those warnings that show up don't seem to affect any functionality, but I haven't tried to update that script in question to replace the obsolete code yet. - .NET 4x is actually the .NET Framework option on that drop-down (apparently it was renamed in more recent Unity versions but it can be confusing)
Thanks so much Teresa! I hope your project is going well
Good morning! I've been using Pupil Labs for a few weeks now, and I'm mainly using the GazeRays script to focus on objects in a scene. I've encountered an issue when using the Gaze Ray. I'm not sure how to explain it, but it seems like it's not working properly, and perhaps I think I might have set it up incorrectly. Could it be a calibration problem?
In this case, it's more visible.
Hi @user-224f5a ! It sounds like you're using the HTC Vive Add-on with the HMD-eyes package. To help troubleshoot your issue, could you first confirm if you've reviewed the developer documentation ? Additionally, does the error also appear in the GazeRay Demo Scene? ? And does the error occur as well on the GazeRay Demo Scene?
For further analysis, please share a recording from Pupil Capture, where you wear the headset, roll the eyes to fit the model and perform the calibration. Sometimes, issues like jitter can stem from a poorly configured 3D model or pupil occlusions.
Good morning, I'm sorry I didn't specify, but I confirm that I'm using the HTC Vive Add-on with the HMD-eyes package. I have consulted the developer documentation, and the issue also appears in the GazeRay demo. I will soon conduct a more thorough analysis and share the recording. Thank you.
I have a question. I recorded the roll eye movement and calibration with Pupil Capture. What do I share with you? The single eye videos? Since I'm using the HTC Vive Add-on with the HMD-eyes package in Unity, I don't have a video of the World because it's all gray.
I believe Miguel was referring to the recording that you can create with the Recorder. You can check the path where these are being saved in the Pupil Capture > Recorder > Path to Recordings. There you will find folders for your recordings that contain several files (including the videos), you should send one of these folders that shows your problem!
@user-224f5a Exactly what @user-13d491 mentioned, thank you! Zip the recording folder and share it with us at data@pupil-labs.com
Thank you everyone! I've sent everything!
Hello, I am trying out the GazeDemo, and in Unity my eyes appear upside down although they are right side up in Pupil Capture (photo 1). Despite that, the calibration seemed successful. However, it seems like I am only able to look at half the scene after calibration (photo 2), and the yellow cursor that marks my gaze only travels in the top half. Has anyone else run into this? How can I fix it?
Hi @user-e6afe3! I'm not so sure about the half scene scenario. Perhaps what Teresa mentioned, about headset tracking, is to blame. But I can confirm that having the eye videos rendered upside down will not affect your eye tracking calibration.
Just a quick question first: Do I understand correctly then that in this message (https://discord.com/channels/285728493612957698/285728635267186688/1237871207726977034), you were presenting the Unity stimuli on a computer monitor and wearing a Pupil Core, but no VR headset was involved?
Hi Bea, not sure what it could be but I noticed that your position is lower than I'd expect (seems like floor level). Could the headset tracking be off? Is it tracking your head movements?
The calibration went well within Pupil Capture and the monitor is at eye level, if not a little higher. The only thing I can think of is I was wearing the headset while trying to run the calibration commands and trying to get screenshots for troubleshooting, so I might try to get a colleague to wear the headset while I troubleshoot. Thanks for responding!
Has anyone gotten these errors when trying to add the hmd-eyes plugin into their own project? Will it be resolved if I enable "allow unsafe code" in Player Settings?
Hi @user-e6afe3 👋 ! Yes, the MessagePack module in the hmd-eyes plugin uses "unsafe" code and because of this, the Starter Project enables unsafe code, so you will also need to do that for your own project. This is not code that will harm your computer, especially not in this case. It is a way for programmers to leave a little warning on sections of code that manipulate memory directly, which is sometimes necessary, but which also requires care to be used correctly, so code needs to "opt-in" to it. It is not anything that you need to be concerned with as a user, as the plugin internally uses the "unsafe" code appropriately, but if you would like to see what I mean, one of the lines in question is here.
Hello again, thanks to everyone who has responded to my questions. I feel like this might be a basic one, but I am new to my lab and the Pupil set up. The documentation tells me "The software has been tested for both Oculus and OpenVR SDKs. Please make sure to select the correct SDK for your headset." How do I know which SDK is the correct one for my headset?
Hi @user-e6afe3 , which headset do you have?
i have the pupil core headset
Sorry, I mean which VR headset.
i see, i only have the pupil core headset and nothing else. i didn't realize i needed another headset to work with hmd-eyes. can you provide me with more information on the VR headset options?
yes that would be correct.
If not, you can just focus on using Pupil Core & Pupil Capture the standard way, by starting a recording and running your Unity experiment. You would just need to place AprilTags at the four corners of the monitor to get gaze mapped into its coordinate system.
Ok, then you can make things a bit easier.
hmd-eyes is needed when using Pupil Core XR within head mounted displays (hmds), so VR headsets.
If you only want to use Unity to display stimuli on a computer monitor while wearing Pupil Core, then you can skip the hmd-eyes part. It could even alter your data, since hmd-eyes makes assumptions related to VR headsets.
But, may I ask if you are you planning to integrate a VR headset at some point?
the reason i thought i would need to use hmd-eyes is because i am using a new stimuli design where the user navigates through a 3D maze, in previous projects that i was not involved with my lab used 2D stimuli. however, i don't think we intend to integrate a VR headset so the pupil capture recorder feature might be the direction to go. thanks for your response, i am glad i checked in before i got any further with my misunderstanding. do you have any insight into any documentation that could guide me on how to relate gaze data in the screen coordinate system of pupil capture to the unity game/world coordinate system?
Hi @user-e6afe3 , I see.
This is in principle possible. I assume you want to know which object the observer is looking at in the Unity scene?
Is it possible to continue this discussion in 💻 software-dev , since it is no longer directly related to Pupil Core XR?