Hello all! I just started to use HTC Vive and Unity to set up an eye tracking experiment. My problem is that I wasn't able to get a good and satisfiying calibration yet. I use the 3d gaze demo made with Unity to calibrate. The calibration process works but after that the gaze is always inaccurate, deviated of about one to three centimeters (I know it's actually wrong to describe that distance in centimeters but I don't know how to describe it otherwise). Are there any best practises somebody figured out to get a good calibration? Or any hints and tips? Any help would be appreciated very much! Thanks for your time!
? @mpk I did, and the demo seems to work fine. Then the question is, can I use the demo to calibrate the eye-tracker first, exit it after it says calibration successful, and then load my simulation? Will the eye-tracker still record calibrated gaze data when testing my simulation(s) after such a calibration?
@user-15acf4 yes, this should work as long as the headset did not move to much.
Hey folks, any plans to allow for recording of eye videos in VR for offline processing?
As you know, the 3D model adds a lot of load to the GPU.
@user-b14f98 Pupil Capture already records the eye videos by default. You can use post-hoc pupil detection on them. Is this what you are looking for?
No, and that's because I was not very clear.
What I would like is to reduce the runtime load on the GPU.
Right now, HMD eyes requires 3D pupil fitting and gaze mapping, right?
Last time I checked, I could not turn it off.
I would like a minimal mode that just records the videos for offline processing.
To be clear, I sometimes run some GPU intensive experiments in Unity which suffer from dropped frames when also recording eye movements. The analysis of eye movements, however could take place offline.
@user-b14f98 You can turn off real-time pupil detection in the world general settings. You need it for real-time calibration, though. Alternatively, you can use post-hoc calibration but you would have to annote the calibration targets manually
Any way to do it through the SDK?
In C#?
@user-b14f98 You would have to RequestController.send()
a notification of this structure:
new Dictionary<string, object> {
{"subject", "set_pupil_detection_enabled"},
{"value", false},
};
oK! Thans, @papr .
Thanks.
BTW, implemented this this week, and it worked like a charm.
Thanks again 🙂
hello everyone :)
Vive Pro, HTC Vive Add-on 200Hz, Unity 2020.1, trying to establish tracking within Unity with use of https://github.com/pupil-labs/hmd-eyes/releases/latest
I run Pupil Service v 2.6.19 (which seems to be hanging each time (but seems to be working in a matter of connection detection from within Unity) with Pupil Capture v 2.6.19.
Each scene I try lacks head movement tracking - is that intentional?
It detects connection, I see the cameras preview outside the Unity, within Unity I run calibration and then I wait ages for the results (there is actually gui text saying about waiting for the results).
What am I doing wrong?
We want to establish functionality within Unity and output the tracking data. Please help 🙂
How can I see the the accuracy and precision after my calibration? it is not shown in capture when I stream it to there and I cant find it in Unity either
@user-525b2f Please check the Unity logs for more specific information and share it with us.
@user-6e3d0f Accuracy should be calculated by the accuracy visualizer plugin in Capture.
thats what I thought, but it didnt show anything. I need to look on it again