🥽 core-xr


user-1176c9 03 November, 2020, 18:08:36

Hello all! I just started to use HTC Vive and Unity to set up an eye tracking experiment. My problem is that I wasn't able to get a good and satisfiying calibration yet. I use the 3d gaze demo made with Unity to calibrate. The calibration process works but after that the gaze is always inaccurate, deviated of about one to three centimeters (I know it's actually wrong to describe that distance in centimeters but I don't know how to describe it otherwise). Are there any best practises somebody figured out to get a good calibration? Or any hints and tips? Any help would be appreciated very much! Thanks for your time!

user-15acf4 09 November, 2020, 12:43:55

? @mpk I did, and the demo seems to work fine. Then the question is, can I use the demo to calibrate the eye-tracker first, exit it after it says calibration successful, and then load my simulation? Will the eye-tracker still record calibrated gaze data when testing my simulation(s) after such a calibration?

mpk 11 November, 2020, 16:10:06

@user-15acf4 yes, this should work as long as the headset did not move to much.

user-b14f98 12 November, 2020, 14:21:36

Hey folks, any plans to allow for recording of eye videos in VR for offline processing?

user-b14f98 12 November, 2020, 14:22:02

As you know, the 3D model adds a lot of load to the GPU.

papr 12 November, 2020, 14:23:11

@user-b14f98 Pupil Capture already records the eye videos by default. You can use post-hoc pupil detection on them. Is this what you are looking for?

user-b14f98 12 November, 2020, 14:23:30

No, and that's because I was not very clear.

user-b14f98 12 November, 2020, 14:23:42

What I would like is to reduce the runtime load on the GPU.

user-b14f98 12 November, 2020, 14:24:03

Right now, HMD eyes requires 3D pupil fitting and gaze mapping, right?

user-b14f98 12 November, 2020, 14:24:09

Last time I checked, I could not turn it off.

user-b14f98 12 November, 2020, 14:24:28

I would like a minimal mode that just records the videos for offline processing.

user-b14f98 12 November, 2020, 15:23:19

To be clear, I sometimes run some GPU intensive experiments in Unity which suffer from dropped frames when also recording eye movements. The analysis of eye movements, however could take place offline.

papr 12 November, 2020, 15:26:56

@user-b14f98 You can turn off real-time pupil detection in the world general settings. You need it for real-time calibration, though. Alternatively, you can use post-hoc calibration but you would have to annote the calibration targets manually

user-b14f98 12 November, 2020, 15:33:05

Any way to do it through the SDK?

user-b14f98 12 November, 2020, 15:33:22

In C#?

papr 12 November, 2020, 15:42:26

@user-b14f98 You would have to RequestController.send() a notification of this structure:

new Dictionary<string, object> {
  {"subject", "set_pupil_detection_enabled"},
  {"value", false},
};
user-b14f98 12 November, 2020, 15:42:44

oK! Thans, @papr .

user-b14f98 12 November, 2020, 15:42:46

Thanks.

user-b14f98 20 November, 2020, 23:11:29

BTW, implemented this this week, and it worked like a charm.

user-b14f98 20 November, 2020, 23:15:36

Thanks again 🙂

user-525b2f 22 November, 2020, 12:42:12

hello everyone :)

Vive Pro, HTC Vive Add-on 200Hz, Unity 2020.1, trying to establish tracking within Unity with use of https://github.com/pupil-labs/hmd-eyes/releases/latest

I run Pupil Service v 2.6.19 (which seems to be hanging each time (but seems to be working in a matter of connection detection from within Unity) with Pupil Capture v 2.6.19.

Each scene I try lacks head movement tracking - is that intentional?

It detects connection, I see the cameras preview outside the Unity, within Unity I run calibration and then I wait ages for the results (there is actually gui text saying about waiting for the results).

What am I doing wrong?

We want to establish functionality within Unity and output the tracking data. Please help 🙂

user-6e3d0f 27 November, 2020, 12:05:06

How can I see the the accuracy and precision after my calibration? it is not shown in capture when I stream it to there and I cant find it in Unity either

papr 27 November, 2020, 13:44:27

@user-525b2f Please check the Unity logs for more specific information and share it with us.

papr 27 November, 2020, 13:44:53

@user-6e3d0f Accuracy should be calculated by the accuracy visualizer plugin in Capture.

user-6e3d0f 27 November, 2020, 13:45:44

thats what I thought, but it didnt show anything. I need to look on it again

End of November archive