🥽 core-xr


user-43b7f1 01 August, 2019, 10:31:28

Hey guys, I'm having constant issues with 3D eye calibration up from August 2018.....

Pupil has recently changed the calibration process to only 3D calibration, but I'm repeatedly experiencing that 3D calibration does not work for me, whereas 2D does. Testing with 3d calibration repeatedly only shows horrible eye tracking for my eyes. Old Unity package version with 2d eye tracking works just fine though.

Has to do something with my smaller eye openings(I'm an Asian and my pupils are partially covered by my eyelids...), and I've also found people with bigger eyes(thus more exposed pupils) tend to get better calibration, when testing applications that uses pupil services.

As a person who's using eye trackers for surveys, supporting only 3d calibration means I won't be able to gather credible data from lots of people(who are also mostly asians...) with 3d calibration method alone. I would have to turn most of the participants in my surveys away, as 3d calibration simply don't work with people with narrower eyes which is surprisingly common in asian regions.....(and also glasses as infrad cannot pass through glasses). Can there be any solutions to remedy this issue, like reverting the 2d calibration options???

user-52f2e6 01 August, 2019, 13:37:55

@user-644a65 We have noticed something similar. In particular, the center is skewed to the right and there seems to be a drifting effect: in the GazeDemoScene for instance, if the user actively looks at the yellow dot in the scene, they will end up consistently drifting to the right side. This seems to be consist across several users, so probably something to do with the calibration?

user-141bcd 01 August, 2019, 14:57:48

@fxlange thanks for your response (07/25/2019). I'm showing all calibration targets at the same distance as I show my stimuli. This was either 1 or 10m, and obv. I scale everything into the right proportions when doing that. I used concentric circles ranging from 2° to 16° radius. (2-4-8-16) The outermost targets (16° eccentricity) however, were mostly blurry due to presentation outside of the Vive's sweet spot. I removed them now. Do you have any recommendations for fine-tuning the other calibration parameters (Secs per target, Ignore Init. Secs, Samples per target)? I tried with longer intervals but that rather decreased calibration quality.

user-141bcd 01 August, 2019, 15:01:13

Also thanks for the link to the fine-tuning section for improving the pupil tracking. I was aware of that but I'm still trying to find the best settings. Any recommendations for the "Intensity Range"? It's obv a trade-off, but rather be on the side of having marked too much as the pupil or too little? Would you manually "reset the 3d model" (via capture GUI) before instructing the participant to look around in order to build a proper 3d model?

user-a3b085 01 August, 2019, 23:19:38

Hello, Is there a way to disable eye tracking from within Unity? Do I just run the Co-Routine Connect inside of ReqestController with a retry of false?

user-a3b085 01 August, 2019, 23:21:30

And then to try and connect just manually run that co-routine at a later time?

user-43b7f1 02 August, 2019, 05:10:41

@user-52f2e6 For me the case is opposite. With my dominant eye being the left, I experience skewing to the left at the gazing demo scene. (My right is is short-sighted so I wear glasses unless I have to put on hmd) As for the offset in center calibration marker, I think I also felt such thing, but then, I think this really depends on which of your eye might be more dominant at close range that the marker itself seems to presenting itself on your 'strong' eye. Perhaps your eyesight might be depending on your right eye more often that your left eye that at closer range an offset to the right is detected on closer ranges?

fxlange 02 August, 2019, 07:52:39

@user-41353f you could calculate the relative distance between both eyes in 3d based on pupil or gaze data.

fxlange 02 August, 2019, 08:15:17

@user-a3b085 you can always disable the corresponding gameobjects or components in the inspector and enable them by code later at runtime.

fxlange 02 August, 2019, 08:31:13

@user-b3a50f we discontinued 2D in hmd-eyes 1.0. the setting you mentioned is not fully supported anymore and will probably be removed before the final release.

fxlange 02 August, 2019, 09:32:55

@user-ee433b there is no easy way for combining pupil capture with some third party VR content. You would need to know the vr camera intrinsics to map pupil/gaze data on a screen capturing.

user-ee433b 02 August, 2019, 09:40:19

Hi, Eye tracking is not needed. The pupils position+diameters and blink duration are enough for my application but I need to synchronize them with what happens in the headset

user-ee433b 02 August, 2019, 09:41:23

*I rephrase: the point to point mapping is not needed, just raw data

user-43b7f1 04 August, 2019, 05:36:54

@fxlange Guess I would have to live without 2D calibration, as you mentioned.

On the calibration side, it would be extremely useful to pre-check my VIVE sweetspot before going on with the calibration.

Something like a concentric circle mark on the background could be valuable to check if I'm indeed putting on my HMD with my eyes centered to the center. (Just like the original concentric background mark on the old calibration screen)

user-ee433b 06 August, 2019, 10:48:40

@fxlange Hi, I think I was not clear. I don't need the eye tracking parts, I just need those 3 data: - Blink - Pupil position (in eye-cam referential) - Pupil images I think any of these need calibration so I don't need the intrinsic parameters of the VR cam. It seems that all 3 data are recorded by Capture, but my problem is to synchronize the data collected by Capture with the VR videos. Have you done this before? Do you have a track?

papr 06 August, 2019, 11:18:12

@user-ee433b In this case you can subscribe to the corresponding topics (blink, pupil, frame.pupil). Each message sent has its own timestamp in Pupil time. You can use hmd-eyes to convert from Pupil to Unity time

user-ee433b 07 August, 2019, 08:18:35

Hi, thx for the answer. I plan to use a HTC "game" so I don't have access to the "game" sources to implement the synchronization and the collection of data. What would be perfect, is that I could feed the video from the game to the "world" of capture. This way, the image from the game would be synchronized with pupils/blink/frame and I could analyze it with PupilPlayer Is there a way to "feed" something in the "world" view?

papr 07 August, 2019, 08:20:30

@user-ee433b That does not really work unfortunately, since we need to know the game camera's intrinsics to properly perform gaze mapping

user-ee433b 07 August, 2019, 08:21:43

But I don't need the gaze, just raw data

user-ee433b 07 August, 2019, 08:23:50

synchronized with the images of the game

papr 07 August, 2019, 08:40:48

I see

papr 07 August, 2019, 08:42:04

We will be releasing a plugin to stream simple rgb buffers to Capture. But this feature is still under construction.

user-ee433b 07 August, 2019, 09:11:32

Is there a way to select a monitor as entry for "world"?

papr 07 August, 2019, 09:31:48

@user-ee433b No, that is not being planned. It is on you to implement that functionality.

user-ee433b 07 August, 2019, 09:46:55

ok, I'll do that

user-ee433b 07 August, 2019, 09:52:42

(I'm trying to install zmq c++ on windows 10)

user-41353f 14 August, 2019, 17:57:02

Not sure if this the right channel for this question but : Are you guys planning on creating hardware to support newer headsets (IE Star VR)?

user-360afa 15 August, 2019, 11:43:59

@mods I'm also wondering this! Will be important for dev decisions...

user-d230c0 19 August, 2019, 20:10:06

can somebody explain whats the difference between 2d and 3d calibration also in calibration settings script there is a PositionKey function that returns "norm_pos" for 2d and "mm_pos" for 3d what are they supposed to mean?

fxlange 22 August, 2019, 14:57:07

hi @user-d230c0 - which version of hmd-eyes are you refering to? from hmd-eyes v1.0 onward only binocular 3d mapping is supported. in the current beta we still have references to 2d calibration (dual monocular) but these will be cleaned up before the v1.0 release.

fxlange 22 August, 2019, 15:23:18

@user-141bcd - the default calibration settings are already more on the safe side and I often even use the settings with shorter intervals and less targets without any issues. - do you already have the 200hz addon? due to the camera FOV and less reflections (fewer LEDs build in) the tracking is more robust (I'm even using it with my glasses on) - manual resetting the 3d model for new participants is recommended, yes. - the intensity range might especially need fine tuning for each participant individually - try to minimize the "leaking" of the blue indicator in the algorithm view - for targets close to the viewer ~1m and less it is especially important to take the gaze mapping context (monocular vs binocular) into account - for the gaze projection into the scene.

user-52f2e6 27 August, 2019, 18:06:31

Hi, when looking at data in gaze_position.csv (recorded with capture and exported from pupil player), how do the gaze_point_3d x, y, and z coordinates relate to unity coordinates?

user-c5fb8b 28 August, 2019, 08:35:25

Hi @user-52f2e6 we are exporting in opencv coordinate system: positive x: right positive y: down positive z: forward

In order to convert to Unity, I think you will just have to flip your y coordinate (multiply by -1)

fxlange 29 August, 2019, 07:49:50

@user-41353f @user-360afa We are currently evaluating Oculus Quest and Valve Index but have not yet determined if we will be making add-ons for these HMDs.

We do not have any concrete plans at this time to make/sell an add-on for IE Star VR. However, we can offer to supply cameras and IR illuminators so that you can prototype an integration for this HMD. If interested, please contact [email removed]

user-360afa 29 August, 2019, 11:04:19

Thanks for your response!

End of August archive