Hey guys, I'm having constant issues with 3D eye calibration up from August 2018.....
Pupil has recently changed the calibration process to only 3D calibration, but I'm repeatedly experiencing that 3D calibration does not work for me, whereas 2D does. Testing with 3d calibration repeatedly only shows horrible eye tracking for my eyes. Old Unity package version with 2d eye tracking works just fine though.
Has to do something with my smaller eye openings(I'm an Asian and my pupils are partially covered by my eyelids...), and I've also found people with bigger eyes(thus more exposed pupils) tend to get better calibration, when testing applications that uses pupil services.
As a person who's using eye trackers for surveys, supporting only 3d calibration means I won't be able to gather credible data from lots of people(who are also mostly asians...) with 3d calibration method alone. I would have to turn most of the participants in my surveys away, as 3d calibration simply don't work with people with narrower eyes which is surprisingly common in asian regions.....(and also glasses as infrad cannot pass through glasses). Can there be any solutions to remedy this issue, like reverting the 2d calibration options???
@user-644a65 We have noticed something similar. In particular, the center is skewed to the right and there seems to be a drifting effect: in the GazeDemoScene for instance, if the user actively looks at the yellow dot in the scene, they will end up consistently drifting to the right side. This seems to be consist across several users, so probably something to do with the calibration?
@fxlange thanks for your response (07/25/2019). I'm showing all calibration targets at the same distance as I show my stimuli. This was either 1 or 10m, and obv. I scale everything into the right proportions when doing that. I used concentric circles ranging from 2° to 16° radius. (2-4-8-16) The outermost targets (16° eccentricity) however, were mostly blurry due to presentation outside of the Vive's sweet spot. I removed them now. Do you have any recommendations for fine-tuning the other calibration parameters (Secs per target, Ignore Init. Secs, Samples per target)? I tried with longer intervals but that rather decreased calibration quality.
Also thanks for the link to the fine-tuning section for improving the pupil tracking. I was aware of that but I'm still trying to find the best settings. Any recommendations for the "Intensity Range"? It's obv a trade-off, but rather be on the side of having marked too much as the pupil or too little? Would you manually "reset the 3d model" (via capture GUI) before instructing the participant to look around in order to build a proper 3d model?
Hello, Is there a way to disable eye tracking from within Unity? Do I just run the Co-Routine Connect
inside of ReqestController
with a retry of false?
And then to try and connect just manually run that co-routine at a later time?
@user-52f2e6 For me the case is opposite. With my dominant eye being the left, I experience skewing to the left at the gazing demo scene. (My right is is short-sighted so I wear glasses unless I have to put on hmd) As for the offset in center calibration marker, I think I also felt such thing, but then, I think this really depends on which of your eye might be more dominant at close range that the marker itself seems to presenting itself on your 'strong' eye. Perhaps your eyesight might be depending on your right eye more often that your left eye that at closer range an offset to the right is detected on closer ranges?
@user-41353f you could calculate the relative distance between both eyes in 3d based on pupil or gaze data.
@user-a3b085 you can always disable the corresponding gameobjects or components in the inspector and enable them by code later at runtime.
@user-b3a50f we discontinued 2D in hmd-eyes 1.0. the setting you mentioned is not fully supported anymore and will probably be removed before the final release.
@user-ee433b there is no easy way for combining pupil capture with some third party VR content. You would need to know the vr camera intrinsics to map pupil/gaze data on a screen capturing.
Hi, Eye tracking is not needed. The pupils position+diameters and blink duration are enough for my application but I need to synchronize them with what happens in the headset
*I rephrase: the point to point mapping is not needed, just raw data
@fxlange Guess I would have to live without 2D calibration, as you mentioned.
On the calibration side, it would be extremely useful to pre-check my VIVE sweetspot before going on with the calibration.
Something like a concentric circle mark on the background could be valuable to check if I'm indeed putting on my HMD with my eyes centered to the center. (Just like the original concentric background mark on the old calibration screen)
@fxlange Hi, I think I was not clear. I don't need the eye tracking parts, I just need those 3 data: - Blink - Pupil position (in eye-cam referential) - Pupil images I think any of these need calibration so I don't need the intrinsic parameters of the VR cam. It seems that all 3 data are recorded by Capture, but my problem is to synchronize the data collected by Capture with the VR videos. Have you done this before? Do you have a track?
@user-ee433b In this case you can subscribe to the corresponding topics (blink
, pupil
, frame.pupil
). Each message sent has its own timestamp in Pupil time. You can use hmd-eyes to convert from Pupil to Unity time
Hi, thx for the answer. I plan to use a HTC "game" so I don't have access to the "game" sources to implement the synchronization and the collection of data. What would be perfect, is that I could feed the video from the game to the "world" of capture. This way, the image from the game would be synchronized with pupils/blink/frame and I could analyze it with PupilPlayer Is there a way to "feed" something in the "world" view?
@user-ee433b That does not really work unfortunately, since we need to know the game camera's intrinsics to properly perform gaze mapping
But I don't need the gaze, just raw data
synchronized with the images of the game
I see
We will be releasing a plugin to stream simple rgb buffers to Capture. But this feature is still under construction.
Is there a way to select a monitor as entry for "world"?
@user-ee433b No, that is not being planned. It is on you to implement that functionality.
ok, I'll do that
(I'm trying to install zmq c++ on windows 10)
Not sure if this the right channel for this question but : Are you guys planning on creating hardware to support newer headsets (IE Star VR)?
@mods I'm also wondering this! Will be important for dev decisions...
can somebody explain whats the difference between 2d and 3d calibration also in calibration settings script there is a PositionKey function that returns "norm_pos" for 2d and "mm_pos" for 3d what are they supposed to mean?
hi @user-d230c0 - which version of hmd-eyes are you refering to? from hmd-eyes v1.0 onward only binocular 3d mapping is supported. in the current beta we still have references to 2d calibration (dual monocular) but these will be cleaned up before the v1.0 release.
@user-141bcd - the default calibration settings are already more on the safe side and I often even use the settings with shorter intervals and less targets without any issues. - do you already have the 200hz addon? due to the camera FOV and less reflections (fewer LEDs build in) the tracking is more robust (I'm even using it with my glasses on) - manual resetting the 3d model for new participants is recommended, yes. - the intensity range might especially need fine tuning for each participant individually - try to minimize the "leaking" of the blue indicator in the algorithm view - for targets close to the viewer ~1m and less it is especially important to take the gaze mapping context (monocular vs binocular) into account - for the gaze projection into the scene.
Hi, when looking at data in gaze_position.csv (recorded with capture and exported from pupil player), how do the gaze_point_3d x, y, and z coordinates relate to unity coordinates?
Hi @user-52f2e6 we are exporting in opencv coordinate system: positive x: right positive y: down positive z: forward
In order to convert to Unity, I think you will just have to flip your y coordinate (multiply by -1)
@user-41353f @user-360afa We are currently evaluating Oculus Quest and Valve Index but have not yet determined if we will be making add-ons for these HMDs.
We do not have any concrete plans at this time to make/sell an add-on for IE Star VR. However, we can offer to supply cameras and IR illuminators so that you can prototype an integration for this HMD. If interested, please contact [email removed]
Thanks for your response!