@user-29e10a we are currently running a battery of tests on pupillabs (all mayor eye movements are compared to a simultaneously recorded eyelink 1000). If you need experimental code for accuracy, smooth pursuit, blinks, microsaccades, head motion, free viewing (all in psychtoolbox/matlab) feel free to go to http://github.com/behinger/etcomp and/or contact me. This is unfortunately not in VR
@user-af87c8 we are interested in transitioning from eyelink 1000 to pupillabs - do you have psychtoolbox + pupillab code in your repository?
@user-af87c8 thank you very much, I will look into this
Good afternoon. And somebody conducted a field study in the store with the help of Pupil labs?
@user-1efa49 yes. for experimental use there are several things to do: - we display pupillabs surface markers using a modified flip screen [1] - we connect to pupillabs using zmq [2]; read [3] how we had to compile the mex under ubuntu - the available zmq-interfaces for matlab are buggy - we send messages to pupil labs and record with our own plugin [4]. I recommend to exchange the special strings e.g. "r" for start recording with more sensible commands. "recording start" <- else strings like "remoteTrial" (due to initial "r") would trigger start recording as well. This script is taken from pupillabs - Note this line [5]. Here we get the timestamp of the most current timestamp. This is necessary because sometimes (in 30-40% of cases) we observed large drifts between cameras and recording computer clocks (github issue forthcoming) - and therefore also between the message/trigger-timestamp and the actual recording
The experimental code is collaborative work with Inga Ibs
----- Analysis ---- Here it gets more interesting. We want a fully reproducible analysis pipeline without manual steps (we dont want things like "start pupil player now and detect surfaces"). Therefore we interface calibration-functions and surface-detection functions directly from pupillabs. This unfortunately requires many libraries of pupillab to be compiled. Skip if you have compiled the pupillabs sofware already: Good news is: there is a make file & we do not need SUDO rights for most things, bad news there are 1 or 2 libraries that need to be installed using sudo [6]
If you need a detailed analysis pipeline, I think its best to look into the scripts [7] and [8]. There is quite a bit of overload because we need to analyse Eyelink + Pupillabs at the same time. But I think the code is factored OK 😃
[1] https://github.com/behinger/etcomp/blob/master/experiment/flip_screen.m [2] https://github.com/behinger/etcomp/blob/master/experiment/ETcomp.m#L60 [3] https://github.com/behinger/etcomp/blob/master/experiment/zmq/compile.m [4] https://github.com/behinger/etcomp/blob/master/experiment/plugins/nbp_pupil_remote.py [5] https://github.com/behinger/etcomp/blob/master/experiment/plugins/nbp_pupil_remote.py#L237 [6] https://github.com/behinger/etcomp#new-installation [7] https://github.com/behinger/etcomp/blob/master/code/functions/et_import.py#L85 <-- pupil labs import + some fixes [8] https://github.com/behinger/etcomp/blob/master/code/functions/et_preprocess.py <-- main anal.pipeli
Wow! Thanks so much the detailed description and code links.
i am implementing a paper Hybrid eye center localization using cascaded regression and hand-crafted model fitting please help me with the implementation in opencv
@user-11fa54 are you referencing this paper: https://www.sciencedirect.com/science/article/pii/S0262885618300040 ?
yes this is the paper
@user-11fa54 this looks like it is within the domain of Remote eye tracking and not head-mounted/wearable eye tracking. The Pupil community is primarily focused on head-mounted/wearable eye tracking. But perhaps someone in the community can respond/provide you with some pointers. However, if you want to get some concrete feedback I would suggest that you share some code that you are already working on (e.g. github link to your repository).
ok
Hello everyone, I'm trying to replicate a study similar to White et. al "Usability Comparison of Conventional Direct Control Versus Pattern Recognition Control of Transradial Prostheses", which measures cognitive load as a function of the pupillary increases per second (NPI). They cite an index of cognitive activity (ICA) which is patented. Basically, they both measure a rate of pupillary increases while performing a task. I'm hoping to perform a similar study with the eye tracker, and have recorded some preliminary data, but I'm having trouble finding a standard method of filtering the pupil diameter data. I'm in touch with the author who has been helpful, but I just wanted to see if anyone in this group may have insights we don't!