🔬 research-publications


user-af87c8 05 August, 2018, 06:34:56

@user-29e10a we are currently running a battery of tests on pupillabs (all mayor eye movements are compared to a simultaneously recorded eyelink 1000). If you need experimental code for accuracy, smooth pursuit, blinks, microsaccades, head motion, free viewing (all in psychtoolbox/matlab) feel free to go to http://github.com/behinger/etcomp and/or contact me. This is unfortunately not in VR

user-1efa49 05 August, 2018, 19:36:58

@user-af87c8 we are interested in transitioning from eyelink 1000 to pupillabs - do you have psychtoolbox + pupillab code in your repository?

user-29e10a 06 August, 2018, 06:58:25

@user-af87c8 thank you very much, I will look into this

user-d9bb5a 07 August, 2018, 08:13:07

Good afternoon. And somebody conducted a field study in the store with the help of Pupil labs?

user-af87c8 07 August, 2018, 17:22:02

@user-1efa49 yes. for experimental use there are several things to do: - we display pupillabs surface markers using a modified flip screen [1] - we connect to pupillabs using zmq [2]; read [3] how we had to compile the mex under ubuntu - the available zmq-interfaces for matlab are buggy - we send messages to pupil labs and record with our own plugin [4]. I recommend to exchange the special strings e.g. "r" for start recording with more sensible commands. "recording start" <- else strings like "remoteTrial" (due to initial "r") would trigger start recording as well. This script is taken from pupillabs - Note this line [5]. Here we get the timestamp of the most current timestamp. This is necessary because sometimes (in 30-40% of cases) we observed large drifts between cameras and recording computer clocks (github issue forthcoming) - and therefore also between the message/trigger-timestamp and the actual recording

The experimental code is collaborative work with Inga Ibs

----- Analysis ---- Here it gets more interesting. We want a fully reproducible analysis pipeline without manual steps (we dont want things like "start pupil player now and detect surfaces"). Therefore we interface calibration-functions and surface-detection functions directly from pupillabs. This unfortunately requires many libraries of pupillab to be compiled. Skip if you have compiled the pupillabs sofware already: Good news is: there is a make file & we do not need SUDO rights for most things, bad news there are 1 or 2 libraries that need to be installed using sudo [6]

If you need a detailed analysis pipeline, I think its best to look into the scripts [7] and [8]. There is quite a bit of overload because we need to analyse Eyelink + Pupillabs at the same time. But I think the code is factored OK 😃

user-1efa49 09 August, 2018, 02:44:47

Wow! Thanks so much the detailed description and code links.

user-11fa54 10 August, 2018, 09:44:23

i am implementing a paper Hybrid eye center localization using cascaded regression and hand-crafted model fitting please help me with the implementation in opencv

wrp 10 August, 2018, 09:44:37

@user-11fa54 are you referencing this paper: https://www.sciencedirect.com/science/article/pii/S0262885618300040 ?

user-11fa54 10 August, 2018, 09:44:58

yes this is the paper

wrp 10 August, 2018, 09:46:26

@user-11fa54 this looks like it is within the domain of Remote eye tracking and not head-mounted/wearable eye tracking. The Pupil community is primarily focused on head-mounted/wearable eye tracking. But perhaps someone in the community can respond/provide you with some pointers. However, if you want to get some concrete feedback I would suggest that you share some code that you are already working on (e.g. github link to your repository).

user-11fa54 10 August, 2018, 09:46:54

ok

user-6d6c44 22 August, 2018, 20:45:37

Hello everyone, I'm trying to replicate a study similar to White et. al "Usability Comparison of Conventional Direct Control Versus Pattern Recognition Control of Transradial Prostheses", which measures cognitive load as a function of the pupillary increases per second (NPI). They cite an index of cognitive activity (ICA) which is patented. Basically, they both measure a rate of pupillary increases while performing a task. I'm hoping to perform a similar study with the eye tracker, and have recorded some preliminary data, but I'm having trouble finding a standard method of filtering the pupil diameter data. I'm in touch with the author who has been helpful, but I just wanted to see if anyone in this group may have insights we don't!

End of August archive