πŸ”¬ research-publications


user-f68ceb 02 March, 2019, 12:47:10

Hello to the community, I was wondering if someone can help with explaining how to create a Gaze Plot from Pupil Data? Something that looks like this:

user-f68ceb 02 March, 2019, 12:47:20

Chat image

wrp 02 March, 2019, 13:13:54

Hi @user-f68ceb I can make a demo of this for you on Monday as an extension of this tutorial: 02_load_exported_surfaces_and_visualize_aggregate_heatmap.ipynb

user-f68ceb 02 March, 2019, 22:10:16

This would be fantastic, @wrp – Thx in advance.

user-1bcd3e 03 March, 2019, 09:30:39

@wrp actually I'll be interesting too....

user-1bcd3e 03 March, 2019, 09:30:57

)

user-1bcd3e 03 March, 2019, 09:31:55

can u tag me on when you ll be publishing? @wrp thank you!

wrp 03 March, 2019, 10:54:28

Sure thing

user-1bcd3e 03 March, 2019, 18:59:49

@wrp thnks

wrp 04 March, 2019, 04:43:14

@user-f68ceb and @user-1bcd3e here is a quick example: https://github.com/pupil-labs/pupil-tutorials/blob/master/03_visualize_scan_path_on_surface.ipynb

wrp 04 March, 2019, 04:44:14

this could surely be improved/extended, but just wanted to show you how this could be done quickly. This tutorial assumes that you already have used Pupil Player with surface tracking plugin and Offline fixation detector plugin.

user-f68ceb 04 March, 2019, 15:49:51

This is brilliant! Thanks so much for this, @wrp – I'll try it out on the weekend.

wrp 04 March, 2019, 23:45:38

You're welcome @user-f68ceb

user-1bcd3e 05 March, 2019, 00:52:23

@wrp thank u! but I'll never make it )) .... I'll wait for pupil developer for a plug in to use ((

user-d9bb5a 06 March, 2019, 12:53:12

I'll try, it's very cool!

user-e45dc9 08 March, 2019, 21:41:21

Hi everyone, we are piloting our new Pupil Labs system, and I'm wondering if anyone has used Pupil Labs to track ocular torsion responses?

user-0a2ebc 09 March, 2019, 02:24:06

Why do we need to calibrate ? What is the purpose of it ?...thanks

user-af87c8 12 March, 2019, 15:55:09

@user-e45dc9 if I remember correctly then Annemiek Barsingerhorn is doing that, but I might wrongly remember

user-e45dc9 14 March, 2019, 18:27:28

Thanks @user-af87c8 - I will reach out.

user-7890df 17 March, 2019, 18:39:18

@user-af87c8 great comparison paper of pupil labs with the eye link! It appears that you believe it is fairly accurate, good for medium-large saccades (NOT microsaccades), and default algorithm for detecting blinks is poor.

What do you speculate as the reason for calibration degradation after 3-4 minutes? What do you think will be the solution for having it maintain a high static frame rate (240Hz?) For researchers like us, what do you think would be the best solution for timing accuracy (display stimuli time with eye movement time)?

Thanks and great work!

user-af87c8 21 March, 2019, 22:56:43

hi @user-7890df. I'm happy that you like the paper :)

  • Calibration degradation: Could be slippage, as we state in the paper, the 3D algorithm might have overall worse, but more stable performance. But I haven't had the time to look at the data yet (now in a different lab, different topic etc.)
  • To get a high static frame rate: That is a very good question that pupil labs should probably answer. In principle you could try to restart the ET, measure the timelag between cameras and repeat until you are happy .
  • Timing accuracy: I liked the pupillabs ZMQ interface. So I send messages whenever an experimental event is happening and later I use that to synchronize. We found the temporal lag of sending those message to be <1ms

Hope that helps! Btw. I don't check the chat here very regularly, you got lucky πŸ˜‰

user-71df6f 23 March, 2019, 18:55:07

Hi Pupil Labs Community! Just got on here. I am researcher with Columbia University and we use the EyeLink 1000. Just curious here, but does anyone have any experience using a VR eye-tracking for research purposes? We exploring that option looking at pupil labs and the option Tobii pro has for VR. Any views/info/resources you can point me to? Thanks πŸ˜ƒ

user-8779ef 26 March, 2019, 18:35:43

@user-71df6f Yes, i have experience. Can you describe the use case?

user-71df6f 28 March, 2019, 03:24:39

@user-8779ef Thanks! The main thing we are trying to do with the 3D eye tracking is a convergence test. We want to simulate an object coming towards the subject and be able to get a measure of how well (or to what extent) they were able to converge. I have only ever worked with the 2D eye tracker EyeLink1000, so I am not even sure what kind of raw data output one by pupil labs or other VR eye trackers output. Any insight would be super helpful. Cheers!

user-8779ef 28 March, 2019, 11:37:57

@user-71df6f I'm also a vision scientist. I'm not confident that this eye tracker has the precision for that kind of work. You may be able to get a delta signal (rate of convergence), but it will be noisy.

user-8779ef 28 March, 2019, 11:39:34

The Tobii is no better. If you want some insight into why, calculate the convergence angle to a a point of regard 1.5 m away from the head. Then, calculate how much the point of regard would shift (in depth) if there were oh ... 0.5˚ of change in convergence angle. You'll find that there will be a large change. This is the relationship that defines the precision required of your eye tracker for this kind of research. This calculation does not account for any biological noise, of which will only add to the imprecision.

user-8779ef 28 March, 2019, 11:42:32

Finally, if you haven't yet, you should maybe do some reading on the human sensitivity to changes in convergence angle. I have a colleague who might be able to suggest a bit of reading on the matter. I'll confer with him today and see if he has a source for that.

user-8779ef 28 March, 2019, 11:44:02

In any case, I can assure you that neither the Tobii nor the Pupil Labs, nor any mobile eye tracker / VR integration will be very accurate in recovering the binocular point of regard in depth at distances beyond ... oh, 2 meters. Total guess, but I'm fairly confident about that.

user-8779ef 28 March, 2019, 11:45:44

Finally, make sure you're familiar with the vergence / accommodation conflict in VR, which will certainly influence vergence kinematics.

user-8779ef 28 March, 2019, 11:46:24

THere's a paper by Hoffman, Banks et al. in ... journal of vision, I think, that you will want to read.

user-71df6f 29 March, 2019, 19:10:22

@user-8779ef Thank so much for the response. I'll definitely read the Banks paper. Please do refer me to any literature you might think would help us understand a convergence test. Also any vision science literature. We are thinking of a convergence test where you would take a small object and move it towards to the participants' nose and see to what extent their eyes are able to follow it and make intergroup comparisons.

user-71df6f 29 March, 2019, 19:11:03

With regards to the tracking device do you think the EyeLinkII wearable might work? I am very new to eye tracking research so really do appreciate the help

user-8b1528 29 March, 2019, 19:40:41

@user-8779ef You have a link to the Hoffman, banks et al. you are refering to ? Thanks !

End of March archive