Hello to the community, I was wondering if someone can help with explaining how to create a Gaze Plot from Pupil Data? Something that looks like this:
Hi @user-f68ceb I can make a demo of this for you on Monday as an extension of this tutorial: 02_load_exported_surfaces_and_visualize_aggregate_heatmap.ipynb
This would be fantastic, @wrp βΒ Thx in advance.
@wrp actually I'll be interesting too....
)
can u tag me on when you ll be publishing? @wrp thank you!
Sure thing
@wrp thnks
@user-f68ceb and @user-1bcd3e here is a quick example: https://github.com/pupil-labs/pupil-tutorials/blob/master/03_visualize_scan_path_on_surface.ipynb
this could surely be improved/extended, but just wanted to show you how this could be done quickly. This tutorial assumes that you already have used Pupil Player with surface tracking plugin and Offline fixation detector plugin.
This is brilliant! Thanks so much for this, @wrp β I'll try it out on the weekend.
You're welcome @user-f68ceb
@wrp thank u! but I'll never make it )) .... I'll wait for pupil developer for a plug in to use ((
I'll try, it's very cool!
Hi everyone, we are piloting our new Pupil Labs system, and I'm wondering if anyone has used Pupil Labs to track ocular torsion responses?
Why do we need to calibrate ? What is the purpose of it ?...thanks
@user-e45dc9 if I remember correctly then Annemiek Barsingerhorn is doing that, but I might wrongly remember
Thanks @user-af87c8 - I will reach out.
@user-af87c8 great comparison paper of pupil labs with the eye link! It appears that you believe it is fairly accurate, good for medium-large saccades (NOT microsaccades), and default algorithm for detecting blinks is poor.
What do you speculate as the reason for calibration degradation after 3-4 minutes? What do you think will be the solution for having it maintain a high static frame rate (240Hz?) For researchers like us, what do you think would be the best solution for timing accuracy (display stimuli time with eye movement time)?
Thanks and great work!
hi @user-7890df. I'm happy that you like the paper :)
Hope that helps! Btw. I don't check the chat here very regularly, you got lucky π
Hi Pupil Labs Community! Just got on here. I am researcher with Columbia University and we use the EyeLink 1000. Just curious here, but does anyone have any experience using a VR eye-tracking for research purposes? We exploring that option looking at pupil labs and the option Tobii pro has for VR. Any views/info/resources you can point me to? Thanks π
@user-71df6f Yes, i have experience. Can you describe the use case?
@user-8779ef Thanks! The main thing we are trying to do with the 3D eye tracking is a convergence test. We want to simulate an object coming towards the subject and be able to get a measure of how well (or to what extent) they were able to converge. I have only ever worked with the 2D eye tracker EyeLink1000, so I am not even sure what kind of raw data output one by pupil labs or other VR eye trackers output. Any insight would be super helpful. Cheers!
@user-71df6f I'm also a vision scientist. I'm not confident that this eye tracker has the precision for that kind of work. You may be able to get a delta signal (rate of convergence), but it will be noisy.
The Tobii is no better. If you want some insight into why, calculate the convergence angle to a a point of regard 1.5 m away from the head. Then, calculate how much the point of regard would shift (in depth) if there were oh ... 0.5Λ of change in convergence angle. You'll find that there will be a large change. This is the relationship that defines the precision required of your eye tracker for this kind of research. This calculation does not account for any biological noise, of which will only add to the imprecision.
Finally, if you haven't yet, you should maybe do some reading on the human sensitivity to changes in convergence angle. I have a colleague who might be able to suggest a bit of reading on the matter. I'll confer with him today and see if he has a source for that.
In any case, I can assure you that neither the Tobii nor the Pupil Labs, nor any mobile eye tracker / VR integration will be very accurate in recovering the binocular point of regard in depth at distances beyond ... oh, 2 meters. Total guess, but I'm fairly confident about that.
Finally, make sure you're familiar with the vergence / accommodation conflict in VR, which will certainly influence vergence kinematics.
THere's a paper by Hoffman, Banks et al. in ... journal of vision, I think, that you will want to read.
@user-8779ef Thank so much for the response. I'll definitely read the Banks paper. Please do refer me to any literature you might think would help us understand a convergence test. Also any vision science literature. We are thinking of a convergence test where you would take a small object and move it towards to the participants' nose and see to what extent their eyes are able to follow it and make intergroup comparisons.
With regards to the tracking device do you think the EyeLinkII wearable might work? I am very new to eye tracking research so really do appreciate the help
@user-8779ef You have a link to the Hoffman, banks et al. you are refering to ? Thanks !