💻 software-dev


user-fd8504 05 January, 2022, 12:34:14

Hi. I have a question - does pupillabs binocular addons for eye tracking give access to camera image?

nmt 05 January, 2022, 13:10:59

I have responded in the vr-ar channel: https://discord.com/channels/285728493612957698/285728635267186688/928273538132086804

user-cb599e 20 January, 2022, 20:19:27

Hi, I have the following question that I have not been able to find answer online: How does the 3d debug window is supposed to look like when it's working. Here is a screenshot of what I have in an exemplar recording. Is that normal or should the 3d pupil intersect with the 2d frame plane where the pupil is detected? We have a custom camera so that might be the source of my problems. Could this be explained by not having properly estimated the camera intrinsics?

Chat image

user-b91cdf 25 January, 2022, 10:33:17

Hi everyone, Is it possible to get the results of the Accuracy Visualizer Plugin via ZMQ? it seems like they are not published in the validation data

papr 25 January, 2022, 21:02:20

Hi, apologies for the late reply.

Could this be explained by not having properly estimated the camera intrinsics? Yes, that could indeed be the case. The crucial part of the intrinsics for the eye model is the focal length.

user-cb599e 25 January, 2022, 21:17:44

So you confirm that the eye should intersect the frame at the pupil.

papr 26 January, 2022, 10:10:03

This is how the model looks for me using the default Pupil Labs eye cam intrinsics

Chat image

user-cb599e 25 January, 2022, 21:21:43

Also the focal length (as is stored in the camera matrix) is measured in pixels if I am correct?

papr 26 January, 2022, 10:10:08

correct

user-51f490 28 January, 2022, 13:53:40

Hello everyone, I am new here and new to the whole world of pupil labs. I would like to receive from you some information, assistance or even basic instructions to follow in order to use the device (pupil labs) in my future experimental works. My situation at the moment: I have two pc's available; the first one where I have installed pupil core/player/caputer and where I can use the device (it is a windows 10 x64); the second one where I have installed Matlab with psychtoolbox through which I have created the experimental task and so it is dedicated to the presentation of the stimuli (it is a ubuntu 18.04). The choice of using ubuntu was made following the information on the various libraries you suggest (ZMQ & MSGPACK) in order to use those scripts that would allow me to send, during the pupil labs recording and during the experiment, event codes (from matlab to the pupil device), so as to have a complete record of when a given event / stimulus occurs through the Network Api.

I don't know if I have installed something wrong but I can't use the scripts in matlab to activate pupil or to send anything on the recording. It would be of great help to me if someone who has already had to do this could send me some instructions or tips to follow in order to deal with this problem. I thank anyone in advance.

user-cd3e5b 31 January, 2022, 21:15:37

Good afternoon, I'm in the process of working with the api to access pupil invisible gaze data in real-time. I have python 3.10, but it seems according to https://github.com/pupil-labs/pyndsi/issues/58 that I might need python 3.6 for the wheel file to pip install ndsi. Note, python 3.6 is no longer supported (https://www.python.org/downloads/release/python-3615/). Is there an updated version of the whl file that you could point me to?

papr 31 January, 2022, 21:17:42

Hi, while we do not offer wheels for 3.10 yet, we do so for Python 3.9. Is that a viable alternative for you?

user-cd3e5b 31 January, 2022, 21:18:26

hi @papr , yes that is a viable alternative for me! Thank you

End of January archive