core


user-75df7c 01 November, 2022, 14:10:51

any idea why the surfaces folder is my export is coming up empty?

papr 01 November, 2022, 14:11:38

No files at all?

user-7daa32 01 November, 2022, 16:42:45

I have been away for too long. I am sure I missed a lot of interesting stuff and updates. Right now I am searching for a way to plot a heat map. Anyone with ideas ? I simply have average Saccade length data

nmt 02 November, 2022, 11:42:50

Hey @user-7daa32, welcome back! Is Saccade length data really all you have? I'm not sure how a heatmap would work in that context, since presumably the length data has no directionality? Maybe a histrogram/descriptive statistics would make more sense.

user-1ed146 01 November, 2022, 20:23:13

Hey there, please help me out. One of the eye cameras is suddenly not working. Like showing nothing but all black. But another eye camera just works fine. Anybody has any ideas on this?

papr 01 November, 2022, 20:41:11

Hey, black or gray?

user-1ed146 01 November, 2022, 20:45:52

It’s black. I have one screenshot if it helps.

papr 01 November, 2022, 20:46:19

If it is fully black try restarting with default settings in the general settings

user-7daa32 02 November, 2022, 12:16:35

Thank you

user-7daa32 02 November, 2022, 12:20:42

Not the Saccade length. It is a derived average using the Saccade lengths. What about using the gaze data, I was able to create a heat map in the player but the visual stimulus shape is distorted.

user-f5523e 02 November, 2022, 16:04:20

hi can I ask a ELI5 type question as I haven't used the software yet - I have skimmed through the docs and couldn't quite figure it out So if you wanted to define an AOI post-recording (i.e. without markers during recording) - I see you can add a surface with the surface tracker plugin - but how does this work with continuous video with a dynamic world and/or head - would you need to modify the surface for each frame of the video assuming a shift of the AOI in the camera image?

papr 02 November, 2022, 16:22:43

without markers during recording That does not work πŸ™‚ You can detect the markers post-recording but they need to be visible in the scene video.

user-f5523e 02 November, 2022, 16:23:58

ok - lets say I wanted to have an AOI as a car that moves across someone's view - would this be possible in post-processing analysis (using your software)?

papr 02 November, 2022, 16:34:54

If you placed a huge marker on it, maybe πŸ˜„ But this is probably not what you are looking for. In your case, I would run a third-party object detector on the scene video and use the exported raw gaze data whether it falls onto any of the detected objects.

user-f5523e 02 November, 2022, 16:36:45

ok thanks - is there software that you would recommend that I could have a look at - ideally free?

papr 02 November, 2022, 16:38:35

There are many neural networks that have been published over the years. I am not up-to-date in this regard. The choice would also heavily depend on what you need, e.g. if you only need a rough outline (a rectangle) or an accurate per-pixel map.

user-f5523e 02 November, 2022, 16:41:48

yeah there has been a lot going on with automatic scene recognition - ok so I think it get it - the analysis is very geared up around the markers - cheers

user-66797d 02 November, 2022, 20:59:54

I have a question about the demo workspace. I tried playing around with the videos and things but realized that most of the templates and videos there were locked. Is the demo workspace designed to be locked to edits, or is there a way around this to allow for edits?

user-bbd687 03 November, 2022, 04:01:45

How does the natural calibration algorithm work? How does the code match the gaze point with the red point in the natural scene? What algorithms are used to deal with it?

user-bbd687 03 November, 2022, 04:02:51

Could you please explain the running logic of the natural calibration algorithm? Use picturesπŸ˜€ πŸ‘

user-f6a634 03 November, 2022, 04:52:49

Hello, I've been using the LSL Relay plugin to stream data to LSL while also using the lsl_inlet.py example in pupil helpers to export the data to a .csv file. How would I go about modifying the lsl_inlet.py example in order to display the computer's local clock for the timestamp, using datetime for example?

user-bbd687 03 November, 2022, 10:26:52

@papr (Pupil Labs) hi

user-a47841 03 November, 2022, 10:45:20

HI there. Just a quick question as I am writing a paper - when one does the eye tracking calibration on Pupil Capture, an error message can appear when the calibration is considered not sufficient (i.e. low data confidence, notably, I imagine ?). What threshold / error of measure does Pupil Capture automatically use ? In other words, what data quality is considered acceptable for the software to consider the calibration is suficient ? Thank you so much for your answer.

papr 03 November, 2022, 10:46:25

Hi πŸ™‚ 2d or 3d calibration?

user-741073 03 November, 2022, 11:17:23

Hi, I'm facing some issues with one or the pupil camera

user-741073 03 November, 2022, 11:17:41

Chat image

papr 03 November, 2022, 11:32:09

Please contact info@pupil-labs.com in this regard

user-741073 03 November, 2022, 11:32:15

Chat image

user-741073 03 November, 2022, 11:32:16

okay

papr 03 November, 2022, 11:49:36

In that case it will only fail for sure if there was no pupil or no reference data collected. Otherwise it will attempt to run the calibration which is a complex optimization function. There is no clear cut value that tells you beforehand if it will converge or not. Even if it converges, it might be very inaccurate. This is why we have the accuracy visualizer which applies the estimated calibration to the recorded Pupil data and compares it to the reference data. This tells you how good the fit is in visual angle error. Depending on your use case, you can decide to repeat the calibration or proceed

user-a47841 03 November, 2022, 11:58:58

Thank you ! πŸ™‚

user-a47841 03 November, 2022, 11:59:12

And what about the 2D calibration ? Does that have a threshold ?

user-bbd687 06 November, 2022, 12:28:54

hi @papr (Pupil Labs) When I run this code, I get an error and can't install it

Chat image Chat image

user-bbd687 06 November, 2022, 12:30:06

πŸ‘‹

user-bbd687 06 November, 2022, 12:31:51

Before I run the code, I have downloaded the 'requirements.txt' file

user-bbd687 06 November, 2022, 12:32:35

@user-d407c1

user-0777d6 07 November, 2022, 22:42:00

Hi, I compiled pupil core on my nvidia xavier nx (arm based), if try to run pupil capture i get the error glfw.GLFWError: (65542) b'GLX: No GLXFBConfigs returned'

papr 08 November, 2022, 09:14:23

This looks like an issue with one of our dependencies. See https://www.glfw.org/

user-0777d6 07 November, 2022, 22:44:19

i can run only the service app, but the fps rate is very slow (3-5 fps...)

papr 08 November, 2022, 09:13:38

Pupil Capture requires a fair amount of CPU. It is possible that this device is not able to deliver it. Note, Pupil Core software does not make explicit use of the GPU.

user-f590a4 08 November, 2022, 09:22:12

hi, i have a question, regarding exporting multiple recordings at once in pupilPlayer. I automated PupilCapture to trigger recording if certain events happened. Now i have about 200 single recordings and i need to export the 3d pupil diameter with pupilPlayer... Is there any faster way then drag and drop every single folder into it and press e?

papr 08 November, 2022, 09:26:02

Hi! Are you only interested in the recorded pupil data, without re-processing the eye videos?

papr 08 November, 2022, 09:26:45

If that is the case, check out https://gist.github.com/papr/743784a4510a95d6f462970bd1c23972

user-f590a4 08 November, 2022, 09:29:46

Yes perfect thank you very much! Thats excatly what i need! πŸ™‚

user-0777d6 08 November, 2022, 09:37:04

i have another problem, if i capture the world camera with uv4l the image is distored, like a fisheye but only on the left part

user-0777d6 08 November, 2022, 09:46:30

solved the fisheye problem πŸ™‚ But linux detect the 2 eye camera as video device, if i try to capture the video from the world camera works, if i try to get the video from the 2 eye camera no, maybe this is the problem why the fps is so slow (pretty blocked...) ?

papr 08 November, 2022, 09:49:26

Just to clarify, you selected the world camera in the world process and the eye cameras in the eye processes, correct?

user-0777d6 08 November, 2022, 09:50:21

the 2 eye camera starts, display a single image, the freeze

papr 08 November, 2022, 09:51:58

Right! If you close one of the eye windows, does it work?

user-0777d6 08 November, 2022, 09:52:15

no, everything is freezed, i need to kill the process

user-0777d6 08 November, 2022, 09:52:18

i get this error

user-0777d6 08 November, 2022, 09:52:31

Assertion failed: ok (src/mailbox.cpp:99)

papr 08 November, 2022, 09:53:14

Are you running from source? Which branch are you running from?

user-0777d6 08 November, 2022, 09:52:39

and a lot of cython error log...

user-0777d6 08 November, 2022, 09:53:47

i cloned the master branch, and i recompile everything from source

user-98789c 08 November, 2022, 10:52:33

Hi all, is there a post hoc kind of way to know with what sampling frequency I recorded pupil size using the Pupil Core?

papr 08 November, 2022, 10:57:15

You can simply subtract neighbouring pupil timestamps (grouped by eye id), the inter-sample duration. 1 / intersample-duration gives you a per-sample estimation of the sampling frequency. I recommend averaging the values over time.

user-75df7c 08 November, 2022, 11:57:38

Any recommendations as to what confidence level to use as a cut off point? I want to filter out noisy data and keep only the samples where gaze is good, but I don't know much is enough. Thanks in advance!

user-c2d375 08 November, 2022, 12:15:33

Hi @user-75df7c πŸ‘‹
I recommend you discard data points with a confidence level lower than 0.6

user-beb3db 18 November, 2022, 14:17:09

Hi, did you use only the confidence level as a filter or did you enter other thresholds for example on diameter, smooth and more? How did you filter the signal?

user-0e5193 09 November, 2022, 10:39:42

Hello, I have a question about the 3D gaze origin. I understand the origin of the x and y axis but not the origin of the z axis. What does the mean of gazenormal*z? If it was 1, where should I think a person is looking?

papr 09 November, 2022, 10:40:45

Hi, the z-axis points forward πŸ™‚

user-0e5193 09 November, 2022, 10:51:56

Oh, thank you. I did misunderstand. So, the z value means the world camera's vectors right?

papr 09 November, 2022, 10:56:51

The z value alone is nearly meaningless.

Imagine two 3d lines. One per eye. Each goes through the center of the eye ball and the center of the pupil. These lines point towards what you are looking at. Each line is defined by a point (eye_center0/1) and a direction (gaze_normal0/1).

We try to find the intersection of those lines, which corresponds to gaze_point_3d.

Is this clearer?

user-0e5193 09 November, 2022, 10:53:13

Then the z value is the difference from the gaze determined by x and y?

user-0e5193 09 November, 2022, 11:04:32

Until I asked, I thought z was the distance to the object.

user-0e5193 09 November, 2022, 11:11:35

Thank you very much. Then, how can I interpret z with x and y?

papr 09 November, 2022, 11:14:12

As mentioned above, x/y/z describe a direction. Specifically, the direction in which one eye ball is rotated towards. The direction alone is not so useful, unless you want to know by how much the eye ball rotated in comparison to a second eye ball direction.

user-bf2b1e 09 November, 2022, 11:14:36

Hello, I am having troubles connecting to pi.local:8080

user-0e5193 09 November, 2022, 11:42:01

Then, is the z value distance to the object?

papr 09 November, 2022, 11:42:46

Not z alone. But the length of the whole vector is.

user-0e5193 09 November, 2022, 11:43:30

I see! thank you. Have a nice day.

user-736bf7 09 November, 2022, 15:16:57

Hi everyone, I would like to use my Pupils Lab Core to determine how often a gaze direction change occurs between three monitors, measure fixations and make a heatplot. I now have a setup that allows me to do robust tracking. Despite the good documentation, I still have some questions: In which format are the timestamps? For example, what does "9245888951" mean under "world timestamp"? Under pupil_positions.csv is also the variable "diameter". This is shown to me with e.g: "2292571258544920". How is this number to be interpreted? Is there a more detailed documentation for the beginner questions? Thank you!

papr 09 November, 2022, 15:17:50

Hey, are looking at the exported csv in Excel?

papr 09 November, 2022, 15:20:49

Both world timestamp and diameter should be decimal point values. The first in seconds, the second in pixels.

user-53a74a 09 November, 2022, 19:14:50

Hello folks, I'm planning to buy a new PC with a new CPU for my experiments, as the current CPU cannot keep recording even at 30Hz. I found this old post saying that the Pupil Labs bundle does not support Xeon processors. Is this still an issue? Should I get i7 or i9 processors? https://discord.com/channels/285728493612957698/285728493612957698/578844966638452749

papr 09 November, 2022, 19:15:34

That is still correct

user-0e5193 10 November, 2022, 01:33:24

Hi, can I ask a question? What is the difference in the value that norm_pos_x/y between pupil_positions and gaze_positions?

papr 10 November, 2022, 07:57:51

The frame of reference is the difference. pupil norm_pos refers to a point in the corresponding eye video camera. gaze norm_pos refers to a point in the scene camera.

user-2196e3 10 November, 2022, 12:39:19

Hi, I need use pupil diameter. But blinks affect pupil diameter. Do you have any reference code or resources to get rid of the blinks points and interpolate new values?

user-0777d6 10 November, 2022, 14:43:01

hi, can I copy the calibration data from a PC to another ? I can launch the capture app on my PC, calibrate glasses; but i need the service app on my SBC with linux, the problem is the SBC can use only service app, and the calibration doesn't works, but i need IPC gaze messages (no messages without calibration...)

papr 10 November, 2022, 14:46:10

If you are ok without the scene video, what is it that you need the gaze data for? In other words, what kind of data are you interested in in particular?

user-0777d6 10 November, 2022, 14:47:04

i need gaze data, i need to capture the world camera then store where users is looking

user-0777d6 10 November, 2022, 14:47:29

on the SBC i use service app, but without a calibration the doesn't send me data

papr 10 November, 2022, 14:47:56

Let's continue in the thread of the other day

user-1ed146 10 November, 2022, 19:33:57

Hello folks, can anyone help here? One of the eye cameras stop functioning, showing 0 fps. And the eye camera window is in grey. How can I fix it??

papr 10 November, 2022, 19:38:18

please contact info@pupil-labs.com in this regard

user-1ed146 10 November, 2022, 19:34:52

Any answers is appreciated.

user-c521af 11 November, 2022, 10:59:09

does Pupil invisible works with iOS???

papr 11 November, 2022, 10:59:51

Hi, no it does not. πŸ™‚ It only works on OnePlus 6 and OnePlus 8/8T with Android

user-72f9ba 11 November, 2022, 16:26:47

Hello!

user-72f9ba 11 November, 2022, 16:26:56

I just got the glasses and companion device

user-72f9ba 11 November, 2022, 16:27:04

The glasses do not connect to the companion device

papr 11 November, 2022, 16:35:06

Hi! The first steps will always be to connect the glasses via the included USB cable to the Companion device / phone. Have you done that already?

user-72f9ba 11 November, 2022, 16:27:08

How can i connect them?

user-72f9ba 11 November, 2022, 16:39:25

Does it need to be connected the whole time?

papr 11 November, 2022, 16:42:33

When you want to use it, yes! It gets its power from the phone and the gaze estimation happens in the phone, not the glasses themselves πŸ™‚

user-72f9ba 11 November, 2022, 16:42:46

That's not what is advertised. This is a major issue

papr 11 November, 2022, 16:43:12

Can you point me to the specific advertisement that you are referring to?

user-72f9ba 11 November, 2022, 16:44:19

https://pupil-labs.com/products/invisible/ Can you show me where it is written?

papr 11 November, 2022, 16:45:48

In all the pictures where people are wearing the glasses, you can see the cable. We never advertise that the glasses can be used wirelessly πŸ™‚

nmt 11 November, 2022, 17:01:16

Hey @user-72f9ba πŸ‘‹. Does your use-case strictly exclude usb connectivity? Is the issue cable management? You can also reach out to info@pupil-labs.com to discuss product fit and options πŸ‘

user-7b683e 12 November, 2022, 10:11:59

Hello Dear Pupil Labs Team,

We want to mention about an issue which I have gotten from Pupil Capture using Core product.

I use a new computer with Ryzen 7 and a 3000 serie of Nvidia GPU. On the device, the frequency of the eye camera is fairly suitable for my new purpose, 120+ Hz. However world camera gives nearly 15 FPS, with a huge latency and froozen sometimes. I guess due to some reasons related with world camera process, I got Blue Screen error sometimes on Windows 10. When the error didn't exsist, gaze became really noisy.

I would like to hear your suggestions about the relavent jumping data. I know that the issue is not clear but maybe some aspects can be enough to predict it.

Have a good day!

user-c7cfe7 13 November, 2022, 08:01:14

Hello developers. I am a university student in Japan. I am using Pupil Core in my research and I have done the following work to enable LSL. 1. 1. copy pylsl and pupil_capture_lsl_relay directories to pupil_capture_settings/plugin 2. run Capture with Sudo

However, when I try to activate LSL, I get the following error which I cannot resolve.

world - [WARNING] plugin: Failed to load 'pupil_capture_lsl_relay'. Reason: 'liblsl library '/Users/tappun/pupil_capture_settings/plugins/pylsl/lib/liblsl.dylib' found but could not be loaded - possible platform/architecture mismatch.

You can install the LSL library with conda: conda install -c conda-forge liblsl or with homebrew: brew install labstreaminglayer/tap/lsl or otherwise download it from the liblsl releases page assets: https://github.com/sccn/liblsl/releases On modern MacOS (>= 10.15) it is further necessary to set the DYLD_LIBRARY_PATH environment variable. e.g. >DYLD_LIBRARY_PATH=/opt/homebrew/lib python path/to/my_lsl_script.py'' world - [WARNING] plugin: Failed to load 'pylsl'. Reason: 'liblsl library '/Users/tappun/pupil_capture_settings/plugins/pylsl/lib/liblsl.dylib' found but could not be loaded - possible platform/architecture mismatch.

I ran the following, but it made no difference.

brew install labstreaminglayer/tap/lsl

I don't have a good understanding of DYLD_LIBRARY_PATH, so I tried specifying it as follows.

DYLD_LIBRARY_PATH="/opt/homebrew/lib/python3.9/site-packages/pylsl/pylsl.py" It would be helpful to know if there are any mistakes.

How can I do this? OS used is MacOS 13.0 (22A380) Python 9.0.

papr 14 November, 2022, 07:53:59

The bundle ships the intel-x86_64 python that is emulated on m1 macs. You installed the native M1 pylsl which is why you get the emulation error.

user-c7cfe7 13 November, 2022, 09:07:09

Regarding the above problem, we considered the possibility that liblsl does not support M1 Mac, and by using a macbookpro (ver. 11.6) equipped with an intel CPU, the error did not occur. Therefore, we will use that machine for development for a while. If you know how to run it on the latest macOS and M1 Mac, I would appreciate it if you could let me know.

Thanks.

papr 14 November, 2022, 07:54:37

I recommend to do so on the develop branch. See the special note about creating an intel-x86 virtual environment https://github.com/pupil-labs/pupil/tree/develop#installing-dependencies-and-code

user-beb3db 14 November, 2022, 10:09:00

Analyze the data

user-beb3db 14 November, 2022, 10:30:28

Hi all, does anyone have a detailed analysis of how to analyze the data exported from the system? With the codes and parameters that can be obtained? I am new here and I wanted to understand how they work. Also I wanted to know if the tests can be done directly from the software you download or if you need to implement them with Python or Matlab codes, thanks!!!

nmt 14 November, 2022, 11:32:50

Hi @user-beb3db πŸ‘‹. Welcome to the community! You can find a detailed overview of analysis and visualisation plugins available in Pupil Player (our free desktop analysis software) here: https://docs.pupil-labs.com/core/software/pupil-player/#pupil-player. You don't need to use coding to work with these tools. That said, you can export the results into .csv files, which can of course be processed with Python or Matlab, if that's your interest. If you can share details of your use-case, we can try to follow up with more concrete guidance/examples!

user-beb3db 14 November, 2022, 10:37:29

and how does lite analyze the data? Python, Matlab, C ++?

user-b3b1d3 14 November, 2022, 15:40:25

Hello everybody! I was wondering if there is a 32bit version of pupil_core software for linux

papr 14 November, 2022, 15:42:02

Hi, what you specifically need for that are arm-based versions of the dependencies. Check out the #1039477832440631366 thread history.

user-b3b1d3 14 November, 2022, 15:40:46

To try to run it in a raspberry pi

user-b3b1d3 14 November, 2022, 15:53:12

On the other hand, I guess i will have to use a similar aproach if I need to run it on windows 11 right?

papr 14 November, 2022, 15:54:04

Do you mean Windows 11 on the rpi?

user-b3b1d3 14 November, 2022, 15:54:16

no, on another machine

papr 14 November, 2022, 15:54:59

We offer a pre-built bundle for Windows on Intel/AMD 64 systems

user-b3b1d3 14 November, 2022, 16:56:37

OK, thanks its up and runnning now on windows 11. I will try to build in rpi tomorrow. Thank you for your time.

papr 15 November, 2022, 10:49:23

Maybe an additional note: Pupil Core requires a fair amount of CPU. It is likely that the rpi won't be able to provide sufficient speed to run the pupil detection at higher frame rates. You can use it to record the video and run the detection post-hoc though.

user-946395 15 November, 2022, 17:42:08

Hi there! I am discussing a research study that used Pupil Labs CORE glasses. This is my students' first conversation about eye tracking details in my psychology of music class. I am looking for a post-hoc video that might illustrate fixations, saccades, etc. There are several videos on the Pupil Labs channel of post-hoc analysis and rendering, but there was no audio in the videos. Thanks in advance! I downloaded the demo videos from this link, https://pupil-labs.com/products/core/tech-specs/, but I cannot get them to load in any video apps that I have. Any help is appreciated!

papr 15 November, 2022, 17:43:30

Please open the demo recordings in Pupil Player and export them. The export will include the raw data as csv and the scene video with overlayed gaze. Check out our documentation for instructions.

user-946395 15 November, 2022, 17:52:55

are the demo recordings titled world.mp4, eye1.mp4, eye0.mp4? Thanks for the help. Looking to do work in eye tracking with Pupil SOON!

papr 15 November, 2022, 18:00:05

These are the raw camera recordings. You can play them back one by one using e.g. VLC media player.

user-bbd687 16 November, 2022, 08:36:07

@paprI've finished the recording. How do I expor the pupil in marking the (x, y) coordinates of world-cam?

user-bbd687 16 November, 2022, 08:36:41

The (x,y) coordinates of the red point

papr 16 November, 2022, 08:37:45

Open the recording folder in Pupil Player and press the export button. This will save the points as csv

user-bbd687 16 November, 2022, 08:38:37

I have finished

user-bbd687 16 November, 2022, 08:38:45

πŸ˜ƒ

user-bbd687 16 November, 2022, 08:39:16

What is the name of the (x,y) coordinate of the red dot?

user-bbd687 16 November, 2022, 08:39:31

Variable name (x,y)

papr 16 November, 2022, 08:39:44

norm_pos

user-bbd687 16 November, 2022, 08:39:55

Chat image

user-bbd687 16 November, 2022, 08:40:01

Chat image

user-bbd687 16 November, 2022, 08:40:22

Excuse me, are these two picture?

papr 16 November, 2022, 08:41:12

Norm_pos_x/y in the gaze position csv file

user-bbd687 16 November, 2022, 08:43:42

Chat image Chat image Chat image

user-bbd687 16 November, 2022, 08:44:16

These are the coordinates of Norm_pos_x/y

user-bbd687 16 November, 2022, 08:44:34

The y value is too small

user-bbd687 16 November, 2022, 08:45:45

My pupil line of sight moves within the frame of the screen

user-bbd687 16 November, 2022, 08:46:21

Chat image

user-bbd687 16 November, 2022, 08:46:44

Just like the red circle

user-bbd687 16 November, 2022, 08:46:53

@papr

papr 16 November, 2022, 08:46:54

from which file did you load the data?

user-bbd687 16 November, 2022, 08:47:30

pupil_positions.csV

papr 16 November, 2022, 08:49:13

pupil != gaze data. What are you looking at is the pupil position within the eye video, not the scene video. You need to look at the norm_pos in gaze_postions.csv

user-bbd687 16 November, 2022, 09:02:41

Thank you. I found it!!!

user-bbd687 16 November, 2022, 09:26:17

I want to use the heat map function

Chat image

papr 16 November, 2022, 09:28:02

You can read about the setup in our documentation https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking

user-bbd687 16 November, 2022, 09:26:21

@papr

user-bbd687 16 November, 2022, 09:27:04

Chat image

user-bbd687 16 November, 2022, 09:27:30

How do I do that

user-bbd687 16 November, 2022, 09:32:12

Excuse me, how to turn off the effect of this highlight?

Chat image

papr 16 November, 2022, 09:33:53

You can turn off visualizations via the plugin manager (for unique visualizations) or in their corresponding menu (for non-unique ones).

user-bbd687 16 November, 2022, 09:40:26

ok,Thank you. I did it successfully

user-bbd687 16 November, 2022, 09:41:33

I can't get a heat map

Chat image

papr 16 November, 2022, 09:41:50

Because you have not setup the markers and not defined a surface yet

user-bbd687 16 November, 2022, 09:41:48

I've turned on the heat map

user-bbd687 16 November, 2022, 09:42:07

What should I do

papr 16 November, 2022, 09:42:21

Please read the documentation that I linked above

user-bbd687 16 November, 2022, 09:43:04

Can natural calibration generate heat map

papr 16 November, 2022, 09:44:09

Calibration and surface tracking are two different steps. You need to calibrate first. This gives you gaze in scene camera coordinates. Then you need surface tracking to map the gaze onto the surface. Afterward, you can generate the heatmap.

user-bbd687 16 November, 2022, 09:43:21

Or is the heat map generated only by screen calibration?

user-bbd687 16 November, 2022, 09:45:13

I've done the natural calibration

papr 16 November, 2022, 09:45:42

Then, you only need to setup the AprilTag markers and define a surface as described in the documentation.

user-bbd687 16 November, 2022, 09:45:30

Then I recorded the video and the data

user-bbd687 16 November, 2022, 09:46:07

What is surface?

papr 16 November, 2022, 09:47:16

The surface is the area of interest onto which you want to generate the heatmap. Pupil Capture and Player need the markers in order to track the surface. for example your computer screen.

user-bbd687 16 November, 2022, 09:47:47

Chat image

papr 16 November, 2022, 09:49:54

As the error message says, you need to setup markers first. The markers are displayed and linked in the documentation. https://docs.pupil-labs.com/core/software/pupil-capture/#markers

user-bbd687 16 November, 2022, 09:48:30

Right, I need to calibrate the heat map using the computer screen

user-bbd687 16 November, 2022, 09:51:33

May I ask, do I need to print marker?

papr 16 November, 2022, 09:52:41

You can either display them digitally on your screen, or print them and attach them to your screen, yes.

https://raw.githubusercontent.com/wiki/pupil-labs/pupil/media/images/pc-sample-experiment.jpg

user-bbd687 16 November, 2022, 09:51:48

Chat image

user-bbd687 16 November, 2022, 09:53:30

Chat image

papr 16 November, 2022, 09:54:47

The only requirement is that you do not use the same marker more than once. Otherwise, you can choose any number (tag id) that you like

user-bbd687 16 November, 2022, 09:53:56

Chat image

user-bbd687 16 November, 2022, 09:54:05

Is there any requirement for the number?

papr 16 November, 2022, 09:55:21

Make sure to include sufficient white border when displaying or cutting out the markers!

user-bbd687 16 November, 2022, 09:55:27

ok

user-bbd687 16 November, 2022, 09:56:13

I'll do the experiment now

user-bbd687 16 November, 2022, 09:58:35
  1. I paste marker on the four corners of the screen
  2. I conducted the test normally
  3. I record video and data
  4. I marked AOI in the player
  5. Generate heat maps
papr 16 November, 2022, 10:00:17

after step 1, you can check if the markers are detected in Pupil Capture in realtime. Enable the surface tracker plugin. The markers should be marked in color. Then add a surface. You can edit the surface to fir your screen. https://docs.pupil-labs.com/core/software/pupil-capture/#preparing-your-environment

user-bbd687 16 November, 2022, 09:58:55

Is this step correct?

user-bbd687 16 November, 2022, 09:59:04

@papr

user-bbd687 16 November, 2022, 10:02:03

Is the surface tracker plugin enabled in capture exe?

papr 16 November, 2022, 10:02:23

not by default. you can turn it on in the plugin manager

user-bbd687 16 November, 2022, 10:04:38

OK

user-bbd687 16 November, 2022, 10:15:54

Chat image

user-bbd687 16 November, 2022, 10:16:15

Is this all right?

user-bbd687 16 November, 2022, 10:16:18

@papr

papr 16 November, 2022, 10:17:01

Looks good to me. Please confirm that the detection works by running Pupil Capture and pointing the scene camera to the screen.

user-bbd687 16 November, 2022, 10:17:42

How should I marker the colors?

papr 16 November, 2022, 10:18:27

Pupil Capture will overlay color in the video preview on the markers if they are detected. (Surface tracker needs to be running)

user-bbd687 16 November, 2022, 10:18:07

β€œThe markers should be marked in color. ”

user-bbd687 16 November, 2022, 10:18:56

ok

user-bbd687 16 November, 2022, 10:30:13

Chat image

user-bbd687 16 November, 2022, 10:30:30

In capture exe, the heat map is in motion

user-bbd687 16 November, 2022, 10:30:55

But in record exe, the heat map is not moving

papr 16 November, 2022, 10:32:03

I am not sure what you mean by "record exe". Do you mean Pupil Capture? Also, what type of movement do you expect?

user-bbd687 16 November, 2022, 10:32:27

It's the player

papr 16 November, 2022, 10:33:57

In Capture, the heatmap is calculated based on the last x seconds. In Player, the heatmap is calculated based on the data within the trim marks. The trim marks are the small controls on the left and right of the timeline.

user-bbd687 16 November, 2022, 10:33:07

In the player, the heat map is always in place

user-bbd687 16 November, 2022, 10:33:28

Chat image

user-bbd687 16 November, 2022, 10:33:51

It doesn't follow the eye movement

user-bbd687 16 November, 2022, 10:34:14

But in capture exe, the heat map follows eye movements

papr 16 November, 2022, 10:35:28

yes, because in player, the heatmap is calculated on all recorded gaze data.

user-bbd687 16 November, 2022, 10:47:36

okk

user-bbd687 16 November, 2022, 10:47:39

πŸ‘‹

user-bbd687 16 November, 2022, 10:47:43

πŸ‘

user-bbd687 16 November, 2022, 11:08:01

Excuse me, I want to export the x+ y+ fixation value of the heat map data, and then draw the heat map of the points in MATLAB.

papr 16 November, 2022, 11:09:52

If you have the surface setup in Player, hit export. There is a surfaces folder. Look at the fixations_on_surface_...csv file. Note that there is a difference between fixation and gaze data. The heatmap in Player is based on gaze data.

user-bbd687 16 November, 2022, 11:08:36

Which variables should I use?

user-bbd687 16 November, 2022, 11:08:50

Which file is it in?

user-bbd687 16 November, 2022, 11:08:54

πŸ˜€

user-bbd687 16 November, 2022, 11:13:10

i have the surfaces folder.

user-bbd687 16 November, 2022, 11:13:31

but the β€œ fixations_on_surface_...csv” size only 1KB

user-bbd687 16 November, 2022, 11:14:08

its blank only name tittle

papr 16 November, 2022, 11:15:14

That means that you have not run the fixation detector yet. Look at the gaze_on_surface...csv data instead

user-bbd687 16 November, 2022, 11:16:51

Chat image

user-bbd687 16 November, 2022, 11:16:57

The table has data

user-bbd687 16 November, 2022, 11:19:11

Can I use these two data. x :x_ norm ; y : y_norm

papr 16 November, 2022, 11:20:19

yes

user-bbd687 16 November, 2022, 11:20:04

Where should I look for gaze data?

user-bbd687 16 November, 2022, 11:20:09

@papr

user-bbd687 16 November, 2022, 11:20:26

en en

user-bbd687 16 November, 2022, 11:20:34

πŸ‘

user-bbd687 16 November, 2022, 11:21:11

Where should I look for gaze data?πŸ˜€

papr 16 November, 2022, 11:21:41

you are looking at it already

user-bbd687 16 November, 2022, 11:21:41

What should the name of the file be?

user-bbd687 16 November, 2022, 11:22:18

Chat image

papr 16 November, 2022, 11:22:42

x_norm/y_norm

papr 16 November, 2022, 11:27:42

Load the data in Matlab and use this https://de.mathworks.com/matlabcentral/fileexchange/66629-2-d-histogram-plot to generate the heatmap.

user-bbd687 16 November, 2022, 11:22:20

Is it confidence?

user-bbd687 16 November, 2022, 11:22:46

okk

user-bbd687 16 November, 2022, 11:23:40

I need one more variable. This variable is used to draw the color of the heat map. The degree of this variable determines the color.

papr 16 November, 2022, 11:25:11

That variable depends on how you aggregate the data. matlab should be doing this for you

user-bbd687 16 November, 2022, 11:24:30

The value of this variable determines the color of the heat map.

user-bbd687 16 November, 2022, 11:26:10

I want to use this variable : "gaze data"

papr 16 November, 2022, 11:26:36

gaze data are x/y locations on the surface. this is the x/y_norm data.

user-bbd687 16 November, 2022, 11:27:27

Can the software export gaze counts?

papr 16 November, 2022, 11:28:04

you can simply count the rows

user-bbd687 16 November, 2022, 11:28:59

ok

user-bbd687 16 November, 2022, 11:30:34

I'd like to ask you again. What variable did you use for the color of the heat map you generated?

papr 16 November, 2022, 11:34:46

the variable is not exported directly. This is how we calculate the heatmap https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/surface_tracker/surface.py#L616-L640

user-bbd687 16 November, 2022, 11:31:01

I now want to verify the accuracy in matlab.

user-bbd687 16 November, 2022, 11:31:31

I want to use the same variables to generate the same heat map as you do

user-bbd687 16 November, 2022, 11:31:49

πŸ˜€ πŸ‘

user-bbd687 16 November, 2022, 11:35:21

Thank you very much

user-e8fbd0 16 November, 2022, 13:50:10

Hi! Just wondering if there is a way to change the background colour of the on-screen calibration and test routines in Capture? The current white is far too bright compared to our stimuli.

papr 16 November, 2022, 13:52:02

Hi, unfortunately, no. You might want to use the natural features calibration instead.

papr 16 November, 2022, 13:52:52

That said, why do you need the calibration routine to be of the same brightness as your stimulus?

user-e8fbd0 16 November, 2022, 13:56:29

Natural features would indeed work, but I was rather fond of the automated nature of the on-screen markers.

user-e8fbd0 16 November, 2022, 14:02:13

Hm, thinking about natural features I might have gotten an idea. We can embed the markers in our radar display using some custom logic and then run the on-screen marker calibration from Capture in parallel, while not actually displaying its markers on the screen.

papr 16 November, 2022, 14:03:57

In that case, I recommend single marker calibration in physical mode. It will expect that an other source displays the marker and only performs detection (without displaying a marker on its own)

user-d61193 17 November, 2022, 06:53:21

Hello Pupil Labs,

We have Pupil Core Child Version. We will have active children participants in a live play room environment. Can we download software to phone or tablet other than laptop to run Pupil Capture and Player?

user-d407c1 17 November, 2022, 08:06:02

Pupil Core can not be run from a phone. You can use a small form-factor tablet-style PC running Pupil Capture to make Pupil Core more portable. If the specifications of such a device are low-end, you record the experiment with the real-time pupil detection disabled to help ensure a high-sampling rate. Pupil detection and calibration can then be performed in a post-hoc context. But it is important you establish a good eye model fit prior to recording. Alternatively, you can connect the laptop through internet, place it on a backpack and control Pupil Capture from a separate machine using the network API

user-80123a 17 November, 2022, 07:56:41

Hello everyone, has anyone ever tried to install the pupil software source code on a raspberry? I got several error after taping this line of instruction: pip install -r requirements.txt So I decided to install each package indivudally. But I am stuck with the package cysignals. I always received the log message errors: "failled building wheel for cysignals". Any ideas? Thanks in advance for your help.

papr 17 November, 2022, 08:00:00

Let's continue the discussion in #1039477832440631366

user-2196e3 17 November, 2022, 10:05:16

@papr Thank you! That's very helpful!

user-d6c401 17 November, 2022, 11:20:00

When I started my two devices (pupil invisible) they went into a sync mode and I cannot access the cloud or any of the menu items on the app. It did allow me to take eye-tracking data on a previously entered wearer name but both apps on each device are now stuck and I have recordings on each device that will not upload to my cloud account. Please help..

papr 17 November, 2022, 11:20:51

Could you please try logging out and back in again?

user-d6c401 17 November, 2022, 11:21:02

Will do, standy

user-d6c401 17 November, 2022, 11:23:09

That did it, thank you

user-c8ad1f 17 November, 2022, 14:15:14

hello! I am working on Reference Image mapper and I am wondering how to add a sample to be mapped. I used a reference picture and video but I now want to apply that enrichment to recordings. How can I do that? Thanks

papr 17 November, 2022, 14:18:50

Add all the recordings that you want to have mapped to the project in which the enrichment is defined.

user-beb3db 17 November, 2022, 16:11:23

Hi all, is there anyone who did a filter after exporting the data? Can you share the code if possible? I would like to get feedback, for now I have read an article that was recommended to me above but it is a bit confusing! I would like to confront someone who is processing the exported data! Thank you

user-beb3db 17 November, 2022, 18:11:58

and also, does anyone have photos of the setup during the experiment?

user-2196e3 17 November, 2022, 22:34:10

Hi, I have 2 questions. 1. where and how can we download the recording folder. I found we only have recording and pupil data, but we don't have recording folder. 2. If we don't have recording folder, so we don't have info.player.json. Anyway we could convert the pupil data timestamp to real time correctly? Thank you!

papr 18 November, 2022, 07:40:32

Could you share a Screenshot of the file list?

user-2196e3 18 November, 2022, 09:29:52

We are still use pupil core device to do more experiments. Is it possible to get raw data for new experiment? How to get access to raw data?

papr 18 November, 2022, 09:54:05

Are those files from a recent recording? Pupil Capture creates a folder that you would open in Pupil Player. That is the raw data. You might be using a very old Pupil Capture version.

user-2196e3 18 November, 2022, 10:00:09

By the way, which version do you recommend to use?

user-beb3db 18 November, 2022, 14:04:17

Can someone explain to me what timestamp is? It's in seconds but what does it mean? Is it seen as time for each sample?

papr 18 November, 2022, 14:04:58

Is it seen as time for each sample? it is πŸ™‚

papr 18 November, 2022, 14:06:53

Maybe this tutorial can give you a more concrete idea of how "Pupil time" relates to the normal system clock https://github.com/pupil-labs/pupil-tutorials/blob/master/08_post_hoc_time_sync.ipynb

papr 18 November, 2022, 14:05:45

More specifically, the time at which the sample was generated. Not its duration.

user-80123a 18 November, 2022, 14:30:41

Hi all, I want to use a Raspberry pi to stream video to a computer. Where the computer runs pupil capture. I found and tried to install using this link https://github.com/Lifestohack/pupil-video-backend

user-80123a 18 November, 2022, 14:31:39

But I got an error message "ModuleNotFoundError: .... cv2"

papr 18 November, 2022, 14:32:58

Pip install opencv-python

user-80123a 18 November, 2022, 14:32:42

When I start to run python main.py

user-80123a 18 November, 2022, 14:37:52

sudo apt install -y python3-opencv

papr 18 November, 2022, 14:40:02

This installs Opencv in the global environment for the default python.

user-80123a 18 November, 2022, 14:38:06

pip install opencv-python

papr 18 November, 2022, 14:40:42

This installs in the current environment

user-bbd687 19 November, 2022, 04:59:25

@paprI modified the English alphabet instead of the native language. But the output is a box

Chat image

papr 19 November, 2022, 07:58:43

I think the included font only has English characters. It might be possible to load a custom font but I would need to look that up next week.

user-a3e405 19 November, 2022, 07:25:20

Hello team, We are currently using pupil core to do realtime eye tracking (gaze on surface) via network API. We would like to get surface data in higher frequency (at least 120Hz) to detect saccades. We noticed the surface datum is 30Hz so we subscribed to both surface & gaze, and tried to map the gaze ourselves usingimg_to_suf_trans (which updates 30Hz). However, applying the homographic transformation to gaze data (norm_pos in gaze) does not give us the same surface data. I saw from a previous post that it is actually more complicated since you map gaze in undistorted camera space. I'm still not sure how to apply img_to_suf_trans in undistorted camera space to get surface data in higher frequency. I would very much appreciate any suggestions or pointers to existing examples. Thank you in advance!

papr 19 November, 2022, 08:02:35

Do I understand it correctly that you would prefer mapping gaze using an older surface location instead of waiting for the newest one?

The surface datum should contain transformation matrices for both, distorted and undistorted, spaces.

user-bbd687 19 November, 2022, 09:12:07

@paprThank you very much!!

user-3aea81 21 November, 2022, 01:31:54

hello

user-3aea81 21 November, 2022, 01:32:36

i'm using pupil core but i have Q

user-3aea81 21 November, 2022, 01:33:08

my pupil is so sensitive

user-3aea81 21 November, 2022, 01:33:42

it's using harder

user-3aea81 21 November, 2022, 01:34:26

can i handle in software?

user-bbd687 21 November, 2022, 03:43:46

@paprLooking forward to your reply!!

user-bbd687 21 November, 2022, 03:43:54

πŸ˜€

user-887da7 21 November, 2022, 04:16:21

Hello, where can I get the format/description of the data that can be collected via pupil labs on HTC VIVE?

papr 21 November, 2022, 08:48:42

It is the same as for the Pupil Core format. https://docs.pupil-labs.com/core/software/recording-format/

You can download an example raw recording here https://drive.google.com/file/d/11J3ZvpH7ujZONzjknzUXLCjUKIzDwrSp/view?usp=sharing

Using Pupil Player, you can export the data to csv https://docs.pupil-labs.com/core/software/pupil-player/#export

user-80123a 21 November, 2022, 14:27:20

Hello all, I used the pupil-video backend on a raspberry and a PC. The raspberry takes the video flux and the PC use the video flux to compute the pupil detection. Everything looks right on the raspberry part.

Chat image

user-80123a 21 November, 2022, 14:27:45

However, I cannot see the video on the PC

user-80123a 21 November, 2022, 14:28:45

I just saw a "HMD Streaming" on the video source/settings

user-80123a 21 November, 2022, 14:29:47

Chat image

user-d407c1 21 November, 2022, 14:37:51

Have you tried toggling on the "enable manual camera selection" on that panel? the icon will become green and you will be able to select other sources

user-80123a 21 November, 2022, 14:33:00

When I developed the list on the "Active Device", I only saw "Local USB"

user-80123a 21 November, 2022, 14:33:56

I use Windows PC

user-80123a 21 November, 2022, 14:42:38

Chat image

user-80123a 21 November, 2022, 14:44:59

Actually, I cannot select the new source

user-d407c1 21 November, 2022, 14:46:37

Are you able to access the streaming (eye camera) from other apps (e.g. OBS, Meet,...)?

user-80123a 21 November, 2022, 14:47:48

I never tried, how can I do that?

user-f590a4 21 November, 2022, 17:43:56

hi i have a question, is there a fast way to extract blink count data from about 500 recordings? Without using PupilPlayer?

user-c78bd2 21 November, 2022, 19:41:11

hi @&288503824266690561 - I am looking at your DIY core solution. Is the published BOM (google sheets: Pupil Headset BOM : Sheet 1) the most up-to-date version? I see that the base compatibility requirements are UVC 1.1 - are there any additional limitations with the DIY solution and the available github pupil labs software?

papr 22 November, 2022, 08:35:32

Please see the first 5 points https://gist.github.com/papr/b258e0e944604375752eae502b4ad3d5 (i.e. the camera needs to support mjpeg, not h264)

user-908b50 21 November, 2022, 22:49:28

HI, just a general question for people using the surface tracker, What do you do if participants moved a lot? This led to surface moving quite a lot in our case and I am not sure what to do. Please help! Also, for videos we had trouble getting the plugin to work. Support appreciated!

papr 22 November, 2022, 08:31:21

Is the issue that the surface markers are no longer being recognized due to motion blur?

user-3aea81 22 November, 2022, 13:41:11

Hello

user-3aea81 22 November, 2022, 13:41:20

I have 1 Q

user-3aea81 22 November, 2022, 13:42:21

When I do calibration after that

user-3aea81 22 November, 2022, 13:42:54

Can see a square box in monitor

papr 22 November, 2022, 13:43:27

Which color has it? Is it green?

user-3aea81 22 November, 2022, 13:43:17

What is that??

user-3aea81 22 November, 2022, 13:44:35

I don't have photo.... but hmm...

user-3aea81 22 November, 2022, 13:44:46

Can see in capture

user-3aea81 22 November, 2022, 13:56:18

Like brown

papr 22 November, 2022, 13:57:07

Could you please share a screenshot of what you are referring to?

user-3aea81 22 November, 2022, 13:57:59

I will check now

user-3aea81 22 November, 2022, 14:00:50

Chat image

user-3aea81 22 November, 2022, 14:01:02

Can you see that box??

papr 22 November, 2022, 14:01:31

Yes, that is the box that I had in mind. This is the calibration area. The area in which the gaze estimation will be most accurate.

user-3aea81 22 November, 2022, 14:03:05

Aha...okay

user-3aea81 22 November, 2022, 14:07:19

Chat image

user-3aea81 22 November, 2022, 14:08:19

And please a few Description that graph???

user-3aea81 22 November, 2022, 14:09:24

What can I do when I see that...

papr 22 November, 2022, 14:10:19

I am not sure what you are referring to. This sounds like you are seeing an error in the picture. But in the picture, the software looks normal to me.

user-3aea81 22 November, 2022, 14:11:14

Yes it's correct.

user-3aea81 22 November, 2022, 14:11:52

I need to describe to customer

user-3aea81 22 November, 2022, 14:12:09

About software

user-3aea81 22 November, 2022, 14:12:44

So nowadays I do operations

user-3aea81 23 November, 2022, 07:58:08

Hello!!~~

user-80123a 23 November, 2022, 10:01:28

Hi all, I currently use an old eye tracker with only 2 cameras: left eye and right eye. So, I would like to know if it is possible to have the old algorithm that you used to calculate the direction of the gaze, when you used only 2 cameras? Thanks in advance.

papr 23 November, 2022, 10:07:14

The new eye tracker also uses two eye cameras. If you are referring to the old 3d pupil detection algorithm, you should use Pupil Core software version 2. Note, the 3d pupil detection works per eye camera, I.e. It is agnostic to the amount of cameras

user-a9c776 23 November, 2022, 11:54:39

Hello, I have a question about the core eyetracker. Is there any analysis software from pupil labs that can automatically evaluate the recordings made? The recordings are around 2 hours long and an automatic evaluation would be a significant advance over doing it by hand. Or are there any recommendations for such a project? Thanks for an answer

papr 23 November, 2022, 11:57:14

Hi, what kind of metrics/evaluations are you looking for?

user-a9c776 23 November, 2022, 11:59:14

We want to measure the time the user is looking at a certain area. Like for example monitors (up to 6) and then per monitor a time as output value.

papr 23 November, 2022, 11:59:57

Have you been using the surface tracker in realtime/Pupil Capture?

user-a9c776 23 November, 2022, 12:00:13

Yes

papr 23 November, 2022, 12:01:16

Using this script, you can extract the realtime-surface-mapped gaze https://gist.github.com/N-M-T/b7221ace2e7acf0c0c836773a3b4cf7c and automate your analysis

user-a9c776 23 November, 2022, 12:01:56

I'll try that, thanksπŸ‘

user-4bc7ca 23 November, 2022, 20:34:01

Hello, I am using the Pupil Core glasses with Unity, and have used scripts from this resource: https://github.com/pupil-labs/hmd-eyes

I am aware this works for the HMD, but I was wondering if it works directly with the Pupil Core glasses?

user-040866 24 November, 2022, 06:35:44

Hello. I'm using the pupil core headset. I have a question about gaze point 3D values in the gaze recordings. What is the datum of the 3d coordinates, since there are negative values? and what's their units? Thanks!

user-d407c1 24 November, 2022, 08:19:07

Hi @user-040866! Please check out our 3D camera space coordinates https://docs.pupil-labs.com/core/terminology/#coordinate-system, which is the one followed gazepoint3d follows. I also recommend you to take a look at the reference system here too https://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html. As you can see (x,y) can be negative depending on whether you look to the right-left or up-down. The z-axis should be always positive

The units are in mm

user-3aea81 24 November, 2022, 08:04:13

hello.The client asked for an explanation of the software menu Do you have an explanation for the menu? Not the data on the homepage

papr 24 November, 2022, 08:08:59

Hey, the documentation should explain the general menus. Then, each plugin has their own menu. We don't have a description of each single ui element, but the documentation has a section on each plugin. That helps understanding the menu.

user-beb3db 24 November, 2022, 10:45:11

Hi, i have a question: is it possible to use eye trackers while wearing glasses or is it better to take them off?

user-c2d375 24 November, 2022, 10:49:53

Hi @user-beb3db πŸ‘‹ Sometimes it's possible to put the Core headset first and then the eyeglasses, ensuring that eye cameras capture eyes from below the glasses frame. Note that it is not an ideal condition but does work for some people, depending on eye physiology and eyeglasses shape/dimensions.

user-6586ca 24 November, 2022, 12:35:25

Hello everyone ! I have a question. Is it possible using Pupil Core to get the head positions (coordinates of the head or head mouvements) ? Thank you in advance and have a nice day !

user-d407c1 24 November, 2022, 12:39:56

Hi @user-6586ca! Check out the head-pose-tracking plugin https://docs.pupil-labs.com/core/software/pupil-player/#head-pose-tracking

user-78c370 24 November, 2022, 13:04:37

Hello, have some concerns about safety issue of the IR led. Saw in the bom sheet, core used SFH 4050 with 4mW/sr Radiant Intensity. Currently, I can't find this LED at local store. They have other alternatives, Can I consider any IR led around 4mW/sr - 14 mW/sr is safe?

nmt 24 November, 2022, 14:46:15

Hi @user-78c370 πŸ‘‹. The IR LED listed in the DIY bill of materials should be available from online vendors. Note that Core conforms with EU IEC-62471 standard for eye safety. We wouldn't be able to comment on the safety of third-party IR LEDs I'm afraid.

user-78c370 24 November, 2022, 16:01:26

Found an interesting document about eye safety.

REN_an1737_APN_20040119.pdf

user-89d824 24 November, 2022, 18:28:33

Hi,

I've noticed that for some of my participants, the edge of the pupil and iris is blurred in the eye camera. At first I thought it's a camera issue but I've swapped cameras and the problem persists.

I also used the same eyetracker/cameras on myself in the same lighting conditions and the problem wasn't reproduced.

My participants' eyes look normal to me, so it wasn't a result of some eye defect. I noticed that the two participants with the same problem had green irises, but not sure about the others who didn't have the problem. I've dark brown eyes, but I doubt everyone else I used the eye-tracker on successfully has dark brown eyes.

(I notice that this person has residual mascara on, but I don't think it would cause what I described -- but I could be wrong)

May I know how what's causing this and how to fix it please? Thanks! πŸ™‚

Chat image

user-d407c1 25 November, 2022, 08:46:29

Hi @user-89d824 !

Mascara and eye cosmetics can influence the lipidic layer of the tear film. This can be why you might see this blurred area as the tear film becomes oily. You can find more about how eye cosmetics affect the tear film here https://www.dovepress.com/investigating-the-effect-of-eye-cosmetics-on-the-tear-film-current-ins-peer-reviewed-fulltext-article-OPTO



If you have physiological serum at hand, one way could be to "clean" the tear film before measuring.



That said, here are some steps you can take to improve pupil detection: 1. Position the eye camera to minimise 'bright pupil' and/or glare. Specifically, try to reduce the bright spot you can see near the pupil's edge. 2. Manually change the exposure settings to optimise the contrast between the pupil and the iris: https://drive.google.com/file/d/1SPwxL8iGRPJe8BFDBfzWWtvzA8UdqM6E/view?usp=sharing 3. Set the ROI to only include the eye region, excluding the dark corners of the image. Note that it is important not to set it too small (watch to the end): https://drive.google.com/file/d/1NRozA9i0SDMe_uQdjC2jIr000iPjqqVH/view?usp=sharing 4. Modify the 2D detector settings: https://docs.pupil-labs.com/core/software/pupil-capture/#pupil-detector-2d-settings 5. Adjust gain, brightness, and contrast: In the eye window, click Video Source > Image Post Processing Let us know how it goes!

user-6e1219 25 November, 2022, 04:51:19

Can you suggest some automatic pipeline method or code to get Saccades from Pupil Core recorded data.

user-beb3db 25 November, 2022, 10:29:15

Hello, how can I extract the surface identified by the markers in order to overlay the heatmap?

user-c2d375 25 November, 2022, 11:04:19

Hi @user-beb3db πŸ‘‹ You can use our Surface Tracker plugin to obtain gaze data relative to the surface and generate the heatmap. Please take a look here for further details https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking

user-515175 25 November, 2022, 12:05:38

Is there a size of core pupil glasses for children age 4?

nmt 25 November, 2022, 15:00:53

Hi @user-515175 πŸ‘‹. We offer a child-sized frame (not listed on the store) suitable for 3-9 year olds at the same cost as the adult-sized headset. If you would like to get a quote or place an order for a child-sized frame, just leave a note in the order form or reach out to [email removed]

user-1cd4d9 27 November, 2022, 01:19:08

Hi, I have question regarding the message timestamps I am using the Network API to read the pupil messages in real-time but I have observed that the pupil messages I receive have timestamps which are behind the current pupil time. Here's the code I am using ``` while True: socket.send_string('t') curr_time = float(socket.recv()) logging.info(f'Before : Current time {curr_time}') topic, payload = subscriber.recv_multipart() msg = msgpack.loads(payload, raw=False) timestamp = msg['timestamp'] logging.info(f'Msg timestamp {timestamp}') socket.send_string('t') curr_time = float(socket.recv()) logging.info(f'After : Current time {curr_time}')

And an example of the output I observe - 

INFO: Before : Current time 8087.21867836 INFO: Msg timestamp 8087.211949 INFO: After : Current time 8087.227443357 `` Is this expected? I'd expect the message timestamp to be in between theBeforeandAfter` current time in this case Maybe I am doing something wrong here?

papr 27 November, 2022, 12:55:14

Pupil data inherits their timestamps from the eye images which are in turn timestamped at their exposure. The time difference between now and the pupil datum corresponds to its processing time, including the transfer over the network.

Note, that your way to estimate the current pupil time is not as accurate as it could be. See https://github.com/pupil-labs/pupil-helpers/blob/master/python/simple_realtime_time_sync.py

user-a3e405 28 November, 2022, 11:27:16

Hello Pupil team, this is Amber again. I recently asked about using the nearest older transformation to get surface data at a higher sampling rate, and you mentioned that you will write up an example. I'm not sure if I missed the example because I couldn't find it on Discord anymore πŸ˜₯ . Any pointers will be very much appreciated!

papr 28 November, 2022, 12:29:07

Hey, sorry for the delay. I will try to get this done today

papr 28 November, 2022, 17:13:32

Here you go! https://gist.github.com/papr/da7b17c165ccbfa6a6835e261607c51e

user-746d07 28 November, 2022, 14:53:19

Hello Pupil team, I am trying to do an experiment using pupil core to measure people's gaze while they are doing their daily activities. By daily activities, I mean for example, cooking, watching TV, etc. What exactly do I need to do when conducting such an experiment? For example, do we need to use printed calibration points instead of using the computer screen for calibration?

papr 28 November, 2022, 15:05:45

Hi, is there a reason for using Pupil Core? Pupil Invisible might be much better suited for this use case. That said, you can use printed markers if you want, but you can similarly use the screen marker calibration. That is because Pupil Core calibrates gaze relative to the scene camera, not the environment.

user-1cd4d9 28 November, 2022, 16:38:24

To estimate the correct delay you need

user-a3e405 28 November, 2022, 17:48:22

Thank you so much!! I will work on it and post here if I have follow up questions.

user-bdb6e6 28 November, 2022, 21:14:53

Hello Pupil team, My colleagues and I are wanting to measure aspects of the vestibular ocular reflex with the Pupil Core system; specifically, ocular torsion. I was wondering if that was possible, and if so, which output measure should we look at?

user-d407c1 29 November, 2022, 09:04:31

Hi @user-bdb6e6 Pupil Core does not provide with cyclotorsions measurements. To collect these measurements, you will need to get an Iris pattern in the eye camera, and check in the next frames for rotations.

While, you can change the eye cameras resolution to 400 by 400px, it might not be enough to detect features of the Iris, especially if the subject' iris has not many salient features to match (like freckles or distinctive collagen patterns) Potentially, one can use a cosmetic contact lens with an irregular pattern, as they may have more salient patterns to match, but that contact lens needs to be fitted by a professional, and you will have to account for movement of the contact lens. Astigmatic (toric) contact lenses have an specific marking that optometrist use to know whether the lens is correctly placed, yet those markings would most probably not be seen by the eye camera, as they are quite subtle.

End of November archive