πŸ‘ core


user-75df7c 01 November, 2022, 14:10:51

any idea why the surfaces folder is my export is coming up empty?

user-e3f20f 01 November, 2022, 14:11:38

No files at all?

user-75df7c 01 November, 2022, 14:12:24

nope :(

user-75df7c 01 November, 2022, 14:13:29

now I'm half remembering some problem with onedrive? was it when exporting?

user-75df7c 01 November, 2022, 14:14:11

but annotations are exporting fine, just surfaces aren't. folder is there but nothing in it

user-e3f20f 01 November, 2022, 14:14:23

Yeah, there are known issues with it. Mostly related to reading files. But I recommend moving the recording out of there and trying again

user-75df7c 01 November, 2022, 14:15:34

scratch that, files are being created very slowly! since they didn't appear when the process seemed to be finished, i assumed it hadn't worked and deleted the folder. thanks!

user-75df7c 01 November, 2022, 14:16:02

it's hard to know when it's finished though

user-e3f20f 01 November, 2022, 14:17:27

I hear you.

user-7daa32 01 November, 2022, 16:42:45

I have been away for too long. I am sure I missed a lot of interesting stuff and updates. Right now I am searching for a way to plot a heat map. Anyone with ideas ? I simply have average Saccade length data

user-1ed146 01 November, 2022, 20:23:13

Hey there, please help me out. One of the eye cameras is suddenly not working. Like showing nothing but all black. But another eye camera just works fine. Anybody has any ideas on this?

user-e3f20f 01 November, 2022, 20:41:11

Hey, black or gray?

user-1ed146 01 November, 2022, 20:45:52

It’s black. I have one screenshot if it helps.

user-e3f20f 01 November, 2022, 20:46:19

If it is fully black try restarting with default settings in the general settings

user-1ed146 01 November, 2022, 22:31:33

It worked. Thanks!

user-1ed146 01 November, 2022, 20:48:47

Alright. I will try out. Thank you.

user-4c21e5 02 November, 2022, 11:42:50

Hey @user-7daa32, welcome back! Is Saccade length data really all you have? I'm not sure how a heatmap would work in that context, since presumably the length data has no directionality? Maybe a histrogram/descriptive statistics would make more sense.

user-7daa32 03 November, 2022, 19:51:44

Heatmap created in player .... Thanks

user-7daa32 02 November, 2022, 12:16:35

Thank you

user-7daa32 02 November, 2022, 12:20:42

Not the Saccade length. It is a derived average using the Saccade lengths. What about using the gaze data, I was able to create a heat map in the player but the visual stimulus shape is distorted.

user-f5523e 02 November, 2022, 16:04:20

hi can I ask a ELI5 type question as I haven't used the software yet - I have skimmed through the docs and couldn't quite figure it out So if you wanted to define an AOI post-recording (i.e. without markers during recording) - I see you can add a surface with the surface tracker plugin - but how does this work with continuous video with a dynamic world and/or head - would you need to modify the surface for each frame of the video assuming a shift of the AOI in the camera image?

user-e3f20f 02 November, 2022, 16:22:43

without markers during recording That does not work πŸ™‚ You can detect the markers post-recording but they need to be visible in the scene video.

user-f5523e 02 November, 2022, 16:23:58

ok - lets say I wanted to have an AOI as a car that moves across someone's view - would this be possible in post-processing analysis (using your software)?

user-e3f20f 02 November, 2022, 16:34:54

If you placed a huge marker on it, maybe πŸ˜„ But this is probably not what you are looking for. In your case, I would run a third-party object detector on the scene video and use the exported raw gaze data whether it falls onto any of the detected objects.

user-f5523e 02 November, 2022, 16:36:45

ok thanks - is there software that you would recommend that I could have a look at - ideally free?

user-e3f20f 02 November, 2022, 16:38:35

There are many neural networks that have been published over the years. I am not up-to-date in this regard. The choice would also heavily depend on what you need, e.g. if you only need a rough outline (a rectangle) or an accurate per-pixel map.

user-908b50 02 November, 2022, 16:39:52

This is it! I am not sure if I would need to re-export the data again. I am using version 2.5.0. https://pupil-labs.com/releases/core/v3.3/

user-e3f20f 02 November, 2022, 16:41:41

You are not affected. The bug only applies to 3.0-3.2

user-f5523e 02 November, 2022, 16:41:48

yeah there has been a lot going on with automatic scene recognition - ok so I think it get it - the analysis is very geared up around the markers - cheers

user-66797d 02 November, 2022, 20:59:54

I have a question about the demo workspace. I tried playing around with the videos and things but realized that most of the templates and videos there were locked. Is the demo workspace designed to be locked to edits, or is there a way around this to allow for edits?

user-bbd687 03 November, 2022, 04:01:45

How does the natural calibration algorithm work? How does the code match the gaze point with the red point in the natural scene? What algorithms are used to deal with it?

user-bbd687 03 November, 2022, 04:02:51

Could you please explain the running logic of the natural calibration algorithm? Use picturesπŸ˜€ πŸ‘

user-f6a634 03 November, 2022, 04:52:49

Hello, I've been using the LSL Relay plugin to stream data to LSL while also using the lsl_inlet.py example in pupil helpers to export the data to a .csv file. How would I go about modifying the lsl_inlet.py example in order to display the computer's local clock for the timestamp, using datetime for example?

user-e3f20f 03 November, 2022, 06:34:15

Ah, you need to run the commands in the command prompt, not within python πŸ™‚ the commands will start their own python instance to run the code in isolation.

user-d90133 03 November, 2022, 13:03:01

I'm still having issues when entering the script in different ways. Would you be able to list the steps I need (e.g. packages I need to download, code I need to enter beforehand) to make this script work? Thanks!

Chat image

user-bbd687 03 November, 2022, 10:26:52

@user-e3f20f (Pupil Labs) hi

user-e91538 03 November, 2022, 10:45:20

HI there. Just a quick question as I am writing a paper - when one does the eye tracking calibration on Pupil Capture, an error message can appear when the calibration is considered not sufficient (i.e. low data confidence, notably, I imagine ?). What threshold / error of measure does Pupil Capture automatically use ? In other words, what data quality is considered acceptable for the software to consider the calibration is suficient ? Thank you so much for your answer.

user-e3f20f 03 November, 2022, 10:46:25

Hi πŸ™‚ 2d or 3d calibration?

user-e91538 03 November, 2022, 11:42:03

Hi there ! 3D calibration please !

user-741073 03 November, 2022, 11:17:23

Hi, I'm facing some issues with one or the pupil camera

user-741073 03 November, 2022, 11:17:41

Chat image

user-e3f20f 03 November, 2022, 11:32:09

Please contact info@pupil-labs.com in this regard

user-741073 03 November, 2022, 11:32:15

Chat image

user-741073 03 November, 2022, 11:32:16

okay

user-e3f20f 03 November, 2022, 11:49:36

In that case it will only fail for sure if there was no pupil or no reference data collected. Otherwise it will attempt to run the calibration which is a complex optimization function. There is no clear cut value that tells you beforehand if it will converge or not. Even if it converges, it might be very inaccurate. This is why we have the accuracy visualizer which applies the estimated calibration to the recorded Pupil data and compares it to the reference data. This tells you how good the fit is in visual angle error. Depending on your use case, you can decide to repeat the calibration or proceed

user-e91538 03 November, 2022, 11:58:58

Thank you ! πŸ™‚

user-e91538 03 November, 2022, 11:59:12

And what about the 2D calibration ? Does that have a threshold ?

user-e3f20f 03 November, 2022, 13:20:07

You are running both commands as one, which confuses the command line. Enter each line one by one.

user-d90133 03 November, 2022, 13:39:10

I'm still getting errors

Chat image

user-d407c1 03 November, 2022, 13:43:38

Hi @user-d90133! You are missing git https://git-scm.com/. Install it, make sure is in your PATH and run the command again Check out https://www.activestate.com/resources/quick-reads/pip-install-git/

user-d90133 03 November, 2022, 15:39:24

Hi, so I installed Git, and attempted to run the lines again. Are the errors due to an incorrectly set up PATH?

Chat image

user-bbd687 06 November, 2022, 12:28:54

hi @user-e3f20f (Pupil Labs) When I run this code, I get an error and can't install it

Chat image Chat image

user-bbd687 06 November, 2022, 12:30:06

πŸ‘‹

user-bbd687 06 November, 2022, 12:31:51

Before I run the code, I have downloaded the 'requirements.txt' file

user-bbd687 06 November, 2022, 12:32:35

@user-d407c1

user-0777d6 07 November, 2022, 22:42:00

Hi, I compiled pupil core on my nvidia xavier nx (arm based), if try to run pupil capture i get the error glfw.GLFWError: (65542) b'GLX: No GLXFBConfigs returned'

user-e3f20f 08 November, 2022, 09:14:23

This looks like an issue with one of our dependencies. See https://www.glfw.org/

user-0777d6 07 November, 2022, 22:44:19

i can run only the service app, but the fps rate is very slow (3-5 fps...)

user-e3f20f 08 November, 2022, 09:13:38

Pupil Capture requires a fair amount of CPU. It is possible that this device is not able to deliver it. Note, Pupil Core software does not make explicit use of the GPU.

user-f590a4 08 November, 2022, 09:22:12

hi, i have a question, regarding exporting multiple recordings at once in pupilPlayer. I automated PupilCapture to trigger recording if certain events happened. Now i have about 200 single recordings and i need to export the 3d pupil diameter with pupilPlayer... Is there any faster way then drag and drop every single folder into it and press e?

user-e3f20f 08 November, 2022, 09:26:02

Hi! Are you only interested in the recorded pupil data, without re-processing the eye videos?

user-e3f20f 08 November, 2022, 09:26:45

If that is the case, check out https://gist.github.com/papr/743784a4510a95d6f462970bd1c23972

user-f590a4 08 November, 2022, 10:36:10

is there any explanation on how to use this right? I didn't seem to get it working... πŸ€”

user-f590a4 08 November, 2022, 09:29:46

Yes perfect thank you very much! Thats excatly what i need! πŸ™‚

user-0777d6 08 November, 2022, 09:37:04

i have another problem, if i capture the world camera with uv4l the image is distored, like a fisheye but only on the left part

user-0777d6 08 November, 2022, 09:46:30

solved the fisheye problem πŸ™‚ But linux detect the 2 eye camera as video device, if i try to capture the video from the world camera works, if i try to get the video from the 2 eye camera no, maybe this is the problem why the fps is so slow (pretty blocked...) ?

user-e3f20f 08 November, 2022, 09:49:26

Just to clarify, you selected the world camera in the world process and the eye cameras in the eye processes, correct?

user-0777d6 08 November, 2022, 09:50:05

I launch sudo python3 main.py service

user-0777d6 08 November, 2022, 09:50:21

the 2 eye camera starts, display a single image, the freeze

user-e3f20f 08 November, 2022, 09:51:58

Right! If you close one of the eye windows, does it work?

user-0777d6 08 November, 2022, 09:52:15

no, everything is freezed, i need to kill the process

user-0777d6 08 November, 2022, 09:52:18

i get this error

user-0777d6 08 November, 2022, 09:52:31

Assertion failed: ok (src/mailbox.cpp:99)

user-e3f20f 08 November, 2022, 09:53:14

Are you running from source? Which branch are you running from?

user-0777d6 08 November, 2022, 09:52:39

and a lot of cython error log...

user-0777d6 08 November, 2022, 09:53:47

i cloned the master branch, and i recompile everything from source

user-98789c 08 November, 2022, 10:52:33

Hi all, is there a post hoc kind of way to know with what sampling frequency I recorded pupil size using the Pupil Core?

user-e3f20f 08 November, 2022, 10:57:15

You can simply subtract neighbouring pupil timestamps (grouped by eye id), the inter-sample duration. 1 / intersample-duration gives you a per-sample estimation of the sampling frequency. I recommend averaging the values over time.

user-98789c 08 November, 2022, 11:08:29

Ah yes, thanks Pablo, I just remembered that I actually had to write a little script to do this, because the missing data were messing with my annotations. But if I have not changed any settings on Pupil Capture while recording, the sampling frequency should be around 200Hz, right?

user-e3f20f 08 November, 2022, 11:09:04

The default settings is 400x400 resolution at 120 Hz

user-98789c 08 November, 2022, 11:10:11

perfect, thank you πŸ™‚

user-75df7c 08 November, 2022, 11:57:38

Any recommendations as to what confidence level to use as a cut off point? I want to filter out noisy data and keep only the samples where gaze is good, but I don't know much is enough. Thanks in advance!

user-e91538 18 November, 2022, 14:17:09

Hi, did you use only the confidence level as a filter or did you enter other thresholds for example on diameter, smooth and more? How did you filter the signal?

user-c2d375 08 November, 2022, 12:15:33

Hi @user-75df7c πŸ‘‹
I recommend you discard data points with a confidence level lower than 0.6

user-75df7c 08 November, 2022, 12:21:33

Thanks! I was using 0.8 until now and I was worried I was not being strict enough! Any reason for this particular number?

user-e3f20f 08 November, 2022, 12:28:08

It is difficult to evaluate this number quantitively. It's more of a qualitative estimate of what is accurate "enough" and was is not. In the end, this threshold is a trade-off between "how good do you want your data to be" and "how much data are you ready to discard". If you have clean data, you can afford of increasing the threshold. If your data is noisy, you might need to reduce the threshold to have any data left. Ideally, you check that pupil detection works well before you start recording. But one does not always do their own data collection...

user-75df7c 08 November, 2022, 12:29:14

Absolutely. Ultimately I just wondered if there was an official recommendation from you guys so I can defend whatever number I choose with my supervisors haha

user-e3f20f 08 November, 2022, 12:29:50

hehe, I get that. 0.6 is that number.

user-75df7c 08 November, 2022, 12:30:19

Thank you!

user-0e5193 09 November, 2022, 10:39:42

Hello, I have a question about the 3D gaze origin. I understand the origin of the x and y axis but not the origin of the z axis. What does the mean of gazenormal*z? If it was 1, where should I think a person is looking?

user-e3f20f 09 November, 2022, 10:40:45

Hi, the z-axis points forward πŸ™‚

user-0e5193 09 November, 2022, 10:45:20

Thank you for your fast answer. But I want to know the origin of the z value. I think if it is 0, a person sees something close. But when it is 1, I don't know how far away he looking at something.

user-e3f20f 09 November, 2022, 10:47:55

The origin of the gaze coordinate system is the scene camera, i.e. (0, 0, 0) is the center of the scene camera sensor.

The gaze_normal0/1 vectors are only directional vectors. They origin in the eye_center0/1 points. Together, they define a line of sight for each eye

user-0e5193 09 November, 2022, 10:51:56

Oh, thank you. I did misunderstand. So, the z value means the world camera's vectors right?

user-e3f20f 09 November, 2022, 10:56:51

The z value alone is nearly meaningless.

Imagine two 3d lines. One per eye. Each goes through the center of the eye ball and the center of the pupil. These lines point towards what you are looking at. Each line is defined by a point (eye_center0/1) and a direction (gaze_normal0/1).

We try to find the intersection of those lines, which corresponds to gaze_point_3d.

Is this clearer?

user-0e5193 09 November, 2022, 10:53:13

Then the z value is the difference from the gaze determined by x and y?

user-0e5193 09 November, 2022, 11:03:06

Sorry. I became more confused. Is gaze_normal0/1 not enough to know the wearer's gaze? This question originated from this reference picture. I think I can know direction only with x and y, but I don't know what z means.

Chat image

user-e3f20f 09 November, 2022, 11:10:37

You can't interpret z independently of x and y in this case.

user-e3f20f 09 November, 2022, 11:09:08

Note, gaze_normal0/1 are normalised. That means that the length of this vector will always be 1. It describes the rotation of one eye ball.

Then the z value is the difference from the gaze determined by x and y? If you so want, z depends on the values of x/y, yes.

user-0e5193 09 November, 2022, 11:04:32

Until I asked, I thought z was the distance to the object.

user-0e5193 09 November, 2022, 11:11:35

Thank you very much. Then, how can I interpret z with x and y?

user-e3f20f 09 November, 2022, 11:14:12

As mentioned above, x/y/z describe a direction. Specifically, the direction in which one eye ball is rotated towards. The direction alone is not so useful, unless you want to know by how much the eye ball rotated in comparison to a second eye ball direction.

user-0e5193 09 November, 2022, 11:23:37

Thank you very much. Looking back on your answer, I think I was confused with gaze_point. Now, It is clear.

user-bf2b1e 09 November, 2022, 11:14:36

Hello, I am having troubles connecting to pi.local:8080

user-0e5193 09 November, 2022, 11:40:20

Can I ask you one last question? I want to know the unit of each gaze_point value.

user-e3f20f 09 November, 2022, 11:42:03

Note, the length (the distance to the gazed-at object) of the gaze_point_3d vector is known to be inaccurate for distances >70-100 cm.

user-e3f20f 09 November, 2022, 11:40:39

The unit for that coordinate system is millimeters

user-0e5193 09 November, 2022, 11:42:01

Then, is the z value distance to the object?

user-e3f20f 09 November, 2022, 11:42:46

Not z alone. But the length of the whole vector is.

user-0e5193 09 November, 2022, 11:43:30

I see! thank you. Have a nice day.

user-e91538 09 November, 2022, 15:16:57

Hi everyone, I would like to use my Pupils Lab Core to determine how often a gaze direction change occurs between three monitors, measure fixations and make a heatplot. I now have a setup that allows me to do robust tracking. Despite the good documentation, I still have some questions: In which format are the timestamps? For example, what does "9245888951" mean under "world timestamp"? Under pupil_positions.csv is also the variable "diameter". This is shown to me with e.g: "2292571258544920". How is this number to be interpreted? Is there a more detailed documentation for the beginner questions? Thank you!

user-e3f20f 09 November, 2022, 15:20:49

Both world timestamp and diameter should be decimal point values. The first in seconds, the second in pixels.

user-e3f20f 09 November, 2022, 15:17:50

Hey, are looking at the exported csv in Excel?

user-e91538 09 November, 2022, 15:18:02

Hi! Yep

user-e3f20f 09 November, 2022, 15:22:29

See https://stackoverflow.com/questions/11421260/csv-decimal-dot-in-excel

user-e3f20f 09 November, 2022, 15:19:51

Excel is known to misinterpret the exported data, depending on the language settings. e.g. 2.5 (2,5 in German notation) is interpreted as 2500 because the . in German is the separator for large numbers.

user-e91538 09 November, 2022, 15:29:54

Thanks for your quick and good explanation, but could you please explain it to me again? When I click in the "diameter" field in Excel, it shows me "2292571258544920". Now I understand that it is pixels, right? How many pixels would that be and can I convert that to mm in a reasonable way?

user-e3f20f 09 November, 2022, 15:32:12

2) check out https://docs.pupil-labs.com/core/software/pupil-player/#raw-data-exporter If you want the diameter in mm, use the diameter_3d column (which is likely effected by issue (1), too)

user-e3f20f 09 November, 2022, 15:31:28

1) Excel is displaying the value incorrectly. The value is likely 22.92571258544920 in the original csv file. You will need to reimport the CSV with the correct language settings s.t. these values are displayed correctly.

user-e91538 09 November, 2022, 15:32:53

Great. Now, I got. Thanks!

user-53a74a 09 November, 2022, 19:14:50

Hello folks, I'm planning to buy a new PC with a new CPU for my experiments, as the current CPU cannot keep recording even at 30Hz. I found this old post saying that the Pupil Labs bundle does not support Xeon processors. Is this still an issue? Should I get i7 or i9 processors? https://discord.com/channels/285728493612957698/285728493612957698/578844966638452749

user-e3f20f 09 November, 2022, 19:15:34

That is still correct

user-53a74a 09 November, 2022, 19:15:51

Thank you so much!

user-0e5193 10 November, 2022, 01:33:24

Hi, can I ask a question? What is the difference in the value that norm_pos_x/y between pupil_positions and gaze_positions?

user-e3f20f 10 November, 2022, 07:57:51

The frame of reference is the difference. pupil norm_pos refers to a point in the corresponding eye video camera. gaze norm_pos refers to a point in the scene camera.

user-0e5193 10 November, 2022, 13:00:39

Thank you so much!

user-2196e3 10 November, 2022, 12:39:19

Hi, I need use pupil diameter. But blinks affect pupil diameter. Do you have any reference code or resources to get rid of the blinks points and interpolate new values?

user-0777d6 10 November, 2022, 14:43:01

hi, can I copy the calibration data from a PC to another ? I can launch the capture app on my PC, calibrate glasses; but i need the service app on my SBC with linux, the problem is the SBC can use only service app, and the calibration doesn't works, but i need IPC gaze messages (no messages without calibration...)

user-e3f20f 10 November, 2022, 14:46:10

If you are ok without the scene video, what is it that you need the gaze data for? In other words, what kind of data are you interested in in particular?

user-0777d6 10 November, 2022, 14:47:04

i need gaze data, i need to capture the world camera then store where users is looking

user-0777d6 10 November, 2022, 14:47:29

on the SBC i use service app, but without a calibration the doesn't send me data

user-e3f20f 10 November, 2022, 14:47:56

Let's continue in the thread of the other day

user-1ed146 10 November, 2022, 19:33:57

Hello folks, can anyone help here? One of the eye cameras stop functioning, showing 0 fps. And the eye camera window is in grey. How can I fix it??

user-1ed146 10 November, 2022, 19:34:52

Any answers is appreciated.

user-e3f20f 10 November, 2022, 19:38:18

please contact info@pupil-labs.com in this regard

user-1ed146 10 November, 2022, 19:39:55

Okay. Thanks. Will shoot them an email then.

user-e91538 11 November, 2022, 10:59:09

does Pupil invisible works with iOS???

user-e3f20f 11 November, 2022, 10:59:51

Hi, no it does not. πŸ™‚ It only works on OnePlus 6 and OnePlus 8/8T with Android

user-72f9ba 11 November, 2022, 16:26:47

Hello!

user-72f9ba 11 November, 2022, 16:26:56

I just got the glasses and companion device

user-72f9ba 11 November, 2022, 16:27:04

The glasses do not connect to the companion device

user-e3f20f 11 November, 2022, 16:35:06

Hi! The first steps will always be to connect the glasses via the included USB cable to the Companion device / phone. Have you done that already?

user-72f9ba 11 November, 2022, 16:27:08

How can i connect them?

user-72f9ba 11 November, 2022, 16:39:25

Does it need to be connected the whole time?

user-e3f20f 11 November, 2022, 16:42:33

When you want to use it, yes! It gets its power from the phone and the gaze estimation happens in the phone, not the glasses themselves πŸ™‚

user-72f9ba 11 November, 2022, 16:42:46

That's not what is advertised. This is a major issue

user-e3f20f 11 November, 2022, 16:43:12

Can you point me to the specific advertisement that you are referring to?

user-72f9ba 11 November, 2022, 16:44:19

https://pupil-labs.com/products/invisible/ Can you show me where it is written?

user-e3f20f 11 November, 2022, 16:45:48

In all the pictures where people are wearing the glasses, you can see the cable. We never advertise that the glasses can be used wirelessly πŸ™‚

user-4c21e5 11 November, 2022, 17:01:16

Hey @user-72f9ba πŸ‘‹. Does your use-case strictly exclude usb connectivity? Is the issue cable management? You can also reach out to info@pupil-labs.com to discuss product fit and options πŸ‘

user-7b683e 12 November, 2022, 10:11:59

Hello Dear Pupil Labs Team,

We want to mention about an issue which I have gotten from Pupil Capture using Core product.

I use a new computer with Ryzen 7 and a 3000 serie of Nvidia GPU. On the device, the frequency of the eye camera is fairly suitable for my new purpose, 120+ Hz. However world camera gives nearly 15 FPS, with a huge latency and froozen sometimes. I guess due to some reasons related with world camera process, I got Blue Screen error sometimes on Windows 10. When the error didn't exsist, gaze became really noisy.

I would like to hear your suggestions about the relavent jumping data. I know that the issue is not clear but maybe some aspects can be enough to predict it.

Have a good day!

user-c7cfe7 13 November, 2022, 08:01:14

Hello developers. I am a university student in Japan. I am using Pupil Core in my research and I have done the following work to enable LSL. 1. 1. copy pylsl and pupil_capture_lsl_relay directories to pupil_capture_settings/plugin 2. run Capture with Sudo

However, when I try to activate LSL, I get the following error which I cannot resolve.

world - [WARNING] plugin: Failed to load 'pupil_capture_lsl_relay'. Reason: 'liblsl library '/Users/tappun/pupil_capture_settings/plugins/pylsl/lib/liblsl.dylib' found but could not be loaded - possible platform/architecture mismatch.

You can install the LSL library with conda: conda install -c conda-forge liblsl or with homebrew: brew install labstreaminglayer/tap/lsl or otherwise download it from the liblsl releases page assets: https://github.com/sccn/liblsl/releases On modern MacOS (>= 10.15) it is further necessary to set the DYLD_LIBRARY_PATH environment variable. e.g. >DYLD_LIBRARY_PATH=/opt/homebrew/lib python path/to/my_lsl_script.py'' world - [WARNING] plugin: Failed to load 'pylsl'. Reason: 'liblsl library '/Users/tappun/pupil_capture_settings/plugins/pylsl/lib/liblsl.dylib' found but could not be loaded - possible platform/architecture mismatch.

I ran the following, but it made no difference.

brew install labstreaminglayer/tap/lsl

I don't have a good understanding of DYLD_LIBRARY_PATH, so I tried specifying it as follows.

DYLD_LIBRARY_PATH="/opt/homebrew/lib/python3.9/site-packages/pylsl/pylsl.py" It would be helpful to know if there are any mistakes.

How can I do this? OS used is MacOS 13.0 (22A380) Python 9.0.

user-e3f20f 14 November, 2022, 07:53:59

The bundle ships the intel-x86_64 python that is emulated on m1 macs. You installed the native M1 pylsl which is why you get the emulation error.

user-c7cfe7 13 November, 2022, 09:07:09

Regarding the above problem, we considered the possibility that liblsl does not support M1 Mac, and by using a macbookpro (ver. 11.6) equipped with an intel CPU, the error did not occur. Therefore, we will use that machine for development for a while. If you know how to run it on the latest macOS and M1 Mac, I would appreciate it if you could let me know.

Thanks.

user-e3f20f 14 November, 2022, 07:54:37

I recommend to do so on the develop branch. See the special note about creating an intel-x86 virtual environment https://github.com/pupil-labs/pupil/tree/develop#installing-dependencies-and-code

user-e91538 14 November, 2022, 10:09:00

Analyze the data

user-e91538 14 November, 2022, 10:30:28

Hi all, does anyone have a detailed analysis of how to analyze the data exported from the system? With the codes and parameters that can be obtained? I am new here and I wanted to understand how they work. Also I wanted to know if the tests can be done directly from the software you download or if you need to implement them with Python or Matlab codes, thanks!!!

user-4c21e5 14 November, 2022, 11:32:50

Hi @user-e91538 πŸ‘‹. Welcome to the community! You can find a detailed overview of analysis and visualisation plugins available in Pupil Player (our free desktop analysis software) here: https://docs.pupil-labs.com/core/software/pupil-player/#pupil-player. You don't need to use coding to work with these tools. That said, you can export the results into .csv files, which can of course be processed with Python or Matlab, if that's your interest. If you can share details of your use-case, we can try to follow up with more concrete guidance/examples!

user-e91538 14 November, 2022, 10:37:29

and how does lite analyze the data? Python, Matlab, C ++?

user-e91538 14 November, 2022, 12:33:59

For example if you wanted to graph how you change the diameter of the pupil by changing the difficulty of the action that is taking place? Can it be done directly from the software or do you need to export the data and create a special code? In case there is already some code ready to practice? Sorry for the severla questions

user-4c21e5 14 November, 2022, 14:14:22

That's no problem at all! You can see a very basic trace of pupil size data over time in our Software. But to extract insights about changes due to cognitive load, you'd need to export the data and do some further processing. Here are a few resources: 1. Best practices for doing pupillometry with Pupil Core: https://docs.pupil-labs.com/core/best-practices/#pupillometry 2. Third-party Python tools for evaluating pupil size changes (should be compatible with Core recordings): https://pyplr.github.io/cvd_pupillometry/index.html 3. Pre-processing pupil size data: https://discord.com/channels/285728493612957698/285728493612957698/854346611076235274 I hope this helps!

user-b3b1d3 14 November, 2022, 15:40:25

Hello everybody! I was wondering if there is a 32bit version of pupil_core software for linux

user-b3b1d3 14 November, 2022, 15:40:46

To try to run it in a raspberry pi

user-e3f20f 14 November, 2022, 15:42:02

Hi, what you specifically need for that are arm-based versions of the dependencies. Check out the #1039477832440631366 thread history.

user-b3b1d3 14 November, 2022, 15:45:44

Ok, thank you. I will take a look.

user-b3b1d3 14 November, 2022, 15:53:12

On the other hand, I guess i will have to use a similar aproach if I need to run it on windows 11 right?

user-e3f20f 14 November, 2022, 15:54:04

Do you mean Windows 11 on the rpi?

user-b3b1d3 14 November, 2022, 15:54:16

no, on another machine

user-e3f20f 14 November, 2022, 15:54:59

We offer a pre-built bundle for Windows on Intel/AMD 64 systems

user-b3b1d3 14 November, 2022, 16:56:37

OK, thanks its up and runnning now on windows 11. I will try to build in rpi tomorrow. Thank you for your time.

user-e3f20f 15 November, 2022, 10:49:23

Maybe an additional note: Pupil Core requires a fair amount of CPU. It is likely that the rpi won't be able to provide sufficient speed to run the pupil detection at higher frame rates. You can use it to record the video and run the detection post-hoc though.

user-e91538 15 November, 2022, 10:49:20

I have another question: does the Pupil Player software export the data already filtered and having applied a smooth one or do we then need to process them also through a filter eliminating the sources of noise?

user-e3f20f 15 November, 2022, 10:50:09

Hi! Pupil Player does not filter or smooth the data. We recommend discarding samples with a confidence of 0.6 or lower before preprocessing the exported data any further (see the third point listed in Neil's message)

user-946395 15 November, 2022, 17:42:08

Hi there! I am discussing a research study that used Pupil Labs CORE glasses. This is my students' first conversation about eye tracking details in my psychology of music class. I am looking for a post-hoc video that might illustrate fixations, saccades, etc. There are several videos on the Pupil Labs channel of post-hoc analysis and rendering, but there was no audio in the videos. Thanks in advance! I downloaded the demo videos from this link, https://pupil-labs.com/products/core/tech-specs/, but I cannot get them to load in any video apps that I have. Any help is appreciated!

user-e3f20f 15 November, 2022, 17:43:30

Please open the demo recordings in Pupil Player and export them. The export will include the raw data as csv and the scene video with overlayed gaze. Check out our documentation for instructions.

user-946395 15 November, 2022, 17:52:55

are the demo recordings titled world.mp4, eye1.mp4, eye0.mp4? Thanks for the help. Looking to do work in eye tracking with Pupil SOON!

user-e3f20f 15 November, 2022, 18:00:05

These are the raw camera recordings. You can play them back one by one using e.g. VLC media player.

user-bbd687 16 November, 2022, 08:36:07

@user-e3f20fI've finished the recording. How do I expor the pupil in marking the (x, y) coordinates of world-cam?

user-bbd687 16 November, 2022, 08:36:41

The (x,y) coordinates of the red point

user-e3f20f 16 November, 2022, 08:37:45

Open the recording folder in Pupil Player and press the export button. This will save the points as csv

user-bbd687 16 November, 2022, 08:38:37

I have finished

user-bbd687 16 November, 2022, 08:38:45

πŸ˜ƒ

user-bbd687 16 November, 2022, 08:39:16

What is the name of the (x,y) coordinate of the red dot?

user-bbd687 16 November, 2022, 08:39:31

Variable name (x,y)

user-e3f20f 16 November, 2022, 08:39:44

norm_pos

user-bbd687 16 November, 2022, 08:39:55

Chat image

user-bbd687 16 November, 2022, 08:40:01

Chat image

user-bbd687 16 November, 2022, 08:40:22

Excuse me, are these two picture?

user-e3f20f 16 November, 2022, 08:41:12

Norm_pos_x/y in the gaze position csv file

user-bbd687 16 November, 2022, 08:43:42

Chat image Chat image Chat image

user-bbd687 16 November, 2022, 08:44:16

These are the coordinates of Norm_pos_x/y

user-bbd687 16 November, 2022, 08:44:34

The y value is too small

user-bbd687 16 November, 2022, 08:45:45

My pupil line of sight moves within the frame of the screen

user-bbd687 16 November, 2022, 08:46:21

Chat image

user-bbd687 16 November, 2022, 08:46:44

Just like the red circle

user-bbd687 16 November, 2022, 08:46:53

@user-e3f20f

user-e3f20f 16 November, 2022, 08:46:54

from which file did you load the data?

user-bbd687 16 November, 2022, 08:47:30

pupil_positions.csV

user-e3f20f 16 November, 2022, 08:49:13

pupil != gaze data. What are you looking at is the pupil position within the eye video, not the scene video. You need to look at the norm_pos in gaze_postions.csv

user-bbd687 16 November, 2022, 09:02:41

Thank you. I found it!!!

user-bbd687 16 November, 2022, 09:26:17

I want to use the heat map function

Chat image

user-e3f20f 16 November, 2022, 09:28:02

You can read about the setup in our documentation https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking

user-bbd687 16 November, 2022, 09:26:21

@user-e3f20f

user-bbd687 16 November, 2022, 09:27:04

Chat image

user-bbd687 16 November, 2022, 09:27:30

How do I do that

user-bbd687 16 November, 2022, 09:32:12

Excuse me, how to turn off the effect of this highlight?

Chat image

user-e3f20f 16 November, 2022, 09:33:53

You can turn off visualizations via the plugin manager (for unique visualizations) or in their corresponding menu (for non-unique ones).

user-bbd687 16 November, 2022, 09:40:26

ok,Thank you. I did it successfully

user-bbd687 16 November, 2022, 09:41:33

I can't get a heat map

Chat image

user-e3f20f 16 November, 2022, 09:41:50

Because you have not setup the markers and not defined a surface yet

user-bbd687 16 November, 2022, 09:41:48

I've turned on the heat map

user-bbd687 16 November, 2022, 09:42:07

What should I do

user-e3f20f 16 November, 2022, 09:42:21

Please read the documentation that I linked above

user-bbd687 16 November, 2022, 09:43:04

Can natural calibration generate heat map

user-e3f20f 16 November, 2022, 09:44:09

Calibration and surface tracking are two different steps. You need to calibrate first. This gives you gaze in scene camera coordinates. Then you need surface tracking to map the gaze onto the surface. Afterward, you can generate the heatmap.

user-bbd687 16 November, 2022, 09:43:21

Or is the heat map generated only by screen calibration?

user-bbd687 16 November, 2022, 09:45:13

I've done the natural calibration

user-e3f20f 16 November, 2022, 09:45:42

Then, you only need to setup the AprilTag markers and define a surface as described in the documentation.

user-bbd687 16 November, 2022, 09:45:30

Then I recorded the video and the data

user-bbd687 16 November, 2022, 09:46:07

What is surface?

user-e3f20f 16 November, 2022, 09:47:16

The surface is the area of interest onto which you want to generate the heatmap. Pupil Capture and Player need the markers in order to track the surface. for example your computer screen.

user-bbd687 16 November, 2022, 09:47:47

Chat image

user-e3f20f 16 November, 2022, 09:49:54

As the error message says, you need to setup markers first. The markers are displayed and linked in the documentation. https://docs.pupil-labs.com/core/software/pupil-capture/#markers

user-bbd687 16 November, 2022, 09:48:30

Right, I need to calibrate the heat map using the computer screen

user-bbd687 16 November, 2022, 09:51:33

May I ask, do I need to print marker?

user-e3f20f 16 November, 2022, 09:52:41

You can either display them digitally on your screen, or print them and attach them to your screen, yes.

https://raw.githubusercontent.com/wiki/pupil-labs/pupil/media/images/pc-sample-experiment.jpg

user-bbd687 16 November, 2022, 09:51:48

Chat image

user-bbd687 16 November, 2022, 09:53:30

Chat image

user-e3f20f 16 November, 2022, 09:54:47

The only requirement is that you do not use the same marker more than once. Otherwise, you can choose any number (tag id) that you like

user-bbd687 16 November, 2022, 09:53:56

Chat image

user-bbd687 16 November, 2022, 09:54:05

Is there any requirement for the number?

user-e3f20f 16 November, 2022, 09:55:21

Make sure to include sufficient white border when displaying or cutting out the markers!

user-bbd687 16 November, 2022, 09:55:27

ok

user-bbd687 16 November, 2022, 09:56:13

I'll do the experiment now

user-bbd687 16 November, 2022, 09:58:35
  1. I paste marker on the four corners of the screen
  2. I conducted the test normally
  3. I record video and data
  4. I marked AOI in the player
  5. Generate heat maps
user-e3f20f 16 November, 2022, 10:00:17

after step 1, you can check if the markers are detected in Pupil Capture in realtime. Enable the surface tracker plugin. The markers should be marked in color. Then add a surface. You can edit the surface to fir your screen. https://docs.pupil-labs.com/core/software/pupil-capture/#preparing-your-environment

user-bbd687 16 November, 2022, 09:58:55

Is this step correct?

user-bbd687 16 November, 2022, 09:59:04

@user-e3f20f

user-bbd687 16 November, 2022, 10:02:03

Is the surface tracker plugin enabled in capture exe?

user-e3f20f 16 November, 2022, 10:02:23

not by default. you can turn it on in the plugin manager

user-bbd687 16 November, 2022, 10:04:38

OK

user-bbd687 16 November, 2022, 10:15:54

Chat image

user-bbd687 16 November, 2022, 10:16:15

Is this all right?

user-bbd687 16 November, 2022, 10:16:18

@user-e3f20f

user-e3f20f 16 November, 2022, 10:17:01

Looks good to me. Please confirm that the detection works by running Pupil Capture and pointing the scene camera to the screen.

user-bbd687 16 November, 2022, 10:17:42

How should I marker the colors?

user-e3f20f 16 November, 2022, 10:18:27

Pupil Capture will overlay color in the video preview on the markers if they are detected. (Surface tracker needs to be running)

user-bbd687 16 November, 2022, 10:18:07

β€œThe markers should be marked in color. ”

user-bbd687 16 November, 2022, 10:18:56

ok

user-bbd687 16 November, 2022, 10:30:13

Chat image

user-bbd687 16 November, 2022, 10:30:30

In capture exe, the heat map is in motion

user-bbd687 16 November, 2022, 10:30:55

But in record exe, the heat map is not moving

user-e3f20f 16 November, 2022, 10:32:03

I am not sure what you mean by "record exe". Do you mean Pupil Capture? Also, what type of movement do you expect?

user-bbd687 16 November, 2022, 10:32:27

It's the player

user-e3f20f 16 November, 2022, 10:33:57

In Capture, the heatmap is calculated based on the last x seconds. In Player, the heatmap is calculated based on the data within the trim marks. The trim marks are the small controls on the left and right of the timeline.

user-bbd687 16 November, 2022, 10:33:07

In the player, the heat map is always in place

user-bbd687 16 November, 2022, 10:33:28

Chat image

user-bbd687 16 November, 2022, 10:33:51

It doesn't follow the eye movement

user-bbd687 16 November, 2022, 10:34:14

But in capture exe, the heat map follows eye movements

user-e3f20f 16 November, 2022, 10:35:28

yes, because in player, the heatmap is calculated on all recorded gaze data.

user-bbd687 16 November, 2022, 10:47:36

okk

user-bbd687 16 November, 2022, 10:47:39

πŸ‘‹

user-bbd687 16 November, 2022, 10:47:43

πŸ‘

user-bbd687 16 November, 2022, 11:08:01

Excuse me, I want to export the x+ y+ fixation value of the heat map data, and then draw the heat map of the points in MATLAB.

user-bbd687 16 November, 2022, 11:08:36

Which variables should I use?

user-bbd687 16 November, 2022, 11:08:50

Which file is it in?

user-bbd687 16 November, 2022, 11:08:54

πŸ˜€

user-e3f20f 16 November, 2022, 11:09:52

If you have the surface setup in Player, hit export. There is a surfaces folder. Look at the fixations_on_surface_...csv file. Note that there is a difference between fixation and gaze data. The heatmap in Player is based on gaze data.

user-bbd687 16 November, 2022, 11:25:43

The heatmap in Player is based on gaze data"

user-bbd687 16 November, 2022, 11:13:10

i have the surfaces folder.

user-bbd687 16 November, 2022, 11:13:31

but the β€œ fixations_on_surface_...csv” size only 1KB

user-bbd687 16 November, 2022, 11:14:08

its blank only name tittle

user-e3f20f 16 November, 2022, 11:15:14

That means that you have not run the fixation detector yet. Look at the gaze_on_surface...csv data instead

user-bbd687 16 November, 2022, 11:16:51

Chat image

user-bbd687 16 November, 2022, 11:16:57

The table has data

user-bbd687 16 November, 2022, 11:19:11

Can I use these two data. x :x_ norm ; y : y_norm

user-e3f20f 16 November, 2022, 11:20:19

yes

user-bbd687 16 November, 2022, 11:20:04

Where should I look for gaze data?

user-bbd687 16 November, 2022, 11:20:09

@user-e3f20f

user-bbd687 16 November, 2022, 11:20:26

en en

user-bbd687 16 November, 2022, 11:20:34

πŸ‘

user-bbd687 16 November, 2022, 11:21:11

Where should I look for gaze data?πŸ˜€

user-e3f20f 16 November, 2022, 11:21:41

you are looking at it already

user-bbd687 16 November, 2022, 11:21:41

What should the name of the file be?

user-bbd687 16 November, 2022, 11:22:18

Chat image

user-e3f20f 16 November, 2022, 11:27:42

Load the data in Matlab and use this https://de.mathworks.com/matlabcentral/fileexchange/66629-2-d-histogram-plot to generate the heatmap.

user-e3f20f 16 November, 2022, 11:22:42

x_norm/y_norm

user-bbd687 16 November, 2022, 11:22:20

Is it confidence?

user-bbd687 16 November, 2022, 11:22:46

okk

user-bbd687 16 November, 2022, 11:23:40

I need one more variable. This variable is used to draw the color of the heat map. The degree of this variable determines the color.

user-e3f20f 16 November, 2022, 11:25:11

That variable depends on how you aggregate the data. matlab should be doing this for you

user-bbd687 16 November, 2022, 11:24:30

The value of this variable determines the color of the heat map.

user-bbd687 16 November, 2022, 11:26:10

I want to use this variable : "gaze data"

user-e3f20f 16 November, 2022, 11:26:36

gaze data are x/y locations on the surface. this is the x/y_norm data.

user-bbd687 16 November, 2022, 11:27:27

Can the software export gaze counts?

user-e3f20f 16 November, 2022, 11:28:04

you can simply count the rows

user-bbd687 16 November, 2022, 11:28:59

ok

user-bbd687 16 November, 2022, 11:30:34

I'd like to ask you again. What variable did you use for the color of the heat map you generated?

user-e3f20f 16 November, 2022, 11:34:46

the variable is not exported directly. This is how we calculate the heatmap https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/surface_tracker/surface.py#L616-L640

user-bbd687 16 November, 2022, 11:31:01

I now want to verify the accuracy in matlab.

user-bbd687 16 November, 2022, 11:31:31

I want to use the same variables to generate the same heat map as you do

user-bbd687 16 November, 2022, 11:31:49

πŸ˜€ πŸ‘

user-bbd687 16 November, 2022, 11:35:21

Thank you very much

user-e8fbd0 16 November, 2022, 13:50:10

Hi! Just wondering if there is a way to change the background colour of the on-screen calibration and test routines in Capture? The current white is far too bright compared to our stimuli.

user-e3f20f 16 November, 2022, 13:52:02

Hi, unfortunately, no. You might want to use the natural features calibration instead.

user-e3f20f 16 November, 2022, 13:52:52

That said, why do you need the calibration routine to be of the same brightness as your stimulus?

user-e8fbd0 16 November, 2022, 13:54:44

We don't need it per se, but because our stimulus is a rather dark (air traffic control) radar display, the bright calibration in between blocks is less pleasant on the eyes

user-e8fbd0 16 November, 2022, 13:56:29

Natural features would indeed work, but I was rather fond of the automated nature of the on-screen markers.

user-e8fbd0 16 November, 2022, 14:02:13

Hm, thinking about natural features I might have gotten an idea. We can embed the markers in our radar display using some custom logic and then run the on-screen marker calibration from Capture in parallel, while not actually displaying its markers on the screen.

user-e3f20f 16 November, 2022, 14:03:57

In that case, I recommend single marker calibration in physical mode. It will expect that an other source displays the marker and only performs detection (without displaying a marker on its own)

user-e8fbd0 16 November, 2022, 14:10:16

I'll run some tests with that as well. Thanks for the swift replies!

user-d61193 17 November, 2022, 06:53:21

Hello Pupil Labs,

We have Pupil Core Child Version. We will have active children participants in a live play room environment. Can we download software to phone or tablet other than laptop to run Pupil Capture and Player?

user-80123a 17 November, 2022, 07:56:41

Hello everyone, has anyone ever tried to install the pupil software source code on a raspberry? I got several error after taping this line of instruction: pip install -r requirements.txt So I decided to install each package indivudally. But I am stuck with the package cysignals. I always received the log message errors: "failled building wheel for cysignals". Any ideas? Thanks in advance for your help.

user-e3f20f 17 November, 2022, 08:00:00

Let's continue the discussion in #1039477832440631366

user-d407c1 17 November, 2022, 08:06:02

Pupil Core can not be run from a phone. You can use a small form-factor tablet-style PC running Pupil Capture to make Pupil Core more portable. If the specifications of such a device are low-end, you record the experiment with the real-time pupil detection disabled to help ensure a high-sampling rate. Pupil detection and calibration can then be performed in a post-hoc context. But it is important you establish a good eye model fit prior to recording. Alternatively, you can connect the laptop through internet, place it on a backpack and control Pupil Capture from a separate machine using the network API

user-74c615 26 July, 2023, 08:46:47

hi everyone! Is this means pupil core must always connect to a computer?

user-d61193 17 November, 2022, 13:28:51

Thank you very much!

user-2196e3 17 November, 2022, 10:05:16

@user-e3f20f Thank you! That's very helpful!

user-d6c401 17 November, 2022, 11:20:00

When I started my two devices (pupil invisible) they went into a sync mode and I cannot access the cloud or any of the menu items on the app. It did allow me to take eye-tracking data on a previously entered wearer name but both apps on each device are now stuck and I have recordings on each device that will not upload to my cloud account. Please help..

user-e3f20f 17 November, 2022, 11:20:51

Could you please try logging out and back in again?

user-d6c401 17 November, 2022, 11:21:02

Will do, standy

user-d6c401 17 November, 2022, 11:23:09

That did it, thank you

user-e91538 17 November, 2022, 14:15:14

hello! I am working on Reference Image mapper and I am wondering how to add a sample to be mapped. I used a reference picture and video but I now want to apply that enrichment to recordings. How can I do that? Thanks

user-e3f20f 17 November, 2022, 14:18:50

Add all the recordings that you want to have mapped to the project in which the enrichment is defined.

user-e91538 18 November, 2022, 08:09:26

great! Thanks

user-e91538 17 November, 2022, 16:11:23

Hi all, is there anyone who did a filter after exporting the data? Can you share the code if possible? I would like to get feedback, for now I have read an article that was recommended to me above but it is a bit confusing! I would like to confront someone who is processing the exported data! Thank you

user-e91538 17 November, 2022, 18:11:58

and also, does anyone have photos of the setup during the experiment?

user-2196e3 17 November, 2022, 22:34:10

Hi, I have 2 questions. 1. where and how can we download the recording folder. I found we only have recording and pupil data, but we don't have recording folder. 2. If we don't have recording folder, so we don't have info.player.json. Anyway we could convert the pupil data timestamp to real time correctly? Thank you!

user-e3f20f 18 November, 2022, 07:40:32

Could you share a Screenshot of the file list?

user-2196e3 18 November, 2022, 09:25:56

And .MP4 video. That's all

user-e3f20f 18 November, 2022, 09:27:12

It looks like you are working with exported data only, without access to the raw data. I fear that there is no possibility for you to reconstruct the date times of the recording.

user-2196e3 18 November, 2022, 09:29:52

We are still use pupil core device to do more experiments. Is it possible to get raw data for new experiment? How to get access to raw data?

user-e3f20f 18 November, 2022, 09:54:05

Are those files from a recent recording? Pupil Capture creates a folder that you would open in Pupil Player. That is the raw data. You might be using a very old Pupil Capture version.

user-2196e3 18 November, 2022, 09:56:28

Yes, these data are from recent recording. They were got in 3 months.

user-e3f20f 18 November, 2022, 09:57:37

Just to clarify, you are using Pupil Capture to record the data, correct? Which version are you using?

user-2196e3 18 November, 2022, 09:59:39

I will ask the person who did that. This is not done by me.

user-e3f20f 18 November, 2022, 10:00:18

The latest one, 3.5.

user-2196e3 18 November, 2022, 10:00:09

By the way, which version do you recommend to use?

user-e91538 18 November, 2022, 14:04:17

Can someone explain to me what timestamp is? It's in seconds but what does it mean? Is it seen as time for each sample?

user-e3f20f 18 November, 2022, 14:04:58

Is it seen as time for each sample? it is πŸ™‚

user-e3f20f 18 November, 2022, 14:05:45

More specifically, the time at which the sample was generated. Not its duration.

user-e3f20f 18 November, 2022, 14:06:53

Maybe this tutorial can give you a more concrete idea of how "Pupil time" relates to the normal system clock https://github.com/pupil-labs/pupil-tutorials/blob/master/08_post_hoc_time_sync.ipynb

user-e91538 18 November, 2022, 14:29:45

So anyway I have to convert it for example if I wanted to see the trend of the diameter with respect to time?

user-80123a 18 November, 2022, 14:30:41

Hi all, I want to use a Raspberry pi to stream video to a computer. Where the computer runs pupil capture. I found and tried to install using this link https://github.com/Lifestohack/pupil-video-backend

user-80123a 18 November, 2022, 14:31:39

But I got an error message "ModuleNotFoundError: .... cv2"

user-80123a 18 November, 2022, 14:32:42

When I start to run python main.py

user-e3f20f 18 November, 2022, 14:32:58

Pip install opencv-python

user-80123a 18 November, 2022, 14:37:06

Thank you, it works. I can imagine the problem is about openCV. But the instruction on the page already ask to run the command: sudo apt install -y python-opencv. So it is a different openCV?

user-80123a 18 November, 2022, 14:37:52

sudo apt install -y python3-opencv

user-e3f20f 18 November, 2022, 14:40:02

This installs Opencv in the global environment for the default python.

user-80123a 18 November, 2022, 14:38:06

pip install opencv-python

user-e3f20f 18 November, 2022, 14:40:42

This installs in the current environment

user-bbd687 19 November, 2022, 04:59:25

@user-e3f20fI modified the English alphabet instead of the native language. But the output is a box

Chat image

user-e3f20f 19 November, 2022, 07:58:43

I think the included font only has English characters. It might be possible to load a custom font but I would need to look that up next week.

user-a3e405 19 November, 2022, 07:25:20

Hello team, We are currently using pupil core to do realtime eye tracking (gaze on surface) via network API. We would like to get surface data in higher frequency (at least 120Hz) to detect saccades. We noticed the surface datum is 30Hz so we subscribed to both surface & gaze, and tried to map the gaze ourselves usingimg_to_suf_trans (which updates 30Hz). However, applying the homographic transformation to gaze data (norm_pos in gaze) does not give us the same surface data. I saw from a previous post that it is actually more complicated since you map gaze in undistorted camera space. I'm still not sure how to apply img_to_suf_trans in undistorted camera space to get surface data in higher frequency. I would very much appreciate any suggestions or pointers to existing examples. Thank you in advance!

user-e3f20f 19 November, 2022, 08:02:35

Do I understand it correctly that you would prefer mapping gaze using an older surface location instead of waiting for the newest one?

The surface datum should contain transformation matrices for both, distorted and undistorted, spaces.

user-a3e405 21 November, 2022, 22:44:08

I think that is correct. To get the surface datum in 120Hz, we need to map gaze using an nearest older transformation. If we wait for newest data, we won't be able to collect mapped gaze data in realtime at 120Hz. I screenshot the conservation from your channel before, and I am encountering the same issue as this person did. However, I'm still not sure how to solve this issue yet. Any suggestions are appreciated!

Chat image

user-bbd687 19 November, 2022, 09:12:07

@user-e3f20fThank you very much!!

user-3aea81 21 November, 2022, 01:31:54

hello

user-3aea81 21 November, 2022, 01:32:36

i'm using pupil core but i have Q

user-3aea81 21 November, 2022, 01:33:08

my pupil is so sensitive

user-3aea81 21 November, 2022, 01:33:42

it's using harder

user-3aea81 21 November, 2022, 01:34:26

can i handle in software?

user-bbd687 21 November, 2022, 03:43:46

@user-e3f20fLooking forward to your reply!!

user-bbd687 21 November, 2022, 03:43:54

πŸ˜€

user-887da7 21 November, 2022, 04:16:21

Hello, where can I get the format/description of the data that can be collected via pupil labs on HTC VIVE?

user-e3f20f 21 November, 2022, 08:48:42

It is the same as for the Pupil Core format. https://docs.pupil-labs.com/core/software/recording-format/

You can download an example raw recording here https://drive.google.com/file/d/11J3ZvpH7ujZONzjknzUXLCjUKIzDwrSp/view?usp=sharing

Using Pupil Player, you can export the data to csv https://docs.pupil-labs.com/core/software/pupil-player/#export

user-887da7 21 November, 2022, 19:38:32

Thank you for your reply. This is really helpfulπŸ˜€ I have looked up the materials you provide. I am now processing a gaze dataset that is collected via pupil labs. The gaze data and its description is in Figure 1. And I print the data of the first five timestamp in Figure 2. I have a few questions about the dataset and hope you could explain it for me.

  1. It looks like this dataset not only contains the center/normal data of left and right eye, but also has "3D Gaze Position in relation to HMD/world (X,Y,Z)". I don't find any relationship of this data with left/right data. Moreover, it looks like in the documentation, the pupil lab will collect the gaze data from left and right eyes separately. So I don't know how to get this "integrated 3D gaze position" data the the meaning of it. Could you explain it for me?

  2. Based on the documentation, it looks like the gaze data collected via pupil lab is in world coordinates. But in the dataset, it looks like most of the data is collected based on HMD coordinate. I was wondering can pupil lab also collect the data on HMD coordinates and how can I convert it to worlds coordinates?

Chat image Chat image

user-80123a 21 November, 2022, 14:27:20

Hello all, I used the pupil-video backend on a raspberry and a PC. The raspberry takes the video flux and the PC use the video flux to compute the pupil detection. Everything looks right on the raspberry part.

Chat image

user-80123a 21 November, 2022, 14:27:45

However, I cannot see the video on the PC

user-80123a 21 November, 2022, 14:28:45

I just saw a "HMD Streaming" on the video source/settings

user-80123a 21 November, 2022, 14:29:47

Chat image

user-80123a 21 November, 2022, 14:33:00

When I developed the list on the "Active Device", I only saw "Local USB"

user-80123a 21 November, 2022, 14:33:56

I use Windows PC

user-d407c1 21 November, 2022, 14:37:51

Have you tried toggling on the "enable manual camera selection" on that panel? the icon will become green and you will be able to select other sources

user-80123a 21 November, 2022, 14:42:04

I already tried to enable the button, another source appears on the list. But when I select the new source, nothing happen

user-80123a 21 November, 2022, 14:42:38

Chat image

user-80123a 21 November, 2022, 14:44:59

Actually, I cannot select the new source

user-d407c1 21 November, 2022, 14:46:37

Are you able to access the streaming (eye camera) from other apps (e.g. OBS, Meet,...)?

user-80123a 21 November, 2022, 14:47:48

I never tried, how can I do that?

user-f590a4 21 November, 2022, 17:43:56

hi i have a question, is there a fast way to extract blink count data from about 500 recordings? Without using PupilPlayer?

user-c78bd2 21 November, 2022, 19:41:11

hi @&288503824266690561 - I am looking at your DIY core solution. Is the published BOM (google sheets: Pupil Headset BOM : Sheet 1) the most up-to-date version? I see that the base compatibility requirements are UVC 1.1 - are there any additional limitations with the DIY solution and the available github pupil labs software?

user-e3f20f 22 November, 2022, 08:35:32

Please see the first 5 points https://gist.github.com/papr/b258e0e944604375752eae502b4ad3d5 (i.e. the camera needs to support mjpeg, not h264)

user-908b50 21 November, 2022, 22:49:28

HI, just a general question for people using the surface tracker, What do you do if participants moved a lot? This led to surface moving quite a lot in our case and I am not sure what to do. Please help! Also, for videos we had trouble getting the plugin to work. Support appreciated!

user-e3f20f 22 November, 2022, 08:31:21

Is the issue that the surface markers are no longer being recognized due to motion blur?

user-908b50 23 November, 2022, 20:12:27

Yes, for some videos, the surface seems to be unstable and vary because of movement. In other cases, the videos doesn't load. Or, the markers aren't recognized. I am trying to salvage the "lost data". I am also wondering what are the best steps for using/not-using data where there might have been clear issues with gaze reliability during calibration on certain parts of the screen. Do we use that data with a caveat?

user-e3f20f 22 November, 2022, 08:46:01

The issue should be solvable. Let me write up an example.

user-a3e405 24 November, 2022, 08:10:00

Thank you for the prompt response! Please let me know when the example is ready. I really appreciate your help!

user-3aea81 22 November, 2022, 13:41:11

Hello

user-3aea81 22 November, 2022, 13:41:20

I have 1 Q

user-3aea81 22 November, 2022, 13:42:21

When I do calibration after that

user-3aea81 22 November, 2022, 13:42:54

Can see a square box in monitor

user-e3f20f 22 November, 2022, 13:43:27

Which color has it? Is it green?

user-3aea81 22 November, 2022, 13:43:17

What is that??

user-3aea81 22 November, 2022, 13:44:35

I don't have photo.... but hmm...

user-3aea81 22 November, 2022, 13:44:46

Can see in capture

user-3aea81 22 November, 2022, 13:56:18

Like brown

user-e3f20f 22 November, 2022, 13:57:07

Could you please share a screenshot of what you are referring to?

user-3aea81 22 November, 2022, 13:57:59

I will check now

user-3aea81 22 November, 2022, 14:00:50

Chat image

user-3aea81 22 November, 2022, 14:01:02

Can you see that box??

user-e3f20f 22 November, 2022, 14:01:31

Yes, that is the box that I had in mind. This is the calibration area. The area in which the gaze estimation will be most accurate.

user-3aea81 22 November, 2022, 14:03:05

Aha...okay

user-3aea81 22 November, 2022, 14:07:19

Chat image

user-3aea81 22 November, 2022, 14:08:19

And please a few Description that graph???

user-3aea81 22 November, 2022, 14:09:24

What can I do when I see that...

user-e3f20f 22 November, 2022, 14:10:19

I am not sure what you are referring to. This sounds like you are seeing an error in the picture. But in the picture, the software looks normal to me.

user-3aea81 22 November, 2022, 14:11:14

Yes it's correct.

user-3aea81 22 November, 2022, 14:11:52

I need to describe to customer

user-3aea81 22 November, 2022, 14:12:09

About software

user-3aea81 22 November, 2022, 14:12:44

So nowadays I do operations

user-3aea81 23 November, 2022, 07:58:08

Hello!!~~

user-80123a 23 November, 2022, 10:01:28

Hi all, I currently use an old eye tracker with only 2 cameras: left eye and right eye. So, I would like to know if it is possible to have the old algorithm that you used to calculate the direction of the gaze, when you used only 2 cameras? Thanks in advance.

user-e3f20f 23 November, 2022, 10:07:14

The new eye tracker also uses two eye cameras. If you are referring to the old 3d pupil detection algorithm, you should use Pupil Core software version 2. Note, the 3d pupil detection works per eye camera, I.e. It is agnostic to the amount of cameras

user-80123a 23 November, 2022, 10:09:21

I forgot to mention, the eye tracker without the camera world. Thank you, I will see the version 2 then.

user-e3f20f 23 November, 2022, 10:17:42

Gaze mapping can happen without the world camera. You just need a way to provide calibration targets, e.g. virtual targets. Typically, this corresponds to the use case we have in VR. Pupil Core glasses without scene camea are mostly used for pupillometry-only research.

user-80123a 23 November, 2022, 10:12:23

But where I can find an older version of the pupil software?

user-d407c1 23 November, 2022, 10:15:02

https://github.com/pupil-labs/pupil/releases

user-80123a 23 November, 2022, 10:28:41

So it is possible to provide the calibration target from an eye tracker1, then use the calibration to another eye tracker2?

user-e3f20f 23 November, 2022, 10:29:55

That would only work if both eye trackers had the same exact camera positions. So likely, no

user-a9c776 23 November, 2022, 11:54:39

Hello, I have a question about the core eyetracker. Is there any analysis software from pupil labs that can automatically evaluate the recordings made? The recordings are around 2 hours long and an automatic evaluation would be a significant advance over doing it by hand. Or are there any recommendations for such a project? Thanks for an answer

user-e3f20f 23 November, 2022, 11:57:14

Hi, what kind of metrics/evaluations are you looking for?

user-a9c776 23 November, 2022, 11:59:14

We want to measure the time the user is looking at a certain area. Like for example monitors (up to 6) and then per monitor a time as output value.

user-e3f20f 23 November, 2022, 11:59:57

Have you been using the surface tracker in realtime/Pupil Capture?

user-a9c776 23 November, 2022, 12:00:13

Yes

user-e3f20f 23 November, 2022, 12:01:16

Using this script, you can extract the realtime-surface-mapped gaze https://gist.github.com/N-M-T/b7221ace2e7acf0c0c836773a3b4cf7c and automate your analysis

user-a9c776 23 November, 2022, 12:01:56

I'll try that, thanksπŸ‘

user-908b50 23 November, 2022, 20:14:08

What does this do? That pupil player already doesn't.

user-e3f20f 23 November, 2022, 20:28:40

It extracts the real-time recorded surface-mapped gaze. Player redetects all markers and surfaces and remap all gaze.

user-e3f20f 23 November, 2022, 20:29:13

If you want share one of those with data@pupil-labs.com and we will try to help you to recover as much as possible

user-908b50 30 November, 2022, 20:22:25

Just done! I have only 3 so far because of space limitations. If the solution tried for this works with the other files also, then I won't need to share those. Thank you!

user-4bc7ca 23 November, 2022, 20:34:01

Hello, I am using the Pupil Core glasses with Unity, and have used scripts from this resource: https://github.com/pupil-labs/hmd-eyes

I am aware this works for the HMD, but I was wondering if it works directly with the Pupil Core glasses?

user-908b50 23 November, 2022, 20:37:03

Okay we have also been using the surface tracker and looking fixations at certain regions. So I am wondering if there is any advantage of re-mapping the gaze. Does it give better surface tracking?

user-e3f20f 23 November, 2022, 20:52:46

Unless you want to use the Post-hoc pupil detection and calibration, there is no advantage

user-040866 24 November, 2022, 06:35:44

Hello. I'm using the pupil core headset. I have a question about gaze point 3D values in the gaze recordings. What is the datum of the 3d coordinates, since there are negative values? and what's their units? Thanks!

user-d407c1 24 November, 2022, 08:19:07

Hi @user-040866! Please check out our 3D camera space coordinates https://docs.pupil-labs.com/core/terminology/#coordinate-system, which is the one followed gazepoint3d follows. I also recommend you to take a look at the reference system here too https://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html. As you can see (x,y) can be negative depending on whether you look to the right-left or up-down. The z-axis should be always positive

The units are in mm

user-3aea81 24 November, 2022, 08:04:13

hello.The client asked for an explanation of the software menu Do you have an explanation for the menu? Not the data on the homepage

user-e3f20f 24 November, 2022, 08:08:59

Hey, the documentation should explain the general menus. Then, each plugin has their own menu. We don't have a description of each single ui element, but the documentation has a section on each plugin. That helps understanding the menu.

user-3aea81 24 November, 2022, 08:11:16

I got it Thank you for always responding kindly and quickly.

user-040866 24 November, 2022, 09:18:52

Thank you. Another question, can 3d gaze points and 2d gaze be converted to each other? How? Many thanks!

user-d407c1 24 November, 2022, 10:27:18

Do you mean to the scene camera coordinates? you already have this value normalised at norm_pos_x and norm_pos_y https://docs.pupil-labs.com/core/software/pupil-player/#gaze-positions-csv

user-e91538 24 November, 2022, 10:45:11

Hi, i have a question: is it possible to use eye trackers while wearing glasses or is it better to take them off?

user-c2d375 24 November, 2022, 10:49:53

Hi @user-e91538 πŸ‘‹ Sometimes it's possible to put the Core headset first and then the eyeglasses, ensuring that eye cameras capture eyes from below the glasses frame. Note that it is not an ideal condition but does work for some people, depending on eye physiology and eyeglasses shape/dimensions.

user-6586ca 24 November, 2022, 12:35:25

Hello everyone ! I have a question. Is it possible using Pupil Core to get the head positions (coordinates of the head or head mouvements) ? Thank you in advance and have a nice day !

user-d407c1 24 November, 2022, 12:39:56

Hi @user-6586ca! Check out the head-pose-tracking plugin https://docs.pupil-labs.com/core/software/pupil-player/#head-pose-tracking

user-6586ca 25 November, 2022, 09:02:48

Thank you very much πŸ™‚

user-78c370 24 November, 2022, 13:04:37

Hello, have some concerns about safety issue of the IR led. Saw in the bom sheet, core used SFH 4050 with 4mW/sr Radiant Intensity. Currently, I can't find this LED at local store. They have other alternatives, Can I consider any IR led around 4mW/sr - 14 mW/sr is safe?

user-78c370 24 November, 2022, 16:01:26

Found an interesting document about eye safety.

REN_an1737_APN_20040119.pdf

user-4c21e5 24 November, 2022, 14:46:15

Hi @user-78c370 πŸ‘‹. The IR LED listed in the DIY bill of materials should be available from online vendors. Note that Core conforms with EU IEC-62471 standard for eye safety. We wouldn't be able to comment on the safety of third-party IR LEDs I'm afraid.

user-78c370 24 November, 2022, 14:48:48

Got it. Thanks for answering!

user-89d824 24 November, 2022, 18:28:33

Hi,

I've noticed that for some of my participants, the edge of the pupil and iris is blurred in the eye camera. At first I thought it's a camera issue but I've swapped cameras and the problem persists.

I also used the same eyetracker/cameras on myself in the same lighting conditions and the problem wasn't reproduced.

My participants' eyes look normal to me, so it wasn't a result of some eye defect. I noticed that the two participants with the same problem had green irises, but not sure about the others who didn't have the problem. I've dark brown eyes, but I doubt everyone else I used the eye-tracker on successfully has dark brown eyes.

(I notice that this person has residual mascara on, but I don't think it would cause what I described -- but I could be wrong)

May I know how what's causing this and how to fix it please? Thanks! πŸ™‚

Chat image

user-6e1219 25 November, 2022, 04:51:19

Can you suggest some automatic pipeline method or code to get Saccades from Pupil Core recorded data.

user-d407c1 25 November, 2022, 08:46:29

Hi @user-89d824 !

Mascara and eye cosmetics can influence the lipidic layer of the tear film. This can be why you might see this blurred area as the tear film becomes oily. You can find more about how eye cosmetics affect the tear film here https://www.dovepress.com/investigating-the-effect-of-eye-cosmetics-on-the-tear-film-current-ins-peer-reviewed-fulltext-article-OPTO



If you have physiological serum at hand, one way could be to "clean" the tear film before measuring.



That said, here are some steps you can take to improve pupil detection: 1. Position the eye camera to minimise 'bright pupil' and/or glare. Specifically, try to reduce the bright spot you can see near the pupil's edge. 2. Manually change the exposure settings to optimise the contrast between the pupil and the iris: https://drive.google.com/file/d/1SPwxL8iGRPJe8BFDBfzWWtvzA8UdqM6E/view?usp=sharing 3. Set the ROI to only include the eye region, excluding the dark corners of the image. Note that it is important not to set it too small (watch to the end): https://drive.google.com/file/d/1NRozA9i0SDMe_uQdjC2jIr000iPjqqVH/view?usp=sharing 4. Modify the 2D detector settings: https://docs.pupil-labs.com/core/software/pupil-capture/#pupil-detector-2d-settings 5. Adjust gain, brightness, and contrast: In the eye window, click Video Source > Image Post Processing Let us know how it goes!

user-89d824 02 February, 2023, 18:21:49

Hi again,

I've resumed data collection this week and 3 out of 6 of my participants have the same problem (blurred border between the pupil and iris). One of them had some residual mascara on her lashes, the other had 'permanent mascara' (see photo) that they cannot remove, and the last one did not seem to have residual makeup on her eyes but applies makeup daily otherwise.

Previously I have discussed with my supervisor about getting physiological serum but they weren't very keen on that because the ethical approval process might take too long.

I have tried the steps you proposed last time but none of them worked. That said, it's possible that I haven't done them correctly, especially Step 1. I have tried adjusting the cameras in all sorts of directions but the pupil still remains undetected. I don't think I know how to remove the 'bright pupil' problem 😦

I've also tried Steps 2 to 5 that didn't help.

My sample consists of mostly female participants and I believe a majority of them wear some form of eye make-up daily. It doesn't seem feasible to only recruit those who don't wear makeup (regularly) -- I would have a difficult time finding participants.

Here's a video recording of another one of my participants with this problem: https://youtu.be/unmY9iWN8ks

I'm guessing that this problem has to be quite common given how many women wear makeup?

Chat image

user-89d824 01 December, 2022, 12:57:40

Hi,

I had another participant with the same problem but I didn't manage to improve pupil detection based on the instructions given -- maybe I need more practice with that.

Nevertheless, I might want to try getting some physiological serum. How do you 'clean' the tear film with that? Do you soak a cotton ball with the serum and gently dab the eyeball? Or do you just pour it directly into the eyes and dab it dry?

user-89d824 25 November, 2022, 14:21:23

Thanks a lot! I'll implement these steps if and when I get another participant with the same issue πŸ™‚

user-e91538 25 November, 2022, 10:29:15

Hello, how can I extract the surface identified by the markers in order to overlay the heatmap?

user-c2d375 25 November, 2022, 11:04:19

Hi @user-e91538 πŸ‘‹ You can use our Surface Tracker plugin to obtain gaze data relative to the surface and generate the heatmap. Please take a look here for further details https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking

user-e91538 25 November, 2022, 12:05:38

Is there a size of core pupil glasses for children age 4?

user-4c21e5 25 November, 2022, 15:00:53

Hi @user-e91538 πŸ‘‹. We offer a child-sized frame (not listed on the store) suitable for 3-9 year olds at the same cost as the adult-sized headset. If you would like to get a quote or place an order for a child-sized frame, just leave a note in the order form or reach out to [email removed]

user-e91538 25 November, 2022, 12:41:32

Yes thank you, I had already seen this but I would like a heatmap on the surface itself, overlapping the two

user-c2d375 25 November, 2022, 13:53:32

It's possible to generate a heatmap overlapping the surface both in real-time and post-hoc. In Pupil Capture, enable the "Show Heatmap" toggle in the Surface Tracker plugin to visualize a real-time heatmap of gaze over the surface. In Pupil Player, we do offer a post-hoc version of the Surface Tracker plugin to create and export heatmaps within the defined surface. More details here: https://docs.pupil-labs.com/core/software/pupil-player/#surface-tracker

user-e91538 25 November, 2022, 14:45:43

thank you very much, I have one last question: is it possible to use the Surface Trackers while the user is on the move? I tried to use them and the heatmap is deformed and not very precise. I would need it for a walking assignment

user-c2d375 25 November, 2022, 15:52:08

Unfortunately, motion blur induced by head movements may lead to failure in markers detection - resulting in a deformed surface. To improve detection, please ensure a sufficiently large white border around markers, make sure the scene is sufficiently illuminated and try to avoid abrupt movements that could cause motion blur.

user-1cd4d9 27 November, 2022, 01:19:08

Hi, I have question regarding the message timestamps I am using the Network API to read the pupil messages in real-time but I have observed that the pupil messages I receive have timestamps which are behind the current pupil time. Here's the code I am using ``` while True: socket.send_string('t') curr_time = float(socket.recv()) logging.info(f'Before : Current time {curr_time}') topic, payload = subscriber.recv_multipart() msg = msgpack.loads(payload, raw=False) timestamp = msg['timestamp'] logging.info(f'Msg timestamp {timestamp}') socket.send_string('t') curr_time = float(socket.recv()) logging.info(f'After : Current time {curr_time}')

And an example of the output I observe - 

INFO: Before : Current time 8087.21867836 INFO: Msg timestamp 8087.211949 INFO: After : Current time 8087.227443357 `` Is this expected? I'd expect the message timestamp to be in between theBeforeandAfter` current time in this case Maybe I am doing something wrong here?

user-e3f20f 27 November, 2022, 12:55:14

Pupil data inherits their timestamps from the eye images which are in turn timestamped at their exposure. The time difference between now and the pupil datum corresponds to its processing time, including the transfer over the network.

Note, that your way to estimate the current pupil time is not as accurate as it could be. See https://github.com/pupil-labs/pupil-helpers/blob/master/python/simple_realtime_time_sync.py

user-a3e405 28 November, 2022, 11:27:16

Hello Pupil team, this is Amber again. I recently asked about using the nearest older transformation to get surface data at a higher sampling rate, and you mentioned that you will write up an example. I'm not sure if I missed the example because I couldn't find it on Discord anymore πŸ˜₯ . Any pointers will be very much appreciated!

user-e3f20f 28 November, 2022, 17:13:32

Here you go! https://gist.github.com/papr/da7b17c165ccbfa6a6835e261607c51e

user-e3f20f 28 November, 2022, 12:29:07

Hey, sorry for the delay. I will try to get this done today

user-746d07 28 November, 2022, 14:53:19

Hello Pupil team, I am trying to do an experiment using pupil core to measure people's gaze while they are doing their daily activities. By daily activities, I mean for example, cooking, watching TV, etc. What exactly do I need to do when conducting such an experiment? For example, do we need to use printed calibration points instead of using the computer screen for calibration?

user-e3f20f 28 November, 2022, 15:05:45

Hi, is there a reason for using Pupil Core? Pupil Invisible might be much better suited for this use case. That said, you can use printed markers if you want, but you can similarly use the screen marker calibration. That is because Pupil Core calibrates gaze relative to the scene camera, not the environment.

user-746d07 29 November, 2022, 18:42:26

I understand about the calibration. Thank you. Due to budget and experiment schedule, it is difficult to buy pupil invisible now. If the examinee works somewhat, is it ok to use a longer USB instead of the included one? I also have a question about syncing with external sensors. I understand that the timestamp in gaze_positions.csv represents UNIX time. The attached picture would be the timestamp when recorded on 2022/09/13 04:22 (UTC), but this number does not seem to represent the correct time. Could you please give me your opinion on this?

Chat image

user-1cd4d9 28 November, 2022, 16:18:20

Hi, thanks for your reply. I did follow the script you have linked but, correct me if I am wrong, isn't that just for synchronizing clocks? I am just concerned about the processing time at this point as my RTT over the network seems significantly less than that (RTT ~ 1e-5s) Does it usually take ~ 0.01s for the processing?

user-e3f20f 28 November, 2022, 16:21:30

To estimate the correct delay, you need to know the pupil time at arrival of datum in question. This is what the linked script is all about. The t request is not instant and includes process/transfer time as well.

10ms does not sound that unlikely. Camera latency alone is 8.5ms

user-1cd4d9 28 November, 2022, 16:38:24

To estimate the correct delay you need

user-a3e405 28 November, 2022, 17:48:22

Thank you so much!! I will work on it and post here if I have follow up questions.

user-bdb6e6 28 November, 2022, 21:14:53

Hello Pupil team, My colleagues and I are wanting to measure aspects of the vestibular ocular reflex with the Pupil Core system; specifically, ocular torsion. I was wondering if that was possible, and if so, which output measure should we look at?

user-d407c1 29 November, 2022, 09:04:31

Hi @user-bdb6e6 Pupil Core does not provide with cyclotorsions measurements. To collect these measurements, you will need to get an Iris pattern in the eye camera, and check in the next frames for rotations.

While, you can change the eye cameras resolution to 400 by 400px, it might not be enough to detect features of the Iris, especially if the subject' iris has not many salient features to match (like freckles or distinctive collagen patterns) Potentially, one can use a cosmetic contact lens with an irregular pattern, as they may have more salient patterns to match, but that contact lens needs to be fitted by a professional, and you will have to account for movement of the contact lens. Astigmatic (toric) contact lenses have an specific marking that optometrist use to know whether the lens is correctly placed, yet those markings would most probably not be seen by the eye camera, as they are quite subtle.

user-bdb6e6 29 November, 2022, 19:50:33

Thank you, I appreciate the help!

End of November archive