πŸ‘ core


Year

Month (messages)

user-4a1dfb 06 October, 2025, 22:05:08

Hello, I have question about a weird gaze offset compared to what I am looking at and what the pupil core DIY thinks I am looking at.

For example, in this picture I am looking at the cursor here but the gaze dot is always a bit offset, this was a problem throughout the session. I'm not sure exactly what the problem is, (world camera focusing, how far I should be looking at the screen, etc,) but I would like some general pointers on what I should do, to debug this.

Chat image

user-f43a29 07 October, 2025, 09:57:49

Hi @user-4a1dfb , is this happening for all recordings or just this one?

user-e6fb99 08 October, 2025, 14:22:57

Hello, Is it possible to get saccade information in core ? If yes how to set it up.

user-4c21e5 09 October, 2025, 08:23:04

Hi @user-e6fb99. The Core system does not output saccades by default. But there are some community-contributed tools that compute saccades. You can find more int he community repo.

user-4a1dfb 10 October, 2025, 01:44:02

[email removed] , is this happening for all

user-74b1c6 13 October, 2025, 08:14:33

Hi everyone, I’m analyzing Pupil Core recordings offline and need guidance on two points: 1. Time to First Fixation (TTFF) β€” What’s the recommended way to compute TTFF on a surface/AOI using Pupil exports? 2. Aggregated heatmap across participants β€” Best practice to generate a grand heatmap on a static stimulus?

user-f43a29 13 October, 2025, 08:20:33

Hi @user-74b1c6 πŸ‘‹

  • Do you want to compute TTFF with respect to the start of the recording or with respect to start of a specific event, like the start of a stimulus presentation?
  • While we don't exactly have a Best Practice for heatmap generation, as there is flexbility there, you could reference this tutorial.
user-74b1c6 13 October, 2025, 08:24:50

@user-f43a29 Thanks, Rob! πŸ™ Yes β€” I want to compute TTFF with respect to the start of the stimulus presentation, not the recording start.

user-f43a29 13 October, 2025, 08:27:35

Ok! Do you have Annotations already saved in your recording for those moments?

user-74b1c6 13 October, 2025, 08:32:33

I didn’t use the Annotations during recording, but I designed the task in PsychoPy, so I have the exact stimulus onset timestamps for each trial.

user-74b1c6 14 October, 2025, 08:53:19

Hi everyone, I defined a Surface manually in one recording, and I’d like to use exactly the same surface coordinates and size in all my other recordings (same monitor setup, same AprilTags). What should I do?

user-f43a29 14 October, 2025, 08:58:54

Hi @user-74b1c6 , please try my colleague's, @user-4c21e5 , tip here and let us know how it goes: https://discord.com/channels/285728493612957698/285728493612957698/1305841351916523561

user-74b1c6 20 October, 2025, 05:59:25

Hi Rob, it was helpful. thanks a lot. I have another question. According to the picture, sometimes β€” even though everything seems correct (surface tracking, calibration, etc.) β€” the software doesn’t detect any fixations?

Chat image

user-4a1dfb 17 October, 2025, 22:31:12

Hello, for the Pupil Core eye cameras, are there any "ideal" settings that should be used or any best practices on what settings to change if certain things happen (E.g., model not sticking well to the pupil, should you then increase the brightness of the IR lights or if someone is using glasses while using the device?) Thank you.

user-f43a29 20 October, 2025, 09:24:52

Hi @user-4a1dfb , Pupil Capture does not provide a setting to control the intensity of the IR illuminators and while others have made it work, using Pupil Core with glasses can be tricky.

Otherwise, the default settings are usually sufficient for most use cases. With respect to the model not sticking well, be sure to first have the pupils well-centered in the eye camera images.

user-f43a29 20 October, 2025, 09:23:18

Hi @user-74b1c6 , you are welcome. Would you be open to sharing the recording with us for closer inspection? If so, you can share it with data@pupil-labs.com via Google Drive, for example.

user-74b1c6 20 October, 2025, 09:29:25

OK

user-45048d 20 October, 2025, 23:44:19

Hi everyone. I have connected eye tracker with the laptop and using pupil capture. Eye0 and Eye 1 camera is working but I am not able to see world camera view. I unplugged and connected it and did restart it many times but its not working. what should i do?

user-d407c1 21 October, 2025, 06:47:01

Hi @user-45048d πŸ‘‹ ! Is it the first time that you connect Pupil Core to that laptop? Could you share whether you are using Windows, Mac or Linux?

user-b02f36 21 October, 2025, 09:24:51

Hello! We now want to use four eye cameras in pupil capture to obtain pupil positions and shoot videos. Is this possible? I only found demo for capturing video frames in the GitHub repository. Many thanks!

user-f43a29 21 October, 2025, 11:22:00

Hi @user-b02f36 , you could write a Pupil Capture plugin for that, or use Annotations to synchronise the different cameras. Lab Streaming Layer would also be an option. See here for more details on some of these options.

user-b02f36 24 October, 2025, 10:05:11

Hi, Rob. I tried to write a Pupil Capture plugin for using four eye cameras. However there is something wrong when starting my plugin. I found that in Line 386, capture.log, there was an error: 2025-10-24 16:59:50,247 - world - [ERROR] launchables.world: Attempt to load unknown plugin: 2. I also found that this error was related to ~/pupil_capture_settings/user_settings_world. In addition, in capture.log, it seemed that the written plugin was successfully scanned. Is there any possible methods to solve this problem? Here I give you my capture.log and plugin code. I use Pupil capture 3.5.1 and I write my plugin based on python 3.11.14 in miniconda. Many Thanks!

capture.log user_settings_world MonitorEyesSelector.py

user-b02f36 22 October, 2025, 03:10:07

I see. Thank you, Rob! btw, is it possible to run pupil capture from source on Windows? I only tried to run it from source on Ubuntu 20.04.

user-4c21e5 22 October, 2025, 03:14:15

Yes, running from source on Windows, Linux, and Mac is possible.

user-b02f36 22 October, 2025, 04:18:11

I see. It is possible to run from source using miniconda! Thank you, Neil!

user-4c21e5 22 October, 2025, 05:40:41

I believe so. Best to give it a try!

user-d9be4a 24 October, 2025, 05:53:37

Hello, what operations should be performed for dynamic areas of interest (tracking moving pedestrians)? We have built a scenario in Unity where pedestrians are crossing the road, and participants are required to wear eye-tracking devices to see whether they can notice these pedestrians.

user-f43a29 24 October, 2025, 10:01:06

Hi @user-d9be4a , are you using Unity for VR stimuli or are you simply presenting stimuli on a computer monitor?

user-d9be4a 24 October, 2025, 12:39:19

Hi,Rob.Stimuli are presented on the computer monitor, such as the pedestrians crossing the street in the figure below.Thank you!

Chat image

user-f43a29 24 October, 2025, 10:21:55

Hi @user-b02f36 , I am not able to do code reviews or provide detailed support for adding custom DIY cameras to Pupil Core. Upon a quick inspection of the code, I do not immediately see anything erroneous. Perhaps try trimming down the Plugin or progressively commenting out sections until you find the troublesome bit of code.

Also, do you mean you are running Pupil Capture from within a miniconda installation? You may want to rather just use plain Python, installed from the Python Org website, with a fresh virtual environment. Sometimes conda can inadvertently introduce issues.

Otherwise, perhaps members of the community who have tried similar can share their experiences.

user-b02f36 24 October, 2025, 12:36:35

OK Rob, I will try to simplify my code first. If I still have questions or find a solution, I will reply to you in the channel. Thanks sincerely!πŸ™‚

user-f43a29 24 October, 2025, 23:00:40

Hi @user-d9be4a , did you also use AprilTags with Pupil Core's Surface Tracking capabilities?

user-d9be4a 25 October, 2025, 05:16:29

Yes, before using the PupilCore, I set up an AprilTag in the corner of the computer monitor. Since we need to observe the entire process of pedestrians crossing the street, I'm not entirely sure if this was the correct approach.

user-0b1050 27 October, 2025, 09:02:11

Hi there, I'm performing a research data collection using Pupil Core, but some of my participants have prescription glasses. I saw another thread where it was explained that it will be challenging, but I was wondering if there is a way to still process the eye video data offline to get better gaze detection results despite the prescription glasses being there? I could not put the eye cameras under the person's glasses so I kept them on top. Can I use the egocentric video and eye videos to generate the gaze trackig in post-hoc? Thank you in advance!

user-f43a29 27 October, 2025, 09:10:40

Hi @user-0b1050 , you can simply start by trying the standard procedure of loading the recordings into Pupil Player and seeing how that looks.

user-f43a29 27 October, 2025, 09:13:17

Hi @user-d9be4a , using AprilTags is correct, but if you only used one tag, then the Surface Tracking plugin will not work correctly or robustly. As an alternative, if you only need to know if gaze was on/off of pedestrians, then you could try an object detection routine, such as those provided with YOLO to detect the pedestrians in Pupil Core's world camera video and then check if the gaze data overlap with the segmentation.

user-d9be4a 27 October, 2025, 11:48:35

Thank you for your answer. Is it feasible to post-hoc define areas of interest with Pupil Player and make the AOIs follow the moving pedestrians?

user-0b1050 27 October, 2025, 09:19:39

Thanks @user-f43a29, I forgot to mention that I am not making the recordings using Pupul Capture. I am using custom system that integrated PupilLabs.PupilFacade for online gaze detection and the saved data is not in the format that Pupil Player expects it. I also don't have the calibration data being saved. Would it still be possible to do any post-processing of the videos with this type of approach?

user-f43a29 27 October, 2025, 10:07:24

@user-0b1050 may I first ask what PupilLabs.PupilFacade is?

user-f43a29 27 October, 2025, 12:58:55

Yes. Defining AOIs with Pupil Player requires use of the Surface Tracking plugin, which works best with at least three AprilTags, preferably more.

user-d9be4a 27 October, 2025, 15:16:35

I will try placing a few more AprilTags,Thanks

user-ee0a9a 28 October, 2025, 17:28:36

Hello β€” my goal is to calculate pupil angular velocity and detect microsaccades using Pupil Core. May I ask following questions:

  1. Which columns should I use? Should I use theta and phi from pupil_positions.csv, or compute theta and phi from gaze_positions.csv, or use other fields?

  2. Is a sampling rate β‰₯ 200 Hz required? How do I set that sampling rate? If the actual sampling rate is lower than 200 Hz, is upsampling (interpolation) acceptable?

Thanks for any guidance!

user-f43a29 29 October, 2025, 09:17:22

Hi @user-ee0a9a , you can of course ask questions.

  • If you want pupil angular velocity, which is essentially the speed at which the eyeball is rotating, then you want to work with pupil_positions.csv. However, if your interest is microsaccades, then you would typically look at gaze_positions.csv. It comes down to whether you are interested in the optical axes or the visual axes.

  • Pupil Core does not record faster than 200 Hz.

user-ee61dc 29 October, 2025, 10:46:46

Will a computer with this 13-inch model run properly when using Pupil Core? https://www.microsoft.com/ja-jp/surface/devices/surface-pro#tech-specs-uidee13

user-d407c1 29 October, 2025, 11:02:02

Hi @user-ee61dc ! You may find some compatibility issues with that ARM processor, you may want to look for an x86 chipset.

user-ee0a9a 30 October, 2025, 06:53:51

Thank you so much Miguel! Should microsaccade events be limited to the time periods in fixations.csv, since in theory they occur only during fixations?

user-f43a29 30 October, 2025, 08:07:59

Hi @user-ee0a9a , that sounds like a reasonable first approach.

End of October archive