👁 core


user-357ae8 01 July, 2025, 17:41:46

Hi im trying to get a good calibration value but I can't get the angular accuracy to be less than 3 degrees and when I test it out it is too inaccurate. I have it on default settings currently and have just been doing the online calibration on pupil capture. Could someone who has a good calibration technique please help?

user-f43a29 01 July, 2025, 18:27:14

Hi @user-357ae8 , if you could share a recording of the calibration process with [email removed] then we can give direct feedback. You can put it on Google Drive, for example.

user-ad361c 02 July, 2025, 16:19:26

Hallo everyone,

Question Regarding Eye Tracking Core and Surface Plugin

In my experiment, I have defined two surfaces (two screens), each marked with four fully visible QR codes. The screens are positioned side by side.

However, I consistently encounter an issue where the surfaces are not stable, they frequently lose tracking, and only the surface boundaries appear (highlighted in pink). I've tried using the “Frozen Screen” option in the Surface Plugin multiple times, but it doesn’t seem to help reliably. I'm unsure if this is expected behavior or if there’s something wrong with my setup.

Additionally, I have a second question: On one of the screens, is it possible to accurately determine whether a person is looking at a specific widget or region within the screen? If so, what’s the recommended way to achieve that?

Thank you!

user-f43a29 02 July, 2025, 17:08:29

Hi @user-ad361c , if you could share a Pupil Capture recording of the Surface Tracking with [email removed] then we can give direct feedback. You can put it on Google Drive, for example.

user-dd61bd 03 July, 2025, 09:00:22

Hello everyone. I use Pupil Player to export the eye-tracking data. During the process, the message shows that "failed: not enough markers with the defined origin marker id were collected"; therefore, I want to ask what this means. Did I miss some process or something doing wrong?

user-f43a29 04 July, 2025, 08:31:53

With respect to your question about the recording, are you trying to use the Surface Tracking plugin in Pupil Player? In other words, do you want to know where they look on the tablet screen?

user-f43a29 03 July, 2025, 12:08:36

Hi @user-dd61bd , this could indicate that the AprilTags for your Surface were not detected or perhaps there was some motion blur. If you could share the recording with us [email removed] then we can take a look and provide more direct feedback. You can put it on Google Drive, for example.

user-dd61bd 04 July, 2025, 00:54:28

@user-f43a29 Thank you for your responses. I am providing the Google Drive link that contains the original eye recording files, including the exported files. Please check it out. Thanks again for your help.

user-f43a29 04 July, 2025, 08:29:55

Hi @user-dd61bd , please note that you have shared this recording publicly. If this recording must stay private, then I'd recommend removing the link and sharing it with us at data@pupil-labs.com

user-dd61bd 04 July, 2025, 09:02:06

@user-f43a29 Thank you for your reminder. I will send you the video information link [email removed] Additionally, the Discord message has been retracted. The Surface Tracking plugin is currently disabled during the export process. I need to know the position of the subject's fixation on the tablet screen. Based on this information, should I activate the Surface Tracking plugin? Thank you.

user-f43a29 04 July, 2025, 09:30:02

If you want to know where they are looking on the tablet screen, then using the Surface Tracking plugin with AprilTag markers is the standard approach.

user-dd61bd 06 July, 2025, 12:13:08

@user-f43a29 Thank you for your suggestion. I will give it a try.

user-bd106e 06 July, 2025, 14:39:22

I have a pupilcore device, and I want to know the detailed camera internal parameters of the human eye camera. How can I find this parameter? Can you tell me?

user-f43a29 07 July, 2025, 09:57:46

Hi @user-bd106e , do you mean the intrinsics of the eye cameras? If so, then as documented here:

When a recording is started in Pupil Capture, the application saves the active camera intrinsics to the world.intrinsics, eye0.intrinsics, and eye1.intrinsics files within the recording.

user-e8726c 07 July, 2025, 11:40:31
user-d407c1 08 July, 2025, 08:28:50

Hi @user-e8726c ! We have replied by email too, the license of pye3d has been changed to LGPLv3, you can find it on the repo.

user-e8726c 07 July, 2025, 11:40:31

Anyone can answer this question. Can Pye3D be used for non-profit academic purposes? I can't find any clear license on it's github

user-e8726c 07 July, 2025, 12:44:52

More specifically, I wanted to ask:

  • Can I use pye3d in academic publications?

  • May I include it in a public GitHub repository?

user-84387e 07 July, 2025, 13:30:48

Oh nice! Convenient, We come from Mainz/Wiesbaden! 😮 Thanks for the heads-up!

user-84387e 07 July, 2025, 16:13:27

Oh, its 500 bucks?

user-f43a29 07 July, 2025, 16:17:55

Yes, depending on when one registers, that is roughly the price of registration. The organizing committee of ECVP is better positioned to provide info about pricing, though.

user-84387e 08 July, 2025, 13:47:28

I asked per E-Mail: As a visitor it seems to be free. So we will see each other 🙌

user-84387e 08 July, 2025, 12:41:35

Hm. I'm wondering if I even need to pay anything when I'm not planning to present anything. I will find that out.

user-84387e 08 July, 2025, 10:00:05

NOOOUUU

user-f43a29 08 July, 2025, 08:41:35

@user-84387e This ☝️ is also of relevance to you, if I remember correctly (https://discord.com/channels/285728493612957698/446977689690177536/1326152860131524618).

user-84387e 08 July, 2025, 09:59:49

YEEEEAASS

user-5c9c29 08 July, 2025, 12:52:49

Hello all, I used the equipment and when i export the csv, not all the timestamps has the diameter for both eyes, sometimes it has only for the left or for the right. Any reason for this happening?

user-f43a29 08 July, 2025, 13:10:10

Hi @user-5c9c29 , can you share an example recoding with [email removed] Then, we can provide more direct feedback. You can share it via Google Drive, for example.

It could be that the pupils were sometimes simply not detected.

user-5c9c29 08 July, 2025, 13:13:09

So sometimes we have an intercalated detection of left and right detection for like 15 time stamps and then it comes back to normal (normal is to one row of timestamp, two diameters, one for right and one for the left eye) And i would like to know possible causes, for i can understand if it’s needed to discard these parts of data. Thank you for the answer

user-f43a29 08 July, 2025, 13:20:18

Hi @user-5c9c29 , may I ask for more clarification about "intercalated detection"?

An effective way to determine what was going on is if we can see a screenshot or a video of how Pupil Core was imaging the eyes during those moments in the experiment. If you cannot provide the original recording, then that is okay. You can also send us a screenshot or video via DM.

If that is also not possible, then you may want to check pupil detection confidence and blinks at those timepoints.

user-5c9c29 08 July, 2025, 13:23:36

Intercalated detection means that for 1° timestamp i have the diameter for the left eye (and don’t have for the right eye); for 2° timestamp i have the diameter for the right eye (and don’t have for the left eye), and go on…. This happens sometimes for like 15 timestamps and then comes back to normal.

user-f43a29 08 July, 2025, 13:36:36

Are you looking at the diameter_3d column of pupil_positions.csv and only using the 3d c++ method rows?

user-5c9c29 08 July, 2025, 13:44:15

I’m looking for the pupil_positions.csv and looking for the column pupil_timestamp and the column eye_id and for both methods that i have

user-f43a29 08 July, 2025, 13:48:48

Ok, I see.

As mentioned, if you can share a recording with us or send a photo of what the eye images from Pupil Core looked like during those moments, then we can better assist you.

Otherwise, you will want to check pupil detection confidence and blinks at those timepoints.

user-e571f5 09 July, 2025, 03:20:40

Pupil Camera Not Detected via cv2.VideoCapture (Index 0–100 All Failed) Hi, everyone, I'm using a single Pupil Labs eye camera connected to my Windows system. The device appears in Device Manager under "libusbK USB Devices", so I believe it's recognized by the system. However, when I try to access the camera using OpenCV's cv2.VideoCapture, none of the indices from 0 to 100 work. I looped through all of them, and none returned a valid frame — or even opened successfully. In Pupil Capture, the camera appears but doesn't stream any video.

Is this because the Pupil camera doesn't expose itself as a standard UVC device?
Do I need a specific SDK or driver to access it outside of Pupil software?

Any advice would be greatly appreciated.

Thanks!

user-d407c1 09 July, 2025, 07:00:53

Hi @user-e571f5 👋 ! Pupil Core cameras are UVC-compliant, so you should be able to access them via pyuvc.

That said, before diving into that, I’d recommend first checking that everything works correctly within Pupil Capture.

Could you try the troubleshooting steps outlined here ?

user-764356 09 July, 2025, 19:12:15

Hi! I'm using the Network API in Python to receive data in real time (I adapted the exmaple code from its document). However, I noticed that if I'm not requesting data at all time, the data would be "buffered", and the next time I request data (let's say 10s later), it would send over old data first from 10s ago. But I would like to get the lastest eye data at the moment I request it. Is it possible to set this up?

Thanks!

user-cdcab0 10 July, 2025, 08:00:48

Hi, @user-764356 - with the Network API, you need to consume messages as quickly as they are produced or you risk losing data. You may choose to do nothing with messages if you are not ready for some or do not need them, but you do need to pull every message out of the delivery queue

user-764356 10 July, 2025, 22:18:34

Gotcha, thanks so much for the info!

user-f3c898 11 July, 2025, 20:55:29

hi for diy pupil core the world camrea not recognized by the software. I uninstalled the driver and did the troubleshooting process but still not work

user-d407c1 16 July, 2025, 06:44:53

Hi @user-f3c898 ! Is the camera you are using UVC compliant?

user-bd5142 16 July, 2025, 05:15:48

Hi Where can I find the API file that directly requests the current user's perspective file and the position of this gaze point in the image? Thanks

user-d407c1 16 July, 2025, 06:44:19

Hi @user-bd5142 👋 ! Just to confirm, are you looking to stream the scene camera and overlay gaze data using Pupil Core?

Pupil Core’s Network API uses ZeroMQ for communication. You can find an example here that shows how to stream the scene camera and visualize it.

To overlay gaze, you’ll also need to subscribe to the gaze stream. Here’s a basic example of how to filter and handle messages.

Once subscribed, you can simply plot the gaze data on top of the video stream.

user-bd5142 16 July, 2025, 09:07:58

Thanks

user-f266fb 18 July, 2025, 02:18:55

Thanks! Additional question: how do you count the FoV angle? When I do experiment using Pupil Core, how to count the angle? I believe there is no such value in the exported files. Is this calculated based on the real distance from observed objects?

nmt 18 July, 2025, 02:19:48

Hi, @user-45f4b0. I've moved your message to the appropriate channel for Core 🙂. The FoV can change depending on which resolution and lens you use. You can read the specific values in this section of the docs.

user-c6d54b 18 July, 2025, 07:37:04

Hello, I'm working with a group that is trying to use the Pupil Core headset in an experiment that includes the use of mirrors, and was wondering if there is any information regarding working with mirrors, or if there could be possible discrepancies in the tracking.

user-f43a29 18 July, 2025, 09:14:10

Hi @user-c6d54b , could you describe a bit more what you mean by "use of mirrors" or share a photo? I'm not sure I understand the end goal.

user-c6d54b 18 July, 2025, 12:32:18

we were wondering if the pupil tracker would have any issues reporting eye tracking with a participant looking at a mirror from an angle

user-f43a29 18 July, 2025, 13:07:32

Hi @user-c6d54b , sure, you can still use Pupil Core’s Surface Tracker to determine where they look on a mirror. Just put a few AprilTags around the edges.

user-c6d54b 18 July, 2025, 13:52:18

thank you 🙏

user-d90d1b 22 July, 2025, 00:36:12

Hi, may I ask where the origin of 'gaze_point_3d_x', 'gaze_point_3d_y', and 'gaze_point_3d_z' in gaze_positions.csv is located? Is the unit in meters?

nmt 22 July, 2025, 03:00:54

Hi @user-d90d1b 👋. The values are given in 3D scene camera space. The origin is the centre of the scene camera. Their units are millimetres. You can read more about Core's coordinate systems here and data made available here.

user-d90d1b 22 July, 2025, 09:42:29

Thank you so much.

user-e6fb99 22 July, 2025, 14:20:49

Hello, @nmt I have successfully collected my data from pupil core. I have an annotation file that annotates 1 event. It also has the time stamps for those events. I want to find the same time points from my other files like fixation and gaze position. I have a few question regarding that.

What does the time stamps mean?

If I want to look at fixations 300 ms before and after the annotated event how to do that?

user-6c5c32 22 July, 2025, 22:21:14

^ in the same lab as her and was wondering what the time points represent with them not starting from zero along with a way to know the sample rate of the device

user-cdcab0 22 July, 2025, 22:30:10

Hi, @user-6c5c32 and @user-e6fb99 - timestamps in Core use what's known as "Pupil Time" - seconds measured on an independent clock with an arbitrary start, guaranteed to be monotonically increasing even if the system clock changes mid-recording.

The sampling rate is actually up to you - before you start a recording you can adjust the refresh rate of the eye cameras up to 200 Hz.

user-e6fb99 22 July, 2025, 22:37:12

Can we check the sampling frequency post recording? If yes what would be the step?

user-cdcab0 22 July, 2025, 22:48:14

You can compute it

number_of_samples / (last_timestamp - first_timestamp)
user-6c5c32 23 July, 2025, 01:05:41

what is the independent clock

user-cdcab0 23 July, 2025, 02:09:09

Independent as in "separate from the system clock" - meaning it's not affected by changes to the system clock (like automatic NTP synchronization or manual user adjustments)

and is there any way to make it start at zero or is that always going to happen ?

You could simply subtract the first timestamp from all the others. This computed value will be measured in seconds starting at 0. Alternatively, you could convert to system time

user-6c5c32 23 July, 2025, 01:05:52

and is there any way to make it start at zero or is that always going to happen ?

user-6c5c32 23 July, 2025, 02:53:14

Perfect thank you

user-e6fb99 23 July, 2025, 13:10:06

Hello, when I look at the different exported files the number of data points is different across the different files. gaze_position.csv has a different number of data points than pupil_positions.csv. Which one of these should I use for calculating the sampling frequency? And why are they different. Thanks

user-e6fb99 23 July, 2025, 13:31:50

Also in gaze_timestamp column sometimes there are same values for more than one row. What does that mean?

user-d407c1 23 July, 2025, 13:33:26

Hi @user-e6fb99 , note that the pupil_positions.csv contains rows for each eye as described here https://discord.com/channels/285728493612957698/285728493612957698/1176712871892226170 and for different methods (2d and 3d).

user-e6fb99 23 July, 2025, 13:34:23

Okay. Thank you

user-e6fb99 23 July, 2025, 13:34:30

Also in gaze_timestamp column sometimes there are same values for more than one row. What does that mean?

user-e6fb99 23 July, 2025, 13:35:38

Also the link is broken

user-d407c1 23 July, 2025, 13:37:18

Sy! I just realised this was moved. The link above works fine, it does explain how gaze datum are formed and timestamp matched.

user-e6fb99 23 July, 2025, 14:01:19

Thank you. I have one more question. After processing the surfaces in pupil player. It generated a folder called surfaces which has gaze position on surface 1 , 2, 3. In these files how to look at the world_timestamps? Because in this file some times the world time stamps are repeating 45 times.

user-d407c1 23 July, 2025, 14:08:18

It depends on the sampling rate you’ve set for the scene and eye cameras, if for example you have 30 Hz vs. 120 Hz for the eye cameras. The eye cameras are running at a higher rate, so you may end up with multiple gaze points for the same scene camera frame.

In the gaze_positions_on_surface_XXX.csv file, you’ll note there is also a gaze_timestamp column that reflects this.

user-e6fb99 23 July, 2025, 14:01:45

45 is a random number. But it’s repeating more than 1. Then how do I consider each time points?

user-e6fb99 23 July, 2025, 14:09:19

Yes there is a gaze timestamp which is different from the world timestamp by a few decimals

user-e6fb99 23 July, 2025, 14:10:13

Is it best to first convert to system time? Because I am totally confused as to what each data points mean? And it’s crucial for me because I want to know when the participant left one surface and looked at the other.

user-d407c1 23 July, 2025, 14:30:15

The surf_positions_XXX.csv file indicates when a surface is detected in the scene camera. It provides the world_timestamp and the corresponding frame index where the surface was found.

The gaze_positions_on_surface_XXX.csv file contains gaze points — as reported in gaze_positions.csv — but mapped to the surface coordinate space. These points only appear when the surface was detected in the scene.

To analyze whether a gaze point falls on a surface, you’ll want to: - Merge each gaze_positions_on_surface_XXX.csv file with gaze_positions.csv using the gaze_timestamp. - For each file, add a new column named XXX to gaze_positions.csv. - If gaze_timestamp is present in the surface file and the on_surf column is True, mark that row as True in the new column.

This will give you a per-surface boolean flag indicating whether the gaze point was on that surface at that timestamp in the gaze file.

If using python, something like this:

import pandas as pd
import glob

# Load main gaze data
gaze_df = pd.read_csv("gaze_positions.csv")

# Iterate through all gaze_position_on_surface_*.csv files
for surf_file in glob.glob("surfaces/gaze_positions_on_surface_*.csv"):
    surface_name = surf_file.split("gaze_positions_on_surface_")[-1].split(".csv")[0]
    surf_df = pd.read_csv(surf_file)

    # Create a new column with default False
    gaze_df[surface_name] = False

    # Merge on 'gaze_timestamp' where 'on_surf' is True
    valid_gaze = surf_df[surf_df["on_surf"] == True]
    valid_timestamps = set(valid_gaze["gaze_timestamp"])

    # Mark those timestamps as True in the new column
    gaze_df.loc[gaze_df["gaze_timestamp"].isin(valid_timestamps), surface_name] = True

# Save updated gaze file
gaze_df.to_csv("gaze_positions_with_surface_flags.csv", index=False)

count_true = gaze_df[surface_name].sum()
    print(f"{surface_name}: {count_true} gaze samples on surface")
user-e6fb99 23 July, 2025, 14:36:43

Wow that makes sense. Now this gaze_positions.csv would have all the time points which would correspond to the sampling frequency and I’d know when each data point was generated or the time between each data point which would be 5ms in case the system was sampling at 200 hz. Is my understanding correct?

user-d407c1 24 July, 2025, 07:15:53

That's correct, the sampling rate would depend on the one set in the settings as well as the computational capacities of your PC, if it can't handle you might see frame drops.

Also note the matching of timestamps on the gaze signal, might make it differ from the camera sampling rate, as noted in the link above.

user-e6fb99 23 July, 2025, 14:37:04

If it is then it solves the gaze position issue. But how to do the same for the fixation position.

user-131620 25 July, 2025, 00:00:51

Hi Folks! I am wondering if it is possible to toggle gaze_mapper setting under calibration through code? Usually it will be done through GUI, but it would be wonderful if we can automate it? I wonder if it can be done via the IPC backbone?

user-f43a29 25 July, 2025, 10:09:59

Hi @user-131620 , this is not implemented in Pupil Core's Network API.

user-131620 25 July, 2025, 10:20:13

@user-f43a29 Thanks, I assume this control is not exposed and can only be control via GUI?

user-f43a29 25 July, 2025, 10:23:07

@user-131620 Correct, but if you absolutely need it, you can modify the source code.

user-412dbc 25 July, 2025, 14:42:43

Hello, a quick question. I made a recording using both eye0 and eye1. What I am interested in is eye fixations within AOIs. Post hoc, after I opened the recording, I found out that eye0 was problematic. Is there any way to export the recording using only info from Eye1 and ignore Eye0?

user-f43a29 25 July, 2025, 16:10:14

Hi @user-412dbc , sure. First, may I ask in what way is eye0 problematic?

Otherwise, did you record everything, including the calibration choreography?

user-d9be4a 26 July, 2025, 05:56:15

Hello, there is an issue with the connection cable of our Pupil Core device, and the camera feed is not displaying. Is there anyone who can provide a repair service or assistance?

user-d407c1 28 July, 2025, 08:20:45

Hi! We have followed up on the ticket

user-13e552 28 July, 2025, 17:24:08

Hi everyone! I have a question regarding the Pupil Labs eye tracker. I’m currently using it to track, in real time, the words on the screen which the user are looking at in the World view. However, I’ve noticed that the gaze or heat map often has a lot of jitter/noise — it moves left, right, up, and down rapidly, making it hard to stay focused on a single word for a longer time. Are there recommended settings to reduce this jitter (e.g., smoothing or filtering in the default configuration)? Or is it possible to implement custom code to make the gaze data more stable/accurate? Thanks for any advice!

user-f43a29 28 July, 2025, 18:41:12

Hi @user-13e552 , may I ask for more clarification about "making it hard to stay focused on a single word"? Would you be able to provide a screen capture/video of what you mean? You can share it via DM or email [email removed] if you prefer.

user-764356 29 July, 2025, 15:41:25

Hi! I had a session where the two eye cameras went out of sync while running. Since then there was a slight delay between the recordings of the two eyes and the world view always show two gazes, connected by a red line. Are accurate gaze positions still recoverable? Thanks!

user-f43a29 29 July, 2025, 16:02:34

Hi @user-764356 , would you be able to share this recording with us [email removed] and we can take a closer look?

End of July archive