πŸ‘ core


user-45f4b0 01 April, 2025, 02:38:10

Hello! I found there is small error in gaze calibration. It seems like the center is better than the edge. Is there a standard way to offset this error? Thanks!

Chat image

nmt 02 April, 2025, 02:29:51

Hi @user-45f4b0! This is already looking quite good. A linear offset wouldn't help in this case. Rather, it would be worth checking pupil detection confidence when the wearer is gazing in the top left. Are you still using a 31 point calibration?

user-9f4dff 01 April, 2025, 10:15:05

Hello, I'm Priyanshu, from decision lab, IIT Kanpur, India. We have a pupil core device. We have been facing trouble with respect to the "noise" related issues (power line noise) from pupil device to other near by device (i.e. EEG). We are using LiveAmp EEG with pupil simultaneously to record. However, we see that those electrodes that lie close to the pupil coverage face tremendous noise issues. Therefore, I wanted to know following things: a. What parts of pupil core device generate/or are active source of this power line noise? b. How can we shield this powerline noise? c. Is there a way to cover any active regions such that this noise doesn't reflect in EEG channels. d. Are there other labs, who have been using EEG with Pupil device?If yes, can you please connect us to them?

nmt 02 April, 2025, 02:30:41

Hi @user-9f4dff! Can you share a photo of your setup?

user-b31f13 02 April, 2025, 01:27:18

Hi @user-f43a29 2 quick question: I am looking to study pupil dillation (size) changes overtime.

  1. I don't particularly see the need for using the diameter3d data, as I think 2d data on the pupil size per timestamp should be enough for my purposes and I sort of think that the 2d data would be more natural (aka untouched by more processing that could affect the data in unintended/unaccounted for ways), unlike the 3d data. Please what are your thoughts on this?
  2. In trying to get the 2d diameter, there is ofcourse the "diameter" data column in the pupil_positions.csv but I could also add the ellipse_axis_a and ellipse_axis_b columns and divide by two (which would give me a lesser number). Which of these would you say is a more accurate estimation of the participant's pupil diameter at each timestamp?
user-b31f13 02 April, 2025, 02:55:06

@nmt Please could you help me with my question?

nmt 02 April, 2025, 03:08:42

Hi @user-b31f13! For studying pupil size, we strongly recommend using diameter_3d. This is a measure of pupil size in millimetres, provided by our 3D eye model, as opposed to the apparent pupil size in pixels, as observed in the 2d eye image. The latter is affected by pupil foreshortening, meaning the camera perspective can significantly impact the measured pupil size. You can read more about this in the following section of the documentation: https://docs.pupil-labs.com/core/best-practices/#pupillometry

user-b31f13 02 April, 2025, 03:17:31

Hi @nmt Thanks for your response. If I am not sure that the model was fitted well throughout the study, should I then filter out data based on model confidence?

nmt 02 April, 2025, 03:35:52

Do you have the recordings? You can play them back in Pupil Player and check the fit. You can also do post-hoc pupil detection and model freezing.

user-b31f13 02 April, 2025, 03:40:34

@nmt I would like to keep the process efficient and standardised as I am working with many participants. Is excluding frames based on model confidence a bad way to go?

nmt 02 April, 2025, 03:50:02

Model confidence is a worthwhile measure. But best practice for pupillometry would be to ensure the model was frozen at the point when the model was well fitted for each participant. If this didn't happen at the time of recording, it can be done post-hoc in Pupil Player, assuming there was no headset slippage. In sum, the goal would be to ensure a good fit that's specific for each participant. This might ultimately require some manual inspection of the recordings.

user-b31f13 02 April, 2025, 04:00:10

@nmt When you say if this didnt happen at the time of recording.. This happens automatically during recording right?

nmt 02 April, 2025, 04:43:48

The model will tend to fit over time, yes. But freezing the model is a manual step.

user-b31f13 02 April, 2025, 05:13:56

Ok. Thanks

user-e0a306 02 April, 2025, 09:57:57

Hi, I am working in pupil with core, sharing the program with other coworkers. Since we work on different projects, there is an issue of recalibration after each person. Is there a possiblity to save individual calibration?

nmt 03 April, 2025, 02:41:11

Hi @user-e0a306! Pupil Core does need to be calibrated each time it is put onto a new wearer. Depending on the requirements of your study, you might also want to think about re-calibrating during each block of testing. You can read more about that in our best practices: https://docs.pupil-labs.com/core/best-practices/#best-practices. As an aside, our newest eye tracker, Neon, is calibration free and avoids this need entirely.

user-c39646 02 April, 2025, 13:50:10

hello, i am new and i am using pupil captue. i see the plugs in network api but then when i see in progrqam nd files i do not see the plugs in folders> can u tell me where i can down load the plugs in plese? i need network API and remote recorder . thanks aloooot

user-999a6c 04 April, 2025, 07:20:35

Hello! I really need support . I already created a ticket.

user-d407c1 07 April, 2025, 06:44:03

Hi @user-999a6c ! Please check out the replies on the ticket.

user-6a6d64 04 April, 2025, 07:38:37

Hi! I am having a problem with the Pupil Core in Mac. I run the Capture program with the corresponding command: sudo /Applications/Pupil\ Capture.app/Contents/MacOS/pupil_capture but there is a moment in time that the captura application closes and I receive the following message in Terminal: Unhandled SIGSEGV: A segmentation fault occurred. This probably occurred because a compiled module has a bug in it and is not properly wrapped with sig_on(), sig_off(). Python will now terminate.

user-f43a29 04 April, 2025, 12:46:13

Hi @user-6a6d64 , could you provide a copy of the logs in /Users/<username/pupil_capture_settings, after the crash happens? Thanks!

user-c39646 07 April, 2025, 07:35:14

hi, how do i create a ticket to look for help please?

user-d407c1 07 April, 2025, 07:39:48

Hi @user-c39646 πŸ‘‹ ! You can do so on the πŸ›Ÿ troubleshooting channel, but kindly note that this is mostly meant for hardware issues.

user-d407c1 07 April, 2025, 07:51:55

You can simply ask here any question you may have πŸ˜‰

user-c39646 07 April, 2025, 07:59:46

oh great, i am new and i am using pupil capture. i see the plugs in "network api" but they do not appear to work. indeed, then when i look into "prograam" folder i do not see the plugs in . can u tell me where i can download the plugs in please? i need "network API" and "remote recorder" .

user-d407c1 07 April, 2025, 08:15:57

@user-c39646 These plugins come pre-installed with the software, so there's no need to install anything extra.

Just open the Plugins Manager from the sidebar and enable them (see screenshot).

You can find more details in the documentation.

Chat image

user-c39646 07 April, 2025, 08:20:51

ok so why i do not see them in the program folder?

user-d407c1 07 April, 2025, 08:47:54

They would not show in the program folder, because they are bundled together with the code. If you would like to check the code, it is open source and available in the repository.

May I ask how are you trying to connect? Is it from a computer to another computer?

user-c39646 07 April, 2025, 08:23:01

because they do not work... i am not able to connect to the IP adress i see in the network API

user-c39646 07 April, 2025, 08:23:11

local IP

user-c39646 07 April, 2025, 08:56:10

its via python with the same computer. i want to send a command saying when it has to start ad stop calibration along with other commands. here is the screenshot of the error and the plugs in installed

Chat image

user-d407c1 07 April, 2025, 09:01:41

I think there might be a small misconception β€” you don’t need the Remote Recorder plugin for this. That plugin was mainly used with Pupil Mobile, which is now deprecated.

What you’re actually looking for is this:
πŸ‘‰ https://docs.pupil-labs.com/core/developer/network-api/#pupil-remote

Just make sure to update the IP address to match the one shown in the Network API plugin in Pupil Capture.

user-c39646 07 April, 2025, 09:10:18

ok but i do need remote recorder if i want to connet and send commands via a second computer (that i will need to do afterwards). right ? i do not see the error in my code compared to what you told me... i am tryng to do calibration in my code but i do not manage to do it, and i am using the local IP code

pupil_vr_calibration.py

user-bd5142 07 April, 2025, 09:05:14

Hi I am using Pupil Core to collect data and preparing to draw a dynamic area of interest. Do you have any guidance on this? Thank you

user-d407c1 07 April, 2025, 09:10:26

Hi @user-bd5142 πŸ‘‹ !

Pupil Core doesn’t support dynamic areas of interest out of the box.

However, if you’re open to using fiducial markers (also known as Apriltags), you can take advantage of the Surface Tracker plugin in Pupil Player.

Alternatively, you might find this community project helpfulβ€”it explores a similar use case and could be a good starting point.

user-bd5142 07 April, 2025, 09:12:33
user-c39646 07 April, 2025, 09:26:01

sure , here is the screenshot ; it cannot find pupil capture but it is actually running

Chat image

user-d407c1 07 April, 2025, 09:37:05

As in the routine is executed? May I ask is Pupil Core actually connected? Mainly asking because the scene camera is not visible. Or is it a Core VR Add-on? πŸ₯½ core-xr

user-c39646 07 April, 2025, 09:43:39

the camera are instlled into a VR Headset exactly. I restarted the computer and it is now working, however it tells me that calibration need world capture video input: i do not need that> can't i do calibrtion just using the eye0 and eye1 camera?

user-d407c1 07 April, 2025, 09:45:03

Okay since this is a πŸ₯½ core-xr question, let's move the conversation to that channel.

user-c39646 07 April, 2025, 09:49:14

ok

user-c39646 07 April, 2025, 14:00:31

hi again, how do i do a five-points calibration? do you have any script in your library?

user-d407c1 07 April, 2025, 14:47:56

Hi @user-c39646 πŸ‘‹ !

The 5-point calibration is the default method used for screen-based choreographies:
https://docs.pupil-labs.com/core/software/pupil-capture/#screen-marker-calibration-choreography

That said, based on our previous conversation, I’m a bit confused about your current approach β€” particularly around your interest again in performing a calibration and how you plan to perform it without a scene camera.

user-c39646 07 April, 2025, 14:00:41

i cannot find it but the one-point calibration

user-c39646 07 April, 2025, 14:52:12

hi Miguel, you are right. i am actually testing several approaches trying to understand how to do with and without calibration. i have to thest this one you just suggested me, but in the previous one how does the calibration stops? and why do i have to click five times? thank you very much for your help

user-f43a29 08 April, 2025, 08:29:35

Hi @user-c39646 , I'm briefly stepping in for my colleague, @user-d407c1 .

I would recommend first working through the Getting Started guide.

To briefly answer your questions:

  • Calibration stops automatically when it has collected enough data points.
  • You do not need to click when doing a standard calibration; you actually do not want to click. Rather, look at each target until the next one appears. The calibration process will automatically handle the rest.

As @user-d407c1 mentioned, calibration is not necessary to measure pupil size and the standard Pupil Capture calibration routine will not be usable in your VR setup, without some modifications. For pupil size, you want a well-fit 3D eye model.

user-c39646 07 April, 2025, 14:53:58

test*

user-c39646 08 April, 2025, 09:42:14

hi, thank you very much. is there any script python in your library that i can modify to adjust it to my task? also is there a script in your library that tells to start and stop recording simply? thank you very much in advance

user-f43a29 08 April, 2025, 14:22:04

Hi @user-c39646 , there is not really a single Python script that can be modified. The calibration process has several interlocking mechanisms and multiple parts of the Pupil Capture code play a role. If you are looking for a method that saves the most time, then using hmd-eyes in Unity would be easiest, especially if these concepts are new. In other words, with hmd-eyes, the hard work of adjusting calibration for your task has already been done.

A script for starting/stopping a Pupil Core recording can be found here:

More details can be found in this part of the Documentation.

user-3f4a1b 08 April, 2025, 20:04:33

Hey there. I was looking into the DIY headset setup. Is there any way to get access to the 3d printing files for the components so that I can modify them for my setup?

user-f43a29 08 April, 2025, 22:23:55

Hi @user-3f4a1b , we have open-sourced the CAD files for the camera mounts. They can be found here.

user-3f4a1b 08 April, 2025, 22:24:57

Gotcha, so the mounts are open source but the eyeglass frame isn't correct?

user-f43a29 08 April, 2025, 22:26:06

@user-3f4a1b That is correct. The frame is not open-source. You can order just the frame however here.

user-3f4a1b 08 April, 2025, 22:28:10

Will do! I am starting to design an experiment and wanted to have a few different hardware setup options to find best fit. Found the Pupil system when I was doing the research

user-f43a29 08 April, 2025, 22:29:29

Ok! If you'd also like to see how our eyetrackers work in real-time, then feel free to schedule a Demo and Q&A call.

user-3f4a1b 08 April, 2025, 22:30:56

Thank you

user-f43a29 09 April, 2025, 09:39:31

pupil-helpers/python/remote_annotations....

user-69c9af 09 April, 2025, 13:28:22

Hello, I was searching a way to open pupil player without the interface (with python for example) as there are some lagging when I upload the file. Does someone have done that before? Also, I wanted to know if it is possible to have triggers from another software on the annotations file? I am currently using LSL but struggling with that part. If I am in the wrong channel for this question, please tell me and I can ask it in the right one.

user-f43a29 09 April, 2025, 22:05:31

Hi @user-69c9af , you are in the correct channel.

So, using Pupil Player's routines to export data without the GUI is not supported. Are you potentially loading long recordings? And, what Operating System are you using?

And, yes, you can send Annotations from other software to Pupil Capture during a recording. If it has a Python interface, then it is easiest to follow our documentation on Pupil Core's Network API, but if not, then any ZMQ interface will do.

While our LSL plugin does not record Annotation data, you can send standard Annotations to Pupil Capture and then post-hoc synchronize them to the LSL timeline.

Or, are you rather referring to transforming the triggers in the data file of a different device to Pupil Core's Annotation format?

user-6a6d64 10 April, 2025, 13:23:03

Sorry, you are right that I used it a second time and everything worked fine. If this happens again I will send you the log file. Thx!

user-f43a29 10 April, 2025, 13:25:28

No problem. If you encounter the issue again, just send the logs here!

user-6a6d64 30 April, 2025, 07:28:25

@user-f43a29 I had the problem again on my Mac; the main pupil player closed and the message I got was: world - [WARNING] uvc: Turbojpeg jpeg2yuv: b'Invalid JPEG file structure: two SOI markers' eye0 - [WARNING] uvc: Turbojpeg jpeg2yuv: b'Invalid JPEG file structure: two SOI markers' eye1 - [WARNING] video_capture.uvc_backend: Camera disconnected. Reconnecting...


(no backtrace available)

Unhandled SIGSEGV: A segmentation fault occurred. This probably occurred because a compiled module has a bug in it and is not properly wrapped with sig_on(), sig_off(). Python will now terminate.


eye0 - [WARNING] video_capture.uvc_backend: Camera disconnected. Reconnecting... (no backtrace available)

Unhandled SIGSEGV: A segmentation fault occurred. This probably occurred because a compiled module has a bug in it and is not properly wrapped with sig_on(), sig_off(). Python will now terminate.


world - [WARNING] uvc: Turbojpeg jpeg2yuv: b'Corrupt JPEG data: 658 extraneous bytes before marker 0xd7'

user-e0a306 14 April, 2025, 10:52:06

Hi, if anyone has already worked with apriltags detection with core, are there any examples of the code available?

user-d407c1 14 April, 2025, 11:26:16

Hi @user-e0a306 πŸ‘‹ ! Do you mean besides the surface tracker ?

user-e0a306 14 April, 2025, 11:34:22

Hi, yeah, have not seen this one exactly but would like to get acquainted with some diverse usage examples

user-c6248a 21 April, 2025, 07:51:21

Hello! Thank you for the great work on Pupil Core. I’m currently conducting an experiment using Pupil Core that involves real-time gaze position tracking. Since the participant’s head is fixed in place during the experiment, the world camera data introduces unnecessary noise. Is there a way to perform calibration and measurement without using the world camera?

user-f43a29 22 April, 2025, 11:49:51

Hi @user-c6248a , coould you clarify the noise that you are seeing? Then, we can look into ways to resolve that. The world camera of Pupil Core is indeed necessary for Pupil Capture's calibration pipeline and also for Surface Tracking.

user-22630f 22 April, 2025, 22:53:24

Hi, I am trying to use Pupil Core on MS Surface Pro 11, but it cannot see the ET or world-facing cameras. Do you have any suggestions to get this working?

user-f43a29 23 April, 2025, 09:49:47

Hi @user-22630f , we also received your email and responded there.

To clarify, Pupil Capture is not supported on ARM architectures, such as that used in the Microsoft Surface Pro 11.

If you intend to explicitly use Pupil Core with that device, then you could also try this third-party package to stream Pupil Core's camera feeds to a remote computer that is running the Pupil Capture software.

user-cc5ee8 23 April, 2025, 17:34:59

hi

user-cc5ee8 23 April, 2025, 17:35:53

i want to integrate this device with jspsych any idea how to do fully confused

user-f43a29 24 April, 2025, 07:22:02

Hi @user-cc5ee8 , what is the ultimate goal of your research?

I ask because communication with Pupil Core happens over ZMQ. With jspsych, being that it runs in a browser, you would typically use the jszmq package, but it requires using ZMQ over WebSockets, which is not explicitly supported by Pupil Core's Network API. It might work, but I cannot guarantee. Otherwise, at the moment, I have not found an alternative solution for ZMQ in the browser/JavaScript. In any event, it would also require some coding effort to write a compatible jspsych plugin.

If you are not doing a real-time, gaze-contingent task, then you could probably just run a standard Pupil Capture recording with Surface Tracker. If you need to know when trials start & end, then then easiest solution might be to have jspsych send a notification over a WebSocket to a listening Python script, which then forwards an Annotation to Pupil Capture via our standard Network API commands.

user-51e172 28 April, 2025, 08:46:12

Hi there, We're suddenly unable to calibrate our pupil core. The red dots flicker and jump around on the screen wildly. Any ideas what could have caused that?

user-d407c1 28 April, 2025, 08:57:01

Hi @user-51e172 πŸ‘‹ ! When you refer to the red-dot, do you mean the 3D pupil detection on the eye cameras, the gaze point or something else.

And by unable to calibrate, do you mean the calibration routine fails?

To better assist you, can you share some video showcasing this behavior, or even better a short recording made in Pupil Capture? You can share it with data@pupil-labs.com

user-e6fb99 28 April, 2025, 20:02:09

Hello all,

I am Oindrila from Georgia tech. I am currently trying to use the pupil labs core. I have an arduino based system( used for motor learning task in humans). I want the data in pupil labs to have time stamps of events (e.g - the person picked up the disc, I have this time stamp access in my matlab as I use it to control the arduino system.) How would be the best way to have the annotations in the pupil lab data.

user-0cb368 29 April, 2025, 03:28:37

Hi, I'm trying to run a recording directory on pupil player on a macOS, but when i drag and drop the file onto the player it either says pupil player cannot open files in the folder format or it just closes the app. I'm very new to this and I'm not sure whats wrong, is there a way to fix this?

user-d407c1 29 April, 2025, 13:56:02

Hi @user-0cb368 ! We have replied to you on the ticket.

user-d407c1 29 April, 2025, 14:04:20

Hi @user-e6fb99 πŸ‘‹ !

Pupil Core's Network API uses the ZeroMQ protocol.

If you are able to use zmq and msgpack as described here, you can follow this example.

Alternatively, on newer versions of MATLAB you can use Python directly, see here.

This, would allow you to use pyzmq and follow this example.

user-e6fb99 30 April, 2025, 10:25:45

In this. Both the example takes to the math works page. Do you have a GitHub example?

user-e6fb99 29 April, 2025, 14:10:14

I wil try this out. Would it work if the matlab / python is in the same computer as the pupil labs ?

user-d407c1 29 April, 2025, 14:10:34

sure

user-e6fb99 29 April, 2025, 14:12:10

Thanks a lot. I will let you know if it works and if I have further questions.

user-e6fb99 29 April, 2025, 18:32:54

Using this, I get an error stating the zmq.core.ctx_new() is script but being called as a function. Should I make any changes?

user-e6fb99 29 April, 2025, 18:33:16

I am running this is MATLAB 2017b

user-e6fb99 29 April, 2025, 18:53:23

I understand that we need to install zmq for this process to work. Can you help with how to do that? Thanks again.

user-d407c1 30 April, 2025, 07:49:17

Hi @user-e6fb99 ! Do you already have the matlab-zmq and matlab-msgpack packages installed and working?

user-e6fb99 30 April, 2025, 09:45:14

I have downloaded the packages from GitHub and have it in my Matlab path. When I open the configure.m from one of the folders . There are comment lines that asks to install the zmq. How can I install it? Please help.

user-d407c1 30 April, 2025, 07:50:16

Hi @user-6a6d64 ! Could you kindly create a ticket on πŸ›Ÿ troubleshooting ?

user-6a6d64 05 May, 2025, 09:49:33

Sorry, I am not able to create the ticket. I get the message that I don't have the right permission

user-d407c1 30 April, 2025, 09:51:54

You would need to install zmq via one of these ways: https://zeromq.org/download/

If that approach is not working I suggest going through python interface in Matlab.

user-e6fb99 30 April, 2025, 09:53:40

I tried downloading through here. The issue is, I couldn’t find the appropriate package for windows.

user-e6fb99 30 April, 2025, 09:54:05

Can you guide me to the appropriate windows system package ?

user-d407c1 30 April, 2025, 10:14:40

You can find the binaries here use ZIP for windows.

That said, as noted in the README, the code using matlab-zmq has only been tested on Ubuntu, and some users have reported issues on Windows.

If you're on Windows, you might find it easier to use MATLAB’s Python interface with pyzmq instead as suggested earlier.

user-e6fb99 30 April, 2025, 10:25:01

Okay sounds good. I will try the python based solution.

user-d407c1 30 April, 2025, 10:26:46

Sure, seems I copy-pasted twice the same URL πŸ˜… , here you have it https://github.com/pupil-labs/pupil-helpers/blob/master/python/remote_annotations.py

user-e6fb99 30 April, 2025, 14:21:04

Thank you. I will let you know once I install the latest Matlab and try it out.

user-a7bee8 30 April, 2025, 13:11:31

Hi there. Is world_index an available data element on the gaze data when using the Network API in Pupil Capture? Trying to associate outputs of the gaze tracking to the world video frames. Thanks! πŸ™‚

user-f43a29 13 May, 2025, 15:19:35

Hi @user-a7bee8 , to clarify, you are trying to do the association in real-time? If so, the incoming video frame and gaze data are timestamped, so you can simply use all gaze data that are newer than the current world frame. You can use this tutorial as a reference.

End of April archive