Hello, I wanted to know if wearing contact lenses is compatible with the use of Pupil Core? I know that wearing glasses is not possible.
Hi @user-69c9af! Yes, wearing contact lenses is absolutely fine when collecting data with Pupil Core.
Thank you for answering !
Hello, As part of an academic study, I am interested in examining the impact of eye movements on piano performance while reading sheet music, using the Pupil Core eye-tracking system. However, I am encountering a problem during the calibration process using the Calibration Marker (v0.4 marker design) — the results are inaccurate, even when using both the 2D and 3D pupil detection settings. I have ensured that the system is set to "single marker" mode, and prior to calibration, I perform a region of interest (ROI) detection for the participant's pupils. During calibration, I present the participant with 9 to 11 different points and ask them to fixate on the center of each marker for approximately 3 seconds. Nevertheless, after calibration, when I ask the participant to look at a specific point on the sheet music, the system indicates that they are looking elsewhere — sometimes significantly far from the actual point. I would greatly appreciate any help or guidance you can provide. Thank you very much in advance.
Hi @user-8d541b! I think it would be helpful if you could share a recording with us that contains the calibration choreography, such that we can provide concrete feedback. Please follow the steps in this message: https://discord.com/channels/285728493612957698/1340009042021712002/1341232607895224323
I’ll send an email with the recording later today. Thank you!
Hi @user-8d541b! Thanks for sharing the recording – it’s very helpful. Here are my feedback points: 1. Pupil detection and 3D model – These look great. One minor optimisation would be to ensure the pupils are always visible during model building (e.g. when rolling your eyes). An easy way to do this is to keep your eyes fixated on one point (such as the pupils in the eye overlay window) and then move your head in a spiral pattern. You’ll get the same result as keeping your head still and rolling your eyes but you can check the pupils are clearly visible. 2. Calibration – Here’s where you’re going wrong. You’re currently holding a physical marker, which is actually a bit too big, but aside from that, you move the marker and then move your whole head to gaze at it. This isn’t really achieving much. The goal is to capture different gaze angles, but right now your eyes are neutral and aligned with the direction of your head as you move it.
I’d strongly suggest using the default screen-based calibration routine to start with. Initiate that, keep your head completely still, and follow the target with your eyes. That should already give you much better accuracy. Give it a try and let me know how you get on!
@nmt we are trying to access the pupil core glasses via de USB but using docker and it works in Linux and Mac but not in Windows; is there a way we could do it?
Running through docker on any platform isn't really a supported workflow. We distribute builds for Linux, Mac, and Windows that are meant to be installed directly on your host OS
Ok, thanks @user-cdcab0
Hello, I conduct an academic research in which I am trying to control the collaborative robot using gaze from eye tracker Pupil Core. Unfortunately, I've come across some misunderstandings that I haven't been able to clarify using the documentation. I would greatly appreciate it if you could help me by answering the following questions: 1) Is the gaze_point_3d value returned by the eye tracker expressed in centimeters, meters, or pixels? 2) As I understand it, gaze_point_3d returns coordinates in the form [x, y, z], where z represents the optical axis. Can this z coordinate be interpreted as the distance to the object the person is looking at?
Hi @user-7be139! Gaze point 3d's unit is mm. It represents the nearest intersection of the left and right eyes' visual axes. In terms of viewing depth, I've written about this before, so you'll want to check out those messages: https://discord.com/channels/285728493612957698/285728493612957698/1140575743609409586
Thank you very much!
Hello everyone! I met some problems when building Pupil Capture in Raspberry Pi 5. I built a virtual environment and could not built pupil_detectors package. P1 shows the output of test_api.py. I tried to change the cython version and numpy version and rebuild the package again, P2-4shows the output when building pupil_detectors from code. I tried to change the cython file ···/src/pupil_detectors/detector_2d/detector_2d.pyx from 'from numpy.math cimport PI' into 'from libc.math cimport M_PI as PI', but it didn't work either. Is there anything wrong with my building process? Note that I created this virtual environment using '--system-site-packages' command. P5 shows the full packages in this environment, and I tried to build another without using the command, but it also did not work. Many thanks for the help and assistance!
Hi @user-b02f36! The Core software stack is not natively supported on ARM. You might want to check out this message for further information/alternatives: https://discord.com/channels/285728493612957698/285728493612957698/1354350692688592947
Thank you for your reply, Neil. I'll try other methods like docker. BTW, is it possible to let Raspberry Pi receive the pupil message using Network API? It seems possible while Pi and a PC with Pupil Capture are under the same WLAN according to the Network API doc.
Streaming via the network API should be possible. Although I don't think we've explicitly tested it. So you might have to give it a try!
Hi, Neil. It works that Pi can subscribe the zeromq message and print the pupil message when it connects to the same WLAN with PC. Again, many thanks for your help! 🙂
Great to hear! Glad it's working for you 🙂
Hi! Every now and then the recording does not work, and the phone is vibrating. It's annoying because the user doesn't know why it's vibrating so they never tell me, and the recording is not working. How can I minimize that from happening?
@user-d4e38a In the meantime, please also open a Support Ticket in the 🛟 troubleshooting channel.
Hi @user-d4e38a , may I first ask which eyetracker you are using?
Hello everyone! I have a question regarding the use of markers for a usability test. I need to test a university website using Pupil Labs eye-tracking glasses. I’ve tried using manual markers (e.g., AprilTags), but I noticed that unless I use fairly large markers—which end up covering part of the screen—I can’t consistently generate heatmaps or track gaze data properly.
Additionally, the website involves a lot of scrolling, which causes the markers to go out of view and disrupts AOI (Area of Interest) tracking.
My goal is to extract relevant usability data from five participants. I would like to ask: • What would you recommend in order to still generate useful heatmaps and use AOIs, given the heavy scrolling? • Is it better to avoid using markers in this case? • Also, do I need to recalibrate the eye-tracking glasses for each participant?
From what I understand, calibration is required for each person to ensure accurate gaze data, since the glasses fit differently and eye geometry varies.
Any advice would be appreciated. Thank you!
Hi @user-999a6c! Thanks for your question. From your description of the markers taking up too much space and going out of view, it sounds like you might have displayed them digitally on your screen. Did you know it’s also possible to print them out and attach them to the corners of your screen? That should help resolve those issues. Regarding calibration, yes, it’s necessary to calibrate the glasses for each wearer. You can read more about best practices for this in our documentation.
I already created a ticket
Hello! I use physical markers stuck to the corners of the screen but they take away from my visibility and I have another problem on the site, I have to constantly scroll.
Could you pleas help me ?
Hi @user-999a6c , I'll briefly hop in for Neil; he's currently at a conference and we typically take some time off on the weekends. Although, always feel free to post questions!
If you could share a picture of your monitor with the AprilTags, then I can provide brief feedback on how to improve their positioning.
With respect to the need to constantly scroll, it will require some custom code to develop a heatmap for the whole page. You can see this Alpha Lab for how we did it for Neon. Many of the principles are the same and much of the code could be used un-changed.
Hello @user-f43a29 Is it possible to discuss in chat that was created for my ticket ?
Hi @user-999a6c , we typically keep general questions about usage here in the public channel, so that other users can also benefit from the information. I can start a thread instead, if you'd like.
Sure
Hello,
I’m using a Pupil Core headset with 3 cameras (World, Eye 0 – right, Eye 1 – left).
I previously had occasional connection issues with Eye 0 (right), where touching the cable would restore the image. But now, Eye 1 (left) stopped working suddenly while sitting still on my desk. It briefly came back for a few seconds and then failed completely.
The camera itself is not broken – it works fine if plugged into another port. I tested it on two computers, with drivers properly installed (libusbK via Zadig). Eye 1 is detected in Device Manager, receives power (gets warm), but always shows 0 FPS and no video.
This seems to be a hardware failure in the Eye 1 connection (either port or internal cable). Can you advise whether this can be repaired or replaced, and what my options are?
Thank you in advance.
Hi @user-999a6c , I've posted a debugging step in the Support Ticket you had opened.
Hello! I have been using pupil core and pupil capture for a year now and it works very well! I record brainvision EEG and pupillometry from pupil core with lab stream layer in order to line them up in time. Recently, Pupil capture has been taking up A LOT of my CPU. I don't need world camera so I thought to use pupil service. However, I cant seem to send pupil service pupillometry data to lab stream layer. Do you have any recommendations where I can reduce Pupil core/pupil capture strain on the CPU? And still be able to line up to EEG markers through lab stream layer?
Hi @user-44f055! Pupil Service is mainly intended to be used with VR/AR applications that don't required a scene video. If memory serves, it doesn't have an LSL plugin like with Pupil Capture. To confirm, you're only streaming pupillometry data via LSL, not gaze data, correct?
Yes Neil, I am not concerned with Gaze. And pupil capture sends three streams to LSL: "Gaze only", "Pupil Capture", and "Pupillometry only". I record the stream "Pupillometry only" via Lab recorder's LSL
Thanks for confirming. Are you calibrating Core and obtaining gaze data during your streaming?
(even if you're only using the pupillometry only LSL stream)
When I open up the pupillometry only file, the data contained is: confidence, norm_pos_x, norm_pos_y, right diameter 2D, left diameter 2D, right diameter 3D, left diameter 3D. So I gues it is collecting X,Y,Z data. I am not using in my analysis. I only use 3D diameter with their gaze fixed.
Hi Maggie. To re-phrase my question, do you perform a calibration before collecting data? You can read about the calibration process in this section of the docs. I ask because the calibration process and subsequent gaze data being generated can significantly increase CPU load.
Maggie Sereika — 1:15 PM Oh thanks, no I do not perform calibration. I do reset and fix pupil model but to the best of my knowledge, a callibration does not apply to me
How do I turn off gaze data from being generated? Even if I don't perform calibration, there is still the red dot in the pupil world view in pupil capture that gives an approximation. I tried, but I don't see an option besides pupil service to only have pupil3d collected
Understood. Using just the pupil data should not really use much CPU. If the gaze circle is indeed still present, then it will be using a previous calibration. I think you can try restarting Capture with default settings. You can find that in the main settings plugin (cog icon). That should clear the previous calibration and potentially reduce workload. Let me know if that works!
Thanks! reseting to default settings does indeed get rid of red circle. I will follow up if unresolved and I have more questions... Thanks!
Hello,
I would like to get some guidance regarding a question that I have. So, I got the error: ''An unexpectedly large amount of pupil data (>20%) were dismissed due to low...'' Yet, the accuracy of the calibration was 0.28 and the precision was 0.08. Is it meaningful to keep the data although I got the error above? I attach you a short video of it.
Thank you in advance, Panos
Hi @user-6c9087! On the face of it, that calibration sequence and resulting accuracy seem great. Are you sure that's the same recording that generated the dismissed pupil data message?
Hey Neil! Thanks for the response. Yes, I am sure. Also, it happens quite often. Whenever I get this error, I look at the precision and accuracy scores. If it's below, let's say, 0.4, then I keep going. So, based on my understanding, even if this error arises, if the accuracy and precision are low, the data remain meaningful. If the errors are higher (e.g., more than 0.8), I restart the experiment and try to fix the cameras.
Also, do you know where in the recorded dataset I can get the calibration results (e.g., precition & accuracy) and the confidence results (which is referred to the error I get) for each participant?
Okay. To be honest, 0.4 degrees of accuracy is already very good. Of course, it depends on your experimental requirements, but if you’re regularly achieving this level of accuracy and your data looks good, that’s definitely a positive.
The accuracy and precision results are saved in the accuracy visualiser menu and in the capture log file. It’s easiest to note these down at the time of calibration or validation, as they’re not saved with each recording. It's possible to make arbitrary number of recordings using the same calibration, and since calibration accuracy can degrade over time, e.g. due to slippage, it's recommended to perform regular calibrations and record the values if important.
Confidence values, however, are stored with each recording and are saved in the pupil_positions.csv file.
Hey Neil, thanks for your answer.
So these numbers (accuracy & precision) are only shown during the recording and not saved elsewhere? Where can I find the accuracy visualiser menu and the capture log file (sorry for my ignorance)?
Also, when the confidence values are low, does the eye tracker still record fixations? Or do I need to manually filter the fixation datasets based on the confidence values prior to running the analyses for the fixations?
Thank you in advance for your help. Panos
No problem. Correct. The accuracy visualiser is a plugin in Capture (see the screenshot). The log file is in the pupil_capture_settings
directory. You can search for it on your computer.
Only high-confidence data are included in the fixation detection. There is a column with specific confidence values in the fixations.csv export.
Hey there, for my masterthesis I wanted to use the eye-tracking data from pupil labs (with core eye-tracking device) to control an interface in max/msp. I programmed a script in python with chatgpt to make the udpreceive object work with the API from your doc. The only problem is now, that all the data im getting don't make sense. I double checked everything with a friend who is also programmer and he recommended (after we didn't find any mistake in the code) that I should ask the support. The data jumps and is not relatet to where my eye is moving etc. - Can someone help me on this?
Hi @user-28d52d , would you be able to provide a screen capture of the eye camera feed and the calibration process in Pupil Capture? Alternatively, you can make a Pupil Capture recording of the setup & calibration phase and upload to Google Drive. You can then share it with [email removed]
On wednesday I will be back at the laboratory and then I will do that, Thanks!
Hello, I will be using Psychopy to display a gaze-contingent visual search paradigm and will be recording eye data using Pupil Core. I would appreciate some help in setting up the eye-tracker in Psychopy. I am now using a skeleton calibration+validation routine (in Builder) only to check the integration with Pupil labs. I keep getting an error in which validation.run() currently fails with:
File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\iohub\client\eyetracker\validation\procedure.py", line 1141, in _processMessageEvents self.sample_type = self.saved_pos_samples[0][0].type IndexError: list index out of range
This indicates that no gaze samples were received from Pupil Labs to Psychopy. Why is this happening?
Also, is it possible to run the calibration/validation routines in Pupil Capture and then get eye-data into Psychopy when the task starts? I was displaying Apriltag markers using the Psychopy code, but that would mean they would only appear when the task runs on Psychopy and cannot be configured with Pupil Capture software.
Thank you for your help!
Hi @user-92a86b , we responded via email. To briefly answer some of your questions here:
Hi @user-92a86b , we also received your email and will be following up there. Thanks!
Hello, Currently using hmd-eyes with unity building a framework using 4 tags for visual experiments and was wondering if Pupil Capture sends surface data ( Number of tags detected, surface confidence/stability etc...), so that i would manage the gaze data accordingly knowing that it is trust-worthy or not ?
Thank you in advance!
Hi @user-e16f11 , you want to subscribe to the surface
topic and there you can get some of this info. I recommend reading through this thread for tips on how to do it in Unity: https://discord.com/channels/285728493612957698/1248580630430875661/1263883422695424051
Let us know if you have any questions about that.
Hello, I would like to inquire whether subjects using Pupil - core can wear myopia glasses or contact lenses. When we measure while wearing glasses, there is a deviation in the line of sight. Thank you in advance!
Hi @user-c4e867 , you can use Pupil Core with contact lenses. Otherwise, there are users who have made it work with glasses, but it can be a bit tricky because of the limited nose bridge real estate and the frames of the glasses potentially blocking the eye cameras.
To that end, our latest eyetracker, Neon, has a frame with swappable magnetic lenses to resolve this issue for people who must wear a perscription. It is called I can see clearly now.
Thanks, taking a look. Also, when you have this all working, it would be great to share the end result in the 📸 show-and-tell section!
Hi @user-28d52d . Thanks, but would it be possible to do a screen capture of just the Pupil Capture software while you are wearing the eyetracking headset and doing the 3D eye model fit and calibration routine?
@user-f43a29 is this what you need?
here is the video
yes will do 🙂
@user-f43a29 I gave you the access
Hi @user-28d52d , thanks. At first glance, the gaze data & calibration results seem reasonable, but is it possible to also include the eye camera windows in the screen capture? This will help determine if everything is setup as expected.
Thanks for including the eye videos. So, the pupil detection, the 3D eye model, and the calibration process all look good. You might to double check that automatic exposure is enabled on the eye cameras as they look almost a bit overexposed, but as mentioned, the gaze data after you do the calibration also look reasonable.
If you use this setup with your Max/MSP program, then do you still see the jumping? It might be helpful to now share what the jumping looks like in your program.
yes I still see the jumping and this is what maes no sense
Are you trying to map gaze to the screen with Pupil Core's Surface Tracker plugin?
also the numbers are unreasonable if I try to translate them, the never make sense
nope, I just wanted to use the gaze data to control x & y axis in max msp
Ok, that is fine. Is the jumping looking like sudden step changes in x/y? As if the eye has suddenly shifted to a new position and then hovers there a bit?
with the eyes, so this should be possible right?
no the jumping even appears when the eye is not moving at all
What is the magnitude of the jumping? Can you share a screen capture of it? Is that what is shown in the histograms in the bottom right of this video (https://discord.com/channels/285728493612957698/285728493612957698/1377278345938600031)?
yes
sometime even radically more jumping even
To be certain I understand, do you mean the bigger jump for arrow 1 or the smaller jump for arrow 2?
1
Ok, that potentially looks like what is called a saccade, when your eye has moved to fixate a new point.
It could be useful in this case to run a Pupil Capture recording while looking at the Max/MSP interface and then play it back in Pupil Player to see if the jumps in the histograms correspond to actual moments when the eye shifts to a new position/fixation.
Have you already done that? If so, you can share the folder for that Pupil Capture recording.
i included you a new video to show, when im not moving the eyes how the data is going wild, only looked to one point
I see now what you mean. It looks like those histograms are plotting the output of the 2D pupil diameter pipeline, so pupil diameter in units of pixels? If so, that is different from gaze x/y.
even if so, why do they jump? why are they moving when I'm not moving my eyes?
Hi @user-28d52d , I gave your Python script a try here. I get similar results as you and they are reasonable. I tested by making the main Pupil Capture window smaller and placing it just above the latest console output in the terminal. The fluctuations in the gaze point qualitatively match with the fluctuations in the displayed values. Please note the following:
Even when trying their utmost to hold their eyes still, a person's eyes will still exhibit some drift, as well as small shifts. However, without knowing the scales on those histograms and sliders, I cannot say for certain how Max/MSP is interpreting the values. Also, you may want to double check the "byte order" (i.e., endianness) that Max/MSP expects when receiving data via udpreceive
. If it is opposite from Pupil Capture's conventions, then you could be accidentally displaying incorrect data.
In all measurement processes, there is some noise. There will simply be some jitter in the raw gaze signal as the model successively makes new estimates. This is expected, but if it is undesireable for your application, you could apply a smoothing filter. The 1 Euro filter, as it is called, is applicable for real-time applications.
They are not eye movement exactly, but pupil diameter. Your pupil size fluctuates due to several factors, even due to changes in cognitive state and even when your eyes are "not moving".
Also, the 2D detector gives the values in pixels, which have a different scale and can fluctuate. Those jumps are potentially noise in the 2D detection, although without knowing the scale of the histograms, it is hard to say exactly.
If you want pupil diameter to also play a role, it would be instead recommended to use the output of the 3D detection pipeline, so diameter_3d
, which is phyisologically based, in millimeters, and stable in comparison.
However, not being familiar with Max/MSP, I cannot be sure if the data have been unpacked (c.f., the unpack f f f f
commands) in a way that the data are exactly what they should be. Running a Pupil Capture session in parallel and comparing to the outputs in the CSV exports from Pupil Player would be one way to debug the code.
@user-28d52d For comparison, your sliders and graph on the left look like they should be reflecting gaze x/y correctly, at first glance. Although, as mentioned, I do not know exactly how udpreceive
and unpack
work.
unfortunately the sliders on the left have the same issue
Received gaze data: [0.44030276413635827, 0.07320716391009996], Pupil size: None Fixation detected, sending data...
this is what is being send by the script, which i wrote with the api from your website
gaze data = pupil diameter?
or maybe we can make a call and I show you or I come by the office and we solve this. At this point I would do anything to make it work