πŸ‘ core


user-c3a255 01 March, 2023, 00:04:28

is anyone here?

nmt 01 March, 2023, 08:12:28

Hi @user-c3a255 πŸ‘‹

user-9f7f1b 01 March, 2023, 03:05:21

Hi @nmt First, I saved a binocular video of the process by displaying nine dots in sequence on the screen while my eyes focused on the nine dots and my head held still on a tripod. Then, ellseg is used to calculate the location of the pupil center. Like pupil capture, the pupil was calibrated with five points. I manually selected the pupil information corresponding to each calibration point and used Gazer2D() for calibration. Finally, all pupil positions mapped were displayed on the screen. The result is showed below:

Chat image

user-9f7f1b 01 March, 2023, 03:08:20

@nmt After I removed (norm_x * norm_y, norm_x_squared, norm_y_squared, norm_x_squared * norm_y_squared) in _polynomial_features, things got better

Chat image

nmt 01 March, 2023, 08:15:05

Interesting approach. Of course, it assumes a fixed relationship between the wearer and screen - hence the 'tripod'. Any relative movements between the wearer and the computer screen would introduce errors into the pipeline.

user-5ba46b 02 March, 2023, 00:45:23

Hi there! My project involves using the Pupil Core (this old model https://hackaday.com/2013/02/12/build-an-eye-tracking-headset-for-90/) and a portable EEG while the participant runs a task on the computer. I plan to use LSL to synchronize all three data (stimuli, glasses, and EEG). I'm a bit confused regarding this process. Considering that everything is compatible with LSL, I'll have to download the Pupil LSL Relay plugin, LabRecorder (does it works as a hub?), and add some plugin/code in MatLab. Is that correct? Can I have the data separately later, or LabRecorder saves everything as just one file?

user-5ba46b 02 March, 2023, 01:51:23

One other thing related to this version of the core. I was able to run the software in it, however, the image quality of the eye camera is really weird and it's not even detecting the pupil. The image is really dark (if I change some settings it gets red but still really dark) but I'm not sure if it's an issue with the camera or some configuration in the software. Do you have any recommendations for this?

nmt 02 March, 2023, 17:55:07

What happens when you change the exposure time? Bear in mind you have an older DIY project built with webcams - there's every chance the camera is faulty πŸ˜…

user-80123a 02 March, 2023, 08:57:05

Hello, I would ask a question about post-hoc gaze calibration. These are my steps: I record my data from a Raspberry pi, then post-hoc the data via a desktop PC. I calibrate the eye tracker on the Raspberry pi and unchecked the pupil detection during the recording. Then, on the desktop side, I select post-hoc pupil detection. My situation is: when I only select Post-Hoc Gaze Calibration, I do not have data on gaze direction. I only get data on gaze direction when I press the button Detect References (see figure bellow). And my question is: is it the right way for having the gaze direction data? Or do I need to do other steps? Thank you in advance in your instruction, have a nice day.

Chat image

nmt 02 March, 2023, 17:54:13

Hi @user-80123a πŸ‘‹. Please take a deep dive into these videos: https://docs.pupil-labs.com/core/software/pupil-player/#pupil-data-and-post-hoc-detection They can explain everything better than I could via text πŸ˜„

user-ebda6c 02 March, 2023, 12:29:47

hi team! I am very new to the software, In which file formats should recordings be to be uploaded to pupil player?

user-d407c1 02 March, 2023, 12:45:08

Hi @user-ebda6c ! Are you using Pupil Core? The recordings files generated with Capture should be on folder named recordings, each folder inside is named after the datetime it was generated. If you navigate inside one of them, you will have more folders, one per recording made on that date. Those are named 000, 001, etc. Just drag one of these recordings onto Pupil Player.

Check out our getting started guide for more information https://docs.pupil-labs.com/core/#_6-locate-saved-recording

user-ebda6c 02 March, 2023, 13:28:01

can I use recordings that have not been generated with Capture?

nmt 02 March, 2023, 17:56:37

Hey! Player only supports Capture recordings

user-154fea 02 March, 2023, 15:37:36

Hi everyone, I am currently suing pupil core ET with older adults and I am struggling to record good data because they usually wear thick glasses and it is throwing off the calibration or during the recording session. I was wondering if there is a way around this. Thanks

nmt 02 March, 2023, 22:00:03

Hi @user-154fea! External glasses sometimes work, but it can be hit and miss. The goal is to capture unobstructed images of the eyes - we recommend trying to put the Core headset on first and the external glasses on top. This is not an ideal condition but does work for some people

user-3efcd2 02 March, 2023, 16:55:48

hello, there is some tutorial to acquire gaze position from pupillab to unity?

nmt 02 March, 2023, 17:57:24

Check out our hmd-eyes repository: https://github.com/pupil-labs/hmd-eyes

user-d1795e 02 March, 2023, 19:19:38

Hello! My Lab is planning to record data from two headsets simultaneously and we where trying to figure out if there's a way to run two headsets simultaneously on the same PC or if we will need to use separate PC's?

nmt 02 March, 2023, 22:02:12

Whilst technically possible, we recommend running each Pupil Capture instance on a separate computer. To keep them in sync, use the Pupil Groups Plugin: https://docs.pupil-labs.com/core/software/pupil-capture/#pupil-groups

user-66797d 02 March, 2023, 21:25:43

Hello, I have a question about downloading the timeseries + scene video. For one video, I've tried downloading the timeseries + scene video from different computers, but the download continuously fails. The download works just fine with the other videos. Is there a way around this issue at all?

nmt 02 March, 2023, 22:03:04

Please reach out to info@pupil-labs.com in this regard

user-3578ef 02 March, 2023, 21:53:41

@nmtWhat should happen when Core is operated in a completely dark room, where the eyes are completely dilated and there is nothing to fix the gaze on> For example, imagine as if operating core at the depth in a murky void?

nmt 02 March, 2023, 22:05:05

The eye cameras of Core have IR illuminators, so you should be able to capture the pupils. I've used Core down to about 3 lux ambient illumination. If you calibrate prior to turning the lights off, you'll still get a gaze estimate as gaze is provided in scene camera coordinates.

nmt 02 March, 2023, 22:05:51

@user-3578ef, what are you planning to do in such a dark environment?

user-3578ef 02 March, 2023, 23:11:14

We've only just done some experiments with a diver in a darkened tub with a monitor but I noticed that when there is some illumination from the monitor reducing the pupil size, its initial tracking might be more robust than when the tub is dark and the pupil initially fully dilated. In fact the initial tracking of pupil diameter is rather bizarre, but there could be plenty of reasons in our setup for that. Only first experiments for now...

nmt 03 March, 2023, 08:23:11

For extremely dilated pupils you might need to increase the maximum pupil size under 2d detector settings. Additionally, we recommend checking out our pupillometry best practices (if you haven't already): https://docs.pupil-labs.com/core/best-practices/#pupillometry

user-3578ef 03 March, 2023, 12:42:08

Indeed that seems to be the case. But then that might create tracking problems when tasking requires looking at the bright display. It seems it would be better to have the minimum diameter auto-regulated somehow to have a seamless response behavior so that the 3D eyeball tracking can maintain valid pupil diameter estimates.

nmt 03 March, 2023, 13:22:14

Apologies, I meant to say maximum expected pupil size. I've edited the original message.

user-3578ef 03 March, 2023, 13:30:10

I also edited my message. But the open question does this large maximum degrade performance in the normal use case where the gaze is turned to the bright screen? The normal pupil sizes are in the range of 2 - 4.5 mm as recommended minmax settings but due to scaling issues related to our eye camera position, these numbers are inherently larger, maybe 2X to begin with and might degrade the Pye3D tracker. I suspect at some point my numbers become unworkable and the pupil diameter dropouts occur.

user-3578ef 03 March, 2023, 13:38:57

Neil, notice the large dead zones, and that I do get diameter tracking about midway through the episode and I can see some small reasonable diameter changes during that time. But then it drops again, etc. And I can see from the erratic FPS that tracker is struggling during the dead zones.

Chat image

user-d407c1 03 March, 2023, 14:14:57

Hi @user-3578ef Are you running some other programs that may have increased the CPU usage during these phases of the recording?
Kindly note that you can try to reduce the CPU usage if you run some of the plugins like surface tracker post-hoc.

user-3578ef 03 March, 2023, 14:21:36

Thanks Miguel, I'll check about that with the lab - but I don't think so. And the only time I see this is in "dark" condition and it is consistent across multiple sessions.

nmt 03 March, 2023, 14:50:07

Note that you can run pupil detection post-hoc and tweak the 2d detector settings.

nmt 03 March, 2023, 14:51:35

It's also worth checking that the eye camera exposure time was appropriate for the dark condition. Exposure time can be set to automatic or manual

user-3578ef 03 March, 2023, 15:37:05

Can I read the min max pupil size data from the recording folder to check it?

nmt 06 March, 2023, 16:34:59

This isn't stored with the recording as far as I remember

user-17ca63 06 March, 2023, 09:31:10

Hi, I got a question about the data gained by pupil core. From 'pye3d 0.3.0 real-time' method, the left and right pupil's diameters are similar in pixels, while the diameter_3d in the unit of millimeter are quite different, the value of the right eye is even twice that of the left eye. Can anyone explain why?

Chat image

user-d407c1 06 March, 2023, 09:58:34

Hi @user-17ca63! The diameter in pixels does not take into account the 3D model, nor the position of the pupil or the corneal refraction. Pye3D offers the diameter 3D by accounting for the eyeball model, the location of the pupil (this is accounting for perspective) and the corneal refraction. There are several possible explanations for what you observe, such as variability of the 3D model between eyes, the cameras to eye relationship, or a misalignment on the estimation. When performing pupillometry, it is recommended to get a proper 3D model of both eyes, a good calibration, freeze the model, and compare relative changes rather than absolute values. https://docs.pupil-labs.com/core/best-practices/#pupillometry

user-2798d6 06 March, 2023, 16:27:40

Hello, I am not getting audio in my core recordings. I see two options in the audio mode in Capture: Sound Only and Silent. I have it set to Sound Only, but my world recordings have no sound. Is there an adjustment or fix for this? Thank you so much!

nmt 06 March, 2023, 16:37:13

Hi @user-2798d6 πŸ‘‹. Please see this message for reference

user-89d824 06 March, 2023, 16:35:57

Hi,

I did a short recording to test whether the calibration markers on a curved screen were detected.

I then ran post hoc calibration - accuracy was 1.7 degrees.

However, the visualisation was a bit odd as you can see in the first screenshot - the two green dots were always connected by the pink line during the calibration process as my eyes moved around the screen. Most of the time it's in a straight line, but sometimes there's a lot of zigzagging like in the 2nd screenshot. Do you know what's going on here please?

Re: the FPS, I don't know why it's so low in the screenshot but in the recording everything is smooth and FPS averages around 23.

Chat image Chat image

nmt 06 March, 2023, 17:04:10

I'd firstly recommend updating your Pupil Core software to the latest version, make a test recording that contains the calibration choreography, and share it with data@pupil-labs.com for feedback

user-4ba9c4 07 March, 2023, 18:07:15

Hello, I am setting up an experiment. I need to measure saccades, however the LSL data recorded is only at around 30 FPS. My question is, can I increase the frame rate of the LSL broadcast from pupil capture to 200 Hz? if So how can it be done?
Thanks,

nmt 08 March, 2023, 12:30:11

Hi @user-4ba9c4! Can you confirm what sampling rate you're getting with gaze data when not using LSL, i.e. directly in Pupil Capture?

user-fa2527 08 March, 2023, 15:09:35

Hello, Does Pupil Core work well for doing screen-based eye tracking research?

user-d407c1 08 March, 2023, 15:26:59

Yes, it can be used in conjunction with the Surface Tracker plugin https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking

user-fa2527 08 March, 2023, 15:24:47

For example, using a chin rest?

user-fa2527 08 March, 2023, 15:33:42

Thank you. Will that yield more than just AOI, ie scanpath?

user-d407c1 08 March, 2023, 15:45:10

It will also give you heatmaps, for scanpaths you will need to follow this tutorial https://github.com/pupil-labs/pupil-tutorials/blob/master/03_visualize_scan_path_on_surface.ipynb

user-fa2527 08 March, 2023, 15:56:58

Thanks again! My other question: can you record input like mouse events so they are already synchronized with eye tracking data or do you have to synchronize them in post processing?

user-d407c1 08 March, 2023, 16:00:13

You can use Lab Streaming Layer for example to do that, using https://github.com/labstreaminglayer/App-Input and https://github.com/labstreaminglayer/App-PupilLabs/blob/master/pupil_capture/README.md

user-d407c1 08 March, 2023, 16:02:51

or a way to log mouse events by yourself and send the corresponding event annotations

user-908b50 08 March, 2023, 20:58:11

From what I remember sampling frequency could either be 60 Hz or 120 Hz. So, I am not sure what this means. I want to interpolate gaze data to calculate saccades so I need sampling frequency.

nmt 09 March, 2023, 07:41:17

This is quite low and suggests insufficient computational resources. What are you computer specifications?

user-908b50 08 March, 2023, 20:59:35

If anyone here has experience calculating saccades then please reach out! I am trying t implement someone's code and I have some questions on how to do that in PL. Thanks!

user-660f48 09 March, 2023, 11:12:06

Hi @nmt, many thanks for your reply. It is reassuring to know that the coordinate system can be adapted to the image. I have a new question. On pupil PLAYER the frame number does not match the frame number of the world video exported from pupil PLAYER. Why is this happening?

nmt 09 March, 2023, 17:01:16

I'm not 100% sure I understand what you mean here. Can you share a screenshot for illustration purposes?

user-6cf287 09 March, 2023, 11:23:57

Hi PupilCore, I am currently analyzing the data recorded from the eye tracker and loading them from the exported csv file. I noticed that the diameter_3d column has some missing values in between some intervals ( i am not sure if the duration of these intervals is constant) . I just wanted to know if this is normal and expected. Secondly, is it possible to reduce the sampling rate of the recorded data? as currently, it is recording 200 data points per second and if we would like to sync it with other devices with different sampling rates, do you have any tips on how best to handle this? Thank you!

nmt 09 March, 2023, 17:09:35

diameter_3d is an estimate of pupil size in mm generated by our pye3d eye model. In cases of low-confidence pupil detections, e.g. during blinks, squinting etc. you can expect missing values. As regards synchronisation, it's not really possible to match Pupil Core's sampling frequency with external equipment. Instead, I'd recommend using our network api to send/receive trigger events, or preferably, use our Lab Streaming Layer integration: https://github.com/labstreaminglayer/App-PupilLabs/tree/master/pupil_capture#pupil-capture-lsl-plugins

user-6cf287 09 March, 2023, 11:25:17

Sorry, I just noticed you have answered the sampling rate question above, i will try that.. so just the first question is unclear to me now.

nmt 09 March, 2023, 17:25:51

Hey @user-f1866e! The search bar is up in the right corner πŸ™‚

user-f1866e 09 March, 2023, 17:31:31

I just find a significant delay in realtime data. topic, payload = Globs.pupil_subscriber.recv_multipart()

user-f1866e 09 March, 2023, 17:32:20

it is 2-3 seconds of delay

user-f1866e 09 March, 2023, 17:33:36

I tried to monitor normalized pupil direction in pupil group. Any idea???

user-908b50 09 March, 2023, 19:40:09

I doubt its the computer.

user-f1866e 09 March, 2023, 20:29:05

Thanks! Delay happens in my tkinter routine.

user-57e536 09 March, 2023, 21:31:32

oops, wrong channel. sorry!

nmt 10 March, 2023, 08:04:41

Frame rate

user-17ca63 10 March, 2023, 09:31:45

Hi PupilLabs, I tried to use pupil player to do the offline detection, but it only saved the offline_pupil data, while missed blink data and so on, how can I get other data?

nmt 10 March, 2023, 17:28:04

Hi @user-17ca63! You'll need to enable the blink detector plugin to see classified blinks in Pupil Player

user-154fea 10 March, 2023, 14:35:13

Hi PupilLabs community, I have question regarding baseline correction -- So if in case the baseline has not been recorded during the experiment, can we create a baseline before the stimulus was presented and the participants were reading a 'Welcome' text. Thanks

nmt 10 March, 2023, 17:30:39

Hi @user-154fea πŸ‘‹. Are you doing pupillometry? If so, a baseline should be recorded in the same session as the experiment, otherwise confounds will creep into your data.

nmt 10 March, 2023, 20:20:12

Frame indices

user-eb6164 11 March, 2023, 23:44:54

Hello, I want to ask is there anyway we can extract the size of the AOI just to record it? the width and height of each AOI?

user-daaa64 13 March, 2023, 12:17:51

Hi everyone! The Pupil Player says β€œno fixation detected” but it genuinely showed me heat maps and fixation points last time I opened it. What’s wrong? What should I do to detect the fixations again? Thanks in advance!

nmt 13 March, 2023, 12:19:58

Hi @user-daaa64. You'll need to enable the Fixation Detector Plugin https://docs.pupil-labs.com/core/software/pupil-player/#fixation-detector

user-eb6164 13 March, 2023, 19:25:59

hi @nmt can we determine the size per AOI? is there any way we can record the size in pixels (height and width)

user-eb6164 14 March, 2023, 15:25:28

can someone answer my question please 😁

user-e684ed 14 March, 2023, 07:44:16

Hello, I am wondering how I can use the face blurring function that is described in the facemapper enrichment. I cannot find anything about in the documentation, because there it says it will be coming soon.

nmt 14 March, 2023, 09:23:35

Hi @user-e684ed, we have received your email and will respond there

user-535417 14 March, 2023, 09:33:56

Hi, we have the pupil eye tracker glasses (pupil invisible) we are trying to connect the glasses to our lsl. According to the instructions we download the pupil capture software (on windows10), but although the glasses were connected to the computer we get the attached errors [and see gray screen]:

we also tried to follow the instructions under the windows section here: https://docs.pupil-labs.com/core/software/pupil-capture/

but nothing work 😦 what can we do ?

Chat image

nmt 14 March, 2023, 09:43:12

Hi @user-535417! Pupil Capture software is not compatible with the Pupil Invisible system. Pupil Invisible must connect to your Companion smartphone device for operation.

The Pupil Invisible LSL documentation can be found here: https://pupil-invisible-lsl-relay.readthedocs.io/en/stable/

user-6cf287 14 March, 2023, 09:41:57

Hi team, do you have any other documentation that shows how to generate and print the april tags? I tried to follow this page but the links are not working: https://pupil-labs.com/releases/core/036-marker-tracking/

nmt 14 March, 2023, 09:59:09

@user-6cf287, you can use the markers found here: https://docs.pupil-labs.com/core/software/pupil-capture/#markers

user-1436e5 14 March, 2023, 09:43:23

Hello there

user-1436e5 14 March, 2023, 09:43:29

Im johan Lara

user-1436e5 14 March, 2023, 09:44:06

Im about to buy the pupil core model for sport shooting purposes

user-1436e5 14 March, 2023, 09:45:36

i wanted to know if using the software + pupil core require any specific knowledge in terms of code writing ?

user-1436e5 14 March, 2023, 09:46:14

or if i can connect the pupil core to my mac, and start using it as a plug and play system

user-1436e5 14 March, 2023, 09:46:45

(im totally new, sorry if its a dumb question !!)

nmt 14 March, 2023, 10:08:42

Hi @user-1436e5 πŸ‘‹. Our software can be certainly be used by no-code users. You tether the Core headset to a computer, in your case Mac is supported, and load the software. Then it's a case of calibrating the system for the wearer and starting a recording. However, there are quite a few things to consider when it comes to using Core in sports shooting. We can offer a demo & Q&A via video call to ensure it's a good fit for your needs. Feel free to reach out to info@pupil-labs.com

user-535417 14 March, 2023, 10:06:42

I tried but I have the followed error:

Chat image

user-5d8c4f 15 March, 2023, 13:27:38

Hi, somehow, the video I recorded through the phone cannot be uploaded to the cloud. do you have any idea why?

nmt 15 March, 2023, 16:10:31

Hi @user-5d8c4f πŸ‘‹. Can I just confirm, which Pupil Labs system are you using?

user-9f7f1b 15 March, 2023, 16:32:39

Hi, team! I have read the code about calculating accuracy, the final accuracy is calculated as the average angular offset (distance) (in degrees of visual angle, only look for values greater than cos(5deg)) between fixations locations and the corresponding locations of the fixation targets. I have a question about the final accuracy. If the average error of one or two fixation target is 1.8 deg, but the rest fixation targets have small deg error, the final accuracy is also small, such as 0.5deg. Should we consider this calibration a failure?

nmt 16 March, 2023, 07:17:54

Hi @user-9f7f1b. I'm not sure I really understand your question. Perhaps you could rephrase it 😸? Maybe I have a question for you - why would small (e.g. 0.5 deg) of accuracy be considered a failure?

user-cdb45b 15 March, 2023, 17:02:09

Hello rcd 5213 mgg 6348 nmt 1123 I m

user-5ed24e 16 March, 2023, 05:52:14

We bought a pupil labs neon mobile eye tracker. We wanted to see the fixations during the recording of the video but unfortunately when we watch the video there was a message saying no fixations found. Could you help us how to setup the eye tracker to resolve the problem.

user-480f4c 16 March, 2023, 07:33:56

Hi @user-5ed24e, I have just replied to your email!

user-5ed24e 16 March, 2023, 07:48:08

Hi @user-480f4c Yes, I have already received the email. Thank you!

user-9f7f1b 16 March, 2023, 10:05:56

Unfortunately, I use DIY headset and 2D calibration code. My teacher said that this result is not good enough. I decided to test the algorithm by making some simulation data as ground truth data.

nmt 16 March, 2023, 10:18:03

Honestly, those results are very impressive for a DIY headset

nmt 16 March, 2023, 10:19:24

Recommend, if you can, implementing a custom 2d calibration choreography where the marker moves around and covers all areas of the screen - that should improve region-specific accuracy values.

nmt 16 March, 2023, 17:35:18

That should trigger a reupload

user-345aa1 17 March, 2023, 18:47:12

Hi, i'm trying to run main.py from this link (https://github.com/pupil-labs/pupil), but after i follow all steps and run i keep getting this error, can anyone help me?

Chat image

user-6b1efe 18 March, 2023, 03:43:00

Hi, we pressed "C" after setting up the communication between Unity and pupil capture, but we did not see the calibration mark on the headworn display, and the calibration mark on the computer screen did not change either. can anyone help me?

user-6b1efe 18 March, 2023, 11:53:50

@nmt Could you help me see how to solve this problem?I would be very grateful if you could reply to my information in time.

user-5bef55 20 March, 2023, 11:42:27

Hi i have the Pupil Core and Can't seam to get it to work, would anyone be able to help with the "world - [ERROR] calibration_choreography.screen_marker_plugin: Calibration requiers world capture video input." Issue?

user-c2d375 20 March, 2023, 13:12:09

Hi @user-5bef55 πŸ‘‹ May I ask you which software version are you using and if you can see the real-time camera preview in all three windows (world, eye0, eye1)?

user-5bef55 20 March, 2023, 13:13:27

The main view with controls shows nothing

user-761b3e 20 March, 2023, 14:51:33

hi, we have the pupil labs core and just track instruments on a screen. We noticed a difference between the normal lens and the fish eye lens. Do we have to consider a deviation in the coordinates with the fish eye lens or is this already compensated?

user-6b1efe 21 March, 2023, 03:48:43

Hi @user-c2d375 .After the calibration is completed, the gaze visualizer is still displayed in the scene. How do I hide the gaze visualizer?

user-c2d375 21 March, 2023, 10:03:36

Hi @user-6b1efe πŸ‘‹ You can enable/disable the accuracy visualizer by accessing the accuracy visualizer plugin located in the menu on the right side of the main window in Pupil Capture.

Chat image

user-5ed24e 21 March, 2023, 07:20:34

Hi @nmt I need help on how to upload our recording from the local drive to pupil cloud

user-d407c1 21 March, 2023, 07:56:38

Hi @user-5ed24e ! Pupil Core recordings can not be uploaded to Pupil Cloud, if you are using Pupil Invisible, please repost your question at πŸ•Ά invisible

But from the video you shared, it seems like you are using Pupil Core. For fixations with Pupil Core, there are two approaches, depending on whether you are using Pupil Player (post-hoc detection) or Pupil Capture (in realtime) . Please check our documentation https://docs.pupil-labs.com/core/terminology/#fixations for more info.

Any of those can be activated in the Plugins section at the sidebar https://docs.pupil-labs.com/core/software/pupil-player/#player-window

Edit: Sorry, I saw that you already have enabled the plugin. In the plugin panel you should see a button at the end saying "Show fixations"

user-5ed24e 21 March, 2023, 07:21:35

What we want to see are the fixations in the player so we can see the location where the user looks

user-5ed24e 21 March, 2023, 08:00:14

@user-d407c1 are using pupil core. We want to visualize the fixations when we play the recording using Pupil player but we can’t visualize it. The link you shared about fixation just shows the terminology about fixations. Is there a way for us to visualise the fixations?

user-d407c1 21 March, 2023, 08:02:31

Sy, I saw later you had the plugin enabled, see the edit to my answer. Also above that toggle you will see the number of fixations detected.

user-3c4ff0 21 March, 2023, 14:40:01

Hi, I have a pupil core device for an eye-tracking experiment. I am wondering if there is any limitation of the eye tracking video recording length, like no more than 40 minutes?

user-d407c1 21 March, 2023, 14:42:12

Hi! There is no specific time limit for recording with the Pupil Core device. However, it is recommended to split your experiment into smaller chunks instead of recording for a long period of time. This is because the main video data can be heavy and there is a risk of crashing the PC if it's not particularly powerful. Therefore, it's better to split your experiment into several blocks rather than just one or two, if your experiment allows. It's also worth doing some pilot testing to get a feel for the kinds of data you'll get, how accurate the calibration is over time, etc. That'll help you decide on an appropriate plan. If you haven't already, check out the best practices page for more information: https://docs.pupil-labs.com/core/best-practices/#split-your-experiment-into-blocks

user-3c4ff0 21 March, 2023, 14:43:42

Will do the splitting, thank you very much!

user-0aca39 21 March, 2023, 18:08:13

Hi, I have pupil labs for HTC vive. I wonder what camera sensor are you using? They seem very cool for a lot of DIY projects I have in mind. Could you share? because I don't want to take them apart, quite expensive gear πŸ˜„

nmt 22 March, 2023, 17:03:52

This isn't publicly available, I'm afraid @user-0aca39

user-6b1efe 22 March, 2023, 09:19:41

Thank you for your reply @user-c2d375 , but the gaze visualizer I mentioned is in hmd-eye, as shown in the following figure. After the calibration is completed, it still displayed in the scene, which will affect the experience of virtual scenes. I would be grateful if you could help me.

Chat image

nmt 22 March, 2023, 17:04:45

Please try disabling the Gaze Visualizer script: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#default-gaze-visualizer

user-eb6164 22 March, 2023, 15:28:39

hi I am having an issue with pupil labs head mounted eye tracker V.3.5.1. One of the eye video captures is not working (eye1) it just give grey color. I did try to uninstall and install drivers didn't work. I unplugged the cam and plugged it again is not working. We already purchased a new camera for this eye and it was working until today the issue occurred again. It is giving me error Video_capture.uvc_backend: could not connect to device. No images will be supplied. No camera intrinsics available. Consider selecting a different resolution

nmt 22 March, 2023, 17:05:03

Please reach out to info@pupil-labs.com in this regard

user-eb6164 22 March, 2023, 15:29:05

i tried different pc not working too

user-91a92d 22 March, 2023, 16:01:47

Hi I am trying to read all the topics from the zmq with the code in the helper (https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_messages.py) to check if the surfaces are well sent over the interface I have uncommented the line to receive all the topics. Some topics are well displayed but I quicly run into an error related to UTF-8 awlays after the topic "frame.eye.X" "UnicodeDecodeError: 'utf-8' codec can't decode byte 0x80 in position 33: invalid start byte" Any idea how to deal with that Thanks for the help

nmt 22 March, 2023, 17:08:43

Can you please share more of the terminal output over in the software-dev channel?

nmt 23 March, 2023, 14:58:12

Please post your output over in the software-dev channel πŸ™‚

user-91a92d 23 March, 2023, 14:59:10

Sorry I am still unsure what goes where

user-736bf7 24 March, 2023, 08:14:54

Unfortunately, I have a defect of eyecamera 0. A replacement part is on its way. Since our experiment is currently at a standstill and we are absolutely pressed for time, I will temporarily use a single camera setup.

Did I understand the Swirski and Dodgson paper mentioned in previous threads correctly that no user calibration is required? Is there anything else I need to consider when setting up PupilCapture for the single camera setup? For the single camera setup, are there any other tips or important papers I should have read somewhere? I understand that contralateral eye movement may not be captured robustly. Thank you. πŸ™‚

https://www.researchgate.net/profile/Lech-Swirski-2/publication/264658852_A_fully-automatic_temporal_approach_to_single_camera_glint-free_3D_eye_model_fitting/links/53ea3dbf0cf28f342f418dfe/A-fully-automatic-temporal-approach-to-single-camera-glint-free-3D-eye-model-fitting.pdf

nmt 24 March, 2023, 16:39:46

Hi @user-736bf7 πŸ‘‹. Go ahead and use the eye tracker like you would with both eye cameras, i.e. set the camera to get good pupil detection, fit the model, and do a calibration choreography, as per the getting started guide: https://docs.pupil-labs.com/core/#_3-check-pupil-detection

The system will default to a monocular calibration if only one eye camera is connected.

user-1ba94f 24 March, 2023, 13:57:08

Hi, I'm trying to write a simple c++ script to open the camera and capture some images using the Core. I can open the camera and get images using the Pupil Labs version of libuvc, however it seems like the IR emitters are not turning on. When I run Pupil Capture , or use guvcview, the camera displays the eye clearly. Through some testing, it appears that with my code the IREDs aren't turning on. I tried looking through the Pupil documentation, but I couldn't find anything about controlling the IREDs. This problem exists on both my VM running Ubuntu 22.04.2 and the current Rasbian release on a Pi 4. Any help is appreciated. This may also pertain to the software-dev channel, so I can move it if needs be. I have some images of the camera outputs in each situation if that would be helpful.

nmt 24 March, 2023, 16:43:34

Hi @user-1ba94f. The cameras are UVC compliant, and the IR illuminators receive power when connected to USB. It sounds like you might need to adjust the brightness setting in UVC settings, which is exposed in libuvc.

user-956845 24 March, 2023, 17:20:03

Hi, I am trying to use the gaze position to plot a heatmap through python. I understand that the origin of position should be the center of marked surface, but I am confused about the unit of position. For example, I had a gaze position (1.2, 0.7), what is the unit of this "1.2"? Thank you.

nmt 27 March, 2023, 11:35:29

Hi @user-956845 πŸ‘‹. Did you obtain surface mapped gaze coordinates through Pupil Core software, or through Pupil Cloud?

user-1ba94f 24 March, 2023, 17:20:03

Thanks for the follow up, I have tried using the uvc_set and uvc_get brightness, but I would get segmentation faults or pipe errors respectively. I have been trying to understand what a pipe error is in this context. I know there is the detech_kernal_driver flag, I'm not sure what value that should be set to though.

user-eeecc7 25 March, 2023, 02:19:43

Hi @user-53a8c4, How would we calculate angular accuracy in the 2D calibration case? I have apriltag markers because of which I have depth info for each world frame as well.

nmt 27 March, 2023, 11:38:07

Hi @user-eeecc7! Recommend using our validation plugin in Pupil Capture. It provides accuracy in degrees of visual angle even when a 2d calibration was performed.

user-99bb6b 25 March, 2023, 17:28:24

Has anyone tried to replace the world camera with a Go pro?

user-53a74a 26 March, 2023, 14:53:40

Hello, we have been experiencing an issue with Pupil Core + Pupil Capture that they won't lock on the pupil of participants, instead, the pupil ROI kept jumping across the eye camera window. Resetting Pupil Capture settings did not help the situation at all. This issue happened only when the participants had light iris colors, such as light blue. Has anyone faced similar problems before? Does the Pupil Capture have some difficulty detecting pupils with certain iris colors?

user-c2d375 27 March, 2023, 07:49:44

Hi @user-53a74a πŸ‘‹ Please try to manually change the exposure settings to improve the contrast between the pupil and the iris (https://drive.google.com/file/d/1SPwxL8iGRPJe8BFDBfzWWtvzA8UdqM6E/view) and set the ROI to only include the eye region, excluding any dark areas that are not your pupil (https://drive.google.com/file/d/1tr1KQ7QFmFUZQjN9aYtSzpMcaybRnuqi/view)

user-df10a5 27 March, 2023, 07:07:15

Hello ,where can I get the NenoNet's infomation?

user-66797d 27 March, 2023, 15:02:25

I'm curious, what are some of the best eye tracking analysis softwares out there for Pupil Labs data that you all are using?

user-ae76c9 27 March, 2023, 16:28:08

Hi team, we are using live calibration with the Core, however sometimes one eye has pretty good pupil detection while the other has poor detection, and this throws off the gaze estimation. Is it possible to only take the pupil detection for one eye into account when calculating gaze estimation? Thank you!

user-ae76c9 30 March, 2023, 17:09:47

Bumping this, thanks!

user-6e1219 27 March, 2023, 18:06:49

Hello, I have designed an experiment where I have integrated Pupil Core with Psychopy and recorded monocular data. Now when I was trying to analyze the HDF5 file I saw that there is no zero values for blink in the Pupil column but in the Pupil Labs generated file zeros are present. Is there any reason for this ?

nmt 28 March, 2023, 06:46:27

Hi @user-6e1219. Blinks aren't currently streamed over the LSL relay: https://github.com/labstreaminglayer/App-PupilLabs/tree/master/pupil_capture#lsl-outlet

user-a11557 28 March, 2023, 00:12:14

Hi, im trying to implement this code about a 9 point callibration system, but it relies on some plugins I cant find where to download, are any of you familiar with them? They are the following: from calibration_choreography.screen_marker_plugin import ScreenMarkerChoreographyPlugin from calibration_choreography.base_plugin import ChoreographyModen

I found them in this link: https://gist.github.com/papr/339dcb08caef45d3798a68aa4e619269

nmt 28 March, 2023, 06:48:36

Note that this is a Plugin. It's designed to be copy-pasted into Pupil Capture's plugin folder: Home directory -> pupil_capture_settings -> plugins. The modules that it tries to import are a part of Pupil Core software.

user-8179ec 29 March, 2023, 02:45:43

Hi. can pupil core calibrate for nystagmus?

user-6cf287 29 March, 2023, 15:39:42

Hi team, after recording, i noticed that sometimes the offline_data folder is either not created or only has limited token files in it. Will this cause an issue when trying to replay the files in Pupil Player? As I tried to load a recording directory that does not have the offline_data at all and it works but when I loaded a file of about 3GB with some token files loaded in the offline_data folder, Pupil Player is not responding.

user-d407c1 30 March, 2023, 13:41:36

Hi @user-6cf287 ! The offline_data folder contains data that is used to speed up the loading of the recording in Pupil Player. If the folder is not created or only has limited token files, it may take longer for the recording to load in Pupil Player, but it should not cause any issues with playback. However, if you are experiencing issues with Pupil Player not responding when loading a recording with a large offline_data folder, it may be due to the size of the folder or the number of token files. You can try deleting the offline_data folder and see if that resolves the issue.

user-d407c1 30 March, 2023, 13:35:42

@user-a11557 if you place it on the folder, it should be available to you in the calibration menu. Unlike other plugins this, does not need to be enabled on the plugin menu.

Chat image

user-4e60e1 30 March, 2023, 22:07:13

hello,i am using pupil labs core with psychopy . i have done everything according to the instructions but psychopy keeps crashing after validation step and it shows me this error

user-480f4c 03 April, 2023, 08:56:40

Hi @user-4e60e1 πŸ‘‹ . It seems that some of the errors are not directly related to Pupil Core (e.g., ERROR Support for thesounddeviceaudio backend is not available this session. Please installpsychopy-sounddeviceand restart the session to enable support). What version of Pupil Capture are you running? Could you try connecting the core headset to your computer and opening Pupil Capture without running your psychopy pipeline? This will help us understand if there are errors related to the core pipeline per se or if the problem appears when running it with psychopy.

user-4e60e1 30 March, 2023, 22:08:47

my psychopy trial structure is shown below

user-4e60e1 30 March, 2023, 22:09:13

Chat image

user-774b94 31 March, 2023, 08:17:34

Hello, could you please assist me with an inquiry? I have recently attempted to use a DIY eye tracker with Microsoft HD-6000 and Logitech C525 cameras. Unfortunately, when attempting to run the pupil-3.5, I encountered issues as it was unable to detect the cameras. Despite trying to run the program with administrator privileges, the problem persisted. I can confirm that both cameras are displayed in the system's camera devices under the camera section, and my computer is running on the Windows operating system. Would you be able to provide me with any suggestions on how I can rectify this situation and enable the pupil-3.5 to detect the cameras? Thank you in advance for your assistance.

user-b75056 24 April, 2023, 10:22:59

Hi,have you solved your problems? I encountered the same problems as you πŸ™‚

user-774b94 31 March, 2023, 08:37:32

Chat image

user-c5ca5f 31 March, 2023, 23:29:10

Hi I am using Pupil mobile app version 0.37.0 on android 11, google pixel 2 phone. I am trying to find a compatible version of pupil capture. What would you guys recommend ? Also what are the step to connect the app to the pupil capture ?

user-d407c1 03 April, 2023, 09:05:12

Hi @user-c5ca5f , latest version of Pupil Capture still works with Pupil Mobile, although kindly note that the app is no longer maintained and is not available on Google Play Store.

Existing Pupil Mobile bundles will still work, but we are no longer developing or maintaining the Pupil Mobile application or selling the Pupil Mobile bundle (we will not be able to assist if there are issues with that app).

We recommend a small form-factor tablet-style PC to make Pupil Core more portable.

Regarding the steps, you only need to connect Pupil Core to your device, start the application while connected to the same local network as your laptop.

In your Pupil Capture instance, enable the network API plugin if is not, go to camera sources, enable manual selection and the camera feeds should show up there.

End of March archive