💻 software-dev


user-fce73e 01 September, 2025, 05:43:40

Hello, I am trying to run the Pupil Labs Neon on a desktop to eliminate the delay when transferring data via the API from a smartphone. When I run the software from the Pupil GitHub repository, the Scene camera connects successfully, but the Eye cameras do not. When I click on the 'Neon sensor module' directly, an error message appears, stating it is "already used or blocked" (as shown in the attached picture). How can I solve this issue? Thank you.

Chat image

nmt 01 September, 2025, 09:12:00

Hi @user-fce73e. Neon is not compatible with the Capture software. It is possible to run from source on Linux and Mac, although this is more experimental and you cannot use Neon's native gaze pipeline. If you need low latency API use, you can use a USB hub and ethernet: https://docs.pupil-labs.com/neon/hardware/using-a-usb-hub/#using-a-usb-hub

user-23177e 01 September, 2025, 12:55:27

great, thank you for the update.

user-ffc425 04 September, 2025, 17:50:04

Hi, I had a question about post-hoc eye-tracking. I just made a recording with the pupil-core camera in the 120 FPS (400, 400) resolution mode. Before I made this recording, I checked the lighting in Pupil Capture. I noticed the pupil tracking here was perfect. After making the recording and reconstructing the video as a .avi file, I then fed it into a Python script that I have written, using Pupil Labs detectors library to extract the pupil from this playable video. Here, I found the performance much worse than what I saw live. I was wondering if anyone had any thoughts on how to improve this? I'm also open to improving the conditions of my image as well. Here is a sample image with poor pupil detection

Chat image

user-f43a29 08 September, 2025, 14:59:43

Hi @user-ffc425 , if you were using the script that was previously linked (https://discord.com/channels/285728493612957698/446977689690177536/1408369985331662878), you might need to modify the parameters when you apply the 3D pupil detector. I would recommend referencing Pupil Capture's detector plugin code to see how it is done there.

user-3bcb3f 09 September, 2025, 05:53:06

Hi @user-cdcab0 Can we process additional LSL streams we have recorded using Pupil Capture using Pupil Player? I have an extra event marker stream coming from MATLAB that I have recorded simultaneously with pupil and gaze data. I wanted to plot it over the gaze data and use it to parse the gaze data into trials or epochs. Additionally we would also like to process this data along with EEG we have collected. Can the pupil recordings be integrated into brainstorm or EEGLAB?

nmt 09 September, 2025, 09:37:54

Pupil Player does not have built in functionality to process the LSL streams. However, we do have an opensource plugin feature. So in principle, you could write a plugin that could generate the plots you describe: https://docs.pupil-labs.com/neon/neon-player/plugin-api/#plugin-api Pupil recordings are not compatible with brainstorm or EEGLAB.

user-3bcb3f 16 September, 2025, 15:24:12

Hi @nmt @user-cdcab0 We wanted to reduce the size of Pink dot with yellow border showing fixations and eye postion on the pupil capture screen, so that we can see more exactly where the participant is gazing. How can we do that?

nmt 18 September, 2025, 01:51:30

You can refer to this section of the docs for instructions on modifying the overlay visual properties: https://docs.pupil-labs.com/core/software/pupil-player/#visualization-plugins

user-a4aa71 18 September, 2025, 09:10:31

Hi everyone, I ran a test using the real-time-blink-detector with a NEON recording (https://github.com/pupil-labs/real-time-blink-detection) and noticed that, ( besides taking quite a long time to finish analyzing the recording), it also detected many more blinks than what the Neon Player reported. My question is: is the blink detection algorithm used in Neon Player the same as the one used in the real-time-blink-detector? Thanks a lot!

user-cdcab0 18 September, 2025, 09:14:03

Yes, they're different, but neither of those are what we recommend for blink detection with Neon. Instead, the Neon Companion app can now detect blinks in real-time and save them within the recording

user-a4aa71 23 September, 2025, 09:06:27

Hello! I have another question about this. In the gaze.csv file, the eighth column is the blinkID (which corresponds to the one in the blinks.csv file), but at those indices there are also gaze_x and gaze_y values in pixels. Are those values generated through some kind of interpolation? Maybe something done by the AI algorithm? With the Pupil Core, I found it useful to have the confidence value. I had considered using the eyelid aperture measurement, but in all my recordings this information is NaN. Many thanks for your help!

user-cdcab0 23 September, 2025, 09:12:46

The gaze estimator will always produce a gaze point when provided with an image - regardless of the state of the eye, the eyelid, or even the presence of an eye in the eye image. So yes, even during a blink, the gaze estimator will still make it's best estimate at a gaze point. Depending on your research, you may want to simply filter these gaze points out

End of September archive