πŸ’» software-dev


Year

Month (messages)

user-fce73e 01 September, 2025, 05:43:40

Hello, I am trying to run the Pupil Labs Neon on a desktop to eliminate the delay when transferring data via the API from a smartphone. When I run the software from the Pupil GitHub repository, the Scene camera connects successfully, but the Eye cameras do not. When I click on the 'Neon sensor module' directly, an error message appears, stating it is "already used or blocked" (as shown in the attached picture). How can I solve this issue? Thank you.

Chat image

user-4c21e5 01 September, 2025, 09:12:00

Hi @user-fce73e. Neon is not compatible with the Capture software. It is possible to run from source on Linux and Mac, although this is more experimental and you cannot use Neon's native gaze pipeline. If you need low latency API use, you can use a USB hub and ethernet: https://docs.pupil-labs.com/neon/hardware/using-a-usb-hub/#using-a-usb-hub

user-23177e 01 September, 2025, 12:55:27

great, thank you for the update.

user-ffc425 04 September, 2025, 17:50:04

Hi, I had a question about post-hoc eye-tracking. I just made a recording with the pupil-core camera in the 120 FPS (400, 400) resolution mode. Before I made this recording, I checked the lighting in Pupil Capture. I noticed the pupil tracking here was perfect. After making the recording and reconstructing the video as a .avi file, I then fed it into a Python script that I have written, using Pupil Labs detectors library to extract the pupil from this playable video. Here, I found the performance much worse than what I saw live. I was wondering if anyone had any thoughts on how to improve this? I'm also open to improving the conditions of my image as well. Here is a sample image with poor pupil detection

Chat image

user-f43a29 08 September, 2025, 14:59:43

Hi @user-ffc425 , if you were using the script that was previously linked (https://discord.com/channels/285728493612957698/446977689690177536/1408369985331662878), you might need to modify the parameters when you apply the 3D pupil detector. I would recommend referencing Pupil Capture's detector plugin code to see how it is done there.

user-3bcb3f 09 September, 2025, 05:53:06

Hi @user-cdcab0 Can we process additional LSL streams we have recorded using Pupil Capture using Pupil Player? I have an extra event marker stream coming from MATLAB that I have recorded simultaneously with pupil and gaze data. I wanted to plot it over the gaze data and use it to parse the gaze data into trials or epochs. Additionally we would also like to process this data along with EEG we have collected. Can the pupil recordings be integrated into brainstorm or EEGLAB?

user-4c21e5 09 September, 2025, 09:37:54

Pupil Player does not have built in functionality to process the LSL streams. However, we do have an opensource plugin feature. So in principle, you could write a plugin that could generate the plots you describe: https://docs.pupil-labs.com/neon/neon-player/plugin-api/#plugin-api Pupil recordings are not compatible with brainstorm or EEGLAB.

user-3bcb3f 10 September, 2025, 07:26:34

Hi, I am not looking for realtime access to the stream. We already get it through the lsl_rec plugin. I would like to offline sync it with the other default pupil capture streams (gaze_position, pupip_possitions) after recording. But I am not able to match the timepoints exactly. I first export gaze position using pupil player and then look at the two streams. I am sharing here the exported .csv of gaze position (https://1drv.ms/f/c/FB1DFB83EAA92E53/Ep3zdOxuS19MuNvu9ocNeh8BYFJkdRoRy-d-75M8IeAv8A?e=a0oPcL) and my taskmarkerstream .csv (https://1drv.ms/x/c/FB1DFB83EAA92E53/Eb0DZ0eTdbpIp-NIEFfFyyYBRE8inqkOI-61MWg1psYnNA?e=hKd2uN)

user-4c21e5 10 September, 2025, 08:48:24

Indeed, I was referring to post-hoc analysis - Pupil Player has no real-time capabilities. In the case of the Pupil Capture LSL recorder plugin which you have used, the saved timestamps belonging to other sensors are already synced and converted to Pupil time. However, there are unlikely to be one-to-one matches across sensors. This is because the other sensors are not running exactly in time with eachother. Thus, you would need to simply find the closest match.

user-3bcb3f 16 September, 2025, 15:24:12

Hi @user-4c21e5 @user-cdcab0 We wanted to reduce the size of Pink dot with yellow border showing fixations and eye postion on the pupil capture screen, so that we can see more exactly where the participant is gazing. How can we do that?

user-4c21e5 18 September, 2025, 01:51:30

You can refer to this section of the docs for instructions on modifying the overlay visual properties: https://docs.pupil-labs.com/core/software/pupil-player/#visualization-plugins

user-3bcb3f 26 September, 2025, 07:55:10

Hi @user-4c21e5 The link you provided is for changing the visual properties in pupil player. We are looking for chnaging it in the pupil capture itself, while recording. Is it possible?

user-a4aa71 18 September, 2025, 09:10:31

Hi everyone, I ran a test using the real-time-blink-detector with a NEON recording (https://github.com/pupil-labs/real-time-blink-detection) and noticed that, ( besides taking quite a long time to finish analyzing the recording), it also detected many more blinks than what the Neon Player reported. My question is: is the blink detection algorithm used in Neon Player the same as the one used in the real-time-blink-detector? Thanks a lot!

user-cdcab0 18 September, 2025, 09:14:03

Yes, they're different, but neither of those are what we recommend for blink detection with Neon. Instead, the Neon Companion app can now detect blinks in real-time and save them within the recording

user-a4aa71 18 September, 2025, 09:31:45

Ah, I see! Thanks a lot for the clarification! So, is it necessary to upgrade the Neon Companion app on the smartphone? It doesn’t show up as available for me… Or should I instead upgrade the Neon Player? Thanks!

user-cdcab0 18 September, 2025, 10:43:48

If you're connected to the internet and the Google Play store doesn't show any app updates, then you're already on the latest. You'll want to make sure that "Compute eye state" is enabled within the app settings.

You'll then want to load these recordings with the latest version of Neon Player

user-a4aa71 23 September, 2025, 09:06:27

Hello! I have another question about this. In the gaze.csv file, the eighth column is the blinkID (which corresponds to the one in the blinks.csv file), but at those indices there are also gaze_x and gaze_y values in pixels. Are those values generated through some kind of interpolation? Maybe something done by the AI algorithm? With the Pupil Core, I found it useful to have the confidence value. I had considered using the eyelid aperture measurement, but in all my recordings this information is NaN. Many thanks for your help!

user-cdcab0 23 September, 2025, 09:12:46

The gaze estimator will always produce a gaze point when provided with an image - regardless of the state of the eye, the eyelid, or even the presence of an eye in the eye image. So yes, even during a blink, the gaze estimator will still make it's best estimate at a gaze point. Depending on your research, you may want to simply filter these gaze points out

user-a4aa71 23 September, 2025, 09:41:21

Ok, got it, thanks a lot! Do you think there’s any reason why the 3d_eye_states.csv file has the last six columns (eyelid angle top left [rad], eyelid angle bottom left [rad], eyelid aperture left [mm], eyelid angle top right [rad], eyelid angle bottom right [rad], eyelid aperture right [mm]) all set to NaN? I downloaded the data from Pupil Cloud

user-cdcab0 23 September, 2025, 09:59:50

That sounds like you may want to open a ticket in πŸ›Ÿ troubleshooting

user-d407c1 26 September, 2025, 08:10:20

Hi @user-3bcb3f ! For gaze visualisation you can drag this onto the plugins folder, enable and choose how do you want the circle to be displayed.

custom_display_recent_gaze.py

user-3bcb3f 26 September, 2025, 08:30:12

Hi, I put this .py file into the plugins folder, but the plugin manager does not show any options where I can manipulate the circle on the pupil capture.

user-d407c1 26 September, 2025, 08:43:18

If you placed it correctly, restart Pupil Capture. On the plugin side panel you should see a new entry called Display Recent Gaze.

Enable it, and at the bottom you’ll get a new plugin panel with options to adjust color, size, and alpha value of the visualised gaze.

user-3bcb3f 26 September, 2025, 08:52:54

Thank you very much. Got it working πŸ™‚

End of September archive