πŸ“― announcements


Year

wrp 07 October, 2025, 10:19:58

@everyone Updates for Pupil Cloud

We’ve rolled out a big update to the Video Renderer and introduced a new drawing tool in the AOI Editor.

You can now: - Customize gaze (circle or crosshair) and fixation visualization styles - Add synchronized eye video overlays (adjust position and transparency of the overlay) - Draw rectangular AOIs

Check out changes in Cloud & read the full release notes.

wrp 21 October, 2025, 04:18:55

@everyone New Research Digest πŸ“°

What does the expert eye see during ultrasound-guided embryo transfer?

Researchers of the Laboratoire Traitement du Signal et de l'Image (LTSI) at the University of Rennes (France) combined Neon eye tracking glasses with a high-fidelity simulator to capture how specialists visually navigate this delicate procedure.

Their findings reveal distinct gaze patterns and cognitive strategies that differentiate experts from novices β€” a breakthrough that could transform medical training and improve IVF success rates.

Discover the full study and how eye tracking is reshaping reproductive medicine: https://pupil-labs.com/blog/unveiling-the-expert-eye-how-eye-tracking-is-revolutionizing-ultrasound-guided-embryo-transfer

Video credit: Josselin Gautier

wrp 24 October, 2025, 08:43:48

@everyone Real-time Python Client Update

Now supports streaming live audio from Neon! Play synced gaze + video + audio, analyze, or transcribe (STT) in real-time.

Start listening. Update now: pip install -U pupil-labs-realtime-api

See more: https://pupil-labs.github.io/pl-realtime-api/dev/methods/simple/streaming/audio/

wrp 29 October, 2025, 07:17:39

@everyone New Alpha Lab

What if you could automatically track gaze on any moving object?

Our new Alpha Lab tutorial makes it possible, introducing a powerful workflow that combines our Neon eye tracker with the Segment Anything Model 2 (SAM2).

The process is simple: Click. Segment. Track.

Define a dynamic Area of Interest with a single click, and the tool automatically follows it throughout your recording, mapping your gaze to the moving object. No more tedious manual coding or being limited by predefined categories.

It’s a faster, more flexible way to study gaze in sports, classrooms, or any dynamic, real-world environment.

All powered by a user-friendly Gradio app in Google Colab. No coding required!

Follow along here: https://docs.pupil-labs.com/alpha-lab/dynamic-aoi-sam2/

wrp 31 October, 2025, 02:54:54

@everyone Updates for Pupil Cloud

Scene video brightness Scene video too dark or too bright? You can now adjust playback brightness in Cloud. Settings are saved per recording. The video renderer visualization uses the same brightness setting, so rendered videos will match your playback adjustments.

Additional recording info You can also now find Frame, Frame name, and Device Serial information in the recording information modal. Tip: press [i] with a recording selected to see the recording info modal.

End of October archive