πŸ“― announcements


Year

wrp 01 November, 2024, 04:26:08

@everyone πŸ“£ Neon + PsychoPy πŸ‘€πŸ§ 

Integrating Neon with PsychoPy is now more streamlined and user-friendly, thanks to our PsychoPy Plugin!

The plugin, easily installed through PsychoPy Builder, enables the use of high-quality gaze and pupillometry data in your PsychoPy experiments.

Features Gaze-Contingency Map and stream gaze to screen-based coordinates in real time 🎏

Pupillometry Record and stream pupil size in mm, and eye state metrics; build cognitive load experiments πŸ‘οΈβ€πŸ—¨οΈ

PsychoPy and Neon-native Recordings Save data in both PsychoPy’s hdf5 and Neon’s native format; benefit from Pupil Labs' data logistics, including secure cloud backups πŸ’½β˜οΈ

Check the docs for more: https://docs.pupil-labs.com/neon/data-collection/psychopy

wrp 18 November, 2024, 07:29:00

@everyone πŸ“£ new Alpha Lab: Neon + Open AI πŸ“£

What if you could automatically pinpoint key behaviors and gaze interactions within hours of eye tracking recordings?

Our proof of concept uses OpenAI’s GTP-4o model for advanced scene understanding and streams events directly to recordings in Pupil Cloud via the API. Our explorations show great promise - imagine applying this to 10+ (or 100+) hours of eye tracking recordings.

Check it out: https://docs.pupil-labs.com/alpha-lab/event-automation-gpt/

nmt 20 November, 2024, 09:38:41 πŸ“£ We’re hiring! πŸ“£

We’re currently seeking candidates for two new positions on the Product Specialist Team!

1. Product Specialist and Research Consultant

Democratize Pupil Labs eye tracking technology, empowering users to achieve their research/industry goals at an ever-accelerating rate; work with diverse users; drive innovation from the front lines: https://pupil-labs.com/careers/product-specialist-and-research-consultant

2. Technical Support Engineer

Empower users by resolving their technical challenges with expertise, creativity, and empathy; deploy custom code solutions; strive to continuously enhance our products and processes: https://pupil-labs.com/careers/technical-support-engineer

Remote positions (UTC+0 to UTC+8 timezones)

Feel like this is a good fit? Get in touch! Know someone who would be a good fit? Share share share!

(video from Product Specialist project: Tag Aligner - combine Neon eye tracking and headpose data with third-party 3D models)

wrp 21 November, 2024, 11:10:35

@everyone πŸ“£ Updates to Neon Companion App πŸ“£ (Make sure to update to the latest version on the Playstore)

Binocular and Monocular Gaze Modes You can configure Neon to generate binocular or monocular gaze data by changing the Gaze Mode in the Neon Companion App settings.

Binocular Mode: In Binocular mode, gaze data is generated using images from both the left and right eyes. This is the default setting and is recommended for most users.

Monocular Mode: Some specialist applications, like ophthalmic testing, require gaze data to be generated from just one eye. This can be achieved by switching to a Monocular gaze mode. Monocular Left generates gaze data using only images of the left eye. Monocular Right uses only images of the right eye.

Gaze Data: The gaze data retains the same format, so all downstream libraries and components do not require any updates. The gaze mode used in a recording will be listed in the info.json. More info in the docs: https://docs.pupil-labs.com/neon/data-collection/gaze-mode/

New Mode for Front-Facing LED We added a mode to turn on the front-facing LED whenever the scene camera is active. This applies to both recording and streaming.

Stability Improvements - Fixed a rare startup delay of sensors when there are many recordings on the Companion Device. - Fixed gaze display in recording playback. There were instances where the gaze circle would no longer update or would get stuck after seeking.

wrp 27 November, 2024, 04:13:42

@everyone πŸ“£ New Alpha Lab Article πŸ“£

Do you have a GoPro, insta360, or DJI action camera? Now you can integrate footage from a third party camera with gaze data from Neon eye tracking glasses. Capture an even wider field of view, get higher frame rates for fast moving objects, and enjoy the scene in HDR.

Our latest Alpha Lab guide shows you how to map gaze onto your egocentric camera footage.

Check it out: https://docs.pupil-labs.com/alpha-lab/egocentric-video-mapper/

Here is an image of the setup we used for an Insta360 G03 + Neon. We built a custom mount for the action cam but it's not a requirement. Stay tuned for a follow up on this project, we've got something in the pipeline πŸ˜‰

Chat image

wrp 29 November, 2024, 02:59:10

@everyone Last week we were at the the Psychonomic Society Conference in NYC. Here’s a clip we took in times square on the way to the conference venue (recorded with Neon). So much to see! πŸ‘€

Thanks to everyone who stopped by to say hi πŸ‘‹ , discuss your research, and get a hands-on demo with Neon.

End of November archive