@everybody π§ Pupil Cloud will be undergoing scheduled maintenance today (Friday 01 December 2023) at midnight UTC+0. We expect Pupil Cloud (web interface and API) to experience about 40 minutes of service disruption. All processes will resume without need for user action after scheduled maintenance is completed.
@everyone ππ£ We're pleased to announce Neon XR π£ π
πΒ β‘ Add eye tracking to your XR research and applications with Neon. Collect research-grade eye tracking data in virtual and augmented reality environments.
All the advantages of Neon still apply in XR! No calibration. Invariance to slippage. Robust to outdoor lighting conditions (for AR this does makes a difference πΈ ).
Hardware π Weβre launching with a mount for the Pico 4 VR headset. You can pre-order the Neon Pico 4 bundle today. We expect to ship around the end of January 2024. Mounts for other common headsets are already in the pipeline!
Software - Neon XR Unity package: Receive real-time eye tracking data in your Unity project. - Mixed Reality Toolkit 3 (MRTK3) Template: Kickstart your implementation with our template project. Demo scenes for gaze-based interaction and real-time heatmap aggregation in VR!
Connect & Build π€ Whatβs on your wishlist for XR integrations? Let us know in the π₯½ core-xr channel.
π οΈ Building a custom XR prototype? Get Neon module with Nest PCB - βBare Metalβ - and follow instructions in the docs to build an integration. Weβd also be happy to get in touch to discuss. Send us an email - [email removed]
Website: https://pupil-labs.com/products/vr-ar Docs: https://docs.pupil-labs.com/neon/neon-xr/
[email removed] βοΈΒ New Alpha Lab Content! Detect Eye Blinks in Real-Time with Neon
We have recently made our blink detector algorithm open-source, and we wanted to show you a few things you can do with it. Not only can you run the detector offline, but it is fast enough to detect blinks almost in real-time! β‘ποΈβπ¨οΈ
Want to measure your blink rate, control recordings with just a blink pattern, or trigger commands? We've got you covered!
πΒ Learn more about it here
What's the first thing you'll do with our blink detector? Let us know in our π neon channel!
[email removed] βοΈΒ New Alpha Lab Content: an AI Vision Assistant!
Imagine an AI assistant that not only talks but also βseesβ for you. Welcome to the era of Large Multimodal Models!
We recently got the opportunity to test GPT-4V, the newest model from OpenAI, which can interpret images πΌοΈ. Naturally, we decided to integrate it with our Neon/Invisible real-time APIs to build a gaze-contingent vision assistant, capable of providing near-real-time auditory feedback π£οΈ
What can it do? Tell you what the wearer is gazing at; a guess of the wearerβs intentions; hazard perceptions, and moreβ¦ π
Want to tailor it to your needs? Modify the prompts and experiment with your own assistive scene understanding applications!
πΒ Learn more about it here