Hello, We are looking to implement the following scenario:
When looking at text on paper, it is automatically translated and appears on the glasses. Among them, when looking at a specific phrase, that phrase is output through a speaker. With a swipe gesture of the hand, the content of that phrase is transferred to a smartphone.
Would it be possible to implement this with the Neon Pico 4 Add-on product (https://pupil-labs.com/products/vr-ar) Both eye tracking and pass-through are needed.. would it be better to use the Meta Quest Pro?
Hi @user-a27091 👋. Thanks for your interest in the Neon-XR Add-on! What you suggest is technically feasible. We have a couple of different ways to implement eye tracking with the Pico 4. Recommend heading over to our documentation, where you can find more details about how to work with the system. For example, we have an MRTK3 template and Unity3D libraries to get you started (links in the docs).
Hi ! I am interested in using the Neon XR for our application. I am primarily working in Unreal Engine. I saw that the NeonXR does not support UE yet. 1) But I am thinking of an alternate pipeline in which I will read the frames from the VR headset (as recorded by the camera on Neon module) and the synchronized gaze tracks and transfer these to an external Python script using the Real-time API. Then whatever data I need to send back to the UE environment, I plan to send using a UDP port connection between the script and UE. Is that possible ? 2) Additionally, is it possible to read real-time blink rate and pupil size data from Neon XR using the real-time API? Thanks a lot for the help !
Hi @user-a8f53c! We already responded by email, but following up here so others can see. 1. What you're describing is essentially using our real-time API (Python client) to receive gaze data from Neon, and subsequently sending that to Unreal via a UDP port. Technically, that is possible. The limitations would be an additional layer of complexity, e.g. a relay computer to receive the data and send via UDP, and potentially more latency depending on your set-up. 2. Real-time blink and pupil size are not currently available via our real-time API - they are not computed on the Companion phone. Instead, recordings are uploaded to Pupil Cloud to obtain blinks and pupil size. Although we do plan to provide these in real-time, I don't have a concrete release date just yet. That said, if you wanted to run the blink detector in near real-time programmatically, e.g. on a separate computer, you could do so. Head over to this Alpha Lab article which shows you what you can do.