Hi, I'm a university professor who runs VR research project. Looking for a eye-tracking possible VR (since Vive Pro Eye is discontinued), I'm thinking to purchase the eye-tracking addon of Pupil Labs. Two questions: 1) Does it work with HTC Vive Pro 2? I only see it works with HTC Vive Pro in your document. 2) How much does it cost?
Hey @user-037678 👋 ! Thank you for your interest in our products 🙂 The price for the HTC Vive pro is € 1400. However, our existing Vive add-on is not compatible with Vive Pro 2. We can offer a de-cased version of our add-on with cameras and cable tree for custom prototyping. If you are interested, please send an email to [email removed]
Hi, I am looking for eye tracking in VR for a research project, and trying to compare different options. What are benefits of pupil labs compared to htc vive eye or focus 3 eye? I assume it is access to raw data (video streams) and more possibilities when running eye tracking on background with existing applications. Is it correct? Are there anything else?
Hi @user-2d1b3f! Out VR Add-on only fits the Vive, Vive Pro and Vive Cosmos. It's not directly compatible with the HMDs you mention. Although some customers have retrofitted their Add-ons into different HMDs by prototyping mounts. Indeed, things like access to raw data, customisability etc. are all motivating factors.
Hello!
I am trying to record the raw eye and world(VR view) images through the Pupil labs plugin for Unity. At the moment, when I initiate the recording from Unity, Pupil Capture records the eye images and world images along with the Gaze and Pupil data. I would like to turn OFF the recording of the Gaze and Pupil data because I want to record the raw images at maximum frame rate and I assume generating these additional data would negatively affect the frame rate. Then, I would like to generate the pupil and gaze data post-hoc.
Could someone please point me towards the direction of how this could be done properly? I believe I would have to make changes to the source code as I could not find the option to turn these OFF in Pupil Capture Settings.
Hello everybody. I would like to use the Neon module in a VR system with the Oculus Meta Quest 2. I'm concerned with the two coordinate systems and how can I correctly map the gaze data to the VR scene. Can anyone give me some tips on how to get started?
Hey @user-40d3d5 👋. No changes to the source code are needed. You can disable real-time pupil detection in Capture settings. Note that you'd then need to do post-hoc pupil detection and calibration. Post-hoc calibration for our VR Add-on uses manual selection (natural features), and this has ultimately shown to be less accurate when compared to the default calibration that uses the reference locations in Unity3D. Are you running Pupil Capture on a separate computer to hmd-eyes/Unity3D? This is also recommended to ensure adequate computational resources and thus achieved sampling rate.
Thank you!
Hey @user-ff2367! We're working on the building blocks for integrating Neon in Unity3D. We hope to have a basic release sometime in the coming weeks, so look out for an update 🙂
Hey @nmt thanks! Looking forward to this update 😀
Hello pupil labs! I am using HTC Vive Pro eyetracker with the Steam VR and Unity to present to the participant the 2-D stimuli on a virtual 'screen'. I need to precisely detect where particpants are looking at my picture and to do this I need to create a surface. I know this could be easily done with the apriltags, but in my case the quality of world view appeared to be so bad that the tags are not recognized (the original images have resolution 1000*750 and are sharp both at the screen and in VR) . How can I adjust the screencasting quality, is any option in Pupil Capture?or I need to do this in Unity? alternatively, can I somehow manually show to Pupil Capture where my tags are? many thanks in advance! below the screenshot of my worls view.
Hi @user-185d95! It's difficult to say why the image is low-quality with so many unknowns. That said, have you considered raycasting to find the intersection of gaze with your stimuli, thereby negating the requirement for virtual AprilTags? You can read about that here: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#default-gaze-visualizer
Thank you! I use my own scripts to analyze data, not the default gaze visualizer, therefore I never used raycasting before, but I will definitely take a look. Anyway, it appeared the issue was in the Unity script and I was able to use AprilTags after fixing it.
Hi everybody. I think our lab's Vive Pro Pupil Add-On broke. I have the impression it is a loose contact at the cable, which I would like to verify. Any suggestions how I can repair it as soon as possible? I would need to continue my data collection very soon.
Hi @user-3efb67 👋 ! For further debugging and to request a repair, please reach out to [email removed]
Thank You for pointing out the email adress
Hello everybody! I am a Ph.D. student just starting with pupil lab equipment who will soon need to analyze data gathered with a pupil core camera to assess gaze behavior in a VR driving simulator. I currently managed to process data using Pupil Player v3.5.1, and I noticed that, unfortunately, only a few data analysis tools are available in the player. How can I process the data to obtain a more precise analysis? Can I sue my gaze? Where can I find all the documentation I need to analyze the tracking data correctly? Thx
How is the Pupil Labs VR/AR HTC vive pro eye tracking addon 2000$ cad?
Hi @user-742ddc and welcome to the community 👋 Could you please provide more details about the specific analysis you have in mind? This way, I can offer you more tailored suggestions. In the meantime, make sure to explore our documentation, which contains all the necessary information about the data that is made available (https://docs.pupil-labs.com/core/software/pupil-player/#raw-data-exporter), and also take a look at the various analysis plugins included in Pupil Player (https://docs.pupil-labs.com/core/software/pupil-player/#plugins). These resources should help you get started with analyzing the eye-tracking data effectively.
I need to analyse the gaze distribution for each ROI on a three monitor system (fixation times, fixation percentage). I might read the data with python and segment and measure each area by hand, but I was looking for a more straightforward process.
Hi @user-04b947! Thanks for your question. If I could ask, what is your intended use-case for eye tracking in VR? Our Add-on is typically used for research and prototyping. It exposes a lot of research-quality data to the user, such as pupillometry and eye state at a high sampling frequency of 200 Hz. We also have a powerful and flexible api and integration with Unity 3D enabling a high-degree of customisation for your applications: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md
Game development and research
I’m working on a game that takes input from the eye which changed the games
For exemple, if you look at a book and you read it. Depending on how much you read it and look at keywords will add more to the story
If you look more at guns or weapons, more will spawn
So will enemies
It adapts to what the user is interested
Core in VR driving simulator
But its very early design
But 2000$ Cad is wayy too expensive for my project
Thanks for the overview! I agree these sort of gaze-contingent interactions are interesting for game dev. That said, our equipment is produced in small quantities and we're already pricing it relative to our Academic discount model. Have you considering something DIY? Or perhaps just leverage the inbuilt eye tracking of some HMDs on the market.
I have built myself a DIY eye tracking solution but its not reliable. I have taken into consideration the Quest Pro, HP Reverb 2 Optic, HTC Vive Pro Eye and the Pimax Eye tracking add-on but there too expensive
What were the hurdles with your DIY solution?
The eye tracking wasn’t reliable and half the time it wasn’t looking where I was
I got a question where do I buy the pupil eye tracking mod and will I be able to cram it into my vive pro 2
Hey @user-eede71 👋. Unfortunately our VR add-on doesn't fit the vive pro 2. What do you need eye tracking for if I may ask?
hi, i would like to ask a question about binocular add on, if i start recording in a VR environment such as a game or an architectural environment, when finished, does it give the exact locations of the focused points as an excel sheet
Hey! I just checked, we have responded to your email 🙂
I already have face tracking but I want eye tracking as well for a more immersive experience in vrchat and among other games.
Okay! Our current VR add-on is used a lot for research and prototyping. Some users buy the add-on and develop their own mounts for different HMDs. We also have a repository called hmd-eyes that contains the building blocks to implement eye tracking in Unity3D. Most users start there to build their applications. If you're interested, check out the developer docs: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md