🥽 core-xr


user-037678 03 July, 2023, 11:35:38

Hi, I'm a university professor who runs VR research project. Looking for a eye-tracking possible VR (since Vive Pro Eye is discontinued), I'm thinking to purchase the eye-tracking addon of Pupil Labs. Two questions: 1) Does it work with HTC Vive Pro 2? I only see it works with HTC Vive Pro in your document. 2) How much does it cost?

user-480f4c 03 July, 2023, 11:41:39

Hey @user-037678 👋 ! Thank you for your interest in our products 🙂 The price for the HTC Vive pro is € 1400. However, our existing Vive add-on is not compatible with Vive Pro 2. We can offer a de-cased version of our add-on with cameras and cable tree for custom prototyping. If you are interested, please send an email to [email removed]

user-2d1b3f 07 July, 2023, 16:31:35

Hi, I am looking for eye tracking in VR for a research project, and trying to compare different options. What are benefits of pupil labs compared to htc vive eye or focus 3 eye? I assume it is access to raw data (video streams) and more possibilities when running eye tracking on background with existing applications. Is it correct? Are there anything else?

nmt 11 July, 2023, 10:56:16

Hi @user-2d1b3f! Out VR Add-on only fits the Vive, Vive Pro and Vive Cosmos. It's not directly compatible with the HMDs you mention. Although some customers have retrofitted their Add-ons into different HMDs by prototyping mounts. Indeed, things like access to raw data, customisability etc. are all motivating factors.

user-40d3d5 17 July, 2023, 12:19:01

Hello!

I am trying to record the raw eye and world(VR view) images through the Pupil labs plugin for Unity. At the moment, when I initiate the recording from Unity, Pupil Capture records the eye images and world images along with the Gaze and Pupil data. I would like to turn OFF the recording of the Gaze and Pupil data because I want to record the raw images at maximum frame rate and I assume generating these additional data would negatively affect the frame rate. Then, I would like to generate the pupil and gaze data post-hoc.

Could someone please point me towards the direction of how this could be done properly? I believe I would have to make changes to the source code as I could not find the option to turn these OFF in Pupil Capture Settings.

user-ff2367 17 July, 2023, 13:15:05

Hello everybody. I would like to use the Neon module in a VR system with the Oculus Meta Quest 2. I'm concerned with the two coordinate systems and how can I correctly map the gaze data to the VR scene. Can anyone give me some tips on how to get started?

nmt 19 July, 2023, 06:47:10

Hey @user-40d3d5 👋. No changes to the source code are needed. You can disable real-time pupil detection in Capture settings. Note that you'd then need to do post-hoc pupil detection and calibration. Post-hoc calibration for our VR Add-on uses manual selection (natural features), and this has ultimately shown to be less accurate when compared to the default calibration that uses the reference locations in Unity3D. Are you running Pupil Capture on a separate computer to hmd-eyes/Unity3D? This is also recommended to ensure adequate computational resources and thus achieved sampling rate.

user-40d3d5 21 July, 2023, 12:25:16

Thank you!

nmt 19 July, 2023, 06:48:26

Hey @user-ff2367! We're working on the building blocks for integrating Neon in Unity3D. We hope to have a basic release sometime in the coming weeks, so look out for an update 🙂

user-ff2367 19 July, 2023, 12:06:44

Hey @nmt thanks! Looking forward to this update 😀

user-185d95 19 July, 2023, 13:50:26

Hello pupil labs! I am using HTC Vive Pro eyetracker with the Steam VR and Unity to present to the participant the 2-D stimuli on a virtual 'screen'. I need to precisely detect where particpants are looking at my picture and to do this I need to create a surface. I know this could be easily done with the apriltags, but in my case the quality of world view appeared to be so bad that the tags are not recognized (the original images have resolution 1000*750 and are sharp both at the screen and in VR) . How can I adjust the screencasting quality, is any option in Pupil Capture?or I need to do this in Unity? alternatively, can I somehow manually show to Pupil Capture where my tags are? many thanks in advance! below the screenshot of my worls view.

Chat image

nmt 20 July, 2023, 07:25:29

Hi @user-185d95! It's difficult to say why the image is low-quality with so many unknowns. That said, have you considered raycasting to find the intersection of gaze with your stimuli, thereby negating the requirement for virtual AprilTags? You can read about that here: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#default-gaze-visualizer

user-185d95 24 July, 2023, 10:43:38

Thank you! I use my own scripts to analyze data, not the default gaze visualizer, therefore I never used raycasting before, but I will definitely take a look. Anyway, it appeared the issue was in the Unity script and I was able to use AprilTags after fixing it.

user-3efb67 26 July, 2023, 09:46:00

Hi everybody. I think our lab's Vive Pro Pupil Add-On broke. I have the impression it is a loose contact at the cable, which I would like to verify. Any suggestions how I can repair it as soon as possible? I would need to continue my data collection very soon.

user-480f4c 26 July, 2023, 09:52:44

Hi @user-3efb67 👋 ! For further debugging and to request a repair, please reach out to [email removed]

user-3efb67 26 July, 2023, 10:09:36

Thank You for pointing out the email adress

user-742ddc 26 July, 2023, 14:24:26

Hello everybody! I am a Ph.D. student just starting with pupil lab equipment who will soon need to analyze data gathered with a pupil core camera to assess gaze behavior in a VR driving simulator. I currently managed to process data using Pupil Player v3.5.1, and I noticed that, unfortunately, only a few data analysis tools are available in the player. How can I process the data to obtain a more precise analysis? Can I sue my gaze? Where can I find all the documentation I need to analyze the tracking data correctly? Thx

user-04b947 26 July, 2023, 18:19:34

How is the Pupil Labs VR/AR HTC vive pro eye tracking addon 2000$ cad?

user-04b947 26 July, 2023, 18:19:58

Chat image

user-c2d375 27 July, 2023, 14:13:23

Hi @user-742ddc and welcome to the community 👋 Could you please provide more details about the specific analysis you have in mind? This way, I can offer you more tailored suggestions. In the meantime, make sure to explore our documentation, which contains all the necessary information about the data that is made available (https://docs.pupil-labs.com/core/software/pupil-player/#raw-data-exporter), and also take a look at the various analysis plugins included in Pupil Player (https://docs.pupil-labs.com/core/software/pupil-player/#plugins). These resources should help you get started with analyzing the eye-tracking data effectively.

user-742ddc 27 July, 2023, 14:24:58

I need to analyse the gaze distribution for each ROI on a three monitor system (fixation times, fixation percentage). I might read the data with python and segment and measure each area by hand, but I was looking for a more straightforward process.

nmt 27 July, 2023, 14:23:15

Hi @user-04b947! Thanks for your question. If I could ask, what is your intended use-case for eye tracking in VR? Our Add-on is typically used for research and prototyping. It exposes a lot of research-quality data to the user, such as pupillometry and eye state at a high sampling frequency of 200 Hz. We also have a powerful and flexible api and integration with Unity 3D enabling a high-degree of customisation for your applications: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md

user-04b947 27 July, 2023, 14:25:00

Game development and research

user-04b947 27 July, 2023, 14:25:54

I’m working on a game that takes input from the eye which changed the games

user-04b947 27 July, 2023, 14:27:02

For exemple, if you look at a book and you read it. Depending on how much you read it and look at keywords will add more to the story

user-04b947 27 July, 2023, 14:27:37

If you look more at guns or weapons, more will spawn

user-04b947 27 July, 2023, 14:27:46

So will enemies

user-04b947 27 July, 2023, 14:28:09

It adapts to what the user is interested

user-c2d375 27 July, 2023, 14:28:18

Core in VR driving simulator

user-04b947 27 July, 2023, 14:28:24

But its very early design

user-04b947 27 July, 2023, 14:31:58

But 2000$ Cad is wayy too expensive for my project

nmt 27 July, 2023, 15:36:51

Thanks for the overview! I agree these sort of gaze-contingent interactions are interesting for game dev. That said, our equipment is produced in small quantities and we're already pricing it relative to our Academic discount model. Have you considering something DIY? Or perhaps just leverage the inbuilt eye tracking of some HMDs on the market.

user-04b947 27 July, 2023, 16:24:29

I have built myself a DIY eye tracking solution but its not reliable. I have taken into consideration the Quest Pro, HP Reverb 2 Optic, HTC Vive Pro Eye and the Pimax Eye tracking add-on but there too expensive

nmt 27 July, 2023, 18:04:07

What were the hurdles with your DIY solution?

user-04b947 27 July, 2023, 19:14:45

The eye tracking wasn’t reliable and half the time it wasn’t looking where I was

user-eede71 30 July, 2023, 22:17:46

I got a question where do I buy the pupil eye tracking mod and will I be able to cram it into my vive pro 2

nmt 31 July, 2023, 17:36:03

Hey @user-eede71 👋. Unfortunately our VR add-on doesn't fit the vive pro 2. What do you need eye tracking for if I may ask?

user-8879f9 31 July, 2023, 11:02:49

hi, i would like to ask a question about binocular add on, if i start recording in a VR environment such as a game or an architectural environment, when finished, does it give the exact locations of the focused points as an excel sheet

nmt 31 July, 2023, 17:40:02

Hey! I just checked, we have responded to your email 🙂

user-eede71 31 July, 2023, 17:51:14

I already have face tracking but I want eye tracking as well for a more immersive experience in vrchat and among other games.

nmt 01 August, 2023, 05:46:48

Okay! Our current VR add-on is used a lot for research and prototyping. Some users buy the add-on and develop their own mounts for different HMDs. We also have a repository called hmd-eyes that contains the building blocks to implement eye tracking in Unity3D. Most users start there to build their applications. If you're interested, check out the developer docs: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md

End of July archive