🥽 core-xr


user-3efcd2 02 March, 2023, 17:22:02

which is the easy way to use pupil lab data in realtime unity application?

nmt 02 March, 2023, 17:58:08

See my message in the core channel

nmt 06 March, 2023, 16:45:40

Check out hmd-eyes: https://github.com/pupil-labs/hmd-eyes

user-7c714e 04 March, 2023, 10:24:03

Hello Neil, I have a question about the Add-on for HTC Vive Pro. I and my PhD students can not run the eye tracking module. Now, the most important question is whether the Add-On can work in the simulation software without programming the API? Otherwise I will have to buy HTC Vive Pro Eye, which would make the PupilLabs Add-On useless. Thank you in advance, Prof. B. Hristov

nmt 06 March, 2023, 16:47:27

Hi @user-7c714e. What simulation software are you running? We provide an integration for Unity3D (https://github.com/pupil-labs/hmd-eyes) but other applications would need their own integrations putting together.

user-8779ef 04 March, 2023, 16:11:06

Hi folks .... pablo was, at one point, trying to fix longstanding issues in the post-hoc calibration of vr data. This involved adjusting the intrinsic matrix of the world camera used to project gaze into the 2d screen coordinates

Chat image

user-8779ef 04 March, 2023, 16:11:20

as you can see, the adjustment isn't reliable. Can you point me to where this adjustment is made in the code?

user-8779ef 04 March, 2023, 16:11:52

I have tried editing the world.intrinsics file, without luck

user-8779ef 04 March, 2023, 16:13:03

This used to work, and that leads me to believe that it's edited somewhere in player

user-8779ef 04 March, 2023, 16:42:02

Ok, nevermind, it is loading from world.instrincs. I will just edit that file

user-8779ef 04 March, 2023, 16:44:59

For this recording, the offset was eliminated with by setting the y components of intrinsics to 550

Chat image

user-8779ef 04 March, 2023, 16:45:26

Happy to tell you more about this problem that has been lingering since you guys released the VR inset.

user-8779ef 04 March, 2023, 16:46:21

the basic issue is that you guys never figured out what intrinsics to grab from unity so that the 3d-2d projection is accurate

user-8779ef 04 March, 2023, 16:57:34

and i believe a different method is used in real-time vs offline

user-8779ef 04 March, 2023, 16:57:45

(post-hoc recalibration)

nmt 07 March, 2023, 13:30:06

Hi @user-8779ef. This area is not currently in active development – my recommendation going forward would be to use real-time calibration if possible

user-e80f9b 06 March, 2023, 16:40:53

How much is for the eye tracking module

nmt 06 March, 2023, 16:48:33

Hi @user-e80f9b. Transparent pricing is available on our website: https://pupil-labs.com/products/vr-ar/

user-e80f9b 06 March, 2023, 16:55:05

Which one is cheaper for eye tracking

nmt 07 March, 2023, 14:04:43

Hey. Can you elaborate on what you're looking to achieve with eye tracking? Is it VR specifically or something more general?

user-099a43 06 March, 2023, 17:02:38

Hi @nmt I'm using htc vive add-on with the official demo(3d_gaze_demo_vr_v1.4). After the calibration, I found that the gaze point is still drifting randomly when I do a fixation. Is there any way to solve this issue?

nmt 07 March, 2023, 13:33:35

The first thing to ensure is that you have good pupil detection and then to build a 3d eye model. Feel free to share an example recording that contains the calibration with data@pupil-labs.com for feedback

user-cd03b7 07 March, 2023, 03:10:40

Has anyone used the reference image mapper in a virtual environment? I'm looking into getting Pupil Binocular Add-on, but I'm not sure how scanning a 3D environment would work

user-d407c1 07 March, 2023, 13:14:33

Hi @user-cd03b7 ! Kindly note that the reference image mapper enrichment only works with Invisible and soon Neon. It is not available for Core.

user-cd03b7 07 March, 2023, 03:12:16

We'd like to see if we can emulate real-life eye tracking metrics in a virtual environment, we have the 3D files (BIM360) of the real-life environment and would just rerun the same user tests with a Vive.

nmt 07 March, 2023, 13:36:25

Hi @user-cd03b7! So if I understand correctly, you want to present a 3D scan of your environment in VR, have the participants negotiate that space (in VR) and build heatmaps similar to what RIM does, with the aim of comparing real to virtual? Interesting proposition. I'm aware of a few research groups doing similar work. It would require quite a bit of work to get set up. What VR engine are you using?

user-cd03b7 07 March, 2023, 16:17:06

I assumed the Binocular Add-on would provide similar functionality to the Core, what metrics or data could we pull from the binoculars? I'm not seeing a product data page, am I looking in the wrong spot?

user-cd03b7 07 March, 2023, 16:17:38

And how could I get in contact with these groups? We'd love to see what it looks like to get this off the ground

user-e80f9b 07 March, 2023, 17:58:19

@nmt ^

user-44c93c 07 March, 2023, 20:38:10

Calibration question. We are using the HTC Vive add-on hardware, but mounted to something other than an HTC vive headset. Our setup includes two monitors running at 1280x1024 resolution that the user looks at to fuse a 3D image. We are having issues with the HMD_eyes unity calibration. We have modified the unity calibration to account for our camera intrinsics, and for the calibration scene to span across those two monitors. After calibration, gaze tracking near the center of the image is good, but as the user looks out towards the periphery of the image, the gaze point is incorrectly pulled in towards the center of the image. We are wondering if modifying the shape of the calibration image (currently 12-point circle with 1 point in the middle) to something more rectangular to match the shape of the physical monitors would yield better gaze tracking towards the periphery of the image? Any other suggestions for improving calibration would be appreciated.

user-8779ef 08 March, 2023, 16:04:58

@user-44c93c I can help there. No affiliation with Pupil Labs.

There are two possibilities that I can think of. First, you may want to modify the world camera intrinsic matrix. Your gaze is actually calibrated to Unity's reported 3D positions of the targets in the main camera's local space, and not their projection. For this reason, the world camera's intrinsic matrix will only affect the mapping of gaze vectors onto the scene view, and not the accuracy of the gaze vector estimates themselves. I have a very simple script I can share that will help you modify that matrix, but make sure to backup your world.intrinsic first! I'll dig that up and follow up with a link.

The other issue may be that the estimated location of the 3D geometric eye model's 3D position with respect to the 2D projection of the world view is incorrectly placed. Picture the 3D eye in front of a frustum, with gaze rays coming out of the eye and intercepting the screen at the visualized gaze locations. Now imagine that you are moving the eye closer/further away, but the angles of the gaze rays will not change - just where they intercept the screen in front of the eye. If the estimate of the eye position is incorrectly close, you have artificial compression. If its too far, artificial expansion. This is a much more difficult situation to deal with, because there are a lot of steps involved in estimating the eye's 3D position. Unlike the issue with the world.intrinsics matrix, inaccuracy of the 3D eye model's position would affect the quality of the gaze estimates.

So, there are at least two possibilities - either it's an issue with the projection of your gaze vectors onto the screen which would make the gaze estimation look inaccurate even if your gaze vectors may be calibrated accurately, and one is that it's an issue with the 3D eye model fit, and thus the gaze estimates themselves. Perhaps the best way to distinguish between the two possibilities is to use a head-rest, use trig to calculate the true spherical position of your on-screen gaze targets relative to the head (degeres azimuth, elevation in head space), convert your angles into the same spherical space, and compare. This would inform you of any true inaccuracies, or whether the inaccuracies are just due to a bad mapping from gaze estimates to the screen representatino in pupil player.

user-8779ef 08 March, 2023, 16:09:50

exploreIntrinsics.py

user-8779ef 08 March, 2023, 16:11:47

@user-44c93c There's an ugly and simple script to modify the world camera intrinsic matrix. In that script, I'm modifying only the vertical component. You might want to modify other components. ...but, please know that this is only modifying the visual indication of gaze accuracy, and not the gaze estimates themselves.

Oh, oh, I just had an important question, and your answer may change my explanation above ... are you using Unity to render your stimuli, or something else?

user-44c93c 08 March, 2023, 20:56:39

Thanks for taking a look at this and for the python script. For your question on stimuli, if you are referring to what the participant is seeing after calibration, it is not being generated in Unity. It is coming from a stereoscopic camera.

user-8779ef 08 March, 2023, 21:10:21

Perhaps that script will help :). For now, you can ignore my lengthy description above.

user-e80f9b 10 March, 2023, 05:24:21

?????

user-0b29bb 13 March, 2023, 01:09:32

Hi! I want to buy an eye-tracker add-on for VIVE Pro. I wonder about the price and how long would it take to be delivered to South Korea.

user-c2d375 13 March, 2023, 08:31:47

Hi @user-0b29bb 👋 Please send an email to info@pupil-labs.com and we will follow up on your request

user-7c714e 13 March, 2023, 13:29:29

OK, thank you!

user-fa0f94 13 March, 2023, 17:58:43

Hello everyone! I use old HTC Vive with SteamVR and hmd. I want to set up demo projects and I need to know if I can use .NET Fremowrk Package because I don't have .NET4 in drop list. .Net2 doesn't work at all. .Net4 seems to work properly but some demo scenes doesn't work well (some objects seem to be in wrong places or other object don't disappear during the calibration :c ) So... Is .Net Framework equivalent of .Net4? I use the last LTS Unity version (2021.3.20f1)

Chat image

nmt 14 March, 2023, 09:25:33

Hi @user-fa0f94 👋. It's worth checking out this: https://docs.unity3d.com/Manual/dotnetProfileSupport.html

If .NET 4 Is not showing in the dropdown it might not be installed (https://www.microsoft.com/en-us/download/details.aspx?id=17851)

user-fa0f94 14 March, 2023, 14:23:57

One more question, Currently I work on car simulator and I want to collect data about objects on which test subject stares. Is it possible to create collision shape in game engine as AOI around, for example traffic sign, and to do raycast from head position in gaze direction and save this information in timestamps about object hit by ray? Then I would map fixations' timestamps collected by pupil capture and my collected hit objects at specific timestamp so I would know which fixation correspond with specific object - collision shape (AOI). I know it could be possible but I rather want to know if it is the proper way to doing eye tracking research 😛 I fear about problems with sync time from pupil capture and time collected from game engine :c

nmt 15 March, 2023, 07:31:36

I believe this is an approach used by other researchers. You can read about the raycasting here: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#gazedirection--gazedistance-instead-of-gazepoint3d Time sync between Unity + Capture is all taken care of. Recommend a thorough read of the developer docs, actually. It's very informative!

user-8779ef 15 March, 2023, 22:13:37

Hello, folks. I have just opened up a recording made using the Unity plugin and there appears to be a constant temporal offset between calibration points and the imported features ...which should overlap perfectly with the calibration points. This is important data - collected with a particular medical patient that travelled from out of state to participate. They have since left. I'm motivated to make this work. Have you seen this before?

user-8779ef 15 March, 2023, 22:14:31

Do you have any advice for how to investigate the issue and fix it?

user-8779ef 15 March, 2023, 22:18:18

Here's a visualization of the issue: The white sphere lags behind the green disc overlay (natural feature), which is draw using data extracted from the notifications file exported from HMD-Eyes

Chat image Chat image

user-8779ef 16 March, 2023, 13:44:32

I can confirm that it is an issue with just calibration features. Eye movies and world movie are in sync

user-8779ef 16 March, 2023, 13:44:54

Perhaps the unity time is off, but if the world frame index is accurate, then I think will be OK

nmt 16 March, 2023, 19:12:11

Hi @user-8779ef. It's difficult to say much about this without having access to the data. I suppose being from a patient it's not feasible to share. Is this the first recording where you've seen the offset?

user-cd03b7 16 March, 2023, 18:46:27

can anyone confirm that the Pupil drop-in binoculars for the HTC vive will work for the Vive Pro 2?

user-8779ef 16 March, 2023, 18:57:52

They don't

user-8779ef 17 March, 2023, 02:57:13

I will try modifying the the notification timestamps

user-8779ef 19 March, 2023, 00:37:05

Fitting it inside is only half the battle. We will also need a unity plugin for temporal sync, screen grabbing, and accurate ray casting. Is that in the roadmap yet?

user-8779ef 21 March, 2023, 12:53:23

Well, consider this a formal request :).

nmt 22 March, 2023, 17:14:06

Hey @user-8779ef. This isn't on an official roadmap. Feel free to add it as a feature request: https://feedback.pupil-labs.com/neon 🙂

user-cd03b7 22 March, 2023, 23:01:08

Unrelated, where are you guys getting your original Vive Pro headsets? Vive doesn't seem to be selling them on their website

user-8779ef 25 March, 2023, 20:11:35

Do you have an example for how to deal with this? Perhaps I can remove specific pldata and replace them with nearly-identical pldata, but with modified timestamps?

End of March archive