vr-ar


user-8603f0 01 August, 2018, 00:05:51

This eye image was taken from PupilLabs eye tracker(Vive Add-on). Does anyone know why those white dots appear? When they encircle the pupil region, my pupil confidence goes down drastically? Is there a way to reduce their influence and increase the accuracy?

Chat image

wrp 01 August, 2018, 05:42:29

@user-8603f0 the white dots are IR illuminators from our HTC Vive Add-on. Confidence should not go down due to the presence of the illuminators as they are accounted for in the algorithm. You can check the algorithm view of the eye window and adjust parameters if desired.

user-8603f0 01 August, 2018, 18:14:57

@wrp All right. It probably was a false positive. I was trying to find the source of the problem, and this seemed a convincing explanation. I see now that isn't the case. Let me try adjusting the parameters as you said. Thanks!

wrp 01 August, 2018, 18:54:11

Welcome @user-8603f0

user-57cf8e 02 August, 2018, 09:06:10

Hello I have a question. When i start the 3D demo from the market, the gauze indicator (the yellow dot) does not follow my eye movement correctly and jumps around wildly. Has anyone ever had similar problems? We tested this with multiply people.

Edit: We use the HTC VIVE Add-on.

user-626718 02 August, 2018, 14:14:30

Hey again, I am still dealing with the performance problems and I am trying to ignore the graphic problem so far. The thing is that my "Unity Gaze Data" that I get from recording is not displaying the correct eye movements. So I know the eye is conducting optokinetic eye movements, but when processing the recorded Pupil Positions from the "Unity Gaze Data"csv-file in matlab, the expected typical pattern is not visible. How can this happen? Is there any way of changing something in the scripts to get more accurate data? I am stuck with this problem for a while now and I really don't know what to do anymore.

user-626718 02 August, 2018, 14:16:04

I am really helpful for any kind of hint which might lead me to the right direction, as I am pretty knew to Pupil Labs

user-8603f0 02 August, 2018, 19:31:33

@user-57cf8e This was the response to my question to similar problem I had with 3d gaze data. I do know Pupil labs have made some improvements to the algorithm since then(it's been almost a year ). Any update from pupil labs on this? Or is it still in proof of concept stage? From my latest tests, I still see the 3d marker being volatile moving around constantly without being stable.

Chat image

user-38ed5d 06 August, 2018, 17:01:47

I am trying to import unity Hololens package. But I get "Assets/pupil_plugin/Scripts/Networking/UDPCommunication.cs(7,7): error CS0246: The type or namespace name `HoloToolkit' could not be found. Are you missing an assembly reference?" HoloToolkit is missing in the project view and was not presented in the import dialog. I am new to Unity and want to help a colleague. What am I missing?

user-58d5ae 07 August, 2018, 09:16:59

You are missing HoloToolkit, the api from microsoft to use hololens in Unity : https://github.com/keluecke/MixedRealityToolkit-Unity

user-3638ed 09 August, 2018, 14:33:23

ey mates! i have a quick question on the subscription behavior within unity. Is it possible to subscribe to two different topics like "gaze" and "blink" within one instance of the project?

a altready checked the PupilTools script on GitHub for any predefined parameters in the PupilTools.SubscribeTo() function that i could use to bind the subscription to a script instance or sth, but couldn't find..

Any ideas on that?

Thanks a lot!

user-f1d099 13 August, 2018, 18:41:58

Hello, Our lab has just installed a new set of eye trackers to our HTC Vive and ran the software for the first time on this device. We have run into an issue with one of the eye cameras being very dark with what we assume the IR lights flickering. We will be uninstalling and reinstalling the cameras later today in an attempt to resolve the issue.

If anyone has some input it would appreciated.

wrp 14 August, 2018, 02:49:44

Hi @user-f1d099 we received an email from you at [email removed] and will respond there. Quick notes here. Flickering that you are seeing is from the HTC Vive's Lighthouse system. You will need to ensure that the VR headset is being worn such that there are minimal gaps (seems like the headset is partially off if this much IR light is coming into the headset).

user-626718 14 August, 2018, 12:58:24

Hi, my goal is to calculate the velocity of pursuit movements of the eye, so I guess it makes more sense to use the pupil coordinates from the recorded csv datafile. However, I am not sure how to convert the normalized coordinates from pupil positions in the eye image frame into coordinates in m or mm. Is someone familiar with that? Would be nice to get some hints!:)

user-54f521 15 August, 2018, 13:00:47

Hello everyone, We use the pupil labs hololens version with the unity plugin. Since the begginging the position data for the pupil we get in unity is pretty poor, even with proper setup and calibration. It seems that the tracking in the pipil_capture software is pretty stable, but in unity the pupil marker position data is pretty inaccurate. Any tipps on what might be going wrong?

user-8779ef 15 August, 2018, 14:21:36

@user-626718 I would think that you would want to actually use the 3D gaze normals which are already in the world coordinate system. Typically smooth pursuit velocity is reported in degrees per second

user-626718 16 August, 2018, 13:09:24

@user-8779ef Thank you!

user-8779ef 17 August, 2018, 03:42:15

@user-626718 actually, more accurate to say that the gaze normals are in head space. Consider that the head can contribute to rotational velocity! Gaze pursuit (eye+head) vs eye pursuit. If you want eye +head, you will have to apply head rotation to the gaze normals to recover gaze-in-world.

user-6997ad 18 August, 2018, 04:09:09

Hey all. Using Vive plugin, Windows, Unity 2018.2.1f1, Pupil Capture 1.8, and HMD eyes latest 0.5.1. I can finish calibration in the 2D market demo, and enter the market, however none of the gaze visualization is working. 3D market demo completes calibration without an error message or success message, and also moves to the market but without any text instructions, and no indication of gaze visualization. I can't find any mention of a problem like this in the Github issues or here. Anyone else seen this before?

user-97591f 20 August, 2018, 15:39:03

@user-6997ad we had the same problem too. Refer to my pull request here or issue 51, the market scenes work with our fix. https://github.com/pupil-labs/hmd-eyes/pull/52

user-24270f 21 August, 2018, 06:11:37

is there a way to calibrate without a world camera and without it being in VR?

user-24270f 21 August, 2018, 06:11:57

like, look at a screen marker at a known distance, or anything at all??

user-24270f 21 August, 2018, 09:02:19

essentially i have people wearing augmented reality glasses. i want to know what icons they are looking at...but i dont have access to a world camera due to the system.

user-b4961f 21 August, 2018, 11:13:25

Hi, I have a Vive plugin and I want to make my system wireless using TPcast. did anyone here got such a system to work? is there a way to connect the pupil plugin to the TPcast? would the mobile app work in this case? can I use some Bluetooth/wifi device to make the pupil plugin wireless? Thanks

user-626718 22 August, 2018, 07:40:09

@user-8779ef I am not sure if I understood that right, so as I am just interested in eye movement I can use the gaze normals or better said world coordinates? And is there an easy way to convert the coordinates into degree? My idea was to convert world coordinates back to vectors in a coordinate system having its origin at the point of the observer and calculating the angle between two consecutive vectors to get the speed in angular velocity. However I have no clue how to match the normalized coordinate system from world coordinates with the Unity coordinate system to factor in the distance from observer to normalized coordinate system.

user-626718 22 August, 2018, 07:41:10

Hope that explanation makes sense, I don't really know how to explain it better 😅

user-8779ef 22 August, 2018, 14:40:55

@user-626718 So, you're just interested in eye movements, even if the head is contributing to tracking behavior by rotating at the same time? Or... will you use a chin rest?

user-8779ef 22 August, 2018, 14:41:14

For now, I'll take you at your word, that head movements aren't relevant to your research question.

user-8779ef 22 August, 2018, 14:42:04

If that's the case, then yes, you can use use the 3D gaze normals. ... but, let me back up here, because I believe you may have stumbled upon a stopping point. The last time I checked, pupil labs 3D calibration (the 3D market scene demo) was not working.

user-8779ef 22 August, 2018, 14:42:48

You'll find that if you use only the 2D calibration, you won't get 3D gaze normals. You will only get a point of regard on the viewing plane ( I forget the exact variable name returned in the pupil gaze sample ).

user-8779ef 22 August, 2018, 14:44:09

To convert this point of regard into a vector will take additional work. I have not put time into this, and can't really help you there. Sorry. Unfortunately, the hmd-eyes package has not yet received the attention that the mobile tracking package has received. I do understand that they have hired a developer for hmd-eyes, but I haven't yet seen evidence of their involvement.

user-6997ad 22 August, 2018, 19:23:50

Has anyone else experienced severe lag upon starting recording? Wondering if disabling some things (such as eye recordings which we don't need) might help alleviate this.

user-626718 23 August, 2018, 07:02:08

@user-8779ef yeah I can use a chin rest to be able to ignore head movements! Thanks a lot four you response! I was also trying to work with the 3D calibration and had the same feeling like you! Maybe I can figure out an idea with the 2D calibration

user-626718 23 August, 2018, 07:04:01

@user-6997ad I have the same trouble, however I do not know how to solve it yet

user-1486c3 23 August, 2018, 07:15:53

@user-6997ad yea I had some problem with that. for me, it was the recording of the VRscene using the plugin that caused it. see if when you disable it, the lag is still there

user-29e10a 23 August, 2018, 13:17:47

@wrp Hi, do you guys plan to adapt the eye cameras to the Microsoft XR Platform? Since there are plenty of headsets available with similar cases, this could be (!) a good idea....

user-29e10a 23 August, 2018, 13:18:49

maybe it could be (!) a good idea to ditch the Oculus support for being too small for eye cameras. I'm looking at you, 200 Hz cam.

user-6997ad 25 August, 2018, 06:05:13

Is anyone using hmd-eyes with the SteamVR plugin successfully? After successful calibration the controller & disk eye trackers are translated away from the view. I suspect this is an error in the way that PupilSettings.Instance.currentCamera is interpreted. Currently we are setting this to Camera.main which should select the first "Camera (head)" in the Steam [CameraRig]. It would be nice if there was a demo scene that used SteamVR for controller tracking on the Vive since this must be a common use-case.

user-8603f0 27 August, 2018, 19:07:43

Can someone help me in interpreting the values from accuracy visualizer? I am using Vive Add-on Eye tracker. I am trying to objectively quantify the calibration quality, so that i can make the user redo the calibration if not correct. The result of the first calibration was very accurate(subjectively). I could see the markers moving in the direction of my gaze. Whereas in the second calibration, the result was very poor with markers not moving in the correct direction. These were the values i got from the accuracy visualizer for two calibrations, 1.(very accurate) first iteration. root-mean-square residuals: 33.21552888189745, in pixel second iteration: ignoring outliers. root-mean-square residuals: 3.7490789299027294 in pixel used 549 data points out of the full dataset 750: subset is 73.20 percent first iteration. root-mean-square residuals: 33.31842235503032, in pixel second iteration: ignoring outliers. root-mean-square residuals: 5.445480268993712 in pixel used 553 data points out of the full dataset 751: subset is 73.64 percent Angular accuracy: 2.7640669345855713. Used 729 of 4014 samples. Angular precision: nan. Used 173 of 4013 samples. 2.(Poor accuracy) first iteration. root-mean-square residuals: 27.639481603847567, in pixel second iteration: ignoring outliers. root-mean-square residuals: 4.724444186616267 in pixel used 585 data points out of the full dataset 688: subset is 85.03 percent first iteration. root-mean-square residuals: 27.756651407035353, in pixel second iteration: ignoring outliers. root-mean-square residuals: 3.4698570971874507 in pixel used 583 data points out of the full dataset 688: subset is 84.74 percent Angular accuracy: 1.554818868637085. Used 621 of 3718 samples. Angular precision: nan. Used 410 of 3717 samples.

I can't understand how the second calibration used more datapoints but resulted in poor accuracy. All other values indicate that the second calibration must be better than the first one. But in reality it is otherwise.

user-e08215 28 August, 2018, 20:26:45

Has anyone tried updating the project to the new Unity 2018.2.3?

user-23e10d 31 August, 2018, 19:06:08

We're using the Oculus DK2 tracker, and our gaze data seems to be complete garbage after calibration (not even remotely corresponding to correct viewport coordinate, ie looking low and looking high both correspond to a y coordinate of 0, or close to it). Some calibrations seem slightly better than others, but it's quite inconsistent, and even the better calibrations usually don't track the eyes correctly in at least one direction. Has anyone else experienced this? We've tried increasing the size of the circle, it seems to no avail. Has anyone else run into calibration problems with the DK2 in Unity?

mpk 31 August, 2018, 19:21:59

@user-23e10d we have not seen this before. Can you share a recording including the eye videos with data(at)pupil-labs.com for debugging?

user-23e10d 31 August, 2018, 20:01:45

After fiddling with the settings in pupil capture, movements seem a little more stable, but still rotated at the least. I'll try to get a recording.

End of August archive