πŸ₯½ core-xr


user-fa97ea 07 October, 2021, 07:06:42

Hi! I used the accuracy visualizer plugin, but it doesn't recognize the top and bottom ranges. Does anyone have an idea? This was done in ScreenCastDemoScene.

Chat image

papr 11 October, 2021, 12:17:08

Hi, I was able to review the recording. Please see my notes below: The green outline is calculated based on the recognized reference locations. These are correct.

Orange lines visualize the error between recognized reference locations and predicted gaze locations. The accuracy visualizer excludes "outlier" samples from its calculation and visualization (samples with a larger angular error than a fixed threshold, default 5.0 degrees). In your case, many samples are being excluded (From your email: Used 1922 of 4642 samples). This is why your screenshot does not show all lines.

I cannot tell for sure what causes the calibration to be this inaccurate that the majority of the samples are being regarded as outliers. The backprojections (yellow and green dots) look fairly good, too. Nonetheless, I feel like the issue could be related to the unity camera intrinsics. Again, I cannot tell for sure, though.

Your eye models are fit fairly well but it could be slightly better, the left eye specifically. I see that you already perform the eye rotation procedure. I suggest doing it with more extreme angles. You can also perform it quicker than in the recording.

user-d1efa8 07 October, 2021, 11:22:55

i usually get that when the model isn't really fitting correctly at those points (i.e. the confidence isn't high enough to register towards calibration), so try moving your eyes around a bit so that the model has a better fit, freeze the model, and see how the confidence holds up when you fixate around those points. if it looks good try running the calibration again. you can also play around with camera settings in the capture app to see if you can get a better image for processing, sometimes the image might be too bright or dark, and finding that sweet spot would help provide a more stable model

user-fa97ea 08 October, 2021, 04:49:44

Thank you very much for your answer. I tried it and the confidence was high enough. (>0.8) However, it still remained unrecognized. Is it possible that the top and bottom of the data being retrieved are swapped?

papr 08 October, 2021, 05:14:19

Note, that the confidence displayed in Capture refers to the 2d confidence, not 3d confidence. Feel free to share an example recording with us such that we can give concrete feedback. Flipping the eye does not make a difference

user-d1efa8 08 October, 2021, 05:02:50

yeah, you can actually flip the image for each eye in camera settings, but im not sure if that matters for calibration. try changing it tho

user-fa97ea 08 October, 2021, 05:35:27

ok, I'll share the data via e-mail, thank you.

user-fa97ea 14 October, 2021, 08:58:12

Hi, I tried some things and share what I've noticed. When I checked the exported data, I noticed that the Y axis was pointing in the opposite direction than I expected(top area in Unity is negative in exported data). I think this is why the many samples are being regarded as outliers. Is there any way to solve this?

papr 14 October, 2021, 09:03:08

If you are referring to the gaze_point_3d_y values, please be aware that it follows the 3d camera coordinate space https://docs.pupil-labs.com/core/terminology/#coordinate-system whose origin is in the center of the camera

papr 14 October, 2021, 09:01:32

Could you please specify the exact name of the Y-axis export.

papr 14 October, 2021, 09:10:26

But a possible test to verify a possible y-axis-flip would be to increase the outlier threshold to a very large number (e.g. 360 degrees) and look if the pupil data is indeed mapped to the wrong direction

user-fa97ea 14 October, 2021, 09:19:20

Got it, I'll try this. Thanks!!

user-73b616 14 October, 2021, 16:46:08

Hello, 1 of 2 cameras of my VR addon stopped working and shows as 3141, 25442, "Sonix Technology Co., Ltd." USB2 camera which works via standard windows Camera app but not via pupil labs sw. I tried manual driver installation which succeeded, however selecting that camera in pupil service cause BSOD with error MULTIPLE_IRP_COMPLETE_REQUESTS. Any fix for that pls?

papr 14 October, 2021, 17:06:04

please contact info@pupil-labs.com in this regard

user-7aedda 19 October, 2021, 06:40:37

Hi! Im using HTC VIVE add on, and try to capture pupil size only, but I found the pupil data is inaccurate, very small (~0.5-2 only even with light stimulation) and I found no. Supporting pupil observation is 0; what is wrong with my setting?

user-7aedda 19 October, 2021, 06:41:12

papr 19 October, 2021, 07:10:17

Please see https://docs.pupil-labs.com/core/best-practices/#pupillometry If you want we can give more concrete feedback. In this case, please share a Pupil Capture recording with [email removed]

papr 19 October, 2021, 08:44:01

@user-7aedda It is not alone about the positioning about sampling different eye angles.

That said, it is difficult to judge the eye models quality from a single picture like this alone. Please create and share a short example recording with Pupil Capture where you roll your eyes until you think the model is fit well based on the description in the link above.

user-7aedda 19 October, 2021, 07:46:34

Thanks for your prompt reply! I finally get the correct pupil size data, I saw the pupil diameter 3D range from -1.6-4.1mm, but when I export it to .csv via player, no such data can be found, largest number of circle_3d_radius is 1.2 only

Chat image

papr 19 October, 2021, 07:51:41

Hi, your eye models are still not fit well. You can improve the fit by rolling your eyes like here https://youtu.be/7wuVCwWcGnE?t=14

Pupil Player displays the diameter in the timeline. You are looking at the circle_3d_*radius* . Alternatively, have a look at the diameter_3d column.

Also, the timeline y-axis limits are not necessarily the total range of the data. See this code for details https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/pupil_producers.py#L219-L227

user-7aedda 19 October, 2021, 07:46:39

So how can pupil lab player app display that data? How Can I export that data?

user-7aedda 19 October, 2021, 07:58:32

If I understand it correctly, my eye model still need adjustment, that’s why my pupil size data is wrong (see the pic attachment, only largest 1.2mm in radius…)

Chat image

papr 19 October, 2021, 08:22:12

Correct πŸ™‚

user-7aedda 19 October, 2021, 08:42:24

i work very hard to adjust my position, setting, still no luck, pupil size still very small (~0.7mm), any tips on HTC VIVE addon?

user-7aedda 19 October, 2021, 08:41:08

Chat image

user-7aedda 19 October, 2021, 09:27:55

papr 19 October, 2021, 09:41:36

One more note: I know that the camera placement options in the VR add-on is very limited but from the video it looks like the pupil would leave the field of view if the subject looked down. This will cause issues in the pupil detection.

user-7aedda 19 October, 2021, 09:28:30

I have try my best to adjust it, just don’t know how to do it right, no tutorial online

user-7aedda 19 October, 2021, 09:28:42

Any tips of it?

papr 19 October, 2021, 09:29:50

A well-fit model is visualized by a stable circle that surrounds the modelled eyeball, and this should be of an equivalent size to the respective eyeball. A dark blue circle indicates that the model is within physiological bounds, and a light blue circle out of physiological bounds. From the documentation. This is not the case in your video. I suggest sampling larger angles and for a longer period of time. The eye movements can be faster than in your video.

user-e33900 22 October, 2021, 13:25:32

Hello I am interested in starting the Pupil capture when i play the scene in unity without pressing R. How exactly should I do that? I didn't understand exactly how you mean it with the remote

user-37ee59 25 October, 2021, 14:42:27

Hey all,

I am currently trying to query data about the eyes through the Pupil Labs Eye Tracker extension of the VIVE HMD. For this I would like to use the Python API. However, I am having a problem communicating with the Pupil Labs eye trackers.

When I try to get a response to the previously sent question with the line "ipc_sub_port = requester.recv().decode("utf-8")", my program waits forever. However, if I start the "Pupil Service" software in parallel, the pupils answer my program directly. The only problem then is that two connections to the Pupil labs are established at the same time, which again does not work.

My question would be why the Pupil Labs eye trackers do not respond when I try to communicate with the trackers via the lines

    requester.send_string('SUB_PORT')
    ipc_sub_port = requester.recv().decode("utf-8")

to establish a connection.

This problem occurs with all types of "send_string". I've also tried the examples from the site https://docs.pupil-labs.com/developer/core/network-api/#pupil-remote

papr 25 October, 2021, 14:49:52

It is important that the script connects to the correct port. Checkout the Network API menu in Pupil Capture or Pupil Service main window. Both show their respective ports. Adjust them in the script accordingly.

user-37ee59 25 October, 2021, 15:00:45

Hey, thanks for fast reply. The ports should be correct, as my python script starts to communicate with the Pupil labs as soon as i start the "Pupil Service". It seems like the "Pupil Service" sends some kind of wake up information to the Pupil Labs.

papr 25 October, 2021, 15:01:52

to the Pupil Labs. Not sure what you are referring to by this

user-37ee59 25 October, 2021, 15:02:46

Oh, sorry. I ment the Pupil Labs VIVE HMD Addon.

papr 25 October, 2021, 15:02:26

Please be aware that either Pupil Capture or Pupil Service need to run in order for these scripts to work.

user-37ee59 25 October, 2021, 15:03:46

Oh, they do? That already helps a bit. But in that case i get another error, that the remotehost closed the connection.

papr 25 October, 2021, 15:03:38

Yes, the hardware is accessed by the software. The software is responsible for processing the video provided by the hardware and performing the pupil detection/gaze estimation on it.

papr 25 October, 2021, 15:04:09

The script just accesses the result.

papr 25 October, 2021, 15:04:26

That sounds like the Capture or Service application shut down.

user-37ee59 25 October, 2021, 15:11:52

I'll look into that. You already helped me alot. Thank you.

user-37ee59 25 October, 2021, 15:53:33

I'm trying to calibrate the Pupil Lab VIVE HMD Add-Ons. I've tried to do it via the code, but could figure out how to do it. I guess i can also manage to calibrate it with the software "Pupil Capture", as it comes with the feature of calibrating it ... obvisouly. But it seem to require a video stream, as the extension for the VIVE HMD, does not come with a world camera. In order to calibrate the Pupil Labs VIVE HMD addon, i need to get a live video stream from the perspective of the HMD. Is that correct? I then have to apply that video stream to the "Pupil Capture"-Software?

papr 25 October, 2021, 15:56:27

In order to calibrate the vive add-on, you need a client to display and provide reference locations. An example client would be our hmd-eyes project that implements the necessary functionality as a Unity plugin. I recommend reading the docs https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#getting-started There is also a section about calibrating and a list of demo scenes

user-6c2e03 26 October, 2021, 21:01:04

Hey! My lab is working with the Pupil Labs VIVE addon in Unity, and we're trying to use the eye tracking to limit the user's vision to a small area that follows their gaze. I've been able to consume data directly from the eye tracker by linking my own code to the gazeController.OnReceive3dGaze event. The window now follows the user's gaze very quickly, but there are some problems.

1, the window is very jittery. I notice that the gaze visualizer code you provide is less jittery than the window, which makes me think you performed some kind of smoothing on it. Edit: I looked into the gaze visualizer code and found that you just don't move the green ball if the confidence from the eye tracker is below a threshold. After implementing this in my code for the window, the jitters are a lot less. If there's anything you could think of to help me reduce it further, please still let me know though.

2, the window occasionally lags for about a second. This also happens with the gaze visualizer, sometimes it just freezes for a moment before snapping across the screen to the correct place.

I'm wondering if you have any advice on mitigating these effects or any others you may know about to increase the accuracy of the window. Ideally we want to use this for research purposes and we need it to be very fast and accurate. Thanks!

papr 27 October, 2021, 15:11:37

Hi! re 1) yes, we just filter by confidence. If you want to apply further smoothing, I can recommend https://cristal.univ-lille.fr/~casiez/1euro/ re 2) these are probably longer periods of low confidence which is why there is no update. But I cannot tell for sure. Otherwise, there might be something else blocking the thread longer than usual.

user-e33900 27 October, 2021, 15:01:20

Hello! I am collecting some data in a csv (button clicks, head movements) in Unity and I need to sync them with pupil data as I record through Pupil Capture and also in a csv with Time.realtimeSinceStartup. I am wondering if there is an example to follow on how to call TimeSync, and make sure that all of these time points will be synched.

papr 27 October, 2021, 15:12:58

You can use this function if timesync is setup correctly https://github.com/pupil-labs/hmd-eyes/blob/master/plugin/Scripts/TimeSync.cs#L46

user-e33900 27 October, 2021, 15:14:11

ok! thanks a lot for the quick reply!

user-4f89e9 27 October, 2021, 20:02:33

Hey there, I'm trying to figure out what product my university bought from Pupil Labs. I was able to find an invoice but it just says "head mounted USB Camera" and the weight was 200 grams and the price was the same as the Epson & HoloLens add-ons. I was wondering if anyone could help me to figure out exactly which product they purchased or should I just go through the support email with the PO number?

papr 27 October, 2021, 20:03:27

should I just go through the support email with the PO number? That would be the easiest way. Please contact info@pupil-labs.com

papr 27 October, 2021, 20:03:47

But there should be an acronym, too. I might be able to tell you what it is based on this acronym.

user-4f89e9 27 October, 2021, 20:05:38

would that be in the invoice ID? Under Item all it says is "head mounted USB Camera" followed by the weight and commodity code

user-4f89e9 27 October, 2021, 20:06:49

I'll go ahead and draft up an email to be safe just because the university doesn't even know if it's head or desk mounted πŸ€·β€β™‚οΈ

papr 27 October, 2021, 20:07:11

Yeah, I think this would be easiest.

user-4f89e9 27 October, 2021, 20:14:58

i do believe it's from 2017 if either model came out after that

papr 27 October, 2021, 20:32:51

For everyone reading this: We were able to resolve the issue via a direct message. The product in question was a Pupil Core headset.

End of October archive