👁 core


user-f7028f 01 November, 2017, 06:13:23

Hi guys! I wonder is there implemented feature to "save" calibration between Pupil Capture\Service starts? Actually I use HMD, so I believe that it's enought to calibrate once and then just use one "calibration model" everytime I use my vr helmet.

user-5d12b0 01 November, 2017, 19:27:06

@user-f7028f , in our experience with the HMD add-on, you have to recalibrate every time you take the HMD off even if just for a second. Even if the HMD is sitting on your cheek just a few mm higher, the calibration will be very wrong.

mpk 01 November, 2017, 19:28:00

@user-5d12b0 this is true for now. We will add 3d calibraiton that uses the 3d eye model. It will be robust against this kind of movement.

user-5d12b0 01 November, 2017, 19:28:34

sounds great!

mpk 01 November, 2017, 19:28:57

I really hope we can release this next week!

mpk 01 November, 2017, 19:29:32

its been quite a while! Also this will get better with the release of the improved eye model. I hope for that to happen in about 4 weeks.

user-5d12b0 01 November, 2017, 19:30:15

Can you point me to any references detailing how the eye model and 3d calibration work?

user-5d12b0 01 November, 2017, 19:30:38

I don't plan on modifying it, I'm just curious as to how it works.

mpk 01 November, 2017, 19:31:11

None of that is well documented. The new eye model is not at all documented. But we will publish a detailed report once we have it finalized!

user-5d12b0 01 November, 2017, 19:31:44

ok, I look forward to it.

mpk 01 November, 2017, 19:32:13

We will certainly make an annoucement!

user-45d36e 02 November, 2017, 13:04:17

Just reposting in case this got lost! I am just wondering how to improve the accuracy of our calibrations. For offline calibration with the manual marker, it looks like the gaze maps well onto the marker location, but then sometimes when we reposition the marker the gaze is way off. How can we improve this online and is there a way to manually correct it offline?

papr 02 November, 2017, 13:33:32

@@user-45d36e Could you post a screen shot that visualizes what you mean?

user-45d36e 02 November, 2017, 13:37:07

Our data is confidential so I don't think that I can! But if we visualize the fixation during offline calibration, I can see that the pink circle sits far away from the marker, and the person was looking at the marker. It seems the most accurate in the center but when we move the marker to the right or left, even though they are allowed to move their head to look at it, it seems to get very inaccurate. If you need to see it I'll see if I can shoot a demo video.

papr 02 November, 2017, 13:38:17

A demo video would be very helpful!

user-45d36e 02 November, 2017, 13:38:34

Ok. I'll aim to get one to you today or tomorrow.

user-45d36e 02 November, 2017, 15:10:19

My coworker Chris is going to get you a demo video tomorrow!

user-2798d6 02 November, 2017, 15:49:30

Hello! I have just started using offline calibration. If I go through the offline calibration process and it doesn't "work" but the eye videos look good, what other things could I check or adjust to salvage that recording?

papr 02 November, 2017, 15:50:11

@user-2798d6 There is a difference between offline pupil detection and offline calibration.

user-2798d6 02 November, 2017, 15:51:16

I know - The detection process seems to be fine, but then when I go through the calibration with natural features, I mark everything but the visualizer shows the person is looking in other places so it's not accurate

user-ecbbea 02 November, 2017, 15:51:29

Is there any documentation available about how the device is calculating depth in the 3D gaze vector? I assume it's using vergence, but I can't find anything. Thanks

papr 02 November, 2017, 15:53:47

@user-ecbbea I do not think that we have documentation on that other than the source code. https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/calibration_routines/gaze_mappers.py#L439-L447

user-ecbbea 02 November, 2017, 15:54:29

@papr thanks!

papr 02 November, 2017, 15:55:12

@user-2798d6 How many natural features do you use? The calibration tries to find a general solution for the mapping function. Therefore it does not always match the given reference points. @user-45d36e this might apply to your problem as well.

user-2798d6 02 November, 2017, 15:55:34

Probably 8-10

user-2798d6 02 November, 2017, 17:13:30

Should I be using more than 8-10 points for the offline calibration?

papr 02 November, 2017, 17:16:46

No, 8 are more than enough if they are well spread.

user-2798d6 02 November, 2017, 17:21:34

Great! So are there any other things I should be looking at or adjusting if the calibration still isn't accurate?

papr 02 November, 2017, 17:25:50

Pupil data confidence should be high and the markers should be set as precise as possible. The new version will improve marker placement and user interaction. This is currently a bit unintuitive. Especially concerning how long a marker is displayed vs when it is used for calibration.

user-2798d6 02 November, 2017, 17:29:11

ok - also, when I'm actually placing the markers, I'm noticing a faint marker that is flickering around the screen near where I'm marking, but not actually in the same spot. Is that possibly affecting my markers?

papr 02 November, 2017, 17:36:34

Can you post a screenshot of that?

user-2798d6 02 November, 2017, 17:39:08

It's the faint blue dot on the right computer keyboard. During my calibration that blue dot flickered all over the keyboard

Chat image

papr 02 November, 2017, 17:40:35

Oh ok. These are false positives of the circle marker detection. They are ignored if the calibration section is set to natural features.

user-2798d6 02 November, 2017, 17:41:47

got it - thank you!

user-b79cc8 03 November, 2017, 13:10:58

Here is a link to a quick demonstration video. The viewer was following the toy the whole time. https://we.tl/aTZ1hJ92JG Just a reminder, I am just wondering how to improve the accuracy of our calibrations. For offline calibration with the manual marker, it looks like the gaze maps well onto the marker location, but then sometimes when we reposition the marker the gaze is way off, especially when the object isn't in the centre. How can we improve this online and is there a way to manually correct it offline?

user-c828f5 03 November, 2017, 14:01:28

@papr Hey! So there is something funky going in with Pupil capture on Ubuntu. Every time I run the application from the icon (application icon created in the search path), it opens up, blinks for awhile (the way an application blinks when it's loading up) and then just closes on it's own. The last time it happened I reinstalled the Pupil SDK and it worked well for awhile. It's crashing again now.

user-c828f5 03 November, 2017, 14:01:46

Where could I run the application directly using terminal?

papr 03 November, 2017, 14:16:26

I think you can run it with /opt/pupil_capture @user-c828f5

mpk 03 November, 2017, 14:46:51

also if you install the bundle on linux

mpk 03 November, 2017, 14:47:00

just type pupil_capture in terminal and it will start

user-2798d6 03 November, 2017, 19:42:13

Hello! When working with offline calibration, is there anything I can do about the pupil detection in this screen shot? When I watch during the pupil detection process, most everything looks fine, but then when I move to actually calibrate, the screenshot happens in a few places. Is there some way to adjust what it thinks is the pupil?

Chat image

mpk 04 November, 2017, 13:11:54

@LKH#3236 can you run detection in 2d mode? I think it will work better for your case.

papr 04 November, 2017, 13:16:20

@user-b79cc8 hey, thank you for the recording. I will have a look at it on Monday 🙂

user-83773f 06 November, 2017, 17:45:48

Hey guys, a bit of a beginner question here. What is the difference between pupil and gaze position? I just recorded some data to test pupil labs out: while head i tried to keep my head still i made several eye saccades and plotted the results in matlab (see image). It seems that for my x-coordinates my pupil position is just a attenuated copy of my gaze position, whereas for the y cordinates it is the opposite. Is this correct? or did i make an error in retreiving the data out of the csv files?

Chat image

user-2798d6 06 November, 2017, 20:30:25

Hello! Another calibration question - I ran detection in 2D mode (see mpk's comment above) and the calibration was still off. My question is how exact can I expect calibration to be? I know the glasses can get within 1 degree of accuracy, but in terms of calibration, can I expect for the marker to be EXACTLY where the person is looking so I only have to worry about that 1 degree, or is it typical for the gaze vs. the marker to be a little off? We are having people look at a musical score, so the notes are kind of tiny and any degree that the calibration is "off" means that we could be off by a whole measure of music.

user-97e499 07 November, 2017, 11:38:49

Hello, i got a calibration question too: What assumptions are made for screen marker and manual marker calibration? (Like: assume screen markers are all in one plane) I tried digging through the source code but I didn't quite figure out how it works/ at which places they differ. Sorry if this has been asked before.

papr 07 November, 2017, 11:47:59

@gendelo3 They only differ in the visualization. One is shown on the screen, the other one needs to be printed.

user-97e499 07 November, 2017, 12:14:34

@papr that i know, i was more interested in how the processing works.

papr 07 November, 2017, 12:40:42

@user-97e499 Ah, I understand. @user-894365 can tell you more about it. She has been working on improving the caibration marker detection.

user-894365 07 November, 2017, 13:16:58

Hi @user-97e499 In the screen marker calibration and the manual marker calibration, the algorithm does the edge extraction and then detects clusters of ellipses. The center of the first cluster found is reported as the position of the marker. In the current algorithm, it is quite likely that it reports the position where is no marker. So we are working on improving the calibration marker detection, which is to make sure that the algorithm reports the exact position of the marker

papr 07 November, 2017, 13:17:15

@user-83773f Pupil positions are the positions of the detected pupil within the eye video frame. A gaze position is a pupil position that has been mapped to the world frame. The mapping function is estimated after running the calibration. The mapper uses the identity function to map pupil positions to gaze positions if no calibration was performed previously. I suspect that you did not calibrate and therefore get the strong correlation between pupil and gaze positions

papr 07 November, 2017, 14:37:43

@user-b79cc8 I just finished watching your demo video. Which calibration method did you use for calibration in the demo? Circle Markers or Natural Features? The first thing I noticed is that it is not necessary to move the marker so much. It is important that field of view of the world camera is covered. The easiest way to do that is to keep your head fixed and move the calibration marker within the world camera's view. E.g. the marker left the camera's view during 00:38 and 00:45 seconds. Therefore I suggest to have a person monitoring the Capture window and checking if the marker is visible. Additionally, I would suggest to show the calibration marker only while you are looking at it, i.e. turn the paper with the marker upside-down during initial setup and only turn it towards the camera during calibration. This advise is negligible if you set the calibration range correctly in the Offline Calibration plugin.

user-2798d6 07 November, 2017, 16:31:21

@user-894365 - I'm not sure if I understand your comment about the calibration. I'm doing offline calibration with natural markers - could you tell me a little more about how your comment relates to that? Thank you!

user-2798d6 07 November, 2017, 16:31:47

I'm still getting my terminology down - still learning 😃

papr 07 November, 2017, 16:33:12

@user-2798d6 I think @user-894365 mentioned you unintentionally. 🙂

user-2798d6 07 November, 2017, 16:33:39

Oh! Ok thanks!

papr 07 November, 2017, 16:34:10

Btw @user-2798d6 The further away your target is the more impact does the 1-degree-error have. At which distance is your subject looking at the music notes?

user-2798d6 07 November, 2017, 16:34:59

About as far away as from a computer - so less than a meter.

papr 07 November, 2017, 16:37:30

@user-2798d6 I think the current offline calibration using natural features suffers a lot from a bad user interface. The new version will allow you to set more markers in consecutive frames and therefore yield a more precise result.

user-2798d6 07 November, 2017, 16:40:58

Ok, great! Thank you!

user-c0934a 08 November, 2017, 12:24:53

Hi guys. I'm using unity with the pupil labs software and the unity pupil labs plugin. I would like to know where the user looks in the unity world but I don't manage to get this data

wrp 08 November, 2017, 12:32:23

@user-c0934a please could you raise this question in the 🥽 core-xr channel?

user-34aab4 08 November, 2017, 15:35:20

Hi everyone, I am quite new to pupil and would like to know if there is somebody who could answer me some questions regarding pupil remote ?

papr 08 November, 2017, 15:36:00

Hey @user-34aab4 Welcome to the channel. Please ask your questions here. 🙂

user-34aab4 08 November, 2017, 15:40:57

Thanks 😃 I'm trying to work with pupil remote the first time and I am not used not network protocols and stuff like that. I have pupil capture running on a ubuntu system and would like to recieve data on a windows laptop. I installed python and dowloaded an example python script from the pupil helpers repository (filter_gaze_on_surface). Finally, I installed the required dependencies using pip. Actually, I don't know if thats all that need to be done, but if running the python script (and having capture running on the other computer) the script is stopping here: sub_port = req.recv_string(). It seems like there is no response. My question is if I forgot to do some basic stuff or if special versions of python or zeromq need to be installed?

papr 08 November, 2017, 15:44:51

The filter_gaze_on_surface subscribes to a specific data topic. In this case to gaze on a specific surface. To receive this data, the surface needs to exists. Surfaces are created using the surface tracker. Did you load this plugin and setup surfaces? This is kind of an advanced feature already. In your case, I would suggest to start with this example: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/pupil_remote_control.py

user-34aab4 08 November, 2017, 15:48:35

Thanks for your anwer! Yes, I already created a surface. However, I also tried the remote control script. For me it seems as there is a general problem about the connection. The remote control script also stops when it tries to receive something : socket.send_string('t') print(socket.recv_string())

papr 08 November, 2017, 15:49:44

Ah, nice! Did you change the ip address in the script to the ip address of the computer on which Pupil Capture is running?

user-34aab4 08 November, 2017, 15:50:52

Yes, I did. Is it important which python version I use?

papr 08 November, 2017, 15:51:32

We recommend to use Python >=3.5, but this is not critical for running the script.

papr 08 November, 2017, 15:52:09

Did you check if there is a network connection to the ubuntu computer, by e.g. pinging it?

user-34aab4 08 November, 2017, 15:54:39

I have a short question regarding your previous answer ("Ah, nice! Did you change the ip address in the script to the ip address of the computer on which Pupil Capture is running?"). I have to set the same ip adress in the script that is shown in the pupil capture remote plugin, right?

papr 08 November, 2017, 15:55:55

As far as I know, you do not need to change anything in Pupil Capture. You need to adapt the ip adress in the example python script, that runs on Windows. Specifically this line: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/pupil_remote_control.py#L33

user-34aab4 08 November, 2017, 15:58:09

Adapting the ip adress in the python script is what I meant. However, I am not sure what to insert there. The ip adress that is shown in the capture remote plugin?

user-34aab4 08 November, 2017, 16:00:31

And yes, I tired to ping and it works fine

papr 08 November, 2017, 16:00:49

Which adress did you use?

papr 08 November, 2017, 16:02:11

Can you run ip addr show in a terminal on your ubuntu computer? Is the shown ip address the same as you used to ping from the windows machine?

user-34aab4 08 November, 2017, 16:04:30

If I get it right it's showing two ip addresses and I used one of them in the script

papr 08 November, 2017, 16:05:46

Ok, and it is probably not 127.0.0.1, correct? Then I assume that the network connection is there and the issue is on Pupil Capture side. Which Pupil Capture version do you use?

user-34aab4 08 November, 2017, 16:06:51

I am sorry that I do not know enough about that topic to follow in detail. Right now I used the 127.0.0.1 but also tried other ones.

user-34aab4 08 November, 2017, 16:07:53

Maybe it's the best idea to contect somebody personally that has knowledge about stuff like that. I guess it's hard to do this just by texting. however, it's good to know that it should not be a prolbem about version.

user-34aab4 08 November, 2017, 16:08:13

I am using capture version 0.0.14.7

user-34aab4 08 November, 2017, 16:08:38

sorry 0.9.14.7

papr 08 November, 2017, 16:09:40

I would suggest to switch to personal messages since this is a setup issue and is not relevant for other users.

user-943415 09 November, 2017, 14:38:52

Hi! I received my pupil few days ago and started playing with it! I'm enjoying it! 😀

user-943415 09 November, 2017, 14:39:55

I have a question. I'm trying to produce heatmaps on a surface. I have the surface found on Pupil Capture and I recorded a short session. Then I open it on Pupil Player.

user-943415 09 November, 2017, 14:40:25

I am following what is written at https://docs.pupil-labs.com/#surface-tracking

user-943415 09 November, 2017, 14:41:11

"Surface Heatmaps It is possbile to dispay gaze heatmaps for each surface by selecting Show Heatmaps mode in the Surface Tracker menu. The surface is divided into a two dimensional grid to calculate the heatmap. The grid can be refined by setting the X and Y size field of each surface. "

user-943415 09 November, 2017, 14:41:32

I don't understand what X size and Y size are...

mpk 09 November, 2017, 14:45:25

@user-943415 welcome to Pupil !

mpk 09 November, 2017, 14:46:06

size x and y are the real world size of your surface. This can be in the units of your choice. (think pixels of a screen or cm of a physical surface.)

user-943415 09 November, 2017, 14:46:18

I also think there should be written: (1) how the heatmap is going to be shown (clicking play?) (2) where it is going to be shown (in the main window while playing? in the debug window while playing?)

user-943415 09 November, 2017, 14:46:53

also, clicking on "Re-calculate gaze distributions" should give some visual feedback, I think. Now nothing happens

user-943415 09 November, 2017, 14:47:27

wow, thanks mpk!

user-943415 09 November, 2017, 14:48:30

so if I leave X size and Y size set to 1 (default value), what am I saying to the software? and if I set it to 1000? how this change what the software does?

mpk 09 November, 2017, 14:48:58

if you set it to 1 you will not get heatmaps (If I remember correctly.)

mpk 09 November, 2017, 14:49:24

I recommend setting it to some sensible value like 200x300

mpk 09 November, 2017, 14:49:54

if you click recalcuate and set the display to heatmap. you should see pretty overlays on the detected surfaces.

user-943415 09 November, 2017, 14:51:15

it worked! thanks!

user-943415 09 November, 2017, 14:51:45

I suggest you specify this important detail a bit better in the interface and in the documentation 😉

user-943415 09 November, 2017, 14:53:22

Would it be useful for you if I fork the project on github, do some small changes (just text both on documentation and interface) and then open a pull request? Or you prefer to manage this (important) part of your work in other ways?

mpk 09 November, 2017, 14:54:00

@user-943415 a fork and PR would be perfect!

mpk 09 November, 2017, 14:54:03

thank you!

user-943415 09 November, 2017, 14:57:23

perfect! going to do it!

user-943415 09 November, 2017, 14:57:31

since we are here, 2 more questions 😉

user-943415 09 November, 2017, 14:58:43

1) you say "size x and y are the real world size of your surface. This can be in the units of your choice. (think pixels of a screen or cm of a physical surface.)". but pixels don't have a dimension, right? I mean, pixels and centimeters are not exchangeable units, right?

mpk 09 November, 2017, 14:59:17

a pixel is a unit.

mpk 09 November, 2017, 14:59:23

if you measure in it.

mpk 09 November, 2017, 15:00:01

in the end what we mean by scale is that we dont know the real world scale or size. So you have to supply it.

mpk 09 November, 2017, 15:00:15

the units that you count in is abitray.

user-943415 09 November, 2017, 15:01:17

infact the second question is

user-943415 09 November, 2017, 15:02:04

2) why the software does need to know the real world dimension of the surface for computing the heatmap? i.e. what if I say the X size is 100000 cm = 1 km? or it is only the ration between Xsize and Ysize that counts?

user-943415 09 November, 2017, 15:02:39

I meant "ratio"

mpk 09 November, 2017, 15:02:40

the relation is important.

mpk 09 November, 2017, 15:02:46

correct ratio.

mpk 09 November, 2017, 15:03:04

also we export the data in this scale.

mpk 09 November, 2017, 15:03:34

I think the size also affects the heatmap binning.

user-943415 09 November, 2017, 15:05:35

yep, I tried with (xsize=1.0,ysize=2.0) and I get something almost uniform in orange

user-943415 09 November, 2017, 15:05:51

then I tried (xsize=3.0,ysize=2.0) and I get something that seems to make sense

user-943415 09 November, 2017, 15:06:08

the same results I get with (xsize=300.0,ysize=200.0)

user-943415 09 November, 2017, 15:07:45

so I still don't understand what this info is used for.

user-943415 09 November, 2017, 15:08:28

if the real world size is 350.0x200.0 and I specify 300.0x200.0 do I get unrealistic heatmaps? Or what do i get?

mpk 09 November, 2017, 15:08:52

@user-943415 this used to be the case in the past. But we mey have changed that recently.

mpk 09 November, 2017, 15:09:09

then scale is only important for the export and the ratio

user-943415 09 November, 2017, 15:10:10

I'm asking it because we plan conducting some simple usability studies in the beginning and so I need to know what we are going to show to the stakeholders 😉

user-943415 09 November, 2017, 15:10:34

ehm, I think I don't understand 100% your sentence "then scale is only important for the export and the ratio"

user-943415 09 November, 2017, 15:10:45

sorry to bother and thanks!

user-943415 09 November, 2017, 15:11:57

if you prefer to "speak in code" you can point me to the part of code in which the related computations are made, I can read python 😉

user-943415 09 November, 2017, 16:38:11

I have see that exporting with (xsize=3.0,ysize=2.0) produces a heatmap_surfacename.png very pixelated while with (xsize=300.0,ysize=200.0) it is much more reasonable so it is like you were saying before

user-943415 09 November, 2017, 16:55:39

also, replying to myself, this link has some answers https://groups.google.com/forum/#!topic/pupil-discuss/oWS7dCYac8U

mpk 09 November, 2017, 16:59:15

@here we just pushed v1.1! https://github.com/pupil-labs/pupil/releases/tag/v1.1

user-e604a3 09 November, 2017, 17:19:38

Happy Version 1.1! I love the new interface!

user-02665a 11 November, 2017, 09:57:50

hey i just wanted to stop by and say gz to the new release! It's probably time for me to undust the tracker once more and check it out! 😉

user-e7102b 11 November, 2017, 17:45:27

Hi, we recently received our Pupil Eye-tracker. I've successfully installed all the pupil software and have it up and running on my Macbook Pro 2017 running OS X 10.6.2 (very impressed with everything)! We use MATLAB 2017b and the psychtoolbox for stimulus presentation, so the next step is to figure out how to gain online access to the data stream via MATLAB. I found some third party code on github that should enable this (https://github.com/matiarj/pupil-helpers/tree/matlabAddV2/pupil_remote/Matlab_Python) but I'm having some issues trying to get this to work (I posted on the google group but didn't receive a response: https://groups.google.com/forum/#!topic/pupil-discuss/Ze6GLyN7LBw). Is this the recommended way to achieve MATLAB integration? I'm also having some trouble with the libuvc installation procedure listed on the mac developers section of the website (https://groups.google.com/forum/#!topic/pupil-discuss/0LTy5DJM7qo), but I don't know if this is related to the MATLAB issues. If you have any suggestions for how to resolve these issues I'd really love to hear them! Thanks, Tom Bullock (Postdoc, UCSB Attention Lab)

mpk 11 November, 2017, 19:17:53

@user-e7102b thanks for the post. We will have a look on monday!

user-e7102b 11 November, 2017, 19:18:26

Great - thanks 😃

user-29e10a 13 November, 2017, 08:47:43

Congrats on your v1.1 Release – looks beautiful! Are there any news about the Oculus CV1 integration? We're eager to get our hands on those, so we would like to order ASAP 😃

user-e938ee 13 November, 2017, 11:27:35

Hello, I've recently got my pupil. I just have a small problem of running pupil_player on my Ubuntu workstation. When I try to run it (after installing it from the .deb on the site), I get this:

ImportError: ('Unable to load OpenGL library', "Failed to load dynlib/dll 'libGL.so.1'. Most probably this dynlib/dll was not found when the application was frozen.", 'libGL.so.1', 'libGL.so.1')

I installed both 64bit and 32bit versions of this library, tried LD_PRELOAD, but can't seem to figure it out.

user-9b7f2d 13 November, 2017, 12:13:20

Do you know if Pupil works with (1) Ogama or (2) PyGaze or (3) OpenSesame? I've searched on the web and the answer seems to be "no" but it thought I would ask here anyway. Thanks!

user-c47be0 13 November, 2017, 13:21:35

Hi @papr I'm working with Sara on this. Taking into account the online calibration (it was my first time doing it and I believe most of what you said is usually followed. In terms of offline calibration, I believe we use circle marker, and whatever we do, the calibration seems to be off consistently, usually in a specific direction. What else about the calibration range should I be doing? (it will not let me alter it in anyway). Sorry for the probably quite basic questions, this is all very new to me!

papr 13 November, 2017, 13:21:49

@user-e938ee The bundle should work out of the box. It comes with all required libs. Please try reinstalling the bundle and revert your custom lib-loading changes if the reinstall does not work .

papr 13 November, 2017, 13:22:49

@user-9b7f2d No, we do not have native integrations with any of these. But we have a very simple network api that can easily be integrated into existing Python projects.

papr 13 November, 2017, 13:27:54

@user-c47be0 Did you alter the manual gaze correction seetings for your gaze section? This feature offsets the gaze by a given amount in a specific direction. Do not worry too much about the calibration range for now. Make sure the marker is clearly visible within the recorded scene video and that the pupil detection works well.

papr 13 November, 2017, 13:30:20

@user-c47be0 Just to clarify: What do you mean by calibration range? The distance of the marker to the subject or the selected frame range in the recording which is used to calibrate?

wrp 13 November, 2017, 13:32:36

@user-29e10a thanks for the positive feedback! Regarding CV1 - we have been pushing back the release on this because we are waiting on new cameras . We hope to be able to have this available within the next 2 weeks... very very close to release!

wrp 13 November, 2017, 13:35:18

@user-9b7f2d adding on to what @papr notes on the network API - please see docs here: https://docs.pupil-labs.com/master/#interprocess-and-network-communication

user-c47be0 13 November, 2017, 13:38:32

@papr sorry, I think I was referring to the distance of the marker rather than the frame range. I hadn't altered the gaze correction settings, so should try this?

papr 13 November, 2017, 13:47:46

@user-c47be0 Ok, good. Do not worry about the marker distance. Usually it is not required to alter the manual gaze correction. Did you upgrade to v1.1 already? Do you see the thicker part of the green lines? This is the calibration rang and should wrap around the found markers (short white vertical lines).

Chat image

user-e938ee 13 November, 2017, 14:16:54

@papr Yeah, I ran it again from source and now it works, although with some stability issues (one of the eye cameras crashes often)

user-e938ee 13 November, 2017, 14:17:52

Is there any way to extract raw information from the live capture? Or at the very least from the recording? The data I'm after is pupil diameter and their relative movement

papr 13 November, 2017, 14:27:42

@user-e938ee See the post above about the network api. You can subscribe to the pupil data which includes diameter and relative movement (if you use the 3d pupil detection)

user-943415 13 November, 2017, 14:31:24

Thanks @papr and @wrp !

wrp 13 November, 2017, 14:48:07

You're welcome @user-943415

user-943415 13 November, 2017, 15:26:45

I found on this issue on github https://github.com/pupil-labs/pupil/issues/503 that cpicanco was able to import pupil data into Ogama. I sent him an email to see if I can get additional details or code about this.

papr 13 November, 2017, 15:27:18

@user-41f1bf 👆

user-e938ee 13 November, 2017, 20:27:07

@papr Thanks! I'll look into it.

user-97e499 14 November, 2017, 10:41:24

After installing v 1.1.2, changing the source camera for eye0 doesn't work properly anymore (the Detect eye 0 circle stays activated, but the eye0 process crashes). Does anyone have a similar problem when switching through cameras? Pupil capture selects my notebooks integrated camera for eye0 per default, which forces me to change the configuration.

papr 14 November, 2017, 10:43:22

@user-97e499 Could you specify what you mean by the Detect eye 0 circle stays activated? Could you post a screenshot?

user-97e499 14 November, 2017, 10:46:45

im talking about the "checkbox" thing where i should be able to toggle detecting eye0 and eye1. After trying to change the source for eye0 from the integrated webcam to "id0" or "id1" the detection window for eye0 crashes.

Chat image

user-97e499 14 November, 2017, 10:47:35

And clicking the green circle doesn't do anything anymore after that too.

papr 14 November, 2017, 10:49:13

@user-97e499 Would you mind opening a github issue for that? Please attach a note about your operating system and if you are running from source or from bundle.

user-97e499 14 November, 2017, 10:49:37

I can do that, np

wrp 14 November, 2017, 15:29:25

@here We are very excited to announce new Pupil hardware! 200 FPS eye cameras - Availble starting today via https://pupil-labs.com/store. Check out our blog post on the new hardware for more details https://pupil-labs.com/blog/2017-11/200-frames-per-second/

user-26898b 14 November, 2017, 16:18:39

I've been looking for a wearable device for our research project, but I need something that will record IR video of the iris, up close, plus capture pupillary measurements. Do ya'll offer both features in one device?

papr 14 November, 2017, 16:20:03

@user-26898b What distance do you have in mind when you say "up close"?

user-26898b 14 November, 2017, 16:20:56

close enough to see discreet changes in individual iris musculature

papr 14 November, 2017, 16:22:32

@user-26898b Our eye cameras are adjustable but have an average distance of 2 cm.

papr 14 November, 2017, 16:23:52

The visibility of the iris musculature changes might be dependend on the used camera resolution. You might need to make a trade off between resolution and captured frame rate.

mpk 14 November, 2017, 16:26:30

@user-26898b I'm sure our hardware can be set up to do what you need.

mpk 14 November, 2017, 16:26:52

You will have full access to the IR eye video feeds and pupillometry data

user-26898b 14 November, 2017, 16:27:40

I was looking at using the VisualEyes 525 system from Interacoustics, but it does so many other things that I don't need and the price is way up there. But it has the video I need.

user-26898b 14 November, 2017, 16:28:08

How could I look at a sample of what the video feed might look like of the iris?

mpk 14 November, 2017, 16:29:25

@user-26898b if you want we can make a recording with close up eye video to show you.

mpk 14 November, 2017, 16:29:46

Do you have a photo that shows what kind of image you are looking for?

user-26898b 14 November, 2017, 16:30:05

yes, one second I'll find it

user-26898b 14 November, 2017, 16:31:27

this is from the Interacoustics wearable hardware

Chat image

user-26898b 14 November, 2017, 16:32:14

it's called video frenzel

papr 14 November, 2017, 16:36:08

@user-26898b This is a screenshot of our Capture software's eye windows. You can record the eye video feed if you want to. The red and green circles are visualizing detection data. They will not be included in the recorded video.

Chat image

papr 14 November, 2017, 16:36:47

The focus of the cameras can be manually adjusted.

user-26898b 14 November, 2017, 16:37:28

could you send me one where they are looking forward, so I can see if the res on the iris is adequate, please

wrp 14 November, 2017, 16:39:10

@user-26898b we can send an image where the eye is looking into the camera. However the camera is not frontally located by design on the Pupil headset, therefore the eye images will likely not be central (like your example) unless looking directly into the camera.

user-26898b 14 November, 2017, 16:39:46

can it be altered to where the camera in frontally located?

wrp 14 November, 2017, 16:40:08

@user-26898b we are building a custom eye camera mount for another researcher

user-26898b 14 November, 2017, 16:40:22

our participants will be looking forward the whole time, so that we can look at discreet iris changes and get pupil measurements

wrp 14 November, 2017, 16:40:53

@user-26898b we can offer a custom eye camera arm extension so you can get frontal images of the eye

wrp 14 November, 2017, 16:41:07

Just send us an email to info@pupil-labs.com and we can follow up from there

user-26898b 14 November, 2017, 16:41:12

but would they still be close up views?

wrp 14 November, 2017, 16:41:17

Yes

user-26898b 14 November, 2017, 16:41:38

ok, I'll email

wrp 14 November, 2017, 16:41:51

Thanks

user-45d36e 15 November, 2017, 14:22:14

Hello--message from Prerna who is part of our team: "So we have a new release of pupil capture but as soon as we click C for calibration the pupil capture application hangs." Does anyone know how to resolve this issue with the newest version of pupil capture? Thank you!

papr 15 November, 2017, 14:27:59

@user-45d36e We are not aware of such an issue. Please create an Github issue including the following information: - Operating system and version - Running from bundle or from source? - Which calibration method has been used - Please attach the capture.log file which you can find in the capture_settings folder

user-2798d6 15 November, 2017, 16:20:07

Hello! I've downloaded the new version of Player and Capture. When I try to do offline calibration with natural features, I click for a natural feature mark and the dot shows up elsewhere on the screen rather than where I just clicked. It's appearing diagonally up and to the left from where I click. Is anyone else having this issue?

papr 15 November, 2017, 16:21:20

@user-2798d6 How far away is it? What display and OS do you use? This sounds like the hdpi-factor of your screen is not calculated correctly.

user-2798d6 15 November, 2017, 16:21:46

On my 13inch screen, it's about 2 inches away diagonally

user-2798d6 15 November, 2017, 16:22:04

I'm in OS Sierra

papr 15 November, 2017, 16:25:06

What type of macbook are you using? You can look it up by clicking on the  in the top left and opening the About this Mac window.

user-2798d6 15 November, 2017, 16:25:30

It's a MacBook pro with retina Early 2015

papr 15 November, 2017, 16:25:56

Ok, I am using the same version. I will try to reproduce your issue.

user-2798d6 15 November, 2017, 16:27:44

thanks!

papr 15 November, 2017, 16:32:44

@user-2798d6 I was able to reproduce the issue by running from source as well from running from bundle. Please create a Github issue.

user-2798d6 15 November, 2017, 16:40:34

How to Github issues work? I created one - but what happens from here?

wrp 15 November, 2017, 16:48:44

Thanks @user-2798d6 now we can keep track of the issue - https://github.com/pupil-labs/pupil/issues/934 - the issue will be closed when it has been resolved/fixed.

user-380f66 15 November, 2017, 17:54:50

Hello again, another question from my colleague Prerna: Is there a mechanism to synchronize 2 eye tracking recordings after the fact in pupil player. She wants to reduce technical issues by recording each person on a separate laptop and then sync the videos afterward. Can we do this by matching up timestamps, etc?

user-2798d6 15 November, 2017, 19:17:29

When using 3D pupil detection, how accurate does the green circle around the eye need to be? A lot of times the circles around both eyes are two different sizes or they really don't match the eyeball size.

papr 15 November, 2017, 19:24:37

The green circle should be roughly (+- 3-5mm) as big as the eye ball. The important part is that it is stable (opaque green, does not jump around). Move your eyes around if the green circle does not match your eye ball at all. This will generate better samples for the model than not moving the eyes.

user-2798d6 15 November, 2017, 19:57:15

Thanks!

user-988d86 16 November, 2017, 15:57:04

Is there any record of the different plugins people have made to go with the main software?

wrp 16 November, 2017, 16:05:42

@user-988d86 there is currently not a directory/archive/list of user contributed plugins

wrp 16 November, 2017, 16:05:48

This would be nice to have!

user-b9539a 16 November, 2017, 21:25:28

Hi! has anyone used the matlab code? I see that we get one reading per time stamp but is there any way we could get x and y coordinates and diameter for each eye rather than one?

user-ecbbea 17 November, 2017, 02:13:39

Just out of curiosity, when did the 200 hz binocular system launch? We just purchased the 120hz binocular system, and are a little sad that we weren't notified the 200hz system was launching so soon.

user-e7102b 17 November, 2017, 02:16:50

I'm also curious about this. We just purchased the 120Hz system but would have been willing to hold off for a few weeks for the 200 Hz system.

mpk 17 November, 2017, 06:50:35

@user-ecbbea @tombullock#3146 we released the 200hz system 2 days ago. We are getting our first production batch this week.

mpk 17 November, 2017, 06:51:12

If you want to upgrade just write us a email!

user-84047a 17 November, 2017, 12:07:38

Thanks @papr !

user-84047a 17 November, 2017, 12:08:14

Does anyone know if there is a way in pupil capture to view the error of each fixation estimate?

user-c828f5 17 November, 2017, 15:30:10

Hello everyone, I downloaded the Pupil Labs App for Android (OnePlus 3) and the app works just fine. I tried data recording on it. The app doesn't give us an option to change resolutions so I found that the Eye Video was capped at 320X280 (?) and the Scene Video at 1280X720. The videos were written out as a MJPEG file, which I couldn't read into VLC or MATLAB or Python. I wanted to know the frame rates that I could get from mobile recording.

wrp 17 November, 2017, 16:14:35

@user-c828f5 You should be able to change the resolutions of the world camera Pupil Cam1 ID2.

wrp 17 November, 2017, 16:14:56

You will also need to open the videos in Pupil Player to view the videos

mpk 17 November, 2017, 17:03:07

@user-c828f5 settings are locked during recording. Once you stop that you should be able to change them.

mpk 17 November, 2017, 17:03:20

Mjpeg video can be transcoded to h264 using ffpeg

user-07e14c 19 November, 2017, 19:09:05

Hi there! Glad to meet this community and to use the pupil hardware and open-source software!

user-07e14c 19 November, 2017, 19:22:21

I just received my pupil headset with the Intel RealSense R200 RGBD world camera but get issues installing is on MacOs (latest version) - I followed instructions on https://docs.pupil-labs.com/#librealsense but cannot get it to work with PuPilCapture - the world camera is not recognized... any update on this?

mpk 20 November, 2017, 07:03:44

@Laurent#2505 does the realsesne backend show up in the backend selection menu of Pupil Capture?

user-33d9bc 21 November, 2017, 21:17:35

hey, I new to this chat thread

user-33d9bc 21 November, 2017, 21:18:11

Has anyone any experience with the gaze interaction of the pupil glasses

user-33d9bc 21 November, 2017, 21:18:23

with computer program/game

user-33d9bc 21 November, 2017, 21:18:36

and how accurate is it

wrp 22 November, 2017, 02:03:28

Hi @user-33d9bc welcome to the Pupil chat 👋

wrp 22 November, 2017, 02:05:03

Under ideal conditions gaze estimation accuracy is 0.6 deg. In the field (not ideal conditions) you may see less accuracy than this depending on the individual, environment, setup of hardware (e.g. focus of cameras), and quality of calibration

wrp 22 November, 2017, 02:06:07

An example (contributed by a researcher and gamer) can be seen in this video: https://youtu.be/X_BalnBOcpk

wrp 22 November, 2017, 02:06:52

@user-33d9bc could you give an example of what you are hoping to achieve (your applicaiton or research objective)?

user-d7b89d 22 November, 2017, 10:25:50

Hello everyone, I have a question regarding calibration accuracy of variing distance of AOI. My colleagues do driving research. In this use case the distance to AOI varies from app. 50cm to arround 100m. Do you have some recommondations to get accurate tracking for all distances?

Furthermore these studies are carried out at night, we adjust contrast and birghtness to get better world camera videos, do you have further recommondations to get a good world video? (we would have infrared beams supplementing the headlights)

Thank you

user-23d980 22 November, 2017, 14:32:36

Hi all, my plug-in works great and I want to use a third-party python library (pyserial) and wanted to know if its possible to load it at run-time. I'd like to avoid having to build Pupil Capture from source, so the user can just drag and drop the plug-in, but we need to interface with a device over serial and was wondering if there is anyway to tell Pupil Capture where to load libraries from?

papr 22 November, 2017, 14:35:39

@user-23d980 Unfortunately, I did not find a way to do so yet. This is due to how pyinstaller bundles the applications. Runtime plugins can only use modules that are included in the bundle.

user-23d980 22 November, 2017, 14:36:23

ok no worries

user-23d980 22 November, 2017, 14:36:36

I found a small lib that might just work by copying & pasting

user-23d980 22 November, 2017, 14:36:40

so fingers crossed won't be an issue

user-23d980 22 November, 2017, 14:36:45

thanks for quick reply @papr

papr 22 November, 2017, 14:37:12

Good luck! Let us know if it works.

wrp 22 November, 2017, 14:46:42

Copying and pasting a lib relative to your plugin (eg in a dir next to your plugin should work)

papr 22 November, 2017, 14:48:38

@wrp @user-23d980 AFAIK, you will have to do so with all its dependencies as well. This does not work for system modules though that are not included in the bundle.

wrp 22 November, 2017, 14:49:43

Right

user-7ca285 23 November, 2017, 00:33:56

Hi @wrp @papr sorry to bother, I've been using my trusty Pupil Labs glasses for quite a long while now. With the latest v1.1, I'm facing the same problem as what Sara mentioned "Sara - 11/15/2017 Hello--message from Prerna who is part of our team: "So we have a new release of pupil capture but as soon as we click C for calibration the pupil capture application hangs." Does anyone know how to resolve this issue with the newest version of pupil capture? Thank you!"

The calibration screen freezes and stutters. There's no calibration targets. If I esc the the calibration, the main screen is glitched up.

As currently, I'm not using same MBP as the one I used to be using, I totally haven't got a clue if it's my current MBP setup's problem, or if it's an incompatible issue with the v1.1. Hope to get this sorted so that I can continue using my Pupil glasses! Looking forward to the 200hz cameras too!

wrp 23 November, 2017, 02:35:59

@user-7ca285 what version if macOS and computer specs for the mbp?

user-7ca285 23 November, 2017, 04:36:16

@wrp Hi thank you so much for the reply!!!!!

Chat image

user-7ca285 23 November, 2017, 04:38:18

And these are the screenshots of the problem

Chat image

user-7ca285 23 November, 2017, 04:38:58

Blank stuttering screen on calibration; and a glitched screen upon exit from calibration

user-7ca285 23 November, 2017, 04:39:06

Chat image

wrp 23 November, 2017, 04:40:40

@user-7ca285 this looks like an opengl issue maybe related to retina macs. I will not be able to recreate this with my current machines unfortunately.

wrp 23 November, 2017, 04:40:47

@papr can you take a look at this when you get online?

wrp 23 November, 2017, 04:41:22

@user-7ca285 can you try a non full screen calibration

wrp 23 November, 2017, 04:41:39

uncheck the full screen option in the calibration plugin

user-7ca285 23 November, 2017, 04:42:09

Is this gonna help: 2017-11-23 12:37:01,204 - MainProcess - [INFO] os_utils: Disabled idle sleep. 2017-11-23 12:37:04,703 - world - [WARNING] camera_models: Loading dummy calibration 2017-11-23 12:37:04,801 - world - [WARNING] launchables.world: Process started. 2017-11-23 12:37:05,898 - eye0 - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend 2017-11-23 12:37:05,912 - eye0 - [INFO] launchables.eye: Session setting are from a different version of this app. I will not use those. 2017-11-23 12:37:06,095 - eye0 - [INFO] camera_models: No user calibration found for camera Pupil Cam1 ID0 at resolution (640, 480) 2017-11-23 12:37:06,095 - eye0 - [INFO] camera_models: No pre-recorded calibration available 2017-11-23 12:37:06,095 - eye0 - [WARNING] camera_models: Loading dummy calibration 2017-11-23 12:37:06,364 - eye0 - [WARNING] launchables.eye: Process started. 2017-11-23 12:37:26,807 - world - [ERROR] calibration_routines.finish_calibration: Did not collect enough data during calibration. 2017-11-23 12:37:33,324 - eye0 - [INFO] launchables.eye: Process shutting down. 2017-11-23 12:37:34,326 - MainProcess - [INFO] os_utils: Re-enabled idle sleep.

user-7ca285 23 November, 2017, 04:42:33

Really really appreciate the invaluable help!

wrp 23 November, 2017, 04:42:45

@user-7ca285 the log is not helpful, but thatnk you for posting

user-7ca285 23 November, 2017, 04:42:51

Ok great

user-7ca285 23 November, 2017, 04:42:58

I'll try as you suggested

wrp 23 November, 2017, 04:43:00

Please try a non-full screen calibration as requested above

wrp 23 November, 2017, 04:43:01

thanks

user-7ca285 23 November, 2017, 04:44:40

Chat image

user-7ca285 23 November, 2017, 04:46:13

It's the same results

wrp 23 November, 2017, 04:46:23

ok, thanks

wrp 23 November, 2017, 04:46:33

we will try to reproduce this issue on our end later today

user-7ca285 23 November, 2017, 04:46:43

Thank you soooo much!!

user-7ca285 23 November, 2017, 04:46:49

Really appreciate the help

user-7ca285 23 November, 2017, 04:56:05

@wrp Hi, my bad, yes, disabling "Full Screen" from the Calibration Plugin does remove the problem

wrp 23 November, 2017, 04:56:39

@user-7ca285 ok, that is helpful in narrowing down the issue.

user-7ca285 23 November, 2017, 04:56:51

Chat image

wrp 23 November, 2017, 04:57:17

@papr glfw fullscreen issue - please try to recreate on your dev machine if possible today

user-7ca285 23 November, 2017, 04:57:36

Thank you really awesome people!!!

wrp 23 November, 2017, 04:57:54

welcome - we will be working on it

user-7ca285 23 November, 2017, 04:58:00

Thank you again!

papr 23 November, 2017, 07:14:52

@user-7ca285 @wrp I was not able to reproduce the issue on my MacBook Pro (Retina, 13-inch, Early 2015). Neither running from source nor from bundle. @user-7ca285 I Are you running from bundle or from source? Re-run your dependency installation in case of running from source. Nonetheless, I would highly recommend upgrading your OS to a newer version. macOS High Sierra runs very smooth for me and the overall software support (specifically on homebrew) is much better.

user-29e10a 23 November, 2017, 08:50:54

Hi, do the Logitech C920 HD Pro works as a World Cam? The C930 works nice, but the 920 is on sale on Amazon just today...

user-23d980 23 November, 2017, 10:03:04

@papr fortunately I was able to just copy in the library and it worked

user-23d980 23 November, 2017, 10:03:20

Is there a way to work out the orientation of a square marker?

user-23d980 23 November, 2017, 10:03:24

relative to the world view?

user-7ca285 23 November, 2017, 10:17:12

@papr Ok great, thanks lots, I'll give that a try! Will update at a later date

user-7ca285 23 November, 2017, 10:23:51

@papr I'm running from bundle, btw.

papr 23 November, 2017, 13:27:40

@user-23d980 Do you mean 3d orientation? See https://github.com/pupil-labs/pupil/pull/872 for that.

user-23d980 23 November, 2017, 13:34:47

@papr yup I've come across that. My understanding is it requires minimum of 2 marks to orientate which makes sense. I was trying to work out if there was a way to do it with just 1 but can't think of how

user-2968b9 23 November, 2017, 17:47:31

Hi all, my colleague is using pupil capture, and she keeps receiving a 'dropping frame' message on her screen. Do you know what this means particularly? What could be causing it, and how we could alleviate it?

user-93741f 23 November, 2017, 17:49:29

Manual Marker Calibration: I am struggling to get a successful calibration using this method. It always returns "Did not collect enough data during calibration. Using:monocular 3d model" No matter how many marker points I use. Using pupil service v 1.1.7. Any advice would be appreciated.

mpk 23 November, 2017, 17:49:58

@user-2968b9 a bit more info about the setup would be required. My guess here is you are using Pupil Mobile together with Pupil Capture?

user-2968b9 23 November, 2017, 17:50:07

Yes we are

mpk 23 November, 2017, 17:50:26

and the error is only happening when you start a recording?

user-2968b9 23 November, 2017, 17:50:32

What do you need to know? I can get the info from her

user-2968b9 23 November, 2017, 17:50:55

I'll check, she's operating in a different country, so she may take a little time to reply

mpk 23 November, 2017, 17:51:26

@user-2968b9 no I think I only need to know the answer to the above question

user-2968b9 23 November, 2017, 17:51:56

Okay, perfect. I've asked her, and I'll let you know as soon as she gets back to me. Thanks for the help. 😃

mpk 23 November, 2017, 17:52:13

because if this only happens at the beginning of the recording then, its fine. Pupil Capture is simply waiting for the first h264 keyframe to start a new h264 mp4 file.

user-2968b9 23 November, 2017, 17:52:23

Ah I see

mpk 23 November, 2017, 17:52:39

we should silence the warning. its an implmentaion detail.

user-2968b9 23 November, 2017, 17:53:35

Ah, fair enough, so it's not really a massive deal, so to speak?

mpk 23 November, 2017, 17:54:58

yes.

mpk 23 November, 2017, 17:55:11

its intended and not harmfull.

user-2968b9 23 November, 2017, 17:55:21

Ah fair enough

user-2968b9 23 November, 2017, 17:56:05

I'll let her know that, if she tells me that it's coming up with greater frequency, I'll let you know. Thanks for the help!

user-25a0b6 24 November, 2017, 10:58:23

Morning Folks. So, I have got an issue with the player updates and wondeirng if anyone has come across this and solved it here. Thanks in advance.

user-25a0b6 24 November, 2017, 10:58:44

Updated my Pupil capture and player software and now my old recordings from version 9.12.2 will not open on the player, it throws out the following error - Interface still loads - I attempt to upload the recording folder anyway and it crashes.. Please help!

Error Message on attempting to load a file is as follows: MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version.

        loadLibraries()
        player - [INFO] launchables.player: Session setting are from a different version of this app. I will not use those.
        createNativeWindow()
        _glfwInitWGL()
        Loading opengl32.dll
        _glfwCreateContextWGL()
        loadWGLExtensions()
        choosePixelFormat()

it seems pretty self explanatory but I am not sure how to go about resolving this - Sounds like some backward compatibility issue. Has this being solved and if so, please can anyone point me to the thread?

Many Thanks

Worried researcher. 😦

user-25a0b6 24 November, 2017, 11:03:07

I have had a rummage and found the following thread - which I have tried to follow. Please see the report attached for details - including system specs etc. Thanks guys. Keeping my fingers crossed

Pupil_Player__Bug_on_Recordings.docx

user-6764f5 27 November, 2017, 11:20:05

Hi everyone,

I'd like to use image data from pupil-labs for image-processing, but I couldn't find the detailed specifications of the camera. Does anyone know the focal length and the size of CCD or CMOS?

user-6764f5 27 November, 2017, 11:28:32

I'm using "high speed 200hz-binocular".

papr 27 November, 2017, 13:05:39

@user-6764f5 Is it possible that you mean the 120Hz eye cameras?

user-6764f5 27 November, 2017, 13:49:34

@papr Apologies, I mean the world camera.

user-36aef3 27 November, 2017, 16:31:14

Hello all, has anyone used pupil eye-tracker in a multi-monitor setup?

papr 28 November, 2017, 08:11:57

@user-36aef3 Hey 🙂 I am using the Pupil software in a multi-monitor setup for work. I think @user-41f1bf has also a lot of experience in this field.

papr 28 November, 2017, 08:32:44

@user-36aef3 For completion: You can specify on which monitor calibrations should be shown. You can also track multiple monitors using the surface tracker in case that you need gaze that is relative to the content shown on the monitors.

user-006924 29 November, 2017, 01:16:24

Hi everyone,

user-006924 29 November, 2017, 01:18:15

My advisor is planning to order a pupil labs eye tracker and I'm not sure which one to choose between high resolution, high speed or 3D. Does anyone has any advice about this? Thanks a lot in advance.

wrp 29 November, 2017, 02:20:28

@NahalNrz#1253 Usually we would recommend the high speed world camera. It comes with 2 lenses (100deg and 60deg) and can capture at variable spatial and temporal resolutions (and is the smallest of the options). The 3d world camera uses the Intel RealSense R200 sensor it provides depth data of your scene (RGBD video streams) - it is a newer sensor in our system and should be used only if you need RGBD data and are comfortable working with this kind of data.

user-006924 29 November, 2017, 02:44:49

Thanks alot

wrp 29 November, 2017, 02:47:52

Welcome @user-006924

user-006924 29 November, 2017, 02:57:55

Is there a big accuracy gap between the monocular and binocular versions? and in case it might be helpful the eye trackers are intended to be used by older adults .

wrp 29 November, 2017, 07:29:30

@user-006924 - You can achieve high accuracy with a monocular system. However, a binocular system provides more data (views of both eyes) and therefore pupil detection for at least one eye should be high confidence due to redundancy in eye movements (e.g. looking extreme left or right). With a monocular eyetracker, extreme look towards the nose may yield reduced confidence in pupil detection

mpk 29 November, 2017, 07:41:03

@NahalNrz#1253 one more point. Only with binocular hardware is gaze accurate at depth other than the calibrated depth.

user-36aef3 29 November, 2017, 20:40:08

@papr thanks for response. Could you point me to some documentation that describes this type of setup?

user-988d86 29 November, 2017, 22:29:26

Does anyone know of a plugin that captures data upon keystroke or clicking a button? Or anything somewhat similar to that?

mpk 30 November, 2017, 05:32:47

@user-988d86 Check Out annotation capture it does something like that

user-8058d7 30 November, 2017, 12:49:15

How did you solve the problem caused by different height of camera and eyes?

user-8058d7 30 November, 2017, 12:49:59

When I experimented, there was an error depending on the distance between the eyes and the object.

user-8058d7 30 November, 2017, 12:50:14

I think this caused by height difference between eyes and world camera

user-8058d7 30 November, 2017, 12:50:31

Pupil solve this problem?

user-8058d7 30 November, 2017, 12:51:19

My eye camera is a monocular. Does this cause more errors?

papr 30 November, 2017, 13:00:37

@user-8058d7 Please be aware that gaze is only correctly estimated in the depth in which the calibration happened.

user-8058d7 30 November, 2017, 13:01:13

what means estimated in the depth?

user-8058d7 30 November, 2017, 13:01:28

pupil estimated depth when is in calibration step?

papr 30 November, 2017, 13:04:56

That was ambiguous, sorry. What I meant was that if you calibrate with a distance of 2 meters, gaze will only be estimated correctly in a depth of 2 meters (especially with 2d calibration, the 3d case is less sensitive to this issue). If the subject looks at objects that are further away the gaze estimation might include inaccuracies depending on how far the object is.

papr 30 November, 2017, 13:05:41

This is probably exactly what you experienced.

user-8058d7 30 November, 2017, 13:05:50

Ok i got it

papr 30 November, 2017, 13:07:01

The relative camera positions to the eyes is compensated for in the calibration.

user-8058d7 30 November, 2017, 13:09:28

How can you mitigate this phenomenon? There is considerable distance variation in actual experimentation or use.

user-8058d7 30 November, 2017, 13:09:45

Binocular is better for this problem?

papr 30 November, 2017, 13:11:37

Yes. The 3d calibration on a binocular device can handle depth much better than the monocular headset. We even add an depth estimation to the gaze point that is based on eye vergence .

user-41f1bf 30 November, 2017, 14:11:09

@erinome I have been using pupil in a multi-monitor setup from day one. Right now it is really robust. However my experience is restricted to Ubuntu OS.

user-41f1bf 30 November, 2017, 14:12:18

What are you planning to do?

user-41f1bf 30 November, 2017, 14:12:46

@user-36aef3

user-36aef3 30 November, 2017, 14:17:21

@user-41f1bf We would like to do some experiments with very wide field of view, hence several monitor setup. We are working with Ubuntu as well

user-41f1bf 30 November, 2017, 14:21:03

If you are planning calibration across monitors you will to adapt the screen based calibration plugin

user-41f1bf 30 November, 2017, 14:21:18

will need*

user-41f1bf 30 November, 2017, 14:22:17

Yes it does

user-36aef3 30 November, 2017, 14:22:44

@user-41f1bf does it tolerate some head movement?

user-41f1bf 30 November, 2017, 14:23:53

From my experience, if use 9 or more calibration points

user-36aef3 30 November, 2017, 14:24:18

@user-41f1bf To be honest, my background is computer vision, I've only used Pupil with a single monitor before for a small stury. I don't even know where to start with multi-monitor setup and the psychologist in our lab doesn't have experience with this either.

user-41f1bf 30 November, 2017, 14:24:21

Do not trust in 5 calibration points

user-41f1bf 30 November, 2017, 14:26:42

My background is behavioral science and I have managed to adapt the plugin. Pupil code is clean and very easy

user-36aef3 30 November, 2017, 14:29:47

@user-41f1bf I actually found some of your posts on the mailing list regarding the screen-based calibration and the plugin you wrote. I think I'll study those first before asking any more questions. Thank you for pointing me in the right direction

user-41f1bf 30 November, 2017, 14:31:29

The screen tracker plugin I wrote is to avoid showing fiducial markers to participants

user-41f1bf 30 November, 2017, 14:32:55

I have found easier to detect the screen monitor and used this as a "pupil surface"

user-41f1bf 30 November, 2017, 14:36:28

@user-36aef3 , to make my life easier, we have adapted (me and my advisor) a correction algorithm

user-41f1bf 30 November, 2017, 14:38:10

This way we increase our chances of quality data.

user-41f1bf 30 November, 2017, 14:39:55

The algorithm assumes equally distribute stimuli around the screen center and is useful when participants move their heads too much

user-36aef3 30 November, 2017, 14:40:34

@user-41f1bf is this part of the code?

user-41f1bf 30 November, 2017, 14:41:31

No, it is not part of the plugin. The screen tracker only detects the screen and export data

user-41f1bf 30 November, 2017, 14:41:53

But it is also written in python

user-41f1bf 30 November, 2017, 14:42:36

Right now I am afk

user-36aef3 30 November, 2017, 14:42:57

@user-41f1bf thanks for help

user-41f1bf 30 November, 2017, 14:43:49

Fell free to enter in contact, we can talk

user-36aef3 30 November, 2017, 14:47:17

@user-41f1bf thanks, I'll try using your plugins first to avoid asking silly questions

user-41f1bf 30 November, 2017, 14:48:30

Hum.. I don't mind. Feel free ro ask

user-41f1bf 30 November, 2017, 14:50:01

My code is NOT as clean as pupil's code, so I would prefer your silly questions first

user-aee266 30 November, 2017, 14:53:45

Is there an easy or recommended way to get the DIY world-camera bracket on the frame?

user-41f1bf 30 November, 2017, 14:58:36

I have been using screws

user-41f1bf 30 November, 2017, 14:59:44

Do you mean fixing the word camera in the frame?

user-aee266 30 November, 2017, 15:03:35

Both to get the pcb on the bracket and the bracket on the frame?....I've mounted the pcb on the bracket with the tiny screws from the camera disassembly, but now I simply can't get the clamp on the back of the bracket onto the small cylinder shape on the frame...It's super unflexible

user-41f1bf 30 November, 2017, 15:04:07

Ahhhhh...

user-aee266 30 November, 2017, 15:04:44

did you click it on before or after mounting the camera pcb?

user-41f1bf 30 November, 2017, 15:05:21

It is strong enough. The bracket will fit in the frames easier without cables and the pcb

user-aee266 30 November, 2017, 15:05:32

It also looks like it's a new design....at least it looks different from the one in the documentation

user-41f1bf 30 November, 2017, 15:05:43

Hummmm...

user-aee266 30 November, 2017, 15:05:58

I'll try unscrewing the pcb and clickking it on...

user-41f1bf 30 November, 2017, 15:06:25

The one I am wearing has an X shape

user-41f1bf 30 November, 2017, 15:06:54

The world bracket has X shape

user-41f1bf 30 November, 2017, 15:07:16

With four places for screws

user-aee266 30 November, 2017, 15:07:48

I have the same one...did you mount the pcb after clicking it on?

user-41f1bf 30 November, 2017, 15:08:21

Yep

user-aee266 30 November, 2017, 15:08:36

I'll try that then, thanks!

user-41f1bf 30 November, 2017, 15:09:17

Try pushing with your thumb in the center

user-41f1bf 30 November, 2017, 15:10:09

Without the pcb

user-aee266 30 November, 2017, 15:10:25

Do you know if I can cut off the wire that goes from the usb cable and directly to the microphone (the microphone I have aleready cut off)...I think it's a ground/shiield connection...did you cut it off?

user-41f1bf 30 November, 2017, 15:11:05

I can send you a picture when I get at the work

user-41f1bf 30 November, 2017, 15:11:47

As far as I remember, I did not remove any cable.

user-41f1bf 30 November, 2017, 15:13:53

I do remember that I needed to force cables to fit them inside the bracket

user-aee266 30 November, 2017, 15:18:22

That did the trick...my world-camera is now on and the cables nice tucked in.

user-41f1bf 30 November, 2017, 15:20:06

It is an older design

user-aee266 30 November, 2017, 15:20:33

It's just the ekstra wire on the eye-camera that went from the usb cable directly to a small metal piece on the microphone....I took of the microphone and directed in the documentation, but wasn't sure what to about this extra wire (they didn't say in the documentation) and though it might be needed for grounding etc...

user-41f1bf 30 November, 2017, 15:22:33

You are right, they did not say to remove the ground cable. 😀

user-aee266 30 November, 2017, 15:24:41

They just forgot to mention where to put/connect it...

user-41f1bf 30 November, 2017, 15:26:37

As far as I remember, I did not remove any cable. I cut off the auto focus and mic from the eye camera.

user-aee266 30 November, 2017, 15:27:47

You cut the autofocus as well...? They didn't mention this in the documentation...or how to do it.

user-41f1bf 30 November, 2017, 15:28:06

Also, as I did not know SMD soldering, I made my own IR connection

user-41f1bf 30 November, 2017, 15:28:52

Using a resistor and single IR led

user-41f1bf 30 November, 2017, 15:29:26

You can choose to leave the autofocus intact

user-41f1bf 30 November, 2017, 15:30:36

You will be able to turn off the autofocus using the most up to date software

user-aee266 30 November, 2017, 15:33:41

Cool ....THANKS! I was struggeling with the SMD soldering earlier today and hope I got them placed correctly as it was impossible for me to se the little notch that indicates the cathode end...

user-41f1bf 30 November, 2017, 15:34:54

I would recommend another camera to you, it is a little bit bigger than HD-6000, but will do the work

user-41f1bf 30 November, 2017, 15:35:42

I am using an ELP-xxxxxx, you can found the model number in the pyuvc repository

user-41f1bf 30 November, 2017, 15:36:45

This camera came with a 5v output

user-aee266 30 November, 2017, 15:37:06

I didn't know you could use another camera for the eye-camera...I had a hard time finding it, but got it and it's mounted for now. Did it fit on the same pupil bracket or did you make you own for this as well?

user-41f1bf 30 November, 2017, 15:37:28

Same pupil bracket

user-41f1bf 30 November, 2017, 15:38:04

95 fps

user-aee266 30 November, 2017, 15:38:23

Nice. I'll keep it in mind if I have broken this one....I had a hard time replacing the IR filter and may have scratched the lens... will see.

user-41f1bf 30 November, 2017, 15:38:27

And it is far more easier to work

user-aee266 30 November, 2017, 15:38:44

wow 95 fps is impressive

user-aee266 30 November, 2017, 15:39:11

what can the hd-6000 provide ?

user-41f1bf 30 November, 2017, 15:39:22

30fps

user-41f1bf 30 November, 2017, 15:40:20

ELP should allow up to 120 with proper ilumination

user-41f1bf 30 November, 2017, 15:40:55

However I hadnt time to keep exploring

user-41f1bf 30 November, 2017, 15:44:36

and testing

user-aee266 30 November, 2017, 15:44:50

I'm looking at the pyuvc repository on Git, but can't find a camera list...

user-aee266 30 November, 2017, 15:47:16

You mentioned this on on the github issues page....ELP-USBFHD01M-L21...it it the one?

user-41f1bf 30 November, 2017, 16:31:34

Yes

user-41f1bf 30 November, 2017, 16:31:44

It is

user-98013c 30 November, 2017, 22:58:03

Hi, just a couple of questions: With the online Accuracy calibration, what does the Angular Accuracy and Precision. In a test run ours were 1.18048093324 and 0.00189682860559 respectively . What should the limits be?

End of November archive