core


user-6e1816 01 March, 2018, 01:31:49

I have install pyglui v1.18, but now I want to install pyglui v1.9, how to do?

wrp 01 March, 2018, 01:44:05

@user-6e1816 the most recent release is v1.18 - https://github.com/pupil-labs/pyglui/releases/tag/v1.18

wrp 01 March, 2018, 01:45:16

@user-6e1816 what OS are you using

user-6e1816 01 March, 2018, 01:45:41

@wrp ubuntu

user-6e1816 01 March, 2018, 01:46:17

I know that the v1.18 is the most recent release.

wrp 01 March, 2018, 01:48:34

@user-e7102b surface stability looks much better. Thanks for sharing progress :+1:

wrp 01 March, 2018, 01:51:48

@user-6e1816 sudo pip3 install --upgrade git+https://github.com/pupil-labs/pyglui

wrp 01 March, 2018, 01:52:00

Did you already try this?

user-6e1816 01 March, 2018, 01:53:03

I am trying https://pupil-labs.com/blog/2017-12/real-time-object-recognition-using-fixations-to-control-prosthetic-hand/. But his version may be v1.9, and an error occurred about pyglui when its version is v1.18

user-6e1816 01 March, 2018, 01:57:09

so I want to know if I could reinstall an earlier version.

wrp 01 March, 2018, 01:57:44

In terms of version formats v1.18 > v1.9 (also there was never a v1.9 release as far as I can see)

wrp 01 March, 2018, 01:58:09

You shouldn't need an earlier version

wrp 01 March, 2018, 02:59:14

Hi @user-8b1388 I would be happy to help you out. You note that you need help with setup. Have you looked through https://docs.pupil-labs.com yet?

papr 01 March, 2018, 06:54:04

@user-6e1816 @wrp Some earlier versions of Pupil require older pyglui versions, that is correct. You need to clone the repository, check out the version that you want to install, replace the git+URL part with the local path to the local repository in the the pip command above.

user-e119ae 01 March, 2018, 07:06:32

Hi there, I''m trying the pupil mobile with my tracker which has the high resolution world camera, when I connect to my OnePlus 5 phone I only can see the feed coming from the eye camera on the phone but the world camera feed is not showing anything !? Could you please help me out with this!?

user-8b1388 01 March, 2018, 08:25:18

@wrp Hi! thanks for your answer. i have a problem with an accurate gaze tracking. @papr help me now.

user-8b1388 01 March, 2018, 08:26:03

i share link to my video folder.

wrp 01 March, 2018, 08:27:34

Sorry, I don't follow, do you mean that you shared a link already with @papr?

wrp 01 March, 2018, 08:27:36

please clarify

user-8b1388 01 March, 2018, 08:30:00

yes. i shared link with @papr. also i should get an order ID. i can do this afternoon.

papr 01 March, 2018, 08:31:22

Yes, I asked @user-8b1388 to share a recording that includes the calibration procedure. I did not have the time to look at it yet though. But I did not ask for an order ID.

wrp 01 March, 2018, 08:31:48

I have also asked for order id via separate email thread

user-cf2773 01 March, 2018, 12:36:48

Following @user-ed537d question, is there a way to pull or calculate where the user is looking in terms of visual angle in relation to the world camera? I mean, ideally, if the user is gazing at a point straight ahead, I would like to get (0.5, 0.5)

papr 01 March, 2018, 12:48:21

@user-cf2773 @user-ed537d Yes, this is possible if you know the world camera's intrinsics. The accuracy visualizer uses this method to calculate the accuracy in terms of visual angles.

user-cf2773 01 March, 2018, 12:49:40

Aren't the world camera's intrinsics known by Pupil Capture already?

user-cf2773 01 March, 2018, 12:50:22

I am trying to get these directly from Pupil Remote

user-cf2773 01 March, 2018, 12:53:34

If I print result['norm_pos'] of the "gaze" topic from Pupil Remote, the coordinate varies only a tiny bit between one sample and another, regardless where I am gazing

mpk 01 March, 2018, 12:53:39

@user-cf2773 @papr the intrinsics do not need to be known if you use 3d mode .

mpk 01 March, 2018, 12:53:49

in this case just use gaze_point_3d

mpk 01 March, 2018, 12:54:09

and compare to a vector going straight out: (0,0,1)

user-cf2773 01 March, 2018, 12:54:42

OK I will try, thanks @mpk !

papr 01 March, 2018, 12:55:53

The norm_pos is normalized within the world camera image, where (0,0) is the bottom left corner and (1,1) the top right corner. Therefore it is expected to change very little between samples.

user-13ea21 01 March, 2018, 13:58:46

Is there a way to set default settings so when pupil capture starts up the settings are the same?

user-cf2773 01 March, 2018, 14:43:04

It doesn't work. I have used 3d calibration and computed scipy.spatial.distance.cosine(data["gaze_point_3d"], (0, 0, 1)). It always returns me values around 0.02 or 0.03, regardless where I am looking at

user-cf2773 01 March, 2018, 15:39:24

Any idea about what is going on? Do I need to follow a different calibration procedure/computation?

papr 01 March, 2018, 15:44:44

0.02 = cos(angle-tocenter). To calculate the actual angle, you need to call numpy.arccos() on the result of scipy.spatial.distance.cosine(data["gaze_point_3d"], (0, 0, 1))

user-cf2773 01 March, 2018, 15:45:49

yes I did that too, but the variation is obviously similarly small

papr 01 March, 2018, 15:45:53

also make sure that the gaze_point_3d points has the same sign as your "center" vector (0, 0, 1)

papr 01 March, 2018, 15:46:50

What x,y values do you get in gaze_point_3dif you look at the center? Are they close to zero?

user-cf2773 01 March, 2018, 15:47:15

Yes, but they are also close to zero when I look to the top-left or bottom-right corners

user-cf2773 01 March, 2018, 15:47:33

They are close to zero all the time, that's what I don't understand

user-cf2773 01 March, 2018, 15:48:00

In Capture, the gaze seems to be tracked properly, so I think it's just an issue of how the coordinates are expressed

papr 01 March, 2018, 15:48:32

Whats the z value for them?

mpk 01 March, 2018, 15:50:10

@user-cf2773 whats the values of data["gaze_point_3d"]?

mpk 01 March, 2018, 15:50:19

do they make sense?

papr 01 March, 2018, 15:52:30

One thing to remember is that arccos returns radians. Did you convert them to degrees with np.rad2deg?

papr 01 March, 2018, 15:53:54

small x,y changes are ok if z is also small. the cosine distance function normalizes both vectors, so that is taken care of.

user-cf2773 01 March, 2018, 15:54:44

data["gaze_point_3d"] = (-60, 26, 484)

papr 01 March, 2018, 15:55:47

This looks like a sensible value.

user-cf2773 01 March, 2018, 16:02:36

For some reasons, I have closed Pupil Capture, reopened it, recalibrated, and now it seems to work: I get cos around 0 when I look at the center and different values when I look far away. Thanks a lot for your help!

papr 01 March, 2018, 16:03:30

Great! 🙂

user-3565f9 01 March, 2018, 16:58:20

Hey everyone! I just got a 120Hz binocular setup with normal world camera, but we've been having trouble setting up the calibration. Nothing we do seems to get us reliable calibration (the gaze dot on the screen does not match where they are looking). When we do get calibration, it lasts for only a minute, and it is like any movement in stimuli or head wrecks it. We're going to be running a study where people look at some paper advertisements, which is why we are trying to use these glasses.

mpk 01 March, 2018, 17:02:41

@user-3565f9 can you share a example recording with eye videos? This will allow us to help. You can send this via PM or email to info[at]pupil-labs.com

user-f1eba3 01 March, 2018, 17:04:59

Hi guys

user-f1eba3 01 March, 2018, 17:05:21

has anyone had problems calibrating the pupil ?

user-f1eba3 01 March, 2018, 17:05:33

witha HTC Vive

user-f1eba3 01 March, 2018, 17:06:06

Namely, the problem is that wheni click on calibrate it just shows the white dot on the screen and it does not do anything\

user-f1eba3 01 March, 2018, 17:06:16

even though I am looking straight at it 😄

mpk 01 March, 2018, 17:07:58

@user-f1eba3 this is a question for HMD eyes. @user-e04f56 can help you there.

papr 01 March, 2018, 17:19:00

@user-3565f9 It would help a lot if the recording includes the calibration, i.e. if you could start the recording before starting the calibration

user-e02f58 01 March, 2018, 18:36:40

May I ask about the expected time of launching the Saccade detector plugin?

papr 01 March, 2018, 18:37:30

Unfortunately, we do not have a time frame for its release yet.

user-e02f58 01 March, 2018, 18:38:49

Understood, thanks

user-b23813 01 March, 2018, 21:49:33

Hi guys, I am using 1.4.1 and I have an issue with the exported excel file including the blink detection data. The data is shown correctly only for some of the blinks. I am sure there were many more blinks recorded but these are not shown correctly in the file. As you can see with the pink shadow, there is a lot of data not correctly depicted in columns, which I guess should be the start time stamps, confidences etc. However, it is difficult to find what refers to what considering the amount of data. Has any other had this issue?

Chat image

papr 01 March, 2018, 21:51:09

@user-b23813 This looks like a software bug. Please raise an issue on github including this screenshot

user-b23813 01 March, 2018, 21:52:37

@papr thanks a lot, I will

mpk 02 March, 2018, 09:32:37

@here Pupil Labs Team has been busy! We just pushed an update of Pupil Capture Player and Service: https://github.com/pupil-labs/pupil/releases/tag/v1.5 !

user-24270f 02 March, 2018, 09:33:41

wheres the CV1 eye tracker

mpk 02 March, 2018, 09:34:06

@user-24270f coming soon 😃

user-24270f 02 March, 2018, 09:34:52

im sure i believe you

mpk 02 March, 2018, 09:35:47

@user-24270f we are very close but we had to revise the camera because there is SO little space in the Rift. Waiting for new parts now. Hope we can show something in the next weeks.

user-24270f 02 March, 2018, 09:35:59

these things take time

mpk 02 March, 2018, 09:36:35

Its challenging when you make an add-on and thus cannot modify the Rift design.

user-dfeeb9 02 March, 2018, 11:09:42
Capture allows creation of recordings without an active world video source

Thank you guys so much for this @mpk

user-0d187e 02 March, 2018, 11:16:05

I want to prevent the 3d model from updating after it's set (I have pointed out this issue before), can I achieve this by decreasing or increasing the "model sensitivity" value? Otherwise I don't know exactly how to achieve that by changing the code.

user-dfeeb9 02 March, 2018, 11:16:41

By the way @papr , @user-e7102b has updated our middleman repo with some nicer matlab examples over at https://github.com/mtaung/pupil_middleman if it might end up being useful to you guys

mpk 02 March, 2018, 11:16:57

@user-0d187e yes decrease the sensitivity.

papr 02 March, 2018, 11:17:03

@user-dfeeb9 Cool!

user-dfeeb9 02 March, 2018, 11:17:12

I still need to implement ntp timesync at some point

user-0d187e 02 March, 2018, 13:19:01

@mpk Thanks. Does setting the sensitivity to 0 stops the model change completely?

mpk 02 March, 2018, 13:20:10

@user-0d187e no, we have not implemented a freeze like that. We can do that though...

user-0d187e 02 March, 2018, 13:21:12

Ahh. How easy is to modify the code for this if I just download the source code?

mpk 02 March, 2018, 13:22:23

super easy:

mpk 02 March, 2018, 13:23:41

just change the min value to something like 0.

user-0d187e 02 March, 2018, 13:29:44

min value of what?

user-0d187e 02 March, 2018, 13:31:34

the model sensitivity you mean

user-0d187e 02 March, 2018, 13:31:36

got it

user-0d187e 02 March, 2018, 13:33:46

I have no idea how to compile the code after changes and make the executable file! but I will try

mpk 02 March, 2018, 13:36:48

@user-0d187e all you need to do is follow these steps : https://docs.pupil-labs.com/#developer-setup

mpk 02 March, 2018, 13:36:57

then make the changes and run python3 main.py

mpk 02 March, 2018, 13:37:07

it will auto compile the pupil detector for you.

user-0d187e 02 March, 2018, 13:37:19

cool

user-0d187e 02 March, 2018, 13:37:23

thx

user-f1eba3 02 March, 2018, 13:47:57

Hi guys

user-f1eba3 02 March, 2018, 13:48:29

Noob question: I was reading the docs and it states that :

user-f1eba3 02 March, 2018, 13:48:30

Once you plug the usb cables into your computer:

the right eye camera will show up with the name: Pupil Cam 1 ID0 the left eye camera will show up with the name: Pupil Cam 1 ID1

user-f1eba3 02 March, 2018, 13:48:41

Where shoul it show up ?

papr 02 March, 2018, 13:49:24

@user-f1eba3 Click on the UVC Manager icon on the right. It should show you a selector with all available cameras

mpk 02 March, 2018, 13:50:00

@user-f1eba3 but you dont have to really worry about that. Just start Pupil capture and the right cameras will be selected for world, right and left eye.

user-c14158 02 March, 2018, 13:58:26

ger

user-29e10a 02 March, 2018, 14:30:50

@mpk @user-0d187e setting the min value to zero does not completely freeze the model, you have also implemented the check for the fit performance gradient, which is included in the eyemodel... i'm also looking to freeze the center of the fitted sphere (to a reasonable value, I work with the Vive addon), since the radius is fix.

user-f1eba3 02 March, 2018, 14:52:30

So basically what I am trying to do is game development and robotics with unity but to do that I will use Unreal engine. Today I want to make a python script or something simillar to get the and y position and show it in a console for starters ?

user-f1eba3 02 March, 2018, 14:52:44

Where should I loook on code that interacts with the pupil

papr 02 March, 2018, 14:53:43

Have a look at this scrip https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_messages.py

papr 02 March, 2018, 14:54:42

It uses Pupil Remote and the IPC backend to subscribe to the pupil data (uncomment line 24 to subscribe to gaze data) and prints the received data

user-f1eba3 02 March, 2018, 15:04:51

how could I have missed that

user-f1eba3 02 March, 2018, 15:04:58

I must have been super blind

user-f1eba3 02 March, 2018, 15:13:44

should I use the newest Python version if I am thinkering with some parts of pupil ?

papr 02 March, 2018, 15:14:15

We support >= Python 3.5, but I highly recommend Python 3.6

user-a49a87 02 March, 2018, 16:02:03

Hi. Is the pupil detection algorithm the same for general pupil labs and for HMD version? I wonder because the point of view and lightning is not the same.

user-ecbbea 02 March, 2018, 18:21:45

Hey just curious if anyone in here knows exactly how confidence is calculated for each sample? I can't find any information in the documentation

papr 02 March, 2018, 18:48:11

@user-a49a87 it is the same. The point of view is not relevant as long as you can see the pupil clearly. Different lighting is handled by the algorithm as well. Or to be specific: The operator is responsible to change the uvc controls such that the pupil is detected well

papr 02 March, 2018, 18:56:26

@user-ecbbea to cite @marc here: Support pixel / ellipse circumference. Where support pixels are the number of pixels that were used to fit the ellipse.

papr 02 March, 2018, 18:57:50

This means that the confidence is high if the support pixels, the detected pupil edge pixels, are similar in shape to an ellipse.

user-ecbbea 02 March, 2018, 19:00:59

Great thanks

user-ecbbea 02 March, 2018, 19:01:15

essentially the less circular the pupil looks, the less confidence?

user-ecbbea 02 March, 2018, 19:08:44

Hey is it possible to adjust the focus on the newer low-profile 200hz cameras? We can't seem to find an obvious way to do so

user-ecbbea 02 March, 2018, 19:08:53

(sorry for all the questions)

user-ecbbea 02 March, 2018, 19:13:06

ALSO: is it possible to get the CAD file for this? https://www.shapeways.com/product/KS8ME6JNV/pupil-labs-camera-arm-extender-for-120hz-and-200hz?optionId=64865617

papr 02 March, 2018, 19:22:57

@user-ecbbea the 200hz cameras do not have the possibility to adjust the focus

mpk 02 March, 2018, 19:57:18

@charles bukowski#5234 you can get the cad file here: https://github.com/pupil-labs/pupil-geometry/blob/master/Pupil%20Headset%20triangle%20mount%20extender.stl

user-a7d017 02 March, 2018, 20:57:02

Hello everyone, I am trying to get hardware set up on Mac. I am able to get video feed of the two pupil cameras, but world feed is not working. In the UVC manager, there is only two listed camID. Selecting UVC source button shows Local USB Source: Ghost capture. Capture initialization failed.

user-a7d017 02 March, 2018, 21:21:49

Additionally have just attempted to use pupil device on WIndows machine. Windows recognized and added two cameras only, the pupil cameras. Should it have also detected a third camera for world?

user-2798d6 02 March, 2018, 21:24:14

Hello! Is there a way to check accuracy of calibration if doing offline natural marker calibration?

papr 02 March, 2018, 21:39:53

@user-2798d6 not yet, but this is quite high on my todo list

user-2798d6 02 March, 2018, 21:43:01

Thank you!

wrp 03 March, 2018, 02:21:50

@user-a7d017 what world camera does your setup have? Is it a 3d world camera (realsense r200 sensor?)

user-10e1e3 04 March, 2018, 08:53:17

Zadig messed up by notebooks builtin webcam. No matter what driver I choose it isnt recognized anymore by any program. In the device manager it is now under Generic USB devices, Not Imaging Devices and is labeled as "UVC 2.0 Webcam". I tried clicking on it and choosing update driver, it says drivers are up to date. I tried uninstalling the driver and rebooting Windows, nothing changes. Help, I need my webcam.

user-6e1816 05 March, 2018, 01:50:58

I can't switch the Resolution and Frame rate on the "Pupil Capture" window, if I do, the window will disappear and it can't open again. My OS is Ubuntu.

wrp 05 March, 2018, 02:57:22

Hi @user-10e1e3 in order to restore drivers you either had to have backups of system drivers so you can roll back or try to do a system update for Windows to get most recent drivers.

wrp 05 March, 2018, 02:58:28

For those reading the above message. Please do not use Zadig to install drivers. You should be able to use Pupil driver installers that ship with Pupil Capture. If you are using Zadig, please make backup drivers and or backup your system before fiddling with driver installation.

wrp 05 March, 2018, 02:58:52

@user-6e1816 are you running the latest version of Pupil Capture?

wrp 05 March, 2018, 03:10:46

@user-6e1816 I was not able to recreate the behavior your noted. Please could you try restarting Pupil Capture with default settings?

user-7bc627 05 March, 2018, 03:43:49

@wrp I stick the surface trackers on my iphone and I was trying on myself. I am wondering for the heap map, why it is always showing as red in the recording? I know that some of the time I wasn't looking at my mobile, but why the heap map is still red? Thanks!

wrp 05 March, 2018, 03:44:35

@user-7bc627 did you set the size of the surface?

user-7bc627 05 March, 2018, 03:46:03

Hmm, I am not sure. Now is like this

Chat image

wrp 05 March, 2018, 03:46:35

@user-cbb918 please specify x size and y size and then press the (Re)-calculate button

user-7bc627 05 March, 2018, 03:56:50

I see...hmm...sorry but how can I know the size?? like should I specify 30? 50? 100?

wrp 05 March, 2018, 03:58:08

You can specify the dimensions of the surface in pixels (e.g. the resolution of the screen size of your phone). You can supply other dimensions with relative proportions if you like as well. Basically the width and height are required in order to determine the output size of the png that is generated when you export the surface. The width and height also determine the number of bins for the heatmap

wrp 05 March, 2018, 03:59:33

example - iPhone 7 is (375, 667)

user-7bc627 05 March, 2018, 04:04:27

great I see! Another question... the visual attention is always showing in the middle of the left side. Meaning that... it did not track my eyes very well?

Chat image

wrp 05 March, 2018, 04:05:58

Please visualize the gaze position and you can see where you were looking and compare to the generated heatmap. I would suggest re-calibrating and ensuring that you have high confidence pupil detection and accurate gaze mapping.

user-7bc627 05 March, 2018, 04:06:20

ok I got it! Thanks!

wrp 05 March, 2018, 04:12:25

@user-7bc627 welcome 😄

user-c9c7ba 05 March, 2018, 08:51:22

hello,

its there somewhere a good tutorial how to adapt pupil capture and his algorithms for eye tracking in HTC vive?...

I am using at the moment HMD-eyes plugin but I am confused for example what detection method to use

user-e02f58 05 March, 2018, 08:56:16

@wrp Hello, I am installing windows dependencies, I encounter error by running python setup.py build in pupil/pupil_src/capture/pupil_detectors

Here is part of build log of visual studio cmd, it sait "cannot open file 'boost_python3-vc140-mt-1_64.lib'"

The boost version it try to link is incorrect, my computer installed boost for vc141. Do you know how to link to the correct lib? (i.e. boost_numpy3-vc141-mt-1_64.lib)

Chat image

mpk 05 March, 2018, 09:20:04

@here The Pupil Mobile 200hz issue has been fixed. Please update to latest version (v0.21.0) !

user-f1eba3 05 March, 2018, 13:57:31

👏 👏

user-f1eba3 05 March, 2018, 14:36:16

Can you somehow run the helper scripts

user-f1eba3 05 March, 2018, 14:36:35

withouth the use of pupil capture through an API for example

mpk 05 March, 2018, 14:37:00

@user-f1eba3 you mean access the data with having the Pupil Capture windows?

papr 05 March, 2018, 14:38:49

*without 🙂

user-f1eba3 05 March, 2018, 14:39:50

withouth having the app yes

papr 05 March, 2018, 14:40:11

This is not possible at the moment.

user-f1eba3 05 March, 2018, 14:40:33

do you have an api for example to do something like pupil.init(port to connect)

user-f1eba3 05 March, 2018, 14:40:43

and then adress that port with other programs

mpk 05 March, 2018, 14:41:10

@user-f1eba3 well if you dont need the world camera, then you can use Pupil Service

papr 05 March, 2018, 14:42:20

@user-f1eba3 There is no background service that would be able to start the applications via an api, no. The apps API entry point is represented through Pupil Remote.

papr 05 March, 2018, 14:42:45

But Pupil Remote is only available if Capture/Service is already running.

user-9575cd 05 March, 2018, 15:25:37

Hello, can someone recommend a reliable freeware software for stimuli presentation control and synchronisation with Pupil?

user-e7102b 05 March, 2018, 15:31:32

@user-9575cd PsychoPy is one popular free stimulus presentation package. It has both GUI and command line experiment building options.

papr 05 March, 2018, 15:32:41

There is no PsychoPy GUI integration for Pupil that I know of though. But integrating the Pupil Helper scripts into the generated Python code of PsychoPy should be fairly easy.

user-9575cd 05 March, 2018, 15:33:43

thanks, Psychopy is actually the software I currently use, but I hopelessly got lost trying connecting it with Pupil

user-9575cd 05 March, 2018, 15:34:46

the idea about Helper is interesting though. By any chance you know some source where to get more information?

papr 05 March, 2018, 15:34:52

Are you modifying a gui-generated experiment or do you build your experiment from scratch using the PsychoPy Python module?

user-9575cd 05 March, 2018, 15:36:07

Currently trying to start from scratch by using Eyetracker demo from the workshop, but constantly get adlib error

papr 05 March, 2018, 15:37:15

These are the Pupil helper example scripts: https://github.com/pupil-labs/pupil-helpers Most of them are implemented using a loop that calls the socket's recv() method directly. This is problematic in your case since that is a blocking call that makes your ui unresponsive. You will need to check for available data within your ui loop before calling recv(). This way the ui loop stays responsive.

user-9575cd 05 March, 2018, 15:39:22

Thanks for the direction, Papr! Didn't try helper yet, but it seems good idea

user-9575cd 05 March, 2018, 15:40:07

By the way, did any of you succeeded in making working experiment with Pupil and Psychopy?

user-9575cd 05 March, 2018, 15:40:25

In one of the recent versions

papr 05 March, 2018, 15:43:01

My bachelor thesis was actually about integrating Pupil into PsychoPy. But the code is over 1.5 years old and is by no means up-to-date... By now I highly recommend the way mentioned above.

papr 05 March, 2018, 15:45:05

@user-9575cd You should also have a look at the Surface Tracker plugin and define a surface around your screen if you want to have gaze in relation to your on-screen stimuli. https://docs.pupil-labs.com/#surface-tracking

user-9575cd 05 March, 2018, 15:49:22

So grateful for additional ideas and that spark of optimism! 😃 Since was loosing hope to make PsychoPy work.

user-072005 05 March, 2018, 15:54:57

Did the latest release solve the issues between the mobile app and pupil player?

papr 05 March, 2018, 15:55:25

The latest Pupil Mobile release, correct

user-072005 05 March, 2018, 15:55:52

excellent, I'll continue my research then

user-e7102b 05 March, 2018, 15:56:23

@user-9575cd Alternatively, if you're able to get hold of a copy of MATLAB (not free, but cheap for students and sometimes available for free via University libraries), you can download Psychtoolbox3 (PTB) for free and use that for stimulus control. I have created a working example for controlling pupil_capture using PTB and sending event codes. This method seems to work well, but it is more clunky than just doing everything within python.

user-9575cd 05 March, 2018, 15:59:51

Thanks, Tom. Perhaps it is even possible to have a look at your example somewhere online?

user-e7102b 05 March, 2018, 16:02:14

Thanks @papr .

user-9575cd 05 March, 2018, 16:02:15

Once again, big thanks. You guys are so helpful!

papr 05 March, 2018, 16:02:31

👍

user-f1eba3 05 March, 2018, 16:28:55

where can I better understand the concept of gaze

user-f1eba3 05 March, 2018, 16:28:58

or gazing ?

papr 05 March, 2018, 16:29:39

Do you mean gaze as the term that we usually refer to in the Pupil software?

user-f1eba3 05 March, 2018, 16:30:30

not just as the literal meaning

user-f1eba3 05 March, 2018, 16:30:34

like a definition

user-f1eba3 05 March, 2018, 16:31:07

I am searching for a debate or something similar to fully grasp the idea of gazing

user-f1eba3 05 March, 2018, 16:47:18

So now I managed to understand most of the concepts used in Pupil and how the helpers work

user-f1eba3 05 March, 2018, 16:47:48

How do you think I should proceede in order to build a plugin for Unreal Engine

user-f1eba3 05 March, 2018, 16:47:57

?

papr 05 March, 2018, 16:48:49

You should look for a zmq implementation that you can use in the Unreal Engine.

user-f1eba3 05 March, 2018, 16:57:12

I am also going to take the hmd eyes project as an example

papr 05 March, 2018, 16:58:13

I would recommend that as well!

user-f1eba3 05 March, 2018, 16:58:41

cool

user-cf2773 06 March, 2018, 12:55:50

@Clover#9724 I do use Psychopy and Pupil together for my experiments, where are you stuck at in the integration?

user-cf2773 06 March, 2018, 12:57:49

I am currently having issues with Pupil Capture 1.4.1. When I calibrate the tracker, the gaze is tracked properly for the first 4 or 5 seconds, and then it gets lost and I need to recalibrate. Any idea on why this is happening? I stay steady in front of the screen, so I am not moving at all.

papr 06 March, 2018, 13:14:45

@user-cf2773 is this with 2d or 3d mapping

user-cf2773 06 March, 2018, 13:14:54

3d

papr 06 March, 2018, 13:15:09

How do the confidence graphs look in the world window?

user-cf2773 06 March, 2018, 13:15:41

usually 100% steady but, sometimes, one eye goes to 0 with no reasons

user-cf2773 06 March, 2018, 13:16:07

in the eye camera I can see the pupil tracked properly and the green circle surrounding the eye

papr 06 March, 2018, 13:16:23

And the green circle does not change in position?

user-cf2773 06 March, 2018, 13:16:57

with some calibrations yes, but then I recalibrate and is generally steady

papr 06 March, 2018, 13:17:47

And is it opaque? The more translucent the green line is, the less is the model confidence.

papr 06 March, 2018, 13:18:43

Can you open the 3d debug window? You can find it in the eye windows' pupil detector settings

user-cf2773 06 March, 2018, 13:24:05

Yes it is quite opaque

user-cf2773 06 March, 2018, 13:24:36

And I have the 3d debug windows open, the blue ball seems to be steady

user-cf2773 06 March, 2018, 13:25:46

But for one of the eyes, there are three models competing with each other

papr 06 March, 2018, 13:25:55

What do you if you say "it gets lost" (citation from your first message)? Do you mean that the gaze visualization does not show up anymore or that the tracking is very inaccurate?

user-cf2773 06 March, 2018, 13:26:35

It becomes inaccurate and then it doesn't show anymore, and I get a lot of nan from pupil remote gaze

user-cf2773 06 March, 2018, 13:26:43

Also, often when I do the calibration I obtain reflection detected in the world camera

papr 06 March, 2018, 13:27:40

nan values? Which gaze datum fields are nan?

user-cf2773 06 March, 2018, 13:28:12

gaze_point_3d

user-cf2773 06 March, 2018, 13:30:20

also, the two magenta markers in pupil capture tracking the gaze are rarely overlapping: they diverge more and more

papr 06 March, 2018, 13:34:25

This is a recent change. Low confidence data is mapped monocularly to avoid worsening a binocular result. The previous version discarded such data. The competing models are an indicating for low model confidence data. This means that new 2d pupil data does not fit the current 3d model and therefore new models are built up.

papr 06 March, 2018, 13:35:04

Try to find a different position for the eye camera with the competing models. Sometimes a different point of view yields more stable models.

user-cf2773 06 March, 2018, 13:36:01

I have been tried for quite a while and with different people, do you have any recommendation on how the cameras should be positioned (e.g., distance from the eyes, in front or below the eyes, etc.)?

mpk 06 March, 2018, 13:36:51

@user-cf2773 if you are having a hard time we can do a video support session to debug this issue. Best way forward is to write us an email to info[at]pupil-labs.com

user-cf2773 06 March, 2018, 13:38:16

Will do, thanks a lot!

user-d9bb5a 06 March, 2018, 13:38:32

Good afternoon. Tell me what I'm doing wrong, why my marker jerks and does not stop twitching?

papr 06 March, 2018, 13:39:50

Which marker are you talking about?

mpk 06 March, 2018, 13:40:22

@user-d9bb5a please provide a sample video for us to look at. You can share a google drive link in this chat.

user-d9bb5a 06 March, 2018, 13:40:44

@user-d9bb5a please provide a sample video for us to look at. You can share a google drive link in this chat.

user-d9bb5a 06 March, 2018, 13:41:21

oh ... second))) about the marker that are needed for the surface (

user-d9bb5a 06 March, 2018, 13:42:45

video, I can reset tomorrow (now at the meeting) ...

user-d9bb5a 06 March, 2018, 13:45:25

can I take the wrong size?

papr 06 March, 2018, 13:48:07

How many markers do you use to track you surface?

user-d9bb5a 06 March, 2018, 13:48:31

4

user-d9bb5a 06 March, 2018, 13:48:47

else 8

user-d9bb5a 06 March, 2018, 14:00:55

Is it possible to make a heat map with video without a marker?

papr 06 March, 2018, 14:02:16

You mean a heatmap that is relative to a gaze target? No, this is only possible by defining the gaze target as surface.

user-d9bb5a 06 March, 2018, 14:03:22

It is very a pity (Thanks ... then it is necessary to understand to the end why they are not defined or twitch.

papr 06 March, 2018, 14:05:39

As @mpk mentioned, the most effective way to help you is to share a recording of the surfaces with us. Until then we would have to guess possible reasons and this is not feasible. 🙂

user-d9bb5a 06 March, 2018, 14:07:19

Yes, tomorrow I will share with you, since. Your help is very necessary. Apparently she just got confused. On the old computer everything worked out, and the new one still can not adjust the work ((((Thanks.

user-f1eba3 06 March, 2018, 14:14:55

has anyone used c++ to do a 0mq request to pupil ?

papr 06 March, 2018, 14:15:35

I think the hmd-eyes project uses the c++ zmq library.

user-f1eba3 06 March, 2018, 14:34:00

maybe I am mistaking but I guess not https://github.com/pupil-labs/hmd-eyes/blob/master/python_reference_client/zmq_tools.py

user-f1eba3 06 March, 2018, 14:34:26

I will try to make a request using zeromq with c++ somehow

papr 06 March, 2018, 14:40:19

I think the hmd-eyes project uses the c# zmq version. You should probably use this http://zeromq.org/bindings:cpp if you want c++ bindings

papr 06 March, 2018, 14:41:13

Specifically https://github.com/jship/CpperoMQ looks like a good high level lib to use.

papr 06 March, 2018, 14:41:42

I would recommend to reimplement the Pupil Helpers in c++ to get started.

papr 06 March, 2018, 14:42:30

You will require https://github.com/msgpack/msgpack-c for communication with Pupil as well

user-f1eba3 06 March, 2018, 14:43:31

@papr i was also thinking about reimplementing the filter_messages helper as well

papr 06 March, 2018, 14:44:40

because you do not need any msgpack up to line 55 but send simple byte strings instead

user-f1eba3 06 March, 2018, 14:46:24

great idea

user-f1eba3 06 March, 2018, 14:46:44

what is with this 2 lines in the remote_control : if name == 'main': from time import sleep, time ?

papr 06 March, 2018, 14:51:40

The first line is a Python convention that you can ignore. The second line just imports the system sleep and time functions such that we can use them in the script.

user-f1eba3 06 March, 2018, 14:54:02

I thought it was something like #ifndef

user-f1eba3 06 March, 2018, 14:54:04

in c

papr 06 March, 2018, 14:57:02

If you import a python file the interpreter runs all its code. This line avoids running code within the if statement during such an import. It is only executed if the file is called direct. In these cases the file's magic __name__ variable is set to '__main__' instead of the file's actual name.

user-f1eba3 06 March, 2018, 15:41:14

Would you recommend this https://github.com/jship/CpperoMQ over ZeroMqs bindings

user-f1eba3 06 March, 2018, 15:41:17

?

user-e7f18e 06 March, 2018, 15:41:34

Can somebody help me to run pupil-capture service properly, for me it is not working properly and i can not able to fix the problem

user-f1eba3 06 March, 2018, 15:41:35

I mean it has not been managed for 2 years now

papr 06 March, 2018, 15:42:30

@user-f1eba3 Oh, I did not see that is has been stale for that long. I would go for the most active one

papr 06 March, 2018, 15:42:45

@user-e7f18e What exactly does not work?

user-e7f18e 06 March, 2018, 15:43:32

eye1 is not catching the red circle on pupil, i am not able to find out that would it be a hardware problem in HTC vive

papr 06 March, 2018, 15:44:14

@user-e7f18e But you are able to see the eye1 video?

user-e7f18e 06 March, 2018, 15:44:23

yes

papr 06 March, 2018, 15:44:33

So you problem is bad pupil detection?

papr 06 March, 2018, 15:44:53

Can you share a screenshot of your eye1 window with us?

user-e7f18e 06 March, 2018, 15:45:03

exactly that why hmd-eyes demo for 3D scene is not accurate as per hmd-people

user-e7f18e 06 March, 2018, 15:45:08

sure

user-f1eba3 06 March, 2018, 15:45:25

so would you recommend I rather use this : https://github.com/zeromq/cppzmq?

user-e7f18e 06 March, 2018, 15:45:44

@papr

Chat image

papr 06 March, 2018, 15:46:30

@user-f1eba3 I did not use personally but this looks like the official c++ version by the zmq people. I would give it a try

papr 06 March, 2018, 15:49:42

@user-e7f18e I see two issues: 1) Try to move the cameras such that the eye is more centered. You can so do so by adjusting the lens-to-eye distance of the hmd. 2) You need more contrast between pupil and iris. Solving step 1) might solve 2) already. In case that it does not: Try playing around with the gamma and gain controls in the UVC Source menu.

user-e7f18e 06 March, 2018, 16:14:07

I changed the distance of lenses right now but step 2 I have no idea how to do that

user-e7f18e 06 March, 2018, 16:14:23

could you please help @papr

papr 06 March, 2018, 16:14:58

Sure, no problem. Would you mind sharing an other screenshot with the new eye positions?

user-e7f18e 06 March, 2018, 16:15:29

Sure

papr 06 March, 2018, 16:15:43

You need to open the UVC source settings to adjust the uvc controls. You can do so by clicking on the camera icon on the right of the eye windows.

papr 06 March, 2018, 16:19:46

Then expand the Image Post Processing sub menu. You should see a lot of sliders for different options. I would recommend to increase the gain and gamma values slightly. The exact values depend on your exact lighting conditions. Adjust them such that the pupil has a high contrast and the pupil detection works well.

user-e7f18e 06 March, 2018, 16:22:05

current scenario.

Chat image

user-e7f18e 06 March, 2018, 16:23:27

looks proper now let me change the settings you said.

user-e7f18e 06 March, 2018, 16:30:33

@papr is it ok with gain 5 and gamma 110.

Chat image

papr 06 March, 2018, 16:31:03

Do you use the 120Hz or the 200Hz eye cameras?

user-e7f18e 06 March, 2018, 16:31:54

120

papr 06 March, 2018, 16:32:19

eye1 looks out of focus. You can adjust the focus by rotating the lens if you use the 120Hz system.

user-072005 06 March, 2018, 16:42:00

Sorry, I read the docs on the calibration and am still a little confused on the workflow. I plan on using the pupil mobile app to study cyclists. Do I need to calibrate every time I start recording or can I save a calibration file for the start of the ride and record a series of short recordings throughout the ride and use the first recording's calibration file for all of them? (I'm assuming here that the glasses don't get moved between recordings)

papr 06 March, 2018, 17:46:00

@user-072005 It is currently not implemented to export/import calibrations from one recording into the other. You could do that if that worked and your assumption that the glasses do not move was right. But I doubt that the glasses do not move during cycling. I would recommend to do a calibration at the start of each recording.

user-381730 06 March, 2018, 19:48:55

Hello there! I just have a small question, it's probably a bit stupid but I gotta ask.

My superviser has made some modifications to Pupil Labs and so he sent me his source code. Now I need to build the program from the code and If I'm understanding this correct, I need to install all the dependencies listed here?: https://docs.pupil-labs.com/#windows-dependencies I've been installing stuff now for 2 hours... is this the correct way to do it?

papr 06 March, 2018, 19:50:45

@user-381730 Unfortunately, this sounds like the way to go. Although it is much easier to install the dependencies on Mac or Linux...

user-381730 06 March, 2018, 19:51:40

Ah yes, I usually dual boot with Linux, but Windows decided to "repair" my disk, making my Linux parition unusable.. Don't have the time to do a full reinstall now unfourtunatly.

papr 06 March, 2018, 19:52:51

Alternatively, you could try and extract your supervisor's changes into a plugin and use the bundle. I honestly think that you save the most time by reinstalling ubuntu and the linux dependencies 😕

user-381730 06 March, 2018, 19:54:18

Probably, though I use Manjaro(Arch) which would make the installation more difficult anyway. :/ I guess doing a plugin is possible..but I just needed to try a few things with his version. Didn't realize it would be this difficult to install.

user-e7102b 06 March, 2018, 19:57:18

Hello, would it be possible to have an individual's calibration settings persist if pupil_capture gets shut down or pupil headset gets temporarily unplugged/replugged? Currently it seems like they get erased.

papr 06 March, 2018, 20:02:10

@user-381730 yeah, sorry about that, but Windows is very tedious im that regard. Are you allowed to share his changes with me? I could tell you if it is easily convertable to a plugin.

papr 06 March, 2018, 20:02:38

And probably also give hints on how to do that

papr 06 March, 2018, 20:06:30

@user-e7102b the calibration is persistent in the gaze mapping plugin. These should be session and disconnect persistent

papr 06 March, 2018, 20:07:14

@user-e7102b is there any indication that Pupil Capture does not terminate correctly?

user-e7102b 06 March, 2018, 20:10:59

@papr Hmm, OK I haven't tested this extensively, but I'm midway through a pilot study at the moment and we disconnected/reconnected the tracker and the calibration went bad.

papr 06 March, 2018, 20:13:32

Ah, there is a difference! I am very sure that the calibration itself (gaze mapping function parameters) stay the same. Do you use 2d or 3d detection? Why do you disconnect the tracker explicitly? Does the subject do anything during the disconnected time?

user-e7102b 06 March, 2018, 20:20:02

We use 2D detection. We run some very long studies so it can be useful to disconnect the participant to allow them to move around, go to the bathroom etc. I understand that there is the chance that the participant could knock the eye-tracker and this could disrupt the calibration, but on this occasion I made sure this didn't happen.

papr 06 March, 2018, 20:21:35

@user-e7102b 2d calibration is very prone to slippage. If you do long recordings it is highly recommended to do multiple re-calibrations

papr 06 March, 2018, 20:22:18

There is always some kind of slight slippage that builds up over time.

papr 06 March, 2018, 20:23:33

@user-381730 What happens if you execute the opencv exe?

user-e7102b 06 March, 2018, 20:24:51

Sure, we do plan to build multiple calibrations into our protocol.

user-381730 06 March, 2018, 20:26:39

@papr, it extracted. :) I think I have managed to install everything now but some files have newer verions than those specified in the guide, I hope it works anyway. Though I'm having trouble starting the application now, when I run main I get error messages that I'm misssing () on calls to print. Shouldn't I use Python 3? I'm kinda a beginner at Python so I'm not sure.

user-e7102b 06 March, 2018, 20:26:45

Would you recommend we use 3D tracking? We're just having participants view stimuli on a monitor, so 2D seemed like the better option. I tried 3D tracking but this seemed less accurate.

papr 06 March, 2018, 20:31:07

@user-381730 Yes, you need Python 3! Python 2 is not compatible with our code

papr 06 March, 2018, 20:31:47

@user-e7102b Do you require the data for online interaction?

user-381730 06 March, 2018, 20:32:06

Yeah, I'm using 3.7 at the moment, doesn't seem to work though.

I'm starting to wonder if my supervisor sent me the correct code, it didn't have any main.py in the pupil.src folder...

user-e7102b 06 March, 2018, 20:32:40

No online interaction at the moment - just passive recording.

papr 06 March, 2018, 20:33:31

@user-381730 3.7 is very new 😄 Did not try that yet but should work in theory. Maybe it is a very old version of Pupil?

papr 06 March, 2018, 20:35:24

@user-e7102b Then I would recommend 2d tracking and calibration but only for monitor/control usage. Record the eye videos and include the calibration procedure into the recording. Then you will be able to use offline calibration and also profit of future pipeline improvements.

user-381730 06 March, 2018, 20:36:55

Maybe, not sure actually, probably a year old at least.

But I just cloned the official repo and tried running it, definatly something wrong with the versions;

user-381730 06 March, 2018, 20:37:01

Chat image

papr 06 March, 2018, 20:38:09

Did you call git clone or did you click the Download zip button on github?

user-381730 06 March, 2018, 20:38:23

I just took the zip 😛

user-e7102b 06 March, 2018, 20:38:49

@papr Thanks for the advice - I'll make sure I record the calibrations.

papr 06 March, 2018, 20:39:50

Yeah, thought so 😛 The zips do not include the git information for some reason. You will have to explicitly clone the repository. You should have a git shell (comes with the git installation) on your computer. Use it to clone the repository

user-381730 06 March, 2018, 20:40:27

A yeah I know how git works, but then this could be why the modded version doesn't work as well.. hnn

papr 06 March, 2018, 20:40:56

Is the modded version on github?

user-381730 06 March, 2018, 20:41:01

nope

user-6952ca 06 March, 2018, 20:42:50

Hello! I'm having difficulty getting the pupil glasses to function as a mouse. I have the mouse_control.py code from the Helpers repo, and I have seen the demonstration on YouTube. Any basic steps to get replicate this?

papr 06 March, 2018, 20:43:24

@user-6952ca Did you setup the required surface?

user-6952ca 06 March, 2018, 20:43:59

I did! I used 8 of the markers to outline a monitor.

papr 06 March, 2018, 20:44:15

Ok nice! Did you rename the surface as well?

user-6952ca 06 March, 2018, 20:44:24

Yep!

papr 06 March, 2018, 20:44:43

Did you do a calibration?

user-6952ca 06 March, 2018, 20:44:53

Yep

papr 06 March, 2018, 20:45:38

Are you receiving any events in the script?

user-6952ca 06 March, 2018, 20:51:30

there is only one line of readout: x_dim: 2560, y_dim: 1440

papr 06 March, 2018, 20:53:39

Can you print out the value of gaze_position in line 56? Is there any output?

user-381730 06 March, 2018, 20:56:58

Still trying to get the offical repo working. When I run the main file, it keeps saying that it can't find the GLFW library can't be found, but I have triple checked the file paths and all, still can't get it to work.

papr 06 March, 2018, 20:57:58

What is the exact error message?

user-381730 06 March, 2018, 20:58:25

Chat image

user-381730 06 March, 2018, 20:59:18

I had the same issue with glew but I noticed that the filepaths were wrong so I fixed it, but I still get this one :S

papr 06 March, 2018, 20:59:26

@user-381730 Did you setup a virtual env? Else use python3 main.pyplease

user-6952ca 06 March, 2018, 20:59:31

@papr getting a constant stream of outputs while looking at the tracked surface

user-381730 06 March, 2018, 21:00:07

Nope, but python3 doesn't work, the command is not recognized even though it is on the path :S

papr 06 March, 2018, 21:00:09

@user-6952ca Nice! Could you post a single line in here?

papr 06 March, 2018, 21:00:32

@user-381730 You will definitively need to fix your python 3 installation first....

papr 06 March, 2018, 21:01:11

Did you reboot?

user-381730 06 March, 2018, 21:01:25

Nope, maybe I should try.

papr 06 March, 2018, 21:01:32

🤞

user-381730 06 March, 2018, 21:01:32

Cause it is added;

user-381730 06 March, 2018, 21:01:49

Chat image

papr 06 March, 2018, 21:02:45

@user-381730 I believe you. With "fix your installation" I meant to get it working. I do not have enough insight into the magic world of pain that is called Windows to help you there. =/

user-e7102b 06 March, 2018, 21:04:11

@papr Also, another question. I'm sending commands to pupil remote (running on a machine with Mac OS X High Sierra) from a stimulus display machine (running Mac OS X Sierra) via an ethernet cable directly linking the two machines (I've set the IP address to be static for this connection). I've noticed that on a couple of occasions the IP address for pupil remote has changed, which of course means that the pupil middleman server fails to connect. The eye-tracker machine is also connected to the local network via a second ethernet connection, so perhaps this is confusing pupil remote? It seems that restarting the machine resets the pupil remote address back to the original (correct address). Is it were possible to set the pupil remote address to a static address?

user-6952ca 06 March, 2018, 21:06:08

@papr This is probably not exactly one line, but here is some of the output: {'topic': 'pupil', 'confidence': 1.0, 'ellipse': {'center': [82.98399353027344, 99.7783203125], 'axes': [33.62591552734375, 37.53098678588867], 'angle': 22.96988868713379}, 'diameter': 37.53098678588867, 'norm_pos': [0.2593249797821045, 0.5842569986979167], 'timestamp': 28123.65402, 'method': '2d c++', 'id': 1}]}, 'timestamp': 28123.652015}], 'timestamp': 28123.644999, 'camera_pose_3d': None}

papr 06 March, 2018, 21:07:25

@user-6952ca For some reason you are receiving pupil data instead of surface gaze data. Can you check line 44? What does it say?

user-b5e0c9 06 March, 2018, 21:08:38

Yeah, I just tried rebooting, didn't help I'm afraid.

Chat image

user-b5e0c9 06 March, 2018, 21:09:07

God just why did it have to mess up my Linux installation this day of all days. :/

user-6952ca 06 March, 2018, 21:09:30

sub.setsockopt_string(zmq.SUBSCRIBE, 'surface')

papr 06 March, 2018, 21:09:33

@user-e7102b Yes, that should be possible. Disable the User primary network interface option in the Pupil Remote interface and set the ip address + port

user-b5e0c9 06 March, 2018, 21:10:29

But I can start the Python shell separetly though, still get errors about missing GLFW

papr 06 March, 2018, 21:10:34

@user-b5e0c9 Oh, alright. python executes python 3. That is good! One issue less 😃

user-b5e0c9 06 March, 2018, 21:11:06

yeah 😄

user-b5e0c9 06 March, 2018, 21:11:31

I don't have any other Python version installed so it shouldn't be able to run anything else

papr 06 March, 2018, 21:11:54

Yeah, you showed it with the first command in the screenshot 👍

user-b5e0c9 06 March, 2018, 21:12:37

Maybe I can try and add GLFW to my path instead?

papr 06 March, 2018, 21:12:43

Did you repeat the glfw-lib copy part?

user-b5e0c9 06 March, 2018, 21:13:01

Yep, also tried redownloading in case it got corrupted or something.

papr 06 March, 2018, 21:13:34

Can you show me the contents of your pupil_external folder?

user-b5e0c9 06 March, 2018, 21:13:47

Sure

user-b5e0c9 06 March, 2018, 21:14:04

Chat image

papr 06 March, 2018, 21:15:43

Looks good. And that is the one in the repository that manually cloned, correct? Sorry for the dump questions but I want to make sure that we do not overlook anything by accident

user-b5e0c9 06 March, 2018, 21:16:38

No, I cloned it from git (with the Windows Bash tool) so it should be correct.

user-6952ca 06 March, 2018, 21:17:13

@papr here is a better pasting of the output, so I think gaze position is there. sorry for the confusion. {'topic': 'gaze.2d.01.', 'norm_pos': [-0.8056872850215058, -1.9805976641233527], 'confidence': 0.8257696120612803, 'timestamp': 28773.085415, 'base_data': [{'topic': 'pupil', 'confidence': 0.6615392241225607, 'ellipse': {'center': [119.89435577392578, 88.1365737915039], 'axes': [39.73045349121094, 96.71500396728516], 'angle': 94.72855377197266}, 'diameter': 96.71500396728516, 'norm_pos': [0.3746698617935181, 0.6327642758687337], 'timestamp': 28773.08761, 'method': '2d c++', 'id': 0}, {'topic': 'pupil', 'confidence': 0.99, 'ellipse': {'center': [230.96566772460938, 190.4365997314453], 'axes': [14.059481620788574, 17.84821128845215], 'angle': 118.0363540649414}, 'diameter': 17.84821128845215, 'norm_pos': [0.7217677116394043, 0.20651416778564458], 'timestamp': 28773.08322, 'method': '2d c++', 'id': 1}]

papr 06 March, 2018, 21:17:27

Oh that is what I meant by manually cloned. Sorry for the ambiguity.

user-b5e0c9 06 March, 2018, 21:19:02

No problem. I now tried adding glfw-3.2.1.bin.WIN64/lib-vc2015 to the path but it didn't seem to help

user-b5e0c9 06 March, 2018, 21:20:20

Are these paths correct?

user-b5e0c9 06 March, 2018, 21:20:25

Chat image

user-b5e0c9 06 March, 2018, 21:20:57

"pupilScripts" is my "work" folder

papr 06 March, 2018, 21:22:10

@user-6952ca That is a normal gaze datum. You should not be receiving it...

papr 06 March, 2018, 21:28:04

@user-b5e0c9 Whats the exact path to your pupil externals folder?

user-b5e0c9 06 March, 2018, 21:29:48

C:\pupilScripts\pupil\pupil_external

user-6952ca 06 March, 2018, 21:30:32

@papr here is the error I am getting: Traceback (most recent call last): File “mouse_control.py”, line 83, in <module> move_mouse(x, y) File “mouse_control.py”, line 22, in move_mouse m.move(x, y) File “/path/python3.6/site-packages/pymouse/x11.py”, line 128, in move fake_input(d, X.MotionNotify, x=x, y=y) File “/path/python3.6/site-packages/Xlib/ext/xtest.py”, line 103, in fake_input y = y) File “/path/python3.6/site-packages/Xlib/protocol/rq.py”, line 1347, in init self._binary = self._request.to_binary(args, keys) File “/path/python3.6/site-packages/Xlib/protocol/rq.py”, line 1069, in to_binary static_part = struct.pack(self.static_codes, pack_items) struct.error: required argument is not an integer

papr 06 March, 2018, 21:32:15

@user-6952ca do you get that error consistently? can you print x and y in line 81?

papr 06 March, 2018, 21:34:51

@user-6952ca I think I know the issue. Please change line 82 to move_mouse(int(x), int(y))

user-6952ca 06 March, 2018, 21:36:55

@papr Yep, that did the trick! thanks for your help! 😁

papr 06 March, 2018, 21:38:08

No problem

papr 06 March, 2018, 21:39:09

Thank you for finding that issue! I committed a fix to the repository

user-b5e0c9 06 March, 2018, 22:00:30

Ok, I will downgrade to Python 3.6 and do all the steps again using the official repo, I probably missed something somewhere.

user-e00b32 06 March, 2018, 22:42:33

Hi all. New to Pupil and Discord. I just collected a big data set using the binocular Pupil headset, but it looks like a lot of gaze_positions were dropped (e.g. only 1892 out of 1984 frames were accounted for). The pupil_positions look fine (e.g. on average ~7 recordings per frame). World video is being recorded at 30 Hz and pupil is being recorded at 120 Hz. Is there a simple way to remap the gaze positions offline (like a player plugin or something)? Additionally, if there is a more appropriate place to ask this question, please let me know.

user-723401 07 March, 2018, 00:57:00

Hello,

user-723401 07 March, 2018, 00:58:05

I have just received my pupil labs eyetracking set. I need to implement it with my HTC vive headset. Where do I start?

wrp 07 March, 2018, 02:29:16

Hi @user-723401 please start with docs here: https://docs.pupil-labs.com/#htc-vive-add-on

wrp 07 March, 2018, 03:00:48

@user-e00b32 Welcome to the chat/community 😸 Question - did you record the calibration procedure as well? If so, then you can run pupil detection with different paramaters in Pupil Player and re-calibrate as well.

user-3a0646 07 March, 2018, 07:41:26

It's a new day and a new attempt at building from source

user-3a0646 07 March, 2018, 07:42:18

I'm having issues when running the boostrap.bat. That should generate b2.exe. When I run the bat file, it fails. If I look at the generated logs it says "c1: fatal error C1083: Cannot open source file: 'yyacc.c': No such file or directory "

user-3a0646 07 March, 2018, 07:43:06

Shouldn't that file be included with Boost?

user-6e1816 07 March, 2018, 08:12:10

@wrp I'm sorry I didn't make myself clear, I am running the latest version of Pupil Capture, the problem as shown in the video

user-3a0646 07 March, 2018, 08:16:35

mm the Solution seems to download the ZIp and not the EXE to extract things

wrp 07 March, 2018, 08:19:08

@user-6e1816 thanks for the video. I just tried the same steps as you (with the bundle - not from source on Ubuntu 16.04) but was not able to replicate the crash

wrp 07 March, 2018, 08:19:48

@user-6e1816 did you try starting again with default settings - delete capture settings

wrp 07 March, 2018, 08:20:39

@user-3a0646 if you have notes or suggestions for dev instructions please make an issue in pupil-docs repo: https://github.com/pupil-labs/pupil-docs

user-6e1816 07 March, 2018, 08:24:28

@wrp just clink the General Settings-Restart with default settings?

wrp 07 March, 2018, 08:25:42

Yes, you can clidk the Restart with default settings button. Since you're running from source you can also delete capture_settings dir in your pupil repo.

wrp 07 March, 2018, 08:26:04

@user-6e1816 did you also try recreating this issue when running from app bundle?

user-3a0646 07 March, 2018, 08:32:39

@wrp I will do that once I have figured out on how to get it working. 😄

user-3a0646 07 March, 2018, 08:33:04

There is some stuff that is kinda unclear 😛

user-6e1816 07 March, 2018, 08:37:11

@wrp start again with default settings doesn't work, but I can't also replicate this crash when running from app bundle

user-3a0646 07 March, 2018, 08:48:04

I managed to do every step of the process now but when running the main.py file causes the error

user-3a0646 07 March, 2018, 08:48:29

Chat image

user-3a0646 07 March, 2018, 08:52:29

I have triple checked that I got the correct files in the correct locations. :/

wrp 07 March, 2018, 08:53:05

@user-3a0646 use the .bat file to run capture

user-3a0646 07 March, 2018, 09:03:26

Aha, hmm tried it, now get some other errors :S

user-3a0646 07 March, 2018, 09:04:02

Chat image

wrp 07 March, 2018, 09:08:08

@user-3a0646 I would suggest actually trying to build the optimization_calibration - python pupil_src/shared_modules/calibration_routines/optimization_calibration/setup.py build

wrp 07 March, 2018, 09:08:32

and python pupil_src/shared_modules/pupil_detectors/setup.py build

wrp 07 March, 2018, 09:08:46

so you can isolate errors prior to running the .bat

user-3a0646 07 March, 2018, 09:08:54

alright, will try it

wrp 07 March, 2018, 09:09:27

Note - that these can take quite some time to build

user-3a0646 07 March, 2018, 09:09:45

Alright, well they didn't as they failed instantly 😄

wrp 07 March, 2018, 09:09:52

ok

user-3a0646 07 March, 2018, 09:10:09

Chat image

wrp 07 March, 2018, 09:10:55

hmn... maybe cd into this dir first

wrp 07 March, 2018, 09:11:03

and then just run python setup.py build

user-3a0646 07 March, 2018, 09:12:01

Which directory?

user-3a0646 07 March, 2018, 09:12:23

oh wait

user-3a0646 07 March, 2018, 09:12:25

I see

wrp 07 March, 2018, 09:12:30

ok

user-3a0646 07 March, 2018, 09:13:45

Chat image

user-3a0646 07 March, 2018, 09:13:55

Can't find eigen...hmm

user-3a0646 07 March, 2018, 09:15:35

Chat image

user-3a0646 07 March, 2018, 09:37:13

Running python pupil_src/shared_modules/calibration_routines/optimization_calibration/setup.py build now works but getting errors when running python pupil_src/shared_modules/pupil_detectors/setup.py build I have double checked that cython is installed

user-3a0646 07 March, 2018, 09:46:42

Chat image

wrp 07 March, 2018, 09:48:14

cython version?

wrp 07 March, 2018, 09:48:31

also try running from within the dir

user-3a0646 07 March, 2018, 09:52:16

I have installed Cython‑0.27.3‑cp36‑cp36m‑win_amd64.whl

user-3a0646 07 March, 2018, 09:53:41

Hmm I tried changing folder as you said, does this mean that it worked?

user-3a0646 07 March, 2018, 09:53:45

Chat image

user-9325d9 07 March, 2018, 09:55:14

Hi all, is there a way to use the new pupil cameras outside of pupil (e.g. with directshow)?

user-3a0646 07 March, 2018, 09:55:40

But if I try to run the run_capture, I still get errors that it can't find DLLS

user-3a0646 07 March, 2018, 09:55:45

Chat image

wrp 07 March, 2018, 09:59:02

@user-3a0646 I looks like the detectors did not build in this case

wrp 07 March, 2018, 09:59:27

@user-3a0646 try clearing the build dir within pupil_detectors

wrp 07 March, 2018, 10:00:06

and run python setup.py build again

user-3a0646 07 March, 2018, 10:01:48

I did as you said, now it is building, taking a good amount of time

wrp 07 March, 2018, 10:01:58

ok, that is progress

wrp 07 March, 2018, 10:02:20

@user-9325d9 what are you trying to achieve exactly?

wrp 07 March, 2018, 10:03:21

@user-3a0646 you may need to hit enter a few times in the cmd prompt if ceres/glog is hanging

user-3a0646 07 March, 2018, 10:04:17

Alright, well the build process crashed, saying it ran out of space but I have 50gb free :S

wrp 07 March, 2018, 10:04:50

that's an error I have not seen before @user-3a0646

user-3a0646 07 March, 2018, 10:04:59

Chat image

user-3a0646 07 March, 2018, 10:07:57

I tried doing it again and I think it worked this time, will try the bat again

user-3a0646 07 March, 2018, 10:08:48

Nope

user-3a0646 07 March, 2018, 10:08:56

Still missing the detector

wrp 07 March, 2018, 10:09:01

@user-3a0646 the cmd prompt shows that build failed

user-9325d9 07 March, 2018, 10:09:24

@wrp we are trying to read the video stream captured with the high-speed cameras, outside of Pupil (we use a different framework)

user-3a0646 07 March, 2018, 10:11:46

Yeah I know but I tried rerunning it and this time i didn't get the space error. Then <I tried the bat again, still says I'm missing DLL detector 😦

user-3a0646 07 March, 2018, 10:14:09

Here is the CMD output for first building the detector and then running the bat : https://gist.github.com/anonymous/01707959e811f16e730fd96c5028b44e

user-3a0646 07 March, 2018, 10:27:29

Tried rebuilding multiple times now, still the same error :S

user-f1eba3 07 March, 2018, 12:12:47

@papr after some strugles the library https://github.com/jship/CpperoMQ is too depracated to work with (I would have to change to c++11 if I wanted to use it) . Thought it might be usefull information for future development

user-e7f18e 07 March, 2018, 12:56:51

can anybody help me for settings of pupil capture service

user-02de58 07 March, 2018, 13:09:58

So I tried installing it on another machine now, still got problems but not the same. IT says I'm missing the pyav module, but it is installed, I have double checked.

user-02de58 07 March, 2018, 13:10:05

(It's me, Kalle)

Chat image

user-e7f18e 07 March, 2018, 13:35:51

It would be nice help if anybody from pupil lab help me to the settings of pupil capture service before calibration by taking remote of my machine

mpk 07 March, 2018, 13:40:59

@user-e7f18e please write an email in info[at]pupil-labs.com to schedule video support.

user-02de58 07 March, 2018, 15:12:04

Seems like more people than me has issues with av: https://stackoverflow.com/questions/49153792/python-missing-dll-from-installed-module

user-6952ca 07 March, 2018, 15:24:11

After messing around with mouse_control.py, I am trying to improve the accuracy of the mouse cursor movements. I have a few questions: 1) What calibration/camera settings should I use for optimal performance (e.g, 2d++ or 3d++) 2) Correcting for eyeblinks: Whenever an eyeblink occurs, the position of cursor moves drastically 3) Ideal surface tracking settings: Currently, I am using a 32" computer monitor at eye level and standing a few steps away. I created a Google Slide show in which I placed the fiducial markers around the perimeter of the slides and moving a fixation cross around the screen.
Thanks for all the outstanding assistance you all have been providing!

user-e7f18e 07 March, 2018, 15:35:56

@mpk i sent the mail into info[at]pupil-labs.com and waiting for the reply, the mail id is info@pupil-labs.com right?

mpk 07 March, 2018, 15:36:29

Yes. We got the mail. I'll reply in a bit!

mpk 07 March, 2018, 15:39:07

@user-6952ca I would try 2dans 3d and use the better one. Eyeblinks can be filtered via confidence. Just discard all data lower 0.9 in the pupil helpers script. Your surface tracking setup seems OK. Sometimes printed markers work better.

user-e00b32 07 March, 2018, 16:23:07

@wrp Thanks for the welcome and the response! Unfortunately, we did not start recording the calibration procedure as a video until later in the data set collection... I've done some exploring, but want to confirm - the calibration coefficients are not saved anywhere?

user-e7f18e 07 March, 2018, 16:32:55

@mpk I am still in office and waiting for your reply, could you please give me a reply for the timings?

mpk 07 March, 2018, 16:33:32

I may not in the office anymore ill reply tomorrow. Sorry

user-e7f18e 07 March, 2018, 16:34:50

Could you please tell me your office timing tomorrow? I need to set-up the plan accordingly

user-3a0646 07 March, 2018, 16:59:06

Still nobody here who knows why python can't find va module?

user-f1eba3 07 March, 2018, 17:16:16

did someone from this chat developed the hmd eyes app for Unitiy ? and if so can we chat a little bit 😄 ?

mpk 07 March, 2018, 17:31:22

@user-f1eba3 that would be @user-e04f56 in hmd eyes chat

user-e7102b 07 March, 2018, 17:40:38

HI, just some feedback that might be helpful for you guys:- I just upgraded to the latest version of pupil_capture (v1.5) and my pupil_remote setup, which was working perfectly, stopped working. I made sure all the settings were exactly the same so this was quite puzzling. After much troubleshooting, the solution was to delete the pupil_capture_settings folder in the user directory (I'm running Mac OS X High Sierra) before restarting Pupil_Capture.

user-072005 07 March, 2018, 17:41:19

Does the pupil mobile app only record for about 5 minutes? Is it possible to record for longer?

papr 07 March, 2018, 17:51:42

@user-072005 Pupil Mobile is able to record for much longer. Please make sure that you have enough storage available.

papr 07 March, 2018, 17:53:21

@user-e7102b upgrading usually removes/ignores the old settings. Therefore you automatically start with the default settings after upgrading.

user-e7102b 07 March, 2018, 18:00:10

@papr yes, all the settings were reset to default when I upgraded, however pupil-remote just would not work despite me ensuring the IP address/port were set up exactly as before. When I manually deleted the pupil_capture_settings folder and restarted pupil_capture, it magically started working.

user-072005 07 March, 2018, 18:31:08

Can I replace the cord that comes with the eye trackers with another usb-C to USB cord?

mpk 07 March, 2018, 18:33:57

@user-072005 yes, any usb-c cable of decent build quality will work.

user-072005 07 March, 2018, 19:24:24

great, thanks

user-072005 07 March, 2018, 20:16:48

I added a 256 GB micro SD to my phone and set it to save to the SD card. But it seems to shut off randomly still. It should have sufficient memory for a short (~5-10 min) bike ride (I'm studying cyclists). I thought maybe it's because the phone was in the rider's pocket, but the file didn't save properly when this happened so I think it isn't a press of the record button stopping it. When the problem occurs, the info file is missing from the folder. It doesn't always cut the recording early either. What could be the cause of this?

user-072005 07 March, 2018, 20:17:01

Oh and I'm using pupil mobile

user-3a0646 07 March, 2018, 20:40:41

Odd, trying to build from source here. I now get an assertion exception that my uvc. The latest one on git is 0.11 but in the python code, there is an assertion that the version must be 0.13 or higher. If I remove the assertion, I get another (real) exception saying that pyndsi version is to old, and that to is the latest. :S

user-3a0646 07 March, 2018, 20:43:02

Removing that assertion to though makes it compile and I can run the application(finaly, after 1.5 days of trying hehe). Now lets just see what errors I will encounter...

user-6952ca 07 March, 2018, 21:08:39

@mpk Thanks for the info on the eyeblinks! I am noticing that the accuracy for the cursor movements is drastically less than that of the gaze position in relation to the world view (which makes sense) . I am curious as to how the calibration process affects the accuracy of surface tracking. Calibrating at different distances from the monitor has different effects on the cursor movement accuracy. My colleagues and I would like to eventually write a program that obfuscates an image on a monitor except for a small spotlight around the point of fixation in real time, so accuracy of fixation on the screen is paramount.

user-1ad73e 07 March, 2018, 22:13:03

Hey, is it possible to somehow record the Unity Scene with the Recorder when using Vive mount, similar to the front camera capture option? Would really like to do this, but couldn't find info about it.

user-1ad73e 07 March, 2018, 22:13:58

(Need to analyze focus patterns in the scene)

user-6e1816 08 March, 2018, 02:00:44

After I insert the "from object_detector_app import Object_Detection" into the world.py, the main.py can't run successfully.

Chat image

user-6e1816 08 March, 2018, 02:00:47

Chat image

user-163330 08 March, 2018, 07:53:29

Hey! I have a quick question. Does anyone know if it is possible to measure pupil size?

papr 08 March, 2018, 09:07:30

@user-3a0646 The assertions are there for a reason. There are some features that depend on the newest versions. You might need to build uvc and ndsi from source to get the newest versions.

user-06a050 08 March, 2018, 09:11:35

Hey! Has anyone tried to incorporate Leap Motion data into capture? The plan is to record hand movement and save this along with the data which is saved by Pupil Capture. I'm having some trouble with this accessing the Leap Motion data per Pupil-frame from the backbone.

papr 08 March, 2018, 09:14:51

@user-6952ca Surface detection is independent of gaze calibration. Mapping gaze to a surface is a simple linear transformation: https://en.wikipedia.org/wiki/Homography

The prediction error is measured in degrees of visual angle. This unit is distance independent. Absolute errors (in screen pixels) will be higher if the subject is further away than if it is closer to the screen.

I would recommend a chin rest + 2d calibration in your case. @user-41f1bf has a lot experience with using Pupil for screen based experiments. He might be able to comment on that.

papr 08 March, 2018, 09:21:55

@user-163330 Yes, the pupil datum contains the pupil size in pixels. If you use the 3d detection mode you can read out the pupil size in millimeters

papr 08 March, 2018, 09:22:47

@user-06a050 Could you elaborate on your setup and which data you are trying to access from where?

papr 08 March, 2018, 09:24:30

@user-1ad73e Yes, this is defintively possible. I saw @user-e04f56 working on that. Please ask in the vr-ar channel on how to do that

user-1ad73e 08 March, 2018, 09:26:29

@papr thx!

papr 08 March, 2018, 09:32:26

@user-6e1816 Where does this object detector come from? I advise against adding module imports in main.py! The launcher needs to be as clean as possible in order to launch processes correctly. I would recommend to add such code into a python file which is then placed into your plugin folder. Capture/Player will automatically try to load it. If you run python3 main.py debug it will even tell you details of what is not working.

On the other side, I am wondering why there is no Python exception trace. this must be an underlying c library that crashes the program...

user-06a050 08 March, 2018, 09:40:19

@papr Of course: I'm publishing hand motion sensor data to the backbone, using zmq_tools.Msg_Streamer in a seperate thread. It publishes on the topic "hand". Inside recent_events I want to access this data, but only data which fits to the current Pupil-frame time-wise.

My current way of doing this gives me all data which was ever published on the "hand" topic since starting capture, every time recent_events is called.

Currently I use this: def recent_events(self): while self.hand_sub.new_data: t, p = self.hand_sub.recv() ...

self.hand_sub being a Msg_Receiver instance which is subscribed to the hand topic. The data I'm publishing on the hand topic has the Pupil timestamp.

Is that comprehensible? 😅

user-3a0646 08 March, 2018, 09:40:49

@papr Alright, well I will test it out, if I get any problems, I know where to look.

papr 08 March, 2018, 09:47:00

@user-3a0646 In the uvc case this might be very sublte timing problems that result in unsynchronized data. This would render any recording useless. Just as a warning.

papr 08 March, 2018, 09:48:59

@user-06a050 Yes, that is helpful. I am happy to see people making use of our backbone infrastructure! Your setup looks good to me. What is the exact issue that you have with it?

user-06a050 08 March, 2018, 09:56:05

@papr The next step I want to make is to write this backbone data into the events dic, so I can use the data inside other plugins. For that to work, I need only the hand data from the backbone which fits to the current pupil frame. My current method gives me all the data which was ever published in the hand topic every time recent_events is called.

user-3a0646 08 March, 2018, 09:56:18

Alright, good to know.

user-06a050 08 March, 2018, 09:57:41

@papr A naive way of doing this would be to iterate over all that data and compare timestamps, then I could only save the relevant data in events. But this is infeasible because the list of entries in that topic grows fast, very fast.

papr 08 March, 2018, 10:02:40

@user-06a050 The received data will never be ahead in time. Therefore the last recv() in your while loop yields the most recent datum. Do you record the data within the leap motions application? Or do you want to store everything with Pupil? In the second case I would not discard any of the data and put everything that you received in a single recent_events call into the events dictionary.

def recent_events(self, events):
  recent = []
  while self.hand_sub.new_data:
    t, p = self.hand_sub.recv()
    recent.append(p)
  events['hands'] = recent
papr 08 March, 2018, 10:03:52

This data will automatically be saved to the pupil_data file. Therefore you need to make sure that it is serializable with msgpack.

user-06a050 08 March, 2018, 10:08:46

@papr That is exactly what I'm doing... But as I've said: the while loop iterates over all data which was ever published since starting capture. Is there something I can do wrong with publishing to the topic?

user-3a0646 08 March, 2018, 10:10:12

@papr Any idea on how I can build these modules myself from source? Sorry I'm a realy python noob. Maybe there is a tutorial about it?

user-06a050 08 March, 2018, 10:10:56

@papr Oh and the data from the Leap Sensor is read inside a different python thread and send to the backbone. This thread is started outside of the Pupil plugin.

papr 08 March, 2018, 10:12:20

@user-06a050 Do I understand it correctly that calling recv() twice yields the same data? Everything you publish is received exactly once. Therefore, if you receive the same data over and over again, you must be sending it over and over again.

papr 08 March, 2018, 10:13:44

@user-3a0646

git clone <repository> <repo_target_folder>
cd <repo_target_folder>
pip3 install .
user-06a050 08 March, 2018, 10:13:56

@papr Yes, the same data plus something more. Havn't checked that, will do! Thanks!

papr 08 March, 2018, 10:15:25

@user-06a050 I recommend to only publish the most recent data. If you need previous data, cache it in the plugin

user-3a0646 08 March, 2018, 10:16:22

Thanks, will test it!

user-f1eba3 08 March, 2018, 10:27:48

happy International Womens day to yall cool devs

user-e7f18e 08 March, 2018, 11:11:13

@wrp could you please help me for setup the pupil capture software properly?

user-e38712 08 March, 2018, 11:42:57

Hello guys, I'm new to Pupil Labs eye-tracker and I have one question - can I use 120Hz model with Xioami mi5 through mobile app?

user-3a0646 08 March, 2018, 12:07:10

@papr so I managed to install pyndsi by downloading some dependencies for it. Now I'm having trouble installing uvc. It needs DLLs from both libusb (which I managed to find) and Libuvc which I didn't find. So now I have to run cmake to make the build script fro Libuvc. The problem is that I don't know the command for this. The command I mean is the one to include the other files, Step 3 on this link https://github.com/pupil-labs/libuvc/blob/master/INSTALL_WINDOWS.md

user-feb6c2 08 March, 2018, 12:10:47

I have a problem with opening the recordings, We we try to drag the folder into the player it send an error message that the video is not valid. Also we have a hot key in captered, the T, which i don't see anywhere where should i use

papr 08 March, 2018, 12:11:42

@user-feb6c2 Could you post a screenshot of your recordings folder content?

papr 08 March, 2018, 12:12:34

@user-feb6c2 Be aware that--if you did not provide a custom session name--the recordings folder should be something like 000 and not 2018_03_08

papr 08 March, 2018, 12:13:01

@user-3a0646 Do you mean that you do not know <pthread_dir>?

user-3a0646 08 March, 2018, 12:13:26

yes, or the include statement. basically how do I write the arguments for cmake?

user-feb6c2 08 March, 2018, 12:13:37

@papr I can't it is not on my computer but i will ask. I found the recordings in the folder but i couldn't play them

papr 08 March, 2018, 12:15:29

@user-3a0646 If I undertsan dit correctly you are not supposed to change the cmake manually but by using Visual Studio.

papr 08 March, 2018, 12:16:15

The pthread_dir seems to refer to the extracted folder form Step 1)

user-3a0646 08 March, 2018, 12:16:34

Ah yes, but I read it as I need to use cmake first and then visual studio

user-3a0646 08 March, 2018, 12:17:16

I¨ll take a screenshot

user-3a0646 08 March, 2018, 12:17:21

wait a minute

papr 08 March, 2018, 12:18:32

Looks like you need to adapt lines 37 and 39 of the CMakeLists.txt file https://github.com/pupil-labs/libuvc/blob/master/CMakeLists.txt#L37

user-3a0646 08 March, 2018, 12:19:14

aaaa that could be it

user-3a0646 08 March, 2018, 12:19:18

let me try

user-3a0646 08 March, 2018, 12:23:41

Nope

Chat image

papr 08 March, 2018, 12:25:08

Click on finish. I do not think that you are supposed to add the libs there

papr 08 March, 2018, 12:25:26

But in the table of the main view

papr 08 March, 2018, 12:26:05

Click on Add Entry and see if it opens a window that allows you to set include and lib paths.

user-3a0646 08 March, 2018, 12:27:10

I clicked finished and it failed

user-3a0646 08 March, 2018, 12:27:26

Micrsoft.Cpp.Default.props was not found

papr 08 March, 2018, 12:31:26

Unfortunately I cannot help you with that issue. =/

user-3a0646 08 March, 2018, 12:32:27

A shame the author couldn't have documented it clearer 😦 Three hours wasted on this problem

user-3a0646 08 March, 2018, 12:32:31

;P

papr 08 March, 2018, 12:33:01

@user-e7f18e didn't we talk about your setup last week?

user-e7f18e 08 March, 2018, 12:33:31

yes @papr

user-3a0646 08 March, 2018, 12:33:32

Why isn't there any pre compiled DLL:s to find anywhere? Isn't that half the point of DLLs, to easily share code?

user-e38712 08 March, 2018, 12:33:33

Do you have any experience with Pupil Mobile android app guys?

papr 08 March, 2018, 12:34:13

@user-e38712 We have not tested this model. But it should work in theory if it has a usb-c connector

papr 08 March, 2018, 12:35:08

@user-3a0646 I have an idea. Dowload the newest windows bundle. It includes pre-compiled dlls. Maybe you can copy it from there.

papr 08 March, 2018, 12:35:23

@user-e7f18e What do you need help with specifically then?

user-e38712 08 March, 2018, 12:42:17

to connect Pupil to my phone I would need USB-C to USB-C, right? I Have already OTG microUSB and microUSB to USB-C, but I'm not sure if it would work 😛 Probably not, because there will be lower speed

papr 08 March, 2018, 12:42:50

@user-e38712 Correct, you would need a usb-c to usb-c cable

user-e7f18e 08 March, 2018, 12:43:15

@papr pupil capture software says my pupil is out of focus and I can not set it properly

user-e7f18e 08 March, 2018, 12:43:37

it would be great help if anybody helping me for the setup using a remote session

user-e38712 08 March, 2018, 12:44:07

@papr ok, when I'll have opportunity I'll try it with mi5 and let u know if it works

papr 08 March, 2018, 12:44:19

@user-e7f18e Please write an email to info@pupil-labs.com

user-3a0646 08 March, 2018, 12:44:37

@papr That's a smart idea

papr 08 March, 2018, 12:44:38

@user-e38712 That would be great! 🙂

user-e7f18e 08 March, 2018, 12:45:10

I already sent a mail to them as instructed by @mpk but no response yet

papr 08 March, 2018, 12:49:09

@user-e7f18e Please be patient. We are a small company and we do our best to keep up with the support requests.

user-3a0646 08 March, 2018, 12:49:21

Ok well maybe I

user-3a0646 08 March, 2018, 12:49:29

Maybe I'm being stupid here now.

user-e7f18e 08 March, 2018, 12:50:20

@papr sure no problem at all

user-3a0646 08 March, 2018, 12:51:48

I did as you said, cloned the pyuvc repo and then run pip install . on it. Before that I configured some file paths inside the setup.py file. Now when I run install I get an error saying "Cannot open include file : "libucv/libuvc_config.h, so such file or dir" And yes, If I navigate to this folder with the file explorer I have a file called libucv_config.h.in not sure what this is about :/ The libucv is also cloned from github

user-3a0646 08 March, 2018, 12:52:19

Like here: https://github.com/pupil-labs/libuvc/tree/master/include/libuvc

user-feb6c2 08 March, 2018, 12:57:35

@papr

Chat image

user-feb6c2 08 March, 2018, 12:57:37

Chat image

user-feb6c2 08 March, 2018, 12:58:20

@papr I drop the mp4 into the player but It doesn't play it.

papr 08 March, 2018, 12:58:46

@user-feb6c2 Ah, no, you need to drop the entire folder. The one named 000

user-feb6c2 08 March, 2018, 12:59:24

ah okey thank you !

user-4d2126 08 March, 2018, 13:34:34

Hey guys, today I have to make the decision whether we are going to keep the pupil or not but I am still not sure to be honest. My boss thinks that it can be helpful but we are still struggling to produce an accurate results.

Based on the feedback I have gotten here so far my capture procedure is as follows: Test subjects sits in a chair with mobile device (5-6 inch) held in a grip on stativ in front of him in a heigh that allows him to operate the device (we need him to be able to play the game). I am using the 2D mapping mode with the calibration being done directly on the mobile device (recorded screen marker calibration on PC and playing it on mobile). So far so good but when the subject starts playing he naturally cannot hold his head still and any change in angle of his view completely messes up the tracking (tried to fix this manually in post processing but that's just way too tedious). We cannot really use any sort of headrest since that would mess up with the "natural gaming environment" even more than the grip already does.

Now, I really think that the technology is neat and would love to keep it but due to my incompetence (we are launching our new game soon and there really was not that much time to play around with this despite having a month) or due to pupil not being reallly that much of a fit for our specific purpose (choose one) I have not been able to produce results based on which I could confidently recommend the purchase to my boss.

To sum this up, any last minute advice before I have to make a decision? Whatever the result I appreciate what you are doing here and wish you a lot of success!

user-feb6c2 08 March, 2018, 14:01:44

@papr I managed to play the video and export the datas but now i got .csv files and even if converted it in excel it still doesn't open it, it says that there is loading error. Do you know something about that ?Thank you !

papr 08 March, 2018, 14:02:55

@user-feb6c2 Excel says that there is a loading error? Would you mind sharing the csv file with me?

user-feb6c2 08 March, 2018, 14:05:29

export_info.csv

user-feb6c2 08 March, 2018, 14:05:33

gaze_positions.csv

user-feb6c2 08 March, 2018, 14:05:35

pupil_positions.csv

user-feb6c2 08 March, 2018, 14:08:34

with the other computer I could open it, it just doesnt work with the I made the recordings.

user-feb6c2 08 March, 2018, 14:10:45

Can I ask all the questions come up to me during the the studying of the instructions ?

papr 08 March, 2018, 14:10:48

@user-feb6c2 The csv files seem to be correct. Maybe the excel conversion messes something up, but I can not say for sure.

user-feb6c2 08 March, 2018, 14:11:20
  1. The T hot key in capture, what is it good for?
  2. When should I choose the plugins ?
  3. I haven't found any information about accuracy visualizer, hololens relay and log history plugin, while I don't find surface heatmaps plugin and remote annotation plugin 4.About the third party plugins and network plugins I would need some detailed explanation
  4. The plugins can influence the eye detection and the calibration ?
  5. I find the recording files but I can't play them even with the pupil play, It says it is not a valid recording
  6. With the plugins I don't undertand some details -What is an UDP transport ? -How can I use different devices and multiple sensors with different sampling rates ? -What are the video frames? -How can I register a surface? -Where is the marker tracking plugin? -What are the timestamps? -What is the practical differention between online and offline fix detection ? -How can I subscribe to a topic of interest ?
user-feb6c2 08 March, 2018, 14:11:46

@papr Can I ask all the questions come up to me during the the studying of the instructions ?

user-feb6c2 08 March, 2018, 14:12:28

Sorry it is a lot but for me the intructions on the web page not really clear 😦

papr 08 March, 2018, 14:12:57

Ok, lets tackle them one by one 🙂

papr 08 March, 2018, 14:14:52
  1. The T hotkey starts the accuracy test procedure. Just perform the same steps as in the calibration. The difference is that it starts the accuracy visualizer plugin instead of creating a new calibration.
papr 08 March, 2018, 14:16:46
  1. The plugin structure is designed to be able to enable or disable plugins based on your needs. You can load and unload all plugins in the Plugin Manager. Clcik the Plugin Manager icon on the right to open it. Plugins can be loaded and unloaded at all times.
papr 08 March, 2018, 14:18:40

3.1. The accuracy visualizer has two phases: 1. Collection of pupil data and reference points. 2) Mapping of pupil data using the current calibration and comparison of the mapping result to the reference points.

papr 08 March, 2018, 14:19:20

3.2. The hololens relay is a specialized plugin for our Hololens integration. https://github.com/pupil-labs/hmd-eyes/tree/master/unity_pupil_plugin_hololens

papr 08 March, 2018, 14:20:25

3.3 The log history just lists all past log messages in the side menu. This is just a convenience plugin if you missed a log message.

papr 08 March, 2018, 14:21:06

3.4 Surface heatmaps can be enabled in the Surface Tracker plugin. Open it via the Plugin Manager, mentioned in 2.

papr 08 March, 2018, 14:22:26

3.5. There is a Capture Annotation plugin to whih you can send annotations remotely via our network api.

user-f1eba3 08 March, 2018, 14:22:32

can I ask something more c++ related then pupil but that still has to do with the device

user-f1eba3 08 March, 2018, 14:22:52

?

papr 08 March, 2018, 14:23:05

@user-f1eba3 Yes, but I might not be able to answer it.

papr 08 March, 2018, 14:24:51

@user-feb6c2 4. We will do this separately 5. Plugins can access pupil and gaze data. The pupil detection algorithm itself cannot be influenced via plugins.

user-f1eba3 08 March, 2018, 14:26:09

So I am sort of stuck. I'm trying to do a simple app to subscribe to the pupil because if I have that I can very fast build a plugin for Unreal Engine. I have tryed cpperoMQ and it simply breaks at every step. Now should I try to use this and cmake it on windows ( Yes I'm only using windows for my current setup) or do you see some other way I can go forward with this

user-f1eba3 08 March, 2018, 14:27:45

by this i ment

user-f1eba3 08 March, 2018, 14:27:47

https://github.com/zeromq/cppzmq

papr 08 March, 2018, 14:33:25
  1. has been solved above 7.1. https://en.wikipedia.org/wiki/User_Datagram_Protocol 7.2. You will have to synchronize all devices' clocks and record data for each sensor separately. Afterwards you can correlate the data by time. This is only possible if you synchronized the clocks beforehand! 7.3. Do you mena in which format they are or how you can access them? 7.4. A surface is registered by opening the Surface Tracker plugin, clicking the A hotkey and showing the surface to the camera 7.5. Marker Tracker plugin = Surface Tracker plugin. See question 2 7.6. Timestamps are created by a monotonic clock and are only useful when comparing timestamps to each other. If you want them to be meaningful (e.g. representing the current time) you will have to synchronize Pupil Capture's clock with your reference clock. 7.7. The fixation detectors are based on the same method to detect fixations: max dispersion in a minimum duration. They differ only in the event format 7.8. https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_messages.py#L23
papr 08 March, 2018, 14:34:19

@user-f1eba3 I would recommend cppzmq. But I would look for precompiled dlls...

user-f1eba3 08 March, 2018, 14:38:10

that s what i'm also looking for

user-feb6c2 08 March, 2018, 14:40:49

@papr thank you so much ! 7.3 In the introduction of frame publisher, they say that it broadcast videoframes but I don't understand what kind of frames and how can they be useful ?

papr 08 March, 2018, 14:43:33

It allows external applications to access the video frames of the Pupil headset. This is needed because it is not possible for multipel application to access them same camera at the same time. This is a developer feature only.

user-feb6c2 08 March, 2018, 14:46:23

Okey thank you so much ! I think I will get back to you again because we have difficulties with the program but so far thank you again !

user-06a050 08 March, 2018, 15:42:15

@papr I'm still struggling. The problem was that I receive the same data when calling recv twice. I made sure that my code itself doesn't send the data twice and now I made sure that there is only one socket connection per thread (one pub for the thread which sends the Leap Sensor data and one sub for the Pupil plugin) just to be sure. But the problem persists, I get the same data twice when calling recv! Do you have any suggestions where to look for solutions?

papr 08 March, 2018, 15:45:15

First I would make sure that this is not due to leap motion sending duplicated data. Start by sending a simple counter value from one thread to the other. But an other question: do you use Python's multiprocessing or multithreading module? If you are using Python threads, you can use a shared queue instead of zmq.

papr 08 March, 2018, 15:47:06

And if you use multiple processes, I would suggest to use the Pull-Push pattern and not the Pub-Sub one. Pub-Sub is for one-to-many relations; Push-Pull for one-to-one

user-06a050 08 March, 2018, 15:51:43

Using a simple print from the function which executes ipc_pub.send() I made sure that this gets called only once per Leap datum. I don't create any threads myself, this is done by the Leap SDK (and Pupil of course). Using threading.get_ident() I checked that there are only the two described threads at play and to make sure the socket connections aren't used across thread borders. The final solution should send the Leap data to the backbone, because we potentially want to use this in different applications too.

papr 08 March, 2018, 16:08:52

So data is also arrives duplicated if you send a simpel increment counter value?

user-06a050 08 March, 2018, 16:13:16

Yes

user-4d2126 08 March, 2018, 16:16:20

@papr Could you please check my post above (3 hours ago)? I have to make the decision now and would really like to know whether I can achieve the desired results (if it is technically possible given enough time and understanding of how everything works) or not.

user-06a050 08 March, 2018, 16:20:31

@papr Here is some code, maybe this helps. It's quite simple and short. https://pastebin.com/SzNRwais

papr 08 March, 2018, 16:20:36

@user-4d2126 I am afraid that Pupil might not the correct tool to achieve your goals. I would suggest to try a remote eye tracking system that uses the phones front camera (or any camera that is fixed to the phone itself). But I do not know what the current state of the art is for such systems' accuracies. I would love to hear back from you in case that you try some. I wonder how close Pupil gets---in terms of accuracy---to such specialised tools...

user-4d2126 08 March, 2018, 16:22:23

@papr Thanks for your answer, appreciate it. Hope that in time there will be a solution for this. Definitely gonna check this channel from time to time to stay informed on the topic. Also thanks for all the help along the way! Good luck with your endeavours.

papr 08 March, 2018, 16:22:49

Good luck to you, too!

papr 08 March, 2018, 16:24:09

@user-06a050 ~~I do not see the problem. That is exactly the output I would expect.. 🤔~~ Never mind, I overlooked something

user-06a050 08 March, 2018, 16:24:09

btw this is awesome, is there a way to buy you a coffee? Or a beer?

user-06a050 08 March, 2018, 16:24:21

@papr Well that's funny 😄

user-06a050 08 March, 2018, 16:25:53

on_connect and on_frame are running in one thread and recent_events in another, if that's important

papr 08 March, 2018, 16:26:25

Where do you initialize LeapListener.counter?

user-06a050 08 March, 2018, 16:27:01

Oh sorry, that'd be inside the LeapListener class, as a class property

user-06a050 08 March, 2018, 16:27:12

With 0

papr 08 March, 2018, 16:28:06

ok, was just a detail

papr 08 March, 2018, 16:29:17

Your code looks correct to me.

user-06a050 08 March, 2018, 16:29:30

And the output?

papr 08 March, 2018, 16:29:38

The output is unexpected

user-06a050 08 March, 2018, 16:29:50

Is it normal to receive the same data twice? Because you said before that it is not.

user-06a050 08 March, 2018, 16:29:56

Ah okay

papr 08 March, 2018, 16:31:35

Could you try to run the LeapListener + leap api stuff in a separate python script?

user-06a050 08 March, 2018, 16:31:55

Will do, gimme a minute

user-06a050 08 March, 2018, 16:36:49

No change

user-06a050 08 March, 2018, 16:37:30

Well except for the fact that there is obviously no FRAME X output in the Pupil output

user-06a050 08 March, 2018, 16:40:39

Oh wait, there is something amiss

user-f1eba3 08 March, 2018, 16:40:46

has anyone here succesfully used zmq for c++ on windows?

user-06a050 08 March, 2018, 16:45:52

Okay, I did everything correct. The output is still wrong.

user-06a050 08 March, 2018, 16:48:10

Not everything, then the output wouldn't be wrong. You know what I mean.

papr 08 March, 2018, 16:53:31

Ok, lets try to simplify the test further. Please ~~create a~~ use this stand-alone sending example, that does not rely on leap motion api: https://gist.github.com/papr/94982539daa1cca1e02957523958f9c1

user-f1eba3 08 March, 2018, 16:55:51

@papr is someone around the comunity more c++ capable than me :))) ?

user-f1eba3 08 March, 2018, 16:56:13

i've really been strugling for the past 2 days to subscribe to pupil using cpp

papr 08 March, 2018, 16:57:38

@user-f1eba3 But are struggling with c++ itself as language, with the cppzmq lib compilation, the cppzmq api or the general concept of how to use zmq?

user-f1eba3 08 March, 2018, 16:58:03

cppzmq lib compilation

user-f1eba3 08 March, 2018, 16:58:28

I did not find a tutorial or something to work with 0mq and cpp

user-f1eba3 08 March, 2018, 16:58:37

there are a bunch with python though

papr 08 March, 2018, 16:59:12

Did you manage to get step 1 of the cppzmq build instructions running?

user-06a050 08 March, 2018, 17:00:12

Had to change time.monotonic_ns() to time.monotonic(). The output is here: https://pastebin.com/qpuSwGMp

I removed some of the output because I was slow and the log was really long...

papr 08 March, 2018, 17:02:07

Eh, yeah, monotonic_ns() is a Python 3.7 feature. My bad. But the output is extremely unexpected.

user-06a050 08 March, 2018, 17:02:28

Especially now since there is now Leap involved

papr 08 March, 2018, 17:03:54

It does probably not play a role at all, but please execute /usr/bin/python3.5 main.py within /home/patrick/Code/pupil/pupil_src

user-f1eba3 08 March, 2018, 17:04:41

im running on windows not on mac

papr 08 March, 2018, 17:07:15

@user-f1eba3 Are you using https://cmake.org/runningcmake/?

papr 08 March, 2018, 17:08:21

@user-f1eba3 You need to start by building libzmq: https://github.com/zeromq/libzmq/blob/master/INSTALL https://github.com/zeromq/libzmq/tree/master/builds/msvc

user-06a050 08 March, 2018, 17:17:00

@papr Tried that, still no change. Is it of any importance that I regularly have to change the base class of my plugin to make capture actually load it? When I use Producer_Plugin_Base it sometimes just isn't loaded until I switch it to Plugin and activate it in the settings menu. After that I can switch the base class back to Producer_Base_plugin.

papr 08 March, 2018, 17:19:26

Producer_Plugin_Base is not a base class you should be using for a custom plugin.

user-06a050 08 March, 2018, 17:19:55

Oh, it's never said anywhere I think 😄

user-06a050 08 March, 2018, 17:20:00

Will change it

papr 08 March, 2018, 17:21:42

Even though it makes semantically sense, it was used in Pupil <v1.0 to display producer plugins in a different drop down menu. In the current version, the Plugin_Manager excludes all plugins from its ui that inherit from any of these: System_Plugin_Base, Base_Manager, Base_Source, Calibration_Plugin, Gaze_Mapping_Plugin, Producer_Plugin_Base.

user-072005 08 March, 2018, 17:24:11

Sorry to jump in in the middle of these other discussions. I'm still having some trouble with the pupil mobile app. I added a 256 GB micro SD to my phone and set it to save to the SD card. But it seems to stop recording randomly still. It should have sufficient memory for a short (~5-10 min) bike ride (I'm studying cyclists). I thought maybe it's because the phone was in the rider's pocket, but the file didn't save properly when this happened so I think it isn't a press of the record button stopping it. When the problem occurs, the info file is missing from the folder. It doesn't always cut the recording early either. What could be the cause of this?

user-06a050 08 March, 2018, 17:25:14

@papr But sadly it doesn't help with my current problem 😄 Well it's been 4 straight hours now with no advancements, I think this'll have to wait until Monday. Thank you so much for your help! Is there a way to donate? It wouldn't be much, but this/you deserves something.

papr 08 March, 2018, 17:25:36

@user-072005 This might be related to https://github.com/pupil-labs/pupil-mobile-app/issues/10 Please post a comment with yur experience there

user-072005 08 March, 2018, 17:26:43

Ok I'll do that

papr 08 March, 2018, 17:26:45

@user-06a050 Referring us to your collegues and mentioning us postively in your research publications would be more than enough 😃

user-06a050 08 March, 2018, 17:27:33

@papr Will do, thanks again! Have a nice day!

papr 08 March, 2018, 17:27:40

@user-06a050 You too!

user-2798d6 08 March, 2018, 20:21:31

Hello! I have a Capture file that when I drop it into Player, sometimes Player shuts down, sometimes it will work until I want to add a plugin (specifically the eye overlay). Do you have any suggestions for what I can do? I've gotten as far as offline pupil detection and then it usually shuts down.

user-2798d6 08 March, 2018, 20:38:44

PS-I just downloaded the latest Player version, and it still shuts down when I try to add eye overlay

user-2798d6 08 March, 2018, 20:42:40

One more question - I've found I have to delete my settings when I download the latest update, but then all of my manual calibration points disappear and I have to redo. Is there a way around that?

user-02ae76 08 March, 2018, 20:43:15

Hey guys, I keep running into an issue and I'm deciding if I need to post it on Github. I am having trouble with Pupilcapture only detecting my World cam unless I delete the pupil_capture_settings and even then, the problem appears upon my next opening of the application. When I do get the eye cam, I get a crazy amount of error messages that loop continually. I've added a screenshot of this. I am thinking it could possibly be a cable error, however I haven't had an issue with this cable before so it seems unlikely. I just lack any reliability in getting the program to work which is very frustrating.

Chat image

user-6e1816 09 March, 2018, 01:15:09

@papr This object detector come from world.py. Now I move it to the plugin folder, there is still a crash...

Chat image

user-e1dd4e 09 March, 2018, 01:24:15

So when I go to record the marker that shows where I am looking seems to always be off by a few inches. I will look at the center of my screen and the marker will show up a few inches below that even though it seems to calibrating properly.

wrp 09 March, 2018, 02:21:11

Hi @user-02ae76 You are using a 200hz eye camera system, correct? There might be a hardware issue here, just to check are you using the cables that shipped with your Pupil headset?

wrp 09 March, 2018, 02:21:36

@user-e1dd4e please try the accuracy test after calibrating and let us know results

user-cbb918 09 March, 2018, 07:05:54

Hi @wrp and everyone. we try to get pupil capture working on a mac os 10.9 (Mavericks), but it crashes at starting. We have tried the last version of pupil software and the previous one. It works fine on another computer running El capitan. I attach the crash report. Any hint about what it might be happening would be great. Thanks

crash_report.pdf

wrp 09 March, 2018, 07:18:34

Thanks @user-cbb918 I will take a look today

mpk 09 March, 2018, 08:08:50

@user-cbb918 the issue is not OS version related but comes from the fact that your 10.9 machine runs an Intel CPU thats not i3/5/7 the bundle only works with this type. If you need to run on this machine you will need to install all dependecies and run from source. I hope this is helpful!

mpk 09 March, 2018, 08:11:41

@user-2798d6 we need a bit more info. Are you using Pupil Mobile? If so and you are using 200hz cameras note that there was a bug that broke recordings. Have since fixed this.

user-2798d6 09 March, 2018, 15:07:28

@mpk I am not using pupil mobile

user-072005 09 March, 2018, 16:46:53

Hi, I'm having trouble with pupil mobile changing the camera settings during the recording. I'm using it outside. I will set it to have the frame rate be dynamically altered, but during a short recording (under 10 min), it will change back to the default and then my recording is useless as it is all white washed. Is there something I can do to address this?

user-8779ef 09 March, 2018, 18:34:19

Hey guys, what module do you use to read and display frames from the world.mp4 without having to convert to ffmpg?

user-8779ef 09 March, 2018, 18:35:07

...and what module should I look at as an example?

mpk 09 March, 2018, 19:10:52

@user-8779ef we use pyav. Check out their examples on github: https://github.com/mikeboers/PyAV/blob/master/examples/save_frames.py

mpk 09 March, 2018, 19:12:15

@AprilG#6958 I recommend raising an issue with our Android devs here: https://github.com/pupil-labs/pupil-mobile-app we can address this quickly and push a release.

mpk 09 March, 2018, 19:13:25

@user-2798d6 in this case I will need the logfile or terminal output after the crash. This is an unknown bug.

user-8779ef 09 March, 2018, 19:28:42

@mpk You use it for loading and displaying the images as well?

mpk 09 March, 2018, 19:30:10

It's part of our source, yes. Have a look at their other examples of our source code.

user-072005 09 March, 2018, 19:39:17

Has anyone else had trouble with pupil mobile changing the camera settings mid-recording?

user-8779ef 09 March, 2018, 20:29:10

Thanks mpk

user-cbb918 10 March, 2018, 09:33:38

Many thanks @mpk, we will try to get it compiled from source

user-2798d6 10 March, 2018, 17:41:26

@mpk - I'm not sure how to send you the log file or terminal output. Do you want me to send the Capture file?

user-072005 11 March, 2018, 17:38:56

I'm studying cyclists and using pupil mobile, I'm having a problem with the recording shutting off randomly. I've tested a bunch ofideas for the cause while not on a bike (such as it bouncing around in the pocket or having connnection issues with the cord) but it has never shut off prematurely. I am now wondering if the IMU in the app is having trouble with the cyclists' acceleration. Is this possible? Is it possible to disable the imu recording?

user-072005 11 March, 2018, 17:58:04

Nevermind, it wasn't that. It's Wi-Fi cutting in and out, does the app actually need wifi?

papr 11 March, 2018, 20:47:47

@user-072005 I will try to replicate this. Thank you for debugging the issue. The wifi is only used for streaming but should not terminate a running recording.

papr 11 March, 2018, 20:49:09

This is actually easy to reproduce.

user-072005 12 March, 2018, 00:37:54

Glad I was helpful this time and not just asking a million questions.

user-2798d6 12 March, 2018, 03:42:31

I accidentally clicked on "Clear Natural Features" in the offline calibration in Player...is there anyway to undo this?

papr 12 March, 2018, 08:08:07

@user-2798d6 No, unfortunately not. =/ Maybe this button should be positioned at the very bottom of the offline calibration menu...

user-06a050 12 March, 2018, 09:37:25

Hey! I'm still stuck with the problem from last week: I've got one python script which sends data into the backbone and another one that's supposed to receive this. The receiving end (recent_events in my Pupil plugin) somehow get's every message which was ever send for that topic on every call. I expect it to only receive message which it hasn't received before. @papr we talked about this a few days ago. I have no freaking idea where to look for solutions or what to try next. Here's some code: https://pastebin.com/ASAHpEW9 Don't be bothered with the Leap-naming of things, this is because I wan't to include Leap Motion Sensor data in Pupil, but as you can see in the code I'm not using actual Leap data.because I'm currently trying to fix this problem. Any help would be greatly appreciated!

user-c14158 12 March, 2018, 10:46:22

hello, I have a issue with pupil player (latest version). When I load a recording only eye1 seems to work, eye0 confidence is at 0 for the entire recording and if i used the "eye video overlay" plugin the image is frozen.

user-c14158 12 March, 2018, 10:47:07

If i try to do a offline pupil detection both eyevideo are running but after that only eye1 is working

user-c14158 12 March, 2018, 10:48:09

during the recording with capture( also latest version) both eye seems to work (at least the confidence treshold was varying over time for both eye)

user-c14158 12 March, 2018, 10:48:17

I m using Windows 10

papr 12 March, 2018, 10:49:03

Hey @user-c14158 would you mind sharing a screenshot of your recording folder that includes file sizes? How did you record the videos? Pupil Capture or Pupil Mobile?

user-c14158 12 March, 2018, 10:51:26

Chat image

user-c14158 12 March, 2018, 10:51:28

The video were record with pupil capture, i can play both eye video with vlc

papr 12 March, 2018, 10:53:47

Ok, that would have been my next question. 🙂 Could you upload the recording and share it with pp@pupil-labs.com ? I would like to replicate the issue locally.

user-c14158 12 March, 2018, 10:59:02

sure , a zip of the folder works for you ?

papr 12 March, 2018, 10:59:10

Yes, it does

user-c14158 12 March, 2018, 11:04:42

done, thx for you help

papr 12 March, 2018, 12:48:12

@user-06a050 Could you try a different topic name than hand?

papr 12 March, 2018, 12:59:09

@user-06a050 Funny enough, we can reproduce your issue if we publish/subscribe with hand but not with something different, e.g. han1...

user-06a050 12 March, 2018, 13:18:32

@papr That's interesting... Will try it later and let you know if it works! Thanks!

user-1d894f 12 March, 2018, 14:14:18

Hi everyone! Got a problem with Intel RealSense + MacBook Pro. After starting Pupil Capture, everything goes OK for about 30 sec, after that video from Intel crashes and only re-plug and restarting app can return video, again for around 30 sec. It's ok on Windows, but on Mac - crash. What can I do to fix that?

papr 12 March, 2018, 14:19:40

How old is your Macbook @user-1d894f ?

user-1d894f 12 March, 2018, 14:22:36

A1706, pretty new, 2017 as i guess (got it only this month)

papr 12 March, 2018, 14:23:08

Alright, does that mean that you use a usb-c to usb-c cable to connect the headset to the macbook?

user-1d894f 12 March, 2018, 14:24:45

I tried both options - usb-c 2 usb-c cable and usb-c 2 usb 3.0 over hub.

papr 12 March, 2018, 14:25:37

Could you share the capture/log file with us after such a crash happend?

user-1d894f 12 March, 2018, 14:30:28

I guess here it is

user-1d894f 12 March, 2018, 14:30:29

2018-03-12 17:25:58,880 - MainProcess - [INFO] os_utils: Disabled idle sleep. 2018-03-12 17:26:02,561 - world - [ERROR] video_capture.realsense_backend: Camera failed to initialize. No cameras connected. 2018-03-12 17:26:02,895 - world - [WARNING] launchables.world: Process started. 2018-03-12 17:26:04,654 - eye1 - [ERROR] video_capture.uvc_backend: Init failed. Capture is started in ghost mode. No images will be supplied. 2018-03-12 17:26:04,654 - eye1 - [INFO] camera_models: No user calibration found for camera Ghost capture at resolution [320, 240] 2018-03-12 17:26:04,654 - eye1 - [INFO] camera_models: No pre-recorded calibration available 2018-03-12 17:26:04,654 - eye1 - [WARNING] camera_models: Loading dummy calibration 2018-03-12 17:26:04,813 - eye1 - [WARNING] launchables.eye: Process started. 2018-03-12 17:26:06,866 - eye1 - [INFO] launchables.eye: Process shutting down. 2018-03-12 17:26:07,871 - MainProcess - [INFO] os_utils: Re-enabled idle sleep.

papr 12 March, 2018, 14:31:03

Which Capture version do you use?

user-1d894f 12 March, 2018, 14:31:40

1 5 12

papr 12 March, 2018, 14:32:12

Ok. We will try to reproduce this issue.

user-1d894f 12 March, 2018, 14:32:29

Thx

user-f1eba3 12 March, 2018, 14:32:37

hey

user-f1eba3 12 March, 2018, 14:33:10

I wanted to ask something but while thinking of the question I answered myself 😄

user-f1eba3 12 March, 2018, 14:33:14

Love those moments

user-f1eba3 12 March, 2018, 14:49:39

Could somebody help me write a zmq request from a third party library

user-f1eba3 12 March, 2018, 14:49:41

?

user-02ae76 12 March, 2018, 14:57:30

@wrp sorry for the delay, I am using the 200Hz eye cam system. We actually were not using the cables because I operate on a MacBook Pro which only has usb-C outputs. We used an Amazonbasics USB-C to USB-C cable. I was worried it might be the video transfer capacity on the cable. We have had more luck using the iMac charging call (also C to C) so I may check out what specs it has to determine what could be going wrong.

user-e02f58 12 March, 2018, 15:24:42

@papr You have send me a plugin reference that use python serial library before, Is it possible to modify the plugin without installing the pupil dependencies? Sorry for noob question but I am new to python and also not get used to develop large program which include a bunch of (20+) libraries.

user-e02f58 12 March, 2018, 15:27:05

@papr Actually, I made an external electronics hardware for studying subject's eye movement(when focusing on an arriving object). It is a ball screw rail with a visual stimulus marker that will getting closer and closer to subject's eyes. My goal is to use serial communication to synchronize pupil to my device, so that I will have the timestamp and travel distance of the visual stimulus. I just reading ZMQ these day, and have a hard time on installing dependencies on win10 and mac but not quite successful. So I am finding other solution that do not need to build from source.

papr 12 March, 2018, 15:28:23

@user-c14158 I found the issue with your recording. For some reason there is a time offset in your eye1 data. The video is not broken. Can you give more details on how you made the recording? Capture version, bundle/source, OS, etc?

papr 12 March, 2018, 15:30:00

Never mind, the info.csv contains all I need. Do all you recordings have this issue or only this one in particular?

papr 12 March, 2018, 15:32:28

@user-e02f58 You can use the serial module when running from source. But you will have to install the module folder within the plugins directory. Your custom plugin should be there as well. E.g. on mac

~/pupil_capture_settings/plugins/pyserial/
~/pupil_capture_settings/plugins/my_custom_plugin.py
user-c14158 12 March, 2018, 16:01:28

we had this issues with successive recordings , but the issue disappeared after restarting pupil capture

papr 12 March, 2018, 16:04:14

If you say successive recordings, do you mean that the issue did not appear in the first few recordings but then in the later ones? Can you estimate after how many recordings that happend?

user-c14158 12 March, 2018, 16:14:19

We had this issue in one session involving multiple recordings (this issue was present from the first recording of the session), however when restarting pupil capture and in different recording sessions (for example on a different day), the problem did not appear.

papr 12 March, 2018, 16:22:58

Mmh. Please tell us in case this happens again.

user-c14158 12 March, 2018, 16:28:22

Sure, can you think of a way for me to retrieve the data from the undetected eye ?

papr 12 March, 2018, 16:32:25

Yes. You fit a linear function to both eye0 and eye1 timestamps, and substract the difference in the functions' intercepts. This will align the eye timestamps but is not a 100% correct restore.

papr 12 March, 2018, 16:32:34
import numpy as np
from scipy.stats import linregress

wts = np.load('world_timestamps.npy')
ets0 = np.load('eye0_timestamps.npy')
ets1 = np.load('eye1_timestamps.npy')

def fit(ts):
    return linregress(np.arange(ts.shape[0]), ts)

result = list(map(fit, [wts, ets0, ets1]))
intercepts = [r.intercept for r in result]
slopes = [r.slope for r in result]
print('timestamp offsets')
print(f'\teye0 - wts: {intercepts[1] - intercepts[0]}')
print(f'\teye1 - wts: {intercepts[2] - intercepts[0]}')
print('timestamp divergance')
print(f'\teye1 - eye0: {slopes[1] - slopes[2]:f}')
user-c14158 12 March, 2018, 16:33:05

ok i will try that, thx a lot

papr 12 March, 2018, 16:33:38

The above script yields these results for your recording:

timestamp offsets
    eye0 - wts: 286.6918953607383
    eye1 - wts: 0.298578210244159
timestamp divergance
    eye1 - eye0: 0.000064
user-02ae76 12 March, 2018, 18:07:17

Question: Are there any recommended transfer rate specs for usb-c cables (for attaching headset to laptop)? I believe I may not have one that is powerful enough. (Amazon basics USB-C to USB-C). Recommendations for tried and true cables are also appreciated. Trying to avoid using an A to C adapter for my laptop port if possible.

user-02ae76 12 March, 2018, 18:13:04

Trying to decide if I would be better off using the original USB-A to C cable with an adapter/extender or whether it would be overall better to just use a long (10ft) USB-C to C cable.

user-2798d6 12 March, 2018, 18:37:01

I've noticed that when I download a new version of the software, I have to delete the user settings, but then all of my calibration points are gone. Is there a way to keep the work I've done on a file in a previous version of the software when I download and use the new version?

user-072005 12 March, 2018, 18:52:05

Has anyone used the glasses outside with the pupil mobile app? My calibration has been turning out terribly and I was wondering if anyone had some tips.

user-1bcd3e 13 March, 2018, 07:58:14

Hi guys just a short quest: I'm near to choose to buy or a new macbook pro 2017 or a macbook air, did you find any particular problem with these laptop? for example any problems with the USB-C of the ne macbook pro? very welcome any suggestion 😃

mpk 13 March, 2018, 08:14:18

@user-1bcd3e usb-c on the macbook is fine. You will just need a usb-c-c cable.

user-1bcd3e 13 March, 2018, 08:15:44

@mpk ok thanks a lot!

papr 13 March, 2018, 08:16:13

@user-1bcd3e See my link above if you need a recommendation for a such a cable

user-1bcd3e 13 March, 2018, 08:17:33

@papr Yes I got it ... very helpful... thanks! 😃

mpk 13 March, 2018, 11:41:12

@user-2798d6 when you update we have to reset the settings. Its very hard to test that with code changes sets of old settings remain valid.

user-e38712 13 March, 2018, 13:08:03

hello guys, remember me when I was talking about testing Pupil mobile with Xiaomi mi5?

user-e38712 13 March, 2018, 13:09:26

all cameras seems to forking fine, but right eye camera is inverted 180 degree

user-e38712 13 March, 2018, 13:09:32

left is fine

papr 13 March, 2018, 13:09:51

This is due to the physical orientation of the right eye camera

user-e38712 13 March, 2018, 13:10:37

is it possibel to calibrate using Pupil mobile app?

user-e38712 13 March, 2018, 13:11:08

I also tried to record sth, but I can't find location of this

user-e38712 13 March, 2018, 13:13:32

ok, found it in Videos

papr 13 March, 2018, 13:53:18

@user-e38712 You are not able to calibrate on the phone. You could stream the video wifi to a computer running Pupil Capture and calibrate there. Or you can do offline calibration in Player.

user-072005 13 March, 2018, 14:36:09

When I use the glasses outside, the camera gets washed out because of overexposure. Is there a setting other than the autoexposure priority I can adjust to help this?

mpk 13 March, 2018, 14:36:43

@user-072005 what camera? Eye or world?

user-072005 13 March, 2018, 14:37:13

World, both a bit but I think the world is the bigger problem

user-072005 13 March, 2018, 14:37:30

But when I disconnect from wifi, it won't let me change the autoexposure priority

papr 13 March, 2018, 14:38:25

Are you sure that is due to the loss of wifi and not due to the recording running?

user-072005 13 March, 2018, 14:38:45

Yes, I checked it while I wasn't recording as well

user-072005 13 March, 2018, 14:39:17

Actually, I can't seem to change anything after auto-exposure mode

user-072005 13 March, 2018, 15:07:42

Were you able to replicate it? Perhaps it is my phone? I'm using samsung galaxy s8 because the school already owned it.

user-cdabd8 13 March, 2018, 15:10:44

Would an absolutely primitive method like wearing sunglasses over the cams work?

papr 13 March, 2018, 15:10:59

@user-072005 Could you list the exact steps to reproduce the issue? I will have a try on my One Plus 3

user-072005 13 March, 2018, 15:16:15

Ok, start not connected to the wifi. I have the glasses facing outside so it's bright and I could see if the setting changed immediately. I opened the Pupil Cam1 ID2 and tapped the upper left button for settings. Then tried to change the auto-exposure priority to the frame rate may be dynamically varied by the device. When I click it, the setting looks changed but the camera doesn't actually change and if I reopen the settings tab, it's back at the frame rate must remain constant

user-072005 13 March, 2018, 15:16:24

And I was never recording during this

user-072005 13 March, 2018, 15:18:28

@user-cdabd8 I'll try that out if I need to

user-072005 13 March, 2018, 15:33:28

Ah I see it was assigned on github. Thanks for the help guys @mpk and @papr I'm leaving to conduct the study on Friday, so hopefully you won't here from me quite as often soon

user-2da779 13 March, 2018, 15:44:35

Hi, I am using the 1.3.9 version of pupil player, and i'm trying to do offline calibration. In the earlier version I used to be able to see the frame number in the bottom left of my screen. However, in this version I can't find it which is making it hard for me to enter the calibration range.

user-2da779 13 March, 2018, 15:46:56

Also, I have been recording the videos using 3D detection and mapping mode, however when I run these videos on pupil player when i select calibration mode 3d, the calibration status shows "failed".

papr 13 March, 2018, 15:48:54

@user-2da779 Please upgrade to the newest version. You can set calibration/mapping ranges easily from the trim marks.

user-2da779 13 March, 2018, 15:49:11

thank you

papr 13 March, 2018, 15:50:33

@user-2da779 In regrard to your second issue: Did you record the calibration procedure? If yes, did you run the circle marker detection? If no, did you define natural features and changed the mode for the section?

user-2da779 13 March, 2018, 15:54:39

I haven't clicked on the circle marker detection. I will try that, thank you!

papr 13 March, 2018, 15:56:59

Just to elaborate on what the issue was: Without the circle marker detection there are no reference point that the calibration can use. Therefore it fails. I guess the user feedback could be improved in this regard.

user-2da779 13 March, 2018, 16:05:16

Ok, so i clicked on circle marker detection and selected recalibrate. it shows failed calibration. However, when i change the calibration mode to 2D, the mapping is completed.

papr 13 March, 2018, 16:10:47

Did you use offline pupil detection as well?

user-2da779 13 March, 2018, 16:15:23

yep, and the detection mode for that was 3d too

papr 13 March, 2018, 16:15:46

Mmh, then it should have worked.

user-e3f7ca 13 March, 2018, 21:58:58

how do you run calibration on pupil htc vive addon?

user-e3f7ca 13 March, 2018, 22:01:09

When I press C in pupil capture, it says " Calibration requireds world capture video input"

user-d9bb5a 14 March, 2018, 09:47:30

MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version. Running PupilDrvInst.exe --vid 1443 --pid 37424 OPT: VID number 1443 OPT: PID number 37424 Running PupilDrvInst.exe --vid 1443 --pid 37425 OPT: VID number 1443 OPT: PID number 37425 Running PupilDrvInst.exe --vid 1443 --pid 37426 OPT: VID number 1443 OPT: PID number 37426 Running PupilDrvInst.exe --vid 1133 --pid 2115 OPT: VID number 1133 OPT: PID number 2115 Running PupilDrvInst.exe --vid 6127 --pid 18447 OPT: VID number 6127 OPT: PID number 18447 Running PupilDrvInst.exe --vid 3141 --pid 25771 OPT: VID number 3141 OPT: PID number 25771 world - [ERROR] video_capture.uvc_backend: Init failed. Capture is started in ghost mode. No images will be supplied. world - [WARNING] camera_models: Loading dummy calibration world - [WARNING] launchables.world: Process started. Running PupilDrvInst.exe --vid 1443 --pid 37424 OPT: VID number 1443 OPT: PID number 37424 Running PupilDrvInst.exe --vid 1443 --pid 37425 OPT: VID number 1443 OPT: PID number 37425 Running PupilDrvInst.exe --vid 1443 --pid 37426 OPT: VID number 1443 OPT: PID number 37426 Running PupilDrvInst.exe --vid 1133 --pid 2115 OPT: VID number 1133 OPT: PID number 2115 Running PupilDrvInst.exe --vid 6127 --pid 18447 OPT: VID number 6127 OPT: PID number 18447 Running PupilDrvInst.exe --vid 3141 --pid 25771 OPT: VID number 3141 OPT: PID number 25771

mpk 14 March, 2018, 09:50:17

@user-d9bb5a you have a realsense world camera. Make sure that this backend is enabled.

user-d9bb5a 14 March, 2018, 09:55:07

yes))I deleted the settings and everything started

wrp 14 March, 2018, 10:07:54

@user-b23813 please put the size of the screen in pixels for the surface size

user-b23813 14 March, 2018, 10:09:02

@wrp so it should be, for example, 1920x1080 for both surfaces?

wrp 14 March, 2018, 10:09:35

1920, 1080 for the full screen surface and if the other surface is smaller make it proportionally smaller

wrp 14 March, 2018, 10:10:08

the size that you specify for the surface is used for exporting a png of the surface - so it can be proportional (does not need to be exact)

wrp 14 March, 2018, 10:10:26

the size is also used to determine the number of bins used in creating the heatmap histogram

user-b23813 14 March, 2018, 10:12:08

@wrp thanks I ll try that

wrp 14 March, 2018, 10:12:51

You're welcome @user-b23813

user-072005 14 March, 2018, 14:22:49

I'm looking at the offline calibration section of the user docs. If I want to use natural feature selection to calibrate, would I still also use the marker detection first, then add natural features? So I would use both, not just natural feature selection?

wrp 14 March, 2018, 14:24:01

@user-072005 you do not need to use markers at all if you want to use natural features

wrp 14 March, 2018, 14:24:31

You can also define multiple calibration sections if you like

wrp 14 March, 2018, 14:25:12

but ranges can not overlap (e.g. you can't apply two calibration methods onto the same range)

user-072005 14 March, 2018, 14:25:59

ok thanks. So, if I want to define a section in player, I would click the "set from trim marks" for the calibration? Nothing seems to happen when I do this

papr 14 March, 2018, 14:31:35

@user-072005 That's because the default trim mark positions are aligned with the default calibration section ranges. Move the trim marks around and click the Set from trim marks button again

user-072005 14 March, 2018, 14:35:44

so, a little embarrassed to ask this, but what do the trim marks look like? I can't find something that will move

wrp 14 March, 2018, 14:36:08

@user-072005 at the end of the seek bar in the bottom of the Pupil Player window

user-072005 14 March, 2018, 14:36:32

Oh I see it, thanks

wrp 14 March, 2018, 14:37:35

Chat image

wrp 14 March, 2018, 14:37:46

For others who might wonder about "trim marks"

wrp 14 March, 2018, 14:39:00

the green box around the seekbar determines the trim marks. By default the trim section includes the entire duration of the recording.

papr 14 March, 2018, 14:39:59

The trim marks also define which section of the recording will be exported.

user-072005 14 March, 2018, 14:42:40

This is a bit of a theoretical question, but do you know if the size of the calibration marker has much impact on the calibration? I printed it out on a regular letter sized sheet of paper and it wasn't picked up well at the distance I was calibrating at. I was wondering if I print a bigger one, is it as if I am calibrating closer than I really am? And I'm generally trying to decide how big I'll print it.

mpk 14 March, 2018, 14:46:46

The recommended diameter of the printed marker is 94mm

user-072005 14 March, 2018, 14:47:09

ok I'll stick to that

mpk 14 March, 2018, 14:47:24

recommended distance is 1-2,5m

user-072005 14 March, 2018, 14:48:04

Would that be true even if the person wearing the eye tracker is expected to be looking at objects further away than that?

mpk 14 March, 2018, 14:48:30

then not. Try something like 3m then. See if the marker is still tracked.

user-072005 14 March, 2018, 14:49:06

ok, and then maybe scale the marker according to the additional distance if it isn't?

mpk 14 March, 2018, 14:49:37

only if the marker is not detected.

user-072005 14 March, 2018, 14:49:55

ok I'll experiment with that today

user-072005 14 March, 2018, 14:50:04

Thanks

user-072005 14 March, 2018, 15:50:00

@mpk 94 mm is the diameter of the outermost white ring?

user-02ae76 14 March, 2018, 15:54:16

@papr thanks for the cord recommendations! Unfortunately I believe I have isolated the issue as being a software problem. I tested using both of my headsets with their in-box cables on my iMac and got strange results - both showed the World view no problem however the eye camera shows up as just a black screen with nothing getting picked up (almost as if the light isn't on). I then retested on my MacBook PRO using an Apple USB-c to C cable and got both images for both headsets. I did try several fixed on the imac (deleting settings, selecting input sources , etc.) But cannot get an eye image on my iMac. I used to have no issues using this machine. I think this confirms at least that my headsets are functional, however I'm considering posting in the GitHub since I've had such issues with consistency.

user-02ae76 14 March, 2018, 15:54:45

Any ideas why my eye image might show up as a black screen on one machine and show up fine on another?

user-2798d6 14 March, 2018, 16:30:41

Hi @mpk - I sent you the file of the recording that keeps shutting down when I try to work with it in Player. You also mentioned a log file and terminal output - will you get that from me sending the recording file or is that something separate?

mpk 14 March, 2018, 16:33:09

@arispawgld#8014 make sure to use the latest version of Pupil Capture. The issue you are outlining sounds like you are using an older version of capture.

user-02ae76 14 March, 2018, 16:39:07

@mpk I'll check this out, I believe I do have the latest version but I'll update if installing the new version doesn't help.

user-ecbbea 14 March, 2018, 17:30:08

Hey, where do I download the little QR codes so I can bound my pupil positions to within a monitor?

user-f1eba3 14 March, 2018, 17:35:17

has anyone builded the zmq library for windows ?

user-ecbbea 14 March, 2018, 17:35:23

thank you @papr

papr 14 March, 2018, 17:35:41

@user-f1eba3 Are you still struggling with that? 😕

user-f1eba3 14 March, 2018, 17:35:55

yup

user-f1eba3 14 March, 2018, 17:36:10

I found the way to run my code in Unreal (I guess

user-f1eba3 14 March, 2018, 17:36:23

I just need to compile the god damn library

user-ecbbea 14 March, 2018, 17:45:58

are you locked into using unreal? I've had success using Unity

user-dc2842 14 March, 2018, 17:47:08

Hi! Could I ask what is the progress of saccade detection plugin and do we know more/less release day?

papr 14 March, 2018, 17:49:15

@user-dc2842 I am sorry to report that there has not been any progress with the saccade detection plugin yet. My collegue and I have been busy with implementing audio playback for Player.

user-dc2842 14 March, 2018, 17:52:34

@papr ok, understood. I found a paper with a robust algorithm for saccade detecion (doi:10.1109/EMBC.2014.6944931), hope maybe it will help you somehow in the future : )

user-f1eba3 14 March, 2018, 18:20:13

@user-ecbbea yes. I want to integrate Pupil in a bigger system that uses multiple robots and everything is developed unde c++/Unreak

user-d72566 15 March, 2018, 09:26:07

Hi, have anyone here been successfull on getting a normal webcam working with this software? A friend of mine managed to do it on an older version in Mac and I''m trying to convert his changes to the up to date Windows version which has changed alot His "hack" was based on a modified backend called non_uvc_backend.py. I've managed to add a new category "Non-UVC-Source" to the gui and I start a new thread where I want the stream to work from. However, it doesn't seem to work. https://gist.github.com/Baxtex/f2aacef643c50157e6c1fa167f9cd2cd Do anyone have experiene with doing this? I need it to try a few things. The modified backend is based on the fake_backend.py

papr 15 March, 2018, 09:33:12

@user-d72566 Ideally you would not have to open a new thread. You would grab the image in the main thread. This avoids code complexity.

papr 15 March, 2018, 09:36:14

For the backend you need two plugin classes: A Source and a Manager. The manager lists all available cameras that your Source class can use to grab a video from. After selecting a camera a single Source object is instanciated and is responsible to grab the frame from the camera during the recent_events() call and to add it to the events dict

papr 15 March, 2018, 09:42:25

Also, this is not the correct way anymore to display a frame: https://gist.github.com/Baxtex/f2aacef643c50157e6c1fa167f9cd2cd#file-non_uvc_backend-py-L46

As mentioned above, you need to provide a Frame object that is placed into the recent_events events dict with the key 'frame'.

Let me know if you have any further questions. I am happyt to see that people use the possiblity to write their own backends 😃

user-d72566 15 March, 2018, 10:08:40

Hmm, interesting, thanks. I will try a few things, will report back here later today! 😃

user-d72566 15 March, 2018, 10:14:12

I believe I have done this, more or less. Let me post my full backend file and a image on how it looks.

user-d72566 15 March, 2018, 10:18:26

Here is the backend: https://gist.github.com/Baxtex/f6e553de2d56dcf61dd45809e9ac1670 This is how it looks like: https://imgur.com/a/TJUSg When I press any of the buttons, it will start a new thread but then nothing happens. Gonna experiment with it a bit. PS: I have included None_UVC_Source, None_UVC_Manager in the init.py file

papr 15 March, 2018, 10:39:32

@user-d72566 None_UVC_Source does not have a recent_events() method

user-d72566 15 March, 2018, 10:48:18

Aha I see

user-41bb50 15 March, 2018, 12:42:31

is this what is used for an IR pass filter? https://i.imgur.com/EzhA01w.jpg

user-d72566 15 March, 2018, 13:11:25

@papr I've now tried to implement this method with the help of uvc and fake backend but I'm probably doing it wrong beacuse nothing happens. 😄 However, I tried just copying over the activate method from fake_backend, and it actually started. I'm talking about these lines: https://gist.github.com/Baxtex/8c0fdec6d894bfbb4b319fc6226ae08d Not sure if this is something I can use? Sorry for these questions, I don't know Python very well but I've learned a lot just trying to get this working. 😄

papr 15 March, 2018, 13:14:28

@user-d72566 Yes, this is somethign you should use, but you need to adapt the notification: {'subject':'start_plugin',"name":"Non_UVC_Source",'args':settings} and settings such that the keys match the keyword arguments of Non_UVC_Source

user-d72566 15 March, 2018, 14:14:01

Alright, I did what you said and has been able to deal with the errors one at at a time until I got to recent_events. I have no idea what source I should use there. I'm guessing I need the frame? So I could do like in the fake_backend; frame = self.get_frame() ?

papr 15 March, 2018, 14:14:50

Sounds correct. What ever function your generates your Frame object

mpk 15 March, 2018, 14:15:03

@user-41bb50 for the DIY headset we recommend Agfa Precisa 100

papr 15 March, 2018, 14:15:21

@user-d72566 e.g.: events['frame'] = self.get_frame()

user-d72566 15 March, 2018, 14:18:52

Hmm, alright, seems like my img isn't set properly now, gonna investigate...

papr 15 March, 2018, 14:20:11

The software expects the img property to return a height x width x depth BGR uint8 array

user-d72566 15 March, 2018, 14:21:05

Isn't that what I do with the make_img function?

user-d72566 15 March, 2018, 14:21:22

Or, maybe trying to do 😄

papr 15 March, 2018, 14:26:09

Could you provide a current version of your code?

user-d72566 15 March, 2018, 14:26:38

Sure, I just cleaned it up a little, had a lot of junk.

papr 15 March, 2018, 14:30:18

Ah, ok, the reason why line 30 fails is the following: You defined a read-only property called img in line 40. In line 30 you try to set it which fails. Change line 30 to self._img = img

user-d72566 15 March, 2018, 14:34:31

Oh, I understand! Then that is fixed, thanks! Now I don't have any errors anymore, which is unfortunate, as it is still not working. However the background now turned black instead of grey, so something is happening at least. 😄

papr 15 March, 2018, 14:43:30

@user-d72566 Ah, a subtle but fatal issue: You overwrote gl_display(). The base class contains the code to draw the image. You have to call super().gl_display() in line 84 instead of pass

user-d72566 15 March, 2018, 14:46:20

Hmm, I did what you said. If fixed some of the rendering issues, but it is still not functioning, just a white background I'm afraid.

papr 15 March, 2018, 14:52:21

_recent_frame is used to draw the frame

user-41bb50 15 March, 2018, 15:03:04

is this what is used for an IR pass filter? https://i.imgur.com/EzhA01w.jpg5728493612957698/285728493612957698

wrp 15 March, 2018, 15:03:56

@user-41bb50 please see @mpk response above 👆

user-41bb50 15 March, 2018, 15:25:35

sorry for being blind, and also for double posting

user-41bb50 15 March, 2018, 15:27:15

dont films exposed to ambient light/sunlight get damaged over time?

papr 15 March, 2018, 15:46:26

@user-41bb50 You need to get the film developed unexposed. Afterwards it should work for many years.

user-41bb50 15 March, 2018, 15:47:56

so developed, or unexposed?

papr 15 March, 2018, 15:48:47

Sorry for being unclear. You need to get it developed without exposing it to sunlight beforehand.

user-41bb50 15 March, 2018, 15:49:02

thanks

user-d72566 15 March, 2018, 16:26:23

I'm back! I just tried your modified version (great many thanks btw!) however, the import on line 12 doesn't work, I don't have anything called manager_classes

papr 15 March, 2018, 16:28:06

@user-d72566 Ah, yes, I had to add this import because I used your backend as a runtime plugin instead of modifying the source. Just delete lines 12 and 277

user-d72566 15 March, 2018, 16:28:39

Ah, I will try!

user-d72566 15 March, 2018, 16:35:00

Hmm the results seem to be equal that of the fake_backend, minus some of the counters. So I got like a green/blue image.

papr 15 March, 2018, 16:35:16

Yes, this is expected, isn't it?

user-d72566 15 March, 2018, 16:36:41

Ah, not really, my goal is to try and use my regular webcam as a source, sorry if I was unclear 😛

papr 15 March, 2018, 16:37:36

ok, but that is the next step. Your uploaded version is expected to do exactly that: Show a static image with a gradient.

user-d72566 15 March, 2018, 16:38:36

Ah

papr 15 March, 2018, 16:39:11

To be exact, get_frame now needs to call capture.read() and use its output instead of self._img

papr 15 March, 2018, 16:40:15

You won't need an other thread for that. Important though: Only call capture.read() once per recent_events/get_frame call. You need to loose the while loop that you used in your background thread

user-d72566 15 March, 2018, 16:40:36

Hmm ok, so its a little like what I had from the original mod then.

user-d72566 15 March, 2018, 16:40:43

Alright, I will try it!

user-d72566 15 March, 2018, 16:49:22

Hmm, maybe this is working, I now have a black screen with artifacts at the top. I think it could be because of the different parameters inside get_frame. Will experiment a little 😃 If I cover the camera, the artifacts disappear. If I uncover it, the artifacts come back, so something is definatly working. 😄

user-d72566 15 March, 2018, 17:18:48

But you are sure that it shoulnd't be on a separate thread? Because the UI and the whole application turns really slow when I choose the camera

papr 15 March, 2018, 17:28:17

That depends with how many fps the camera is running. If you need to start a thread, then it should be startedand stopped from within Non_UVC_Source

user-d72566 15 March, 2018, 17:30:32

Its only 30 fps, but I have set that I think. Yeah I don't know really, just that everything is really slow, just like the main thread gets blocked

papr 15 March, 2018, 17:33:50

Yeah, that is probably the case.

papr 15 March, 2018, 17:34:54

You should add a small sleep(1/60) to your recent_events method if you move the capture.read() loop into a background thread. Else your main loop will run at 100% CPU usage

user-d72566 15 March, 2018, 17:38:50

I could try. Btw, this is how it looks: https://imgur.com/a/xdvkO

user-d72566 15 March, 2018, 18:25:19

I've now tried starting a separate thread to save the image that the recent function then fetches. GUI is much more responsive now, but background is just grey, no artifacts at all: https://gist.github.com/Baxtex/4b726038cfae389609ca206bfaaa6c20 so it is sadly not working as expected.

papr 15 March, 2018, 18:54:12

Which Webcam are you using?

user-d72566 15 March, 2018, 19:22:00

Just my laptop's built in, a really generic cheap china thing. However I'm gonna try a better external one tomorrow (usb 2).

user-d84237 16 March, 2018, 02:20:34

Are there any ways to improve accuracy after calibration? The three visualized gaze positions after calibration often appear on completely different positions. Also, even though calibration seems to be successful, the gaze positions are slightly different from where exactly a user is watching at. Any suggestions?

user-d72566 16 March, 2018, 08:19:59

Turns out, I have confused some functions names, 😃 Now however, when the thread saves the image to self.g_pool.capture.img, I get an error: AttributeError: can't set attribute.. I then tried using just self.img, but that didn't work either, though it didn't throw any error. Hmm. This is the current state of my code: https://gist.github.com/Baxtex/153fc8f42ca6ce1bd44d14a6d798e174

user-b116a6 16 March, 2018, 09:02:30

hey guys, when trying to detect eye 1 from capture i get the error: "Eye 1: init failed. Capture is started in ghost mode. No images will be supplied." and the screen of the detect eye 1 window in grey and no image is shown. Detect eye 0 works fine and world camera also works fine. I'm using MacOS 10.13.3 and Pupil Capture v.1.5-12. Can anyone help?

papr 16 March, 2018, 09:05:00

@user-b116a6 Has eye1 been working before?

user-b116a6 16 March, 2018, 09:05:51

@papr Yes

papr 16 March, 2018, 09:09:20

@user-d72566 This is the problem of the read-only property in line 100 again. Have a look at this small tutorial that explains how they work: https://www.programiz.com/python-programming/property

To solve your problem you either have to assign the read image to capture._img or remove the property definition.

papr 16 March, 2018, 09:11:56

@user-b116a6 Please write an email to info@pupil-labs.com including your order number, Discord username, a short description of the problem, and a list of things that you tried to get it working again.

user-d72566 16 March, 2018, 09:36:05

Ah, I have some experience with Getters and setters (I'm a C# and Java dev), I guess I'm just to used to get iformative compiler errors. :D I did what you said, put in the underscore and now the camera finally works! Many many many thanks to you for helping me out with this!

papr 16 March, 2018, 09:42:45

Great to hear that it is working! Just a thing I stumbled upon: https://gist.github.com/Baxtex/153fc8f42ca6ce1bd44d14a6d798e174#file-none_uvc_backend-py-L112

papr 16 March, 2018, 09:43:59

@user-d72566 max(1, 0/60) will always return 1. Sleeping that long will make your ui unresponsive.

user-d72566 16 March, 2018, 09:44:52

Ah, yes I noticed so I use the orginal delay instead;

user-d72566 16 March, 2018, 09:45:19

max(0,1./self.fps - spent)

user-b116a6 16 March, 2018, 09:52:20

@papr I don't seem to have the order number, I am a student and this is part of my Thesis Project, is it necessary to include the order number or can I include it at a later email correspondence? It is time-sensitive that I find a way to continue and hopefully a replacement is not needed.

papr 16 March, 2018, 09:55:12

@user-b116a6 I understand. Let's try to do some debugging then. Disconnecting/reconnecting the camera does not work?

papr 16 March, 2018, 09:55:39

What headset do you use? 120/200Hz?

user-b116a6 16 March, 2018, 09:57:32

@papr 120Hz, binocular, I tried disconnecting/reconnecting, reverting to a previous version of the Pupil software, switching to Windows 10, rebooting several times, changing USB ports, tried killing zombie processes but not found any

papr 16 March, 2018, 10:01:26

@user-b116a6 can you check if this connector is correctly connected?

Chat image

papr 16 March, 2018, 10:04:02

Or if there are any visible defects with the cables?

user-b116a6 16 March, 2018, 10:05:13

@papr the cables seem to be connected properly for both cameras

papr 16 March, 2018, 10:06:45

@user-b116a6 Is Pupil Cam1 ID1 listed in the selector of the UVC Manager menu?

user-b116a6 16 March, 2018, 10:07:51

@papr No it is not listed, only Pupil Cam1 ID2 and Pupil Cam1 ID0 are listed

papr 16 March, 2018, 10:10:22

@user-b116a6 Please remove the connectors of both eye cameras carefully, switch the cameras and try again.

papr 16 March, 2018, 10:10:51

Which cameras are listed now?

user-b116a6 16 March, 2018, 10:17:27

@papr Thank you very much, this was the only thing that I didn't try. The connector of eye 1 was a bit loose. Everything works fine now.

papr 16 March, 2018, 10:18:05

Ok, happy to help. Good luck with your thesis!

user-d72566 16 March, 2018, 12:03:10

@papr Sorry to bother you again but I haven't managed to find anything about this. When I try to Calibrate (I've tried all the different methods) when the calibration is finished and full screen mode closes, the whole application crashes: File "C:\work\pupil\pupil_src\shared_modules\calibration_routines\calibrate.py", line 351, in preprocess_3d_data ref_vector = g_pool.capture.intrinsics.undistortPoints(ref_vector) AttributeError: 'NoneType' object has no attribute 'undistortPoints'

I haven't touched these files at all, is there something I have to change? 😃

papr 16 March, 2018, 12:46:14

@user-d72566 Your Non_UVC_Source class needs an intrinsics attribute.

papr 16 March, 2018, 12:49:36

See the camera_models.py file for the different intrinsics classes. I guess you should start with a Dummy_Camera instance and run the Camera Intrinsics Estimation procedure afterwards in order to calculate the correct instrinsics for your camera. Are you using your webcam as world or eye camera?

user-d72566 16 March, 2018, 12:50:33

Hmm ok, right now I'm using my homebuilt headset, both world and eye are webcams.

papr 16 March, 2018, 12:57:24

Ok, that is fine. The eye cameras always use a Dummy_Camera intrinsics instance

user-d72566 16 March, 2018, 12:59:05

Ok! I looked at the fake_backend.py file, I just tried writing " self._intrinsics = Dummy_Camera(size, self.name)" in my make_img function, just crashes though., it says it is not implemented.

user-b116a6 16 March, 2018, 13:00:13

Sorry to bother again, calibration process does not start, instead of the usual markers showing in a white screen, only the white screen is shown. I tried deleting the folder ~/capture_settings and restarting the application but no luck.

papr 16 March, 2018, 13:01:15

@user-d72566 Could you provide the traceback please

user-d72566 16 March, 2018, 13:01:30

sure, just a sec

papr 16 March, 2018, 13:02:18

@user-b116a6 Please use the Restart with defaults button in the general settings

user-b116a6 16 March, 2018, 13:03:20

@papr Yes I tried that also. Does it matter that I leave the detect eye windows open?

papr 16 March, 2018, 13:12:22

If you close the world window, the eye windows should close automatically as well

user-d72566 16 March, 2018, 13:18:54

@papr its the name property that is causing trouble, trying to fix it

user-b116a6 16 March, 2018, 13:20:06

@papr Yes i tried restarting the application, restarting with default settings, deleting the ~/capture_settings folder but no luck in all cases.

papr 16 March, 2018, 13:21:01

@user-b116a6 Which Mac and which version of macOS do you use? Could you disable the full screen option and try again?

user-b116a6 16 March, 2018, 13:22:36

Yes I read somewhere that it may be caused by the retina display. I have the Macbook Pro 15" Retina running High Sierra 10.13.3. Let me try that and get back to you.

user-d72566 16 March, 2018, 13:24:39

Ok, so I managed to fix the name property. So now when I select the camera, the application crashes, no error message or anything. If I comment out the "self._intrinsics = Dummy_Camera(size, self.name)" it works again, just like before. (but it crashes at calibration of course)

papr 16 March, 2018, 13:30:10

@user-d72566 Did you check the log file? Maybe there is more information about the crash in it

user-d72566 16 March, 2018, 13:30:53

I will

user-d72566 16 March, 2018, 13:32:18

you mean the capture.log ?

user-d72566 16 March, 2018, 13:33:35

what thee... "AttributeError: 'None_UVC_Source' object has no attribute '_name'"

user-b116a6 16 March, 2018, 13:34:14

@papr Yes, disabling the fullscreen did it.

papr 16 March, 2018, 13:34:44

@user-b116a6 You are using v1.5 correct?

user-b116a6 16 March, 2018, 13:34:56

Yes, 1.5-12

user-d72566 16 March, 2018, 13:35:21

@papr isn't this correct? https://gist.github.com/Baxtex/fb43065787c1ee4e11bec8f9f59f942f

user-d72566 16 March, 2018, 13:37:05

Or am I just being stupid about the properties again? 😄

papr 16 March, 2018, 13:37:28

My guess is that you need to define self._name earlier than calling make_img() since make_img accesses self.name

user-3565f9 16 March, 2018, 13:37:33

What would be some good specs for a computer collecting the eyetracking data?

papr 16 March, 2018, 13:38:30

@user-3565f9 Intel Core i5, 8GB Ram, and lots of storage for the videos

papr 16 March, 2018, 13:39:07

You could do with less but these are the recommended specs 🙂

user-d72566 16 March, 2018, 13:45:06

Yup, that was it! Again, thanks!

user-88dd92 16 March, 2018, 14:02:04

has anyone recorded sounds using the Pupil Capture?

papr 16 March, 2018, 14:02:40

@user-88dd92 Yes 🙂 I did

user-88dd92 16 March, 2018, 14:03:20

how did you input the audio to pupil? simple microphone or something a bit more fancy?

papr 16 March, 2018, 14:03:50

On Mac I used the built-in microphone. On linux and USB webcam mic

user-88dd92 16 March, 2018, 14:05:33

Cool, thank you very much! Im working on Linux and have had some trouble with playing the audio files back. Ill go get a proper mic (i.e. not one made from a pair of earphones). Thanks PAPR

papr 16 March, 2018, 14:06:07

@user-88dd92 Do you mean playback in e.g VLC?

papr 16 March, 2018, 14:06:19

Or in Pupil Player?

user-88dd92 16 March, 2018, 14:08:26

either one... basically even when recording with the audio pulgin and exporting using the video exporter (pupil player) the MP4 files have no associated audio file... as such I assume no audio was recorded.

user-88dd92 16 March, 2018, 14:09:00

I will give it a go and if I'm still stumped I will come back and once again ask the experts!

user-88dd92 16 March, 2018, 14:09:01

Cheers

user-88dd92 16 March, 2018, 14:31:10

@papr What version of Pupil Capture are you using? Cheers

user-88dd92 16 March, 2018, 14:31:11

V

papr 16 March, 2018, 14:32:06

I am running the current master from source

user-88dd92 16 March, 2018, 14:32:07

1.5 seen in message on top. Please disregard!

user-88dd92 16 March, 2018, 14:32:16

thanks

user-c14158 16 March, 2018, 14:51:07

hello, what is the meaning of the green circle in the eye image ?

user-c14158 16 March, 2018, 14:54:58

also I have some issue with the setting of the camera, i cannot set the exposure for the eye camera to auto ( for example) or change the exposure time.

user-c14158 16 March, 2018, 14:55:34

I also have the same problem with the R200 camera where i cannot change a lot of variable

wrp 16 March, 2018, 14:55:43

Hi @user-c14158 the green circle is a visualization of the eyeball from the 3d model

wrp 16 March, 2018, 14:56:07

@user-c14158 what OS and version of Pupil are you using (running from src or from app bundle)?

user-c14158 16 March, 2018, 14:57:16

Win 10 , version 1.5.12 of pupil (running from the app bundle )

user-c14158 16 March, 2018, 14:59:53

regarding the visualization of the eyeball , i assume it is not supossed to vary a lot during the recording ( it is the case for me ). does that mean that the model sensitivity value is wrong ?

user-b116a6 16 March, 2018, 15:50:46

Hello again, I'm trying to convert from PUPIL EPOCH to UNIX EPOCH and I found here a possible solution: https://groups.google.com/forum/#!msg/pupil-discuss/rnarge_J7Go/pOtATxQSAgAJ However when executing the python files I get these errors

Chat image

user-b116a6 16 March, 2018, 15:53:30

Firstly uvc could not be found so I changed to pyuvc. I don't know if that can work so if anyone succeeded in changing the timestamp with any other way I'd appreciate the help.

papr 16 March, 2018, 16:09:00

@user-b116a6 You will need to install pip3 install git+https://github.com/pupil-labs/pyuvc and change the source code back to from uvc import ...

user-b116a6 16 March, 2018, 16:23:30

@papr Now this error is shown when executing this command

Chat image

papr 16 March, 2018, 16:38:30

Would you mind sharing the complete output, please?

user-b116a6 16 March, 2018, 16:41:22

@papr

Chat image

papr 16 March, 2018, 17:41:21

@user-b116a6 You will need our version of libyvc as well https://docs.pupil-labs.com/#install-libuvc

papr 16 March, 2018, 17:41:33

Generally, try to use the dpendencies in our docs

user-b116a6 16 March, 2018, 17:45:27

@papr Thank you, I followed those instructions, however I don't remember if it was before or after I formatted my Mac. Let me try that again and I will let you know. Thanks again.

user-b116a6 16 March, 2018, 20:03:27

@papr I followed those steps and I got an error here: pip3 install git+https://github.com/pupil-labs/pyndsi Everything else worked, however when I tried to run pupil_time_sync_master.py I got a "ModuleNotFoundError : No module named uvc"

user-d84237 17 March, 2018, 06:14:06

Are there any ways to improve accuracy after calibration? The three visualized gaze positions after calibration often appear on completely different positions. Also, even though calibration seems to be successful, the gaze positions are slightly different from where exactly a user is watching at. Any suggestions?

user-b116a6 17 March, 2018, 13:11:19

Hey guys, I want to use Pupil eye tracking for the real time analysis of the data recorded. Starting I want to e.g. receive and manipulate the fixations while the eye tracking recording session is in progress. Is it possible and can anyone guide me in the proper direction? Thank you.

papr 17 March, 2018, 13:15:08

@user-d84237 A slight estimation error of ~1 degree (depending on mapping method) is expected.

papr 17 March, 2018, 13:15:59

@user-b116a6 Do you know about the Pupil Helpers repository?

papr 17 March, 2018, 13:16:21

I suggest to have a look at the example scripts there.

user-6e1816 19 March, 2018, 07:41:23

There are gaze_point_3d_x, gaze_point_3d_y and gaze_point_3d_z in the gaze_positions.csv, I would want to know which one represents the gaze distance?

papr 19 March, 2018, 07:42:11

The z value. The unit is millimeters.

user-6e1816 19 March, 2018, 07:47:32

https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/calibration_routines/gaze_mappers.py#L247 and the 500 is the default value?If I use screen_marker_calibration method, this means the distance between my eye and screen is 500mm?

papr 19 March, 2018, 07:48:40

Correct. This distance is needed to initialize the bundle adjustment optimization.

user-6e1816 19 March, 2018, 07:52:00

So if the distance between my eye and screen is 1m, just change the 500 to 1000?

papr 19 March, 2018, 07:54:08

That is correct 🙂

user-6e1816 19 March, 2018, 07:55:32

OK,thanks!

user-78dc8f 19 March, 2018, 15:21:29

Hi folks. We are following up on an issue posted here back in January. We have a calibration phase, then we stopped the video, and then a recording phase. We are trying to integrate analysis across these phases--to basically apply the calibration to the recording phase. Step 1 was to merge the videos. We finally figured out how to do that. But to analyze the data in pupil player, we need to merge the 'other' files in the pupil capture recording folder, yes? Can someone provide some guidance on exactly what pupil player needs?

papr 19 March, 2018, 15:22:48

@user-78dc8f Do I remember correctly, that you recorded onPupil Mobile?

user-78dc8f 19 March, 2018, 15:23:09

@papr yes

papr 19 March, 2018, 15:27:08

Ok, then you will need to concatinate the timestamp files only. These are numpy arrays. Use numpy.load(), numpy.concatenate(), and numpy.save() to do so.

papr 19 March, 2018, 15:27:47

Make sure that you do not loose any frames when concatinating the videos though!

user-78dc8f 19 March, 2018, 15:27:57

@papr ok...so these are the .time files?

user-78dc8f 19 March, 2018, 15:28:22

@papr or the .npy files?

papr 19 March, 2018, 15:28:56

the npy files. If they do not exist, open the recording in Player once and Player will generate them for you

papr 19 March, 2018, 15:30:15

Only merge timestamps of the same cameras. Do not merge eye0 and eye1 timestamps for example.

user-78dc8f 19 March, 2018, 15:30:39

@papr ok (and, yes, merging cross cameras would be daft 😉

user-78dc8f 19 March, 2018, 15:31:29

@papr then we should put the merged .mjpeg files and merged .npy files into a folder and, in theory, pupil player should be happy?

papr 19 March, 2018, 15:31:47

Correct!

user-78dc8f 19 March, 2018, 15:31:49

@papr (at least, it should open...)

user-78dc8f 19 March, 2018, 15:32:08

@papr thanks. We'll explore (back in a month....ha ha)

user-b116a6 19 March, 2018, 15:45:00

@papr Following on the message I sent on Saturday, I managed to write a script to communicate with the IPC backbone and I subscribed to receive metrics with topic 'fixation' and I am writing them in a JSON file. Is it possible now to use the methods called from Player that generate the fixation_report.csv and fixations.csv so I can generate them myself without using the Player app? I am asking because I need to calculate the fixations real-time not offline through the Player app. Thank you in advance.

papr 19 March, 2018, 16:09:19

@user-b116a6 Do I understand it correctly that you want to write online fixations to a csv file?

papr 19 March, 2018, 16:10:31

This is the Pupil Player code that exports the fixations to csv: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/fixation_detector.py#L436-L471

user-b116a6 19 March, 2018, 16:17:54

@papr Yes, I have seen this file. Currently, I set to record 70 frames/sec and I write in a file only the metrics with topic fixation. Can I e.g. call a method from fixation_detector.py that will calculate based on the recorded metrics the fixations and create similar files to those created from Player or do I have to implement my own e.g. I-VT algorithm to calculate them in real-time?

papr 19 March, 2018, 16:21:46

@user-b116a6 Aah, I think I understand your issue now. You want to aggreate the fixtions as Player does, correct?

user-c14158 19 March, 2018, 16:22:12

Hi @wrp I don't know if you saw my reply last friday (I forgot to quote you) and I was wondering if you thought of something regarding my issue

papr 19 March, 2018, 16:22:52

@user-b116a6 Since Capture sends out fixation events as soon as they pass for the minimum fixation definition.

papr 19 March, 2018, 16:25:23

@user-b116a6 The online fixations have an id field. Fixation events that belong to the same fixation have the same id. You can accumulate fixation events until their id changes and calculate metrics, e.g. total duration, etc based on the collected fixation events

user-b116a6 19 March, 2018, 16:47:28

@papr Yes that is exactly what I need to do, thank you, I will try that and let you know.

user-c828f5 19 March, 2018, 23:41:37

Hey guys. has anyone experienced a very good pupil track but a very bad, post calibration Gaze point? I'm detecting pupils in offline mode and the detection works beautifully. However, post Natural Features calibration, the Gaze point appears all messed up. Gaze from Recording (I performed the same calibration live while recording) is much better than offline calibration.

papr 20 March, 2018, 08:34:30

@user-c828f5 I can have a look if you share the recording with me. Which Pupil version do you use?

user-78dc8f 20 March, 2018, 09:17:36

@papr Good morning. Working on what we discussed yesterday regarding merging calibration and test phase videos. I got ffmpeg and python installed and working. Managed to merge the videos and the numpy timestamps file, but I'm getting some errors when I try to playback in pupil player. As a test, I copied 4 files to a folder: eye0_timestamps.npy, eye0.mjpeg, world_timestamps.npy, world.mp4. These are the original calibration files. Pupil player won't open these. I must be missing some other file?

papr 20 March, 2018, 09:18:52

Ah, you will need the info.csv file as well. But you can take it from any of the original recordings. No need to merge them.

user-78dc8f 20 March, 2018, 09:20:04

@papr copied that file over. Now when I drop that into pupil player, player crashes...

user-78dc8f 20 March, 2018, 09:21:34

@papr just re-verified that the original opens with no problems...

papr 20 March, 2018, 09:22:11

Can you share the player.log file with us?

user-78dc8f 20 March, 2018, 09:22:26

@papr sure. how do i do that?

papr 20 March, 2018, 09:23:30

Just drag and drop the file into discord

user-78dc8f 20 March, 2018, 09:24:15

@papr 2018-03-20 09:20:51,961 - MainProcess - [INFO] os_utils: Disabled idle sleep. 2018-03-20 09:20:53,229 - player - [ERROR] player_methods: No valid dir supplied 2018-03-20 09:20:57,445 - player - [INFO] launchables.player: Starting new session with '/Users/nfb15zpu/Documents/J-Files/Dyadic/TestCalibration' 2018-03-20 09:20:57,446 - player - [INFO] player_methods: Updating meta info 2018-03-20 09:20:57,447 - player - [INFO] player_methods: Checking for world-less recording 2018-03-20 09:20:58,440 - player - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend 2018-03-20 09:20:59,022 - player - [INFO] launchables.player: Application Version: 1.5.12 2018-03-20 09:20:59,022 - player - [INFO] launchables.player: System Info: User: nfb15zpu, Platform: Darwin, Machine: C02PX3J2FVH8.local, Release: 17.4.0, Version: Darwin Kernel Version 17.4.0: Sun Dec 17 09:19:54 PST 2017; root:xnu-4570.41.2~1/RELEASE_X86_64 2018-03-20 09:20:59,045 - player - [INFO] camera_models: No user calibration found for camera world at resolution (1280, 720) 2018-03-20 09:20:59,046 - player - [INFO] camera_models: No pre-recorded calibration available 2018-03-20 09:20:59,046 - player - [WARNING] camera_models: Loading dummy calibration 2018-03-20 09:20:59,173 - player - [ERROR] launchables.player: Process Player crashed with trace: Traceback (most recent call last): File "launchables/player.py", line 245, in player File "shared_modules/file_methods.py", line 55, in load_object FileNotFoundError: [Errno 2] No such file or directory: '/Users/nfb15zpu/Documents/J-Files/Dyadic/TestCalibration/pupil_data'

2018-03-20 09:20:59,173 - player - [INFO] launchables.player: Process shutting down. 2018-03-20 09:21:01,176 - MainProcess - [INFO] os_utils: Re-enabled idle sleep.

papr 20 March, 2018, 09:25:44

Ah ok, you will need a pupil_data file as well. Again, take it from any of the original recordings. Player generates empty ones when converting Pupil Mobile recordings.

user-78dc8f 20 March, 2018, 09:27:58

@papr Bingo! Now original plays and my merged files play. I'll hand this off to my grad student from here so she can test if we can apply the calibration to the test phase now...will report back soon.

user-78dc8f 20 March, 2018, 09:29:00

@papr one question: you mentioned the possibility of dropping video frames. Should I check that, or can I just assume that if the video plays in pupil player that my timestamps and video frames match up?

papr 20 March, 2018, 09:34:18

Mmh, I think it would be best to ensure that, especially if you merge more than two videos. See this link on how to count the video frames: https://stackoverflow.com/questions/2017843/fetch-frame-count-with-ffmpeg

I suggest to use the Decode and count the number of frames method. Timestamps can be counted by loading the merged numpy file and looking at the shape of the loaded file.

user-78dc8f 20 March, 2018, 09:36:36

@papr thanks

user-e38712 20 March, 2018, 10:50:33

hello guys

user-e38712 20 March, 2018, 10:51:24

is there somewhere data format for Offline Surface Tracker like this one for other csv? https://github.com/pupil-labs/pupil-docs/blob/master/user-docs/data-format.md

user-e38712 20 March, 2018, 10:53:01

I mean these files

user-e38712 20 March, 2018, 10:53:04

Chat image

user-b23813 20 March, 2018, 11:19:16

Hi, I am using the Pupil Labs with 2 eye-cameras (2-D) and I noticed that the pupil diameter in the export file differs between the eyes. Is this normal? Which value of the two should I use as the pupil diameter? Maybe the means of the two eyes? Thank you.

papr 20 March, 2018, 11:21:55

@user-b23813 The 2d Pupil diameter is measured in pixels. Therefore it is very expected that the values differ between the eyes since it depends on the eye to camera distance.

user-b23813 20 March, 2018, 11:25:10

Thanks @papr.

user-b116a6 20 March, 2018, 13:12:18

@papr Following on yesterday's topic in regard to replicating the fixations.csv file in real-time. I am trying to test my code with the pupil_data file before moving to a real-time analysis test, however I seem to be off in the duration of the fixation. Also, when searching for dict['method'] == 'gaze' it doesn't seem to return anything as it only returns pupil method and in the csv file for the method field it states gaze. This data exists in the pupil_data file right?

papr 20 March, 2018, 13:20:06

@user-b116a6 The method is selected based on the pupil detection method. pupil uses pupil 3d model vectors to calculate fixations. If no 3d model data is available, gaze unprojects 2d gaze using the cameras intrinsics and uses the resulting 3d gaze vectors for fixations.

papr 20 March, 2018, 13:20:29

In general, I recommend to use the pupil method

user-b116a6 20 March, 2018, 14:05:34

@papr Alright, I will use pupil then, however I don't seem to understand how the start_timestamp and duration is calculated. Initially I set it to receive the timestamp of the first frame as the start_timestamp and to calculate the duration I subtracted that from the timestamp of the last frame with the same id but that seems to be wrong now that I am checking my output with the actual csv file.

papr 20 March, 2018, 14:14:33

Offline fixation detector does the aggreagation a bit differently. It defines a maximum duration and uses binary search in order to find the actual end of an fixation quickly.

user-b116a6 20 March, 2018, 16:05:07

@papr I run some tests with the code I have written offline and online and it seems that the results are not deviating very much and the fixations file I create is almost identical with the one created from the Player. Thanks for the help. 😀 Another question I have, is, what metrics should I use to calculate the saccades? Should I find an algorithm online and implement it since it is still under development from Pupil Labs?

papr 20 March, 2018, 16:07:59

@user-b116a6 Yes, that would be great. It would be very appreciated if you could make a pull request when you are done and contribute your saccade detector to the Pupil project.

papr 20 March, 2018, 16:08:42

Also, thank you very much for verifying the functionality of the Player fixation detector. 🙂

papr 20 March, 2018, 16:10:31

@user-dc2842 recommended http://ieeexplore.ieee.org/document/6944931/ but I do not think that this algorithm is suitable for online application.

user-c828f5 20 March, 2018, 20:31:47

@papr Thanks, Should I email it to you or send it over using Discord?

papr 20 March, 2018, 20:32:57

As you prefer.

user-8944cb 20 March, 2018, 23:24:03

Hi Everyone, our lab has just received the Pupil labs eye tracker. It is great to have a forum with previous questions! I will go over them, but have several questions to get me started...What calibration method would you recommend using for eye tracking during fast movement within ranges of between 1.5-3 meters (the gaze distance will change within this range during the trials) - the manual, single, or nature calibration? Also, is anyone knows of information regarding synchronization of eye tracking with motion capture?

wrp 21 March, 2018, 01:48:48

Hi @user-8944cb I would suggest trying the manual marker calibration. I would also suggest recording eye videos and to start recording prior to calibrating so that you can re-calibrate and even re-run pupil detection algorithms with Pupil Player post-hoc (if you want to have the option it is best to record calibration and eye videos).

wrp 21 March, 2018, 01:49:45

@user-8944cb regarding mocap + Pupil - please check out community contributed scripts in pupil-community repo: https://github.com/pupil-labs/pupil-community#scripts

mpk 21 March, 2018, 11:23:04

@here We just pushed a Pupil software release: https://github.com/pupil-labs/pupil/releases/tag/v1.6

user-f1eba3 21 March, 2018, 16:20:11

Hi there,

user-f1eba3 21 March, 2018, 16:20:46

I was trying to calibrate pupil on an htc vive and I changed the video settings as instructed on the website

user-f1eba3 21 March, 2018, 16:21:10

Now what should I do such that pupil recognises my world camera

papr 21 March, 2018, 16:21:41

There is no world camera on the htc vive add on.

user-f1eba3 21 March, 2018, 16:22:06

well can't I link it tot the htc camera ?

user-f1eba3 21 March, 2018, 16:22:11

since I have it

papr 21 March, 2018, 16:22:43

You need to provide the reference markes yourself for the calibration. The idea is that you visualize markers in your virtual scene and calibrate gaze to that scene.

papr 21 March, 2018, 16:23:13

You will have to use the HMD Calibration procedure for that.

user-f1eba3 21 March, 2018, 16:23:36

Ok

user-8944cb 21 March, 2018, 22:21:50

@wrp Thanks for your reply! When you say start recording before the calibration, do you mean do the calibration as part of the recording? What do I need to press then the 'C' or the 'R' to record when calibration? Or do you mean first record, calibrate and then record again?

wrp 22 March, 2018, 08:49:59

@user-8944cb Example workflow: 1. Ensure pupil is well detected and eye cameras properly adjusted 2. In Recorder click record eye - this will record eye videos when you start recording 3. Start recording by pressing R button in the GUI or r on the keyboard when the world window is in focus 4. Start calibrating by pressing the C button in the GUI or c on your keyboard when the world window is in focus 5. Stop recording when done by pressing r or clicking R in the GUI

user-88dd92 22 March, 2018, 11:15:46

Hello all,

user-88dd92 22 March, 2018, 11:18:51

I am having problems getting audio in my recordings. I am using the v1.6 of Capture and Player not from the source , however I cannot see a audio.mp4 being exported anywhere. Do I need to run portaudio and its pyaudio wrapper as if I were running it from the source?

user-88dd92 22 March, 2018, 11:18:55

Thank you!

papr 22 March, 2018, 12:01:43

@user-88dd92 Your recording needs an audio.mp4 in order to play it back correctly. There is no audio only export.

user-88dd92 22 March, 2018, 12:06:23

Hi papr, when recording videos I can see the audio plugin module say 'recording audio.mp4', but this file is not generated in the recordings folder or in the exports folder, am I looking for it in the wrong place. Furthermore, when I open the .mp4 file (world.mp4 or world_viz.mp4) in an external player (VLC) the details show no audio in the file.

papr 22 March, 2018, 12:14:42

@user-88dd92 What microphone do you use and which operating system?

user-88dd92 22 March, 2018, 12:15:37

I am using headphone + mic that pupil capture recognizes on Ubuntu OS (also I have checked that sounds can be reproduced and there are no driver errors)

user-88dd92 22 March, 2018, 12:17:49

the headphones+mic is Microsoft LiveChat LX-3000

user-88dd92 22 March, 2018, 12:18:24

they seem to work fine on Ubuntu for everything else (videochat and audio recording)

papr 22 March, 2018, 12:21:26

Let me test a similar setup.

user-88dd92 22 March, 2018, 12:21:37

great! thank you very much!

user-88dd92 22 March, 2018, 14:20:12

@papr, did you face similar issues or did it all run smoothly for you? Once again, many many thanks for all your help.

papr 22 March, 2018, 14:28:06

Unfortunately, I am having trouble with listing audio input devices in the Audio Capture plugin if I run the bundle. We will try to resolve this issue as soon as possible.

user-88dd92 22 March, 2018, 14:32:13

OK, I will stay posted to see any developments. Thank you so very much for the terrific support and continuous development of this awesome tool! Cheers

user-f1eba3 22 March, 2018, 15:18:52

is there some documentation regarding the 2d vs 3d mode and each of its upsides ?

papr 22 March, 2018, 15:28:51

Short version: 2d mode uses polynomial regression for mapping based on the pupil's normed 2d position. Can be very accurate but is prone to slippage and does not extrapolate well outside of the calibration area.

3d mode uses bundle adjustment to find the physical relation of the cameras to each other, and uses it to linearly map pupil vectors that are yielded by the 3d eye model. The 3d model is constructed from a sequence of 2d ellipses.

papr 22 March, 2018, 15:29:59

3d mode is on average a bit less accurate than 2d mapping but is less prone to slippage due to the refitting of the eye model.

user-f1eba3 22 March, 2018, 15:42:42

So regarding the Calibration procedure with hmd eyes

mpk 22 March, 2018, 15:42:59

For hmd eyes use 2d

user-f1eba3 22 March, 2018, 15:43:57

should I import the Unity packages ? ( considering I do not intend to develop anything further on Unity and just calibrate my pupil ?

papr 22 March, 2018, 15:45:57

@user-f1eba3 So did you switch to Unity? I thought you were developing for a different platform?

user-f1eba3 22 March, 2018, 15:48:57

so i developed my zmq plugin for unreal

user-f1eba3 22 March, 2018, 15:49:21

and I am able to receive data from pupil capture

user-f1eba3 22 March, 2018, 15:49:30

its just that i want now to calibrate my device

user-f1eba3 22 March, 2018, 15:49:53

because at the moment i don't see any gaze data (since I never calibrated Pupil)

user-f1eba3 22 March, 2018, 15:50:08

and from what i understand the best way to do it is using Unity

papr 22 March, 2018, 15:50:59

Ehm, there should be some dummy gaze. make sure to subscribe to gaze and that at least one eye process is running.

user-f1eba3 22 March, 2018, 15:51:35

ok but either way i'm going to need to calibrate the pupil on the way

papr 22 March, 2018, 15:51:39

These are the relevant notifications for hmd calibration: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/calibration_routines/hmd_calibration.py#L51-L60

user-f1eba3 22 March, 2018, 15:51:39

so i decided to do it now

papr 22 March, 2018, 15:53:07

But if you are able to import unity packages, why bother to write your own unreal code? I thought both platforms were imcompatible to each other (I am no export in this though!)

user-f1eba3 22 March, 2018, 15:53:18

well

user-f1eba3 22 March, 2018, 15:53:38

The ideas that we are going to use pupil on a bigger reasearch platform

user-f1eba3 22 March, 2018, 15:53:49

open ease and most stuff is developed in Unreal

papr 22 March, 2018, 15:55:20

I would recommend to reuse code as much as possible. So if you are able to import the hmd eyes Unity packages, then do so. No need to rewrite everything then.

user-f1eba3 22 March, 2018, 15:55:34

i don't want to

user-f1eba3 22 March, 2018, 15:56:11

My question is as follows : To perform just the calibration for my pupil do I need the packages ?

papr 22 March, 2018, 15:57:38

No. Calibration is done in Pupil Capture. You will need to provide reference positions via notifications (see my link above). Capture will collect pupil data on its own and calibrate when the procedure is done.

user-f1eba3 22 March, 2018, 16:54:18

Thx for the info

user-f1eba3 22 March, 2018, 19:34:08

Hey papr

user-8944cb 22 March, 2018, 22:21:55

Hi Everyone, I am exploring possibilities of using the Pupil mobile (we have recently purchased the headset). Can I use a mobile that is not directly provided by the company (Assuming it is one of those that is supported) purchese the cable, and download the app, or does it has to be directly ordered from the company? Thanks!

wrp 23 March, 2018, 03:18:15

@user-8944cb - Our current tests show that the Moto Z2 Play, OnePlus 3/3T/5/5T, Nexus 5x, and Nexus 6p are the most reliable and robust devices for Pupil Mobile. While other Android devices with USBC ports can theoretically support Pupil headsets, we have found that not all USBC controllers are created equal and not all vendors provide full support of the USBC spec on a driver/firmware level.

user-c71581 23 March, 2018, 03:30:29

I think I might have posted in the wrong channel earlier:

Hey guys, working on the diy pupil hardware.

Is there alternatives to the exposed black film (can't find any local); I purchased the $25 recommended alternative in the Google " list of materials" - unfortunately it was a glass nir filter that was too large.

Tried using polarizing filters (salvaged from 3d real-d glass) and I'm having trouble with getting the lens to focus and display a clear image.

I'm assuming if I try the exposed black film, it should fix problem with focusing a clear eye image; but I'm having trouble finding the material

user-c71581 23 March, 2018, 03:31:22

The ir leds work for certain: double checked with my cell phone camera and you can see the violet light glowing

Also tried manually adjusting the physical distance of the eye camera mount, the software focus, and the lens finger adjustment

Oh, also tried floppy disk magnetic film as a filter as well... No luck

user-c71581 23 March, 2018, 03:32:53

I tried using the undeveloped film inside of the canisters for point and shoots from Wal-Mart, but that had a silver foil? That i would need to develop with chemicals and expose to the sun or something.

wrp 23 March, 2018, 03:35:32

Just responded in the vr-ar eyes channel. This is the correct channel for this post. Thanks for reposting here

wrp 23 March, 2018, 03:37:11

as noted earlier in the thread (but somwhat hard to find) - for the DIY headset we recommend Agfa Precisa 100

user-c71581 23 March, 2018, 07:17:24

Thank you ! - just to double check (is this it?): https://www.freestylephoto.biz/1175268-AgfaPhoto-CT-Precisa-E6-Slide-100-ISO-35mm-x-36-exp.?gclid=EAIaIQobChMIycjsrvKB2gIVA57ACh2uJwy0EAQYASABEgLxz_D_BwE

Also, is there any other alternatives. I'm building the head set for research and am running short on time (trying to find materials in driving distance).

user-88dd92 23 March, 2018, 13:11:28

Does anyone know if previous versions that support 200Hz srate of Pupil Capture can record sounds/noises? Cheers

papr 23 March, 2018, 13:40:07

Capture 1.5 should work for audio recordings on Linux

user-88dd92 23 March, 2018, 13:40:26

thanks! I will give it a try!

user-c71581 23 March, 2018, 16:26:42

Ok, reading through old posts: basically buy the "agfa ct precisa 100", develop it unexposed to sunlight. And it should work as a ir pass filter

papr 23 March, 2018, 16:26:58

Correct

user-c71581 23 March, 2018, 17:00:21

I'm really grateful that this discord chat exists - it's really helpful and if I found it earlier, would have saved me a lot of time and frustration, haha

This was on the Google docs material sheets for DIY pupil - https://docs.google.com/spreadsheets/d/1NRv2WixyXNINiq1WQQVs5upn20jakKyEl1R8NObrTgU/pub?single=true&gid=0&output=html

under depreciated tabs: https://imgur.com/a/fnp11 - ir-bandpass filter for eye cam - NIR Optical Filter, 850DF20, 11.5mm painted edge

For future references if anyone is building according to the list and cannot find "exposed (black) film negatives" (which means developed without exposure to sunlight), it does not fit the DIY pupil as a NIR pass filter

user-c71581 23 March, 2018, 17:02:13

Left side is the DIY lens for eye camera, right side is the nir pass filter that is too large and is on the Google docs shopping list link

user-c71581 23 March, 2018, 17:07:21

Also, depending on your area: most photo labs have been taken over by Wal-Mart, cvs, and Walgreens (USA). Trying to consult them for film negatives is not possible due to those centers off sourcing the development of their film and no longer doing it in house.

user-b116a6 23 March, 2018, 17:09:08

Hey guys, I don't know if this is quite relevant, however, I am developing a web application locally which will be used as a GUI for the commands I was using so far. The user will login, press a button and on the click of the button I want to execute firstly the pupil_time_sync_master.py which is provided to synchronize the Pupil Clock with the Unix_Epoch. The rest of the process is not relevant. I tried everything so far but the script does not execute. Can anyone help? I am using MacOS High Sierra 10.13.3, XAMPP 7.2.3-0 to host the web server, html, and python.

user-c71581 23 March, 2018, 17:10:21

Try local photo labs if available in the area, different friends/family old photo box (becoming obsolete due to digital Photography), or the "Agfa Precisa 100" as recommended in previous chat.

user-c71581 23 March, 2018, 17:25:54

And I see why that's the wrong size... That subheading says "deprecated" as in not recommended rather than "depreciated" as in lower price.

Can we change that material list sub heading to "not recommended" or simply remove them - the links are either sold out or dead and add a lot more confusion

Rather than placing the "Agfa Precisa 100" with a description to develop the film unexposed to sunlight to use as a NIR pass filter for the DIY pupil eye camera

user-c71581 23 March, 2018, 17:30:21

And a link to the discord chat at the top of that Google document so users who are DIY can read through the chat and find up to date information on materials and previously asked questions

user-8944cb 23 March, 2018, 19:02:02

@wrp Thanks for your answer! I recorded and calibrated. Now I am trying to record again. However, when I move the recorded file to Pupil Player it writes that "No user calibration found for camera eye1 at resolution (400,400)...Loading Dummy calibration...No pre-recorded calibration available" and same for second eye. What am I doing wrong? Thans!

user-b23813 24 March, 2018, 12:48:11

Hi guys, I am trying to export blinks with confidence of 0.2 and less. I changed both the onset and offset thresholds to 0.2 but the export file includes blinks with confidences higher than 0.2 too. I still get blinks with confidences higher than 0.2 even if I change the offset threshold to 0.25 and leave the onset to 0.2. Should I have done something differently? I also wanted to ask what the filter length (in seconds) means at the pupil player? Thanks!

user-42b39f 26 March, 2018, 13:21:17

Hi All,

user-42b39f 26 March, 2018, 13:24:15

Our lab just received the pupil lab hardaware (2x200Hz+word camera). Following the instructions, I am getting pupil detction but when it comes to the calibration process, I am getting this error message:"Not enough ref point point or pupil data". What could be the issue ? Thanks for the help

user-f1eba3 26 March, 2018, 13:25:29

Hi has anyonr had some experience with msgpack ?

user-f1eba3 26 March, 2018, 13:30:37

I want to use the msgpack C version to receive the data from Pupil Capture

wrp 27 March, 2018, 04:14:13

@user-8944cb did you calibrate in Pupil Capture? If not, then you will need to conduct calibration offline. Please see the offline calibration section here: https://docs.pupil-labs.com/#data-source-plugins

wrp 27 March, 2018, 04:16:17

@user-42b39f what calibration procedure are you using? If you are seeing this message it means that either reference points (e.g. markers shown on screen or manual markers) were not well detected or that the pupil was not well detected. Based on your note I would assume that the references were not detected or specified. Please follow up with more information about your calibration process.

user-42b39f 27 March, 2018, 06:28:47

@wrp I tried 3 methods (capture 1.16.14 on macbook pro os 10.13.3) : screen calibration (on 2 screens : macbook laptop screen and thunderbold display with full screen mode), manual marker and finger calibration.With the screen calibration process, I did notice how I could add more markers. I increased the sample duration as well but it did not change the error message. For the manual marker calibration, I guess the marker is recognized because it is filled with blue at each new position and the stop one is filled with red. Note that I went through the eye processes to be sure that the pupil was well detected. With the 200 Hz camera, it looks like the image is out of focus but I think that the positioning was rather correct (pupils were recognized nearly continuously). I changed the resolution from the default (192,192) to (400,400) but it did not change anything. The exposure is not exactly the same for the 2 cameras but I do not have access to automatic exposure mode (perhaps it is normal?).

user-e38712 27 March, 2018, 07:37:56

Hi guys, I want to build Pupil on Windows and I'm on last stage trying to run run_capture.bat. I'm getting ModuleNotFoundError: No module named 'msgpack'

however msgpack is installed

C:\work\pupil\pupil_src>pip install msgpack Requirement already satisfied: msgpack in c:\python36\lib\site-packages

user-e38712 27 March, 2018, 07:38:59

any ideas why it's not seeing module?

user-e38712 27 March, 2018, 07:43:15

ok, I found errors when installing msgpack

user-e38712 27 March, 2018, 07:43:31

pip install msgpack Collecting msgpack Using cached msgpack-0.5.6.tar.gz Installing collected packages: msgpack Running setup.py install for msgpack ... error Exception: Traceback (most recent call last): File "c:\python36\lib\site-packages\pip\compat__init__.py", line 73, in console_to_str return s.decode(sys.stdout.encoding) UnicodeDecodeError: 'utf-8' codec can't decode byte 0xbe in position 155: invalid start byte

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "c:\python36\lib\site-packages\pip\basecommand.py", line 215, in main status = self.run(options, args) File "c:\python36\lib\site-packages\pip\commands\install.py", line 342, in run prefix=options.prefix_path, File "c:\python36\lib\site-packages\pip\req\req_set.py", line 784, in install **kwargs File "c:\python36\lib\site-packages\pip\req\req_install.py", line 878, in install spinner=spinner, File "c:\python36\lib\site-packages\pip\utils__init__.py", line 676, in call_subprocess line = console_to_str(proc.stdout.readline()) File "c:\python36\lib\site-packages\pip\compat__init__.py", line 75, in console_to_str return s.decode('utf_8') UnicodeDecodeError: 'utf-8' codec can't decode byte 0xbe in position 155: invalid start byte

user-e38712 27 March, 2018, 07:48:06

do you know how to solve it?

user-c9c7ba 27 March, 2018, 08:05:43

Hello, anybody a good documentation about data received from pupil capture through zemQ?...

I have submited to gaze and I become topics gaze.3d.0, then gaze.3d.1 and gaze.3d.01?..

papr 27 March, 2018, 08:49:57

@user-c9c7ba The different topic endings indicate on which pupil data the gaze datum is based on. - 3d indicates the gaze mapping type. - A trailing .0. indicates that the gaze datum was monocularly mapped based on eye0 data - A trailing .1. indicates that the gaze datum was monocularly mapped based on eye1 data - A trailing .01. indicates that the gaze datum was binocularly mapped based on eye0 and eye1 data

papr 27 March, 2018, 08:50:58

@user-e38712 Try running pip3 install msgpack The 3 is important (as far as I know)

user-e38712 27 March, 2018, 08:52:57

with pip3 I'm getting same Exception

user-e38712 27 March, 2018, 08:53:25

this is my pip version if it can help solve problem: pip 9.0.3 from c:\python36\lib\site-packages (python 3.6)

papr 27 March, 2018, 08:53:56

@user-e38712 This looks correct. Mmh

user-c9c7ba 27 March, 2018, 08:54:48

@papr thanks for reply... than when I need to construct a graph where x is time and y is gaze position, should I use .01 topic with pupil_point?....

and when I need to construct another graph where x is time and y is eye position can I use .0. for it?

papr 27 March, 2018, 08:55:00

@user-e38712 Could you try running pip3 install -U msgpack_python

user-e38712 27 March, 2018, 08:59:03

@papr I got the same error. I'm not the only one facing that problem, there are few issues on github but msgpack dev keeps saying "go to stackoverflow" or "it's pip problem not msgpack"

papr 27 March, 2018, 09:05:47

@user-c9c7ba Not exactly. Let me clarify the term we use:

pupil datum: The result of the pupil detector. The norm_pos field is the pupil position relative to the eye image.

gaze datum: The result of mapping one or two pupil datums into the world camera coordinate system. the norm_pos field is the gaze position relative to the world camera image.

If you want to visualize eye positions, you will have to visualize pupil data. If you want to visualize gaze data you will need to use the gaze data.

papr 27 March, 2018, 09:06:32

@user-e38712 Mmh, that is annoying =/ Unfortunately I am not able to reproduce this issue.

user-e38712 27 March, 2018, 09:10:23

@papr yes it is :/ No problem, thanks anyway for your time

user-e38712 27 March, 2018, 09:10:54

I hope it will not take me too long to find solution

papr 27 March, 2018, 09:12:43

@user-e38712 What is the full folder path from which you try to install?

user-e38712 27 March, 2018, 09:16:54

@papr actually from C:\Users\zezima\Downloads, does it even matter?

user-e38712 27 March, 2018, 09:18:42

all other libs I've installed from same path and everything went smooth

papr 27 March, 2018, 09:19:28

@user-e38712 It does matter if it had included non-unicode characters: https://stackoverflow.com/questions/25036897/pip-install-unicodedecodeerror

wrp 27 March, 2018, 09:19:35

@user-e38712 what versions of python are installed on your Windows OS and what version of Windows OS?

user-e38712 27 March, 2018, 09:21:43

@papr you're right, but there are no non-unicode characters @wrp Python 3.6.1, Windows 10 Pro 64bit

user-e38712 27 March, 2018, 09:22:08

@wrp Python is 64bit as well ofc

wrp 27 March, 2018, 09:29:41

@user-e38712 in this case you should not need to specify pip3 since you only have python3 installed on your system

wrp 27 March, 2018, 09:31:41

an alternative is to use a wheel from Gohlke https://www.lfd.uci.edu/~gohlke/pythonlibs/#msgpack

wrp 27 March, 2018, 09:32:01

and then pip install <name of .whl file>

user-e38712 27 March, 2018, 09:32:53

@wrp yes, I have no other 2.x python versions installed. pip or pip3 are doing the same. I have also newest version of pip. Cool I'll try this out in a moment

user-e38712 27 March, 2018, 09:34:48

@wrp looks good

C:\Users\zezima\Downloads>pip install msgpack-0.5.6-cp36-cp36m-win_amd64.whl Processing c:\users\zezima\downloads\msgpack-0.5.6-cp36-cp36m-win_amd64.whl Installing collected packages: msgpack Successfully installed msgpack-0.5.6

wrp 27 March, 2018, 09:38:28

If you want to install msgpack from pip install msgpack then you need Visual Studio or MSVC++ installed

wrp 27 March, 2018, 09:38:49

so, perhaps it did not build due to missing compilers/dependencies

user-e38712 27 March, 2018, 09:39:12

@wrp much thanks man, looks like with msgpack lib everything is fine now

wrp 27 March, 2018, 09:39:22

the wheel file on the other hand ships with libs compiled for you, so you don't need to have MSVC++ installed

wrp 27 March, 2018, 09:39:45

so, if you want to use pip install msgpack please follow the instructions in the README of the github repo for msgpack-python

user-e38712 27 March, 2018, 09:39:49

@wrp I'm doing everything step by step like here: https://github.com/pupil-labs/pupil-docs/blob/master/developer-docs/windows.md

wrp 27 March, 2018, 09:39:56

did you install MSVC?

wrp 27 March, 2018, 09:41:14

were you using x64 Native Tools Command Prompt?

user-e38712 27 March, 2018, 09:41:31

both questions: Yes

wrp 27 March, 2018, 09:44:05

well, then in your case I don't know what else to say. Setting up a dev system for Pupil on Windows is quite the process. If you did not start from a clean slate (freshly installed OS), then there could be some environment conflicts - but hard to say without looking. Pleased that there was a workaround for you though

user-e38712 27 March, 2018, 09:49:15

@wrp yes, there are many steps to do and my OS is not freshly instaled. For now I'm taking a break, thanks for help and if I have more problems I'll let u know 😃 BTW today I'll test Pupil 1.6. Some cool features!

wrp 27 March, 2018, 09:50:27

@user-e38712 are you running from source or you tested the v1.6 bundle?

wrp 27 March, 2018, 09:54:15

@user-42b39f please could you try calibration again on the screen using screen marker calibration method. Please ensure that you are using full screen and that there are no other markers in the scene (e.g. a printed marker visible on the table)

user-e38712 27 March, 2018, 09:55:32

@wrp I'll test it from bundle, got another problem while running run_capture.bat. I'll go back to it tomorrow

wrp 27 March, 2018, 09:56:32

@user-e38712 is there an absolute necessity to run from src for your work/research?

wrp 27 March, 2018, 09:56:59

I ask because another option could be to develop a plugin or subscribe to IPC over the network

wrp 27 March, 2018, 09:57:18

this way you could just run the bundle and focus on developing the plugin/application

user-e38712 27 March, 2018, 09:58:59

@wrp not sure, I want to develop plugin. But It's not necessery to run from source?

wrp 27 March, 2018, 09:59:26

@user-e38712 you do not need to run from source if all you want to do is develop a plugin

wrp 27 March, 2018, 10:00:07

https://docs.pupil-labs.com/#plugin-api

wrp 27 March, 2018, 10:00:34

You can add your plugin to the /plugins directory within the pupil_capture_settings dir or pupil_player_settings dir

wrp 27 March, 2018, 10:00:47

these dirs get created in your home dir after you run the bundle the first time

user-e38712 27 March, 2018, 10:00:49

@wrp good to know 😉 I should have check it on my own in docs, my friend told me it's necessary

wrp 27 March, 2018, 10:01:18

@user-e38712 you can do a lot with plugins 😄 but it really depends on what you are trying to achieve

user-e38712 27 March, 2018, 10:05:10

@wrp actually I want to create plugin that will count fixations number, time, average time on surfaces and be able to generate several different heatmaps based on these data. I don't know if I'm able to write gui in plugin. I also want to develop features like generating charts e.g time/fixations on specified surface

wrp 27 March, 2018, 10:06:32

This should all be possible via plugin - @papr do you see any blocking cases based on @user-e38712's description above

user-e38712 27 March, 2018, 10:11:36

@wrp Thanks guys, you are very helpful. I have to go now, I'll come back later and read @papr papr response. Have a good day!

wrp 27 March, 2018, 10:16:12

likewise 👋

papr 27 March, 2018, 10:24:56

In case that you want to do this in an offline fashion: I would recommend to work with the exported csv data instead of writing a plugin. The exported data should contain all the data that you need.

user-42b39f 27 March, 2018, 11:18:47

@wrp I performed again a screen calibration with the default settings and with no other markers nearby and got the same message. As I don't know how to add more ref point in this calibration process, I guess that it has something to do with the pupil detection. However when I go in eye0 and eye1 process and set the parameters with the algorithm on, it looks as if the pupils were well recognized. I have a red ellipse contouring and a central dot. Is there anyway of checking the quality of the pupils detection during the calibration procedure ?

user-c14158 27 March, 2018, 11:50:30

Hi, i have a new issue with the lastest version of pupil capture (1.6.11, 200hz eye camera and I'm on Windows 10), the eye 1 video regurlaly freeze for a short moment and the gaze mapping on the world image freeze with it. The problem occurs when the resolution is set to 192*192 for any framerate. If i lower the exposure absolute time the video seems to works fine but then the value is automaticaly reset to 32 after a few seconde and the video start freezing again. I do not have the same issue for the eye 0 video. And the same lines appears continuously in the prompt.

user-c14158 27 March, 2018, 11:50:43

Chat image

mpk 27 March, 2018, 12:25:09

@user-c14158 can you share the logfile?

user-c14158 27 March, 2018, 13:24:46

sure, however all hose lines are not in the logfile

user-c14158 27 March, 2018, 13:25:05

capture.log

mpk 27 March, 2018, 13:28:20

@user-c14158 can you contact us vie info[at]pupil-labs.com I think this might be a HW issue.

user-8944cb 27 March, 2018, 13:37:50

Hi @wrp , thanks for replying. Yes, I did calibrate in Pupil Capture, and I tried the screen calibration method (not sure how to additional calibration points), the manual marker method (tried adding more than 9 points for calibration), and the single marker calibration. I tried on two computers, and in the 1.16.14 and the previous capture software. In all of them the accuracy and precision values in the 'accuracy visualizer' after calibration look good, but I got this error every single video I recorded after the calibration and tried to open in pupil player. I can't find any information about this in the Pupil Docs. Thank you!

user-c14158 27 March, 2018, 13:42:03

@mpk I tried again with pupil capture 1.5.12 and i can't replicate this issue, that why I thought it was SW related

papr 27 March, 2018, 13:46:07

@user-c14158 We think that the v1.6 issue is due to our stripe issue detector that we implemented. It resets the camera's framer ate automatically when stripes are detected. Your camera seems to trigger a lot of false positive detections.

Please share an example recording with us thtat was recorded using v1.5 and includes the eye videos. You need to enable the Record eyes option in the recorder menu.

We can use it to improve the false postive rate of our detector.

user-c14158 27 March, 2018, 14:45:01

@papr I shared the recording with info@pupil-labs.com

papr 27 March, 2018, 14:46:12

Thank you

user-f1eba3 27 March, 2018, 15:37:40

Hi Pupil guys is there anywhere with the specification of the message that is to be send on every topic

user-f1eba3 27 March, 2018, 15:38:32

I'm trying to receive data from Pupil service in a static way with the core zmq C Api which means I could use to now exactly how the message is encoded when being send 😄

papr 27 March, 2018, 15:43:01

A zmq message is composed of one or more zmq frames. Notifications are composed of at least two zmq frames. The first simply contains the topic, and the second frame includes msgpack encoded data.

mpk 27 March, 2018, 15:45:14

@user-f1eba3 I recommend this: https://docs.pupil-labs.com/#the-ipc-backbone

mpk 27 March, 2018, 15:46:27

to print the message you are intersted in to learn about the specific payload.

mpk 27 March, 2018, 15:48:01

@user-f1eba3 just to clarify, dont use python instead of C, just use the script to introspect to learn about the format on the wire.

user-921ec1 27 March, 2018, 21:49:36

Hi just testing out pupil mobile, and can't seem to calibrate or lock on to pupil? Is there a way to do this in standalone mode (i.e., just running from the app, no connection to pupil service/recorder on a pc)?

user-051e49 27 March, 2018, 21:59:11

Hi guys, i want to using pupil on Windows in real time but dont know how to start. I want to create program to pattern recognition based on Yolo darknet. My ask is: its possible to create stream with data (Pupil files csv with the meta data & mp4 video) in own module without compilation pupil? And if it's possible how i should start?

wrp 28 March, 2018, 04:29:04

@user-921ec1 Pupil Mobile app will not perform pupil detection. It is designed to be used for local recording of sensor data (eye videos, world videos, audio, imu data) and/or for broadcasting video data over a wifi network to another computer running Pupil Capture.

wrp 28 March, 2018, 04:29:57

If you record data locally on your android device with Pupil Mobile, you will use Pupil Player to perform offline (post-hoc) pupil detection and calibraiton for gaze estimation. This means that you should include the calibration procedure(s) in your recording.

wrp 28 March, 2018, 04:30:38

Alternatively you could stream over wifi to another computer running Pupil Capture and perform pupil detection and calibration on the subscriber/client desktop/laptop

wrp 28 March, 2018, 04:33:13

@krzysztof.kozubek#6593 Maybe you want to take a look at https://pupil-labs.com/blog/2017-12/real-time-object-recognition-using-fixations-to-control-prosthetic-hand/ - this community member has a very well documented project that demonstrates how you can create a plugin and extend Pupil to do object classification

user-051e49 28 March, 2018, 05:33:11

@wrp yes i saw it. If i good understand to create object classification i dont need change code pupil and using source code? [email removed] @zezima)

user-42b39f 28 March, 2018, 06:28:17

With regards to my calibration issues, is it normal that the camera are fitted in a way that on the left eye the upper lid appears upward and on the right eye it is downward ?

mpk 28 March, 2018, 06:29:38

@user-42b39f yes, is just a display issue, you can set "rotate image" in the eye window general settings to remedy this.

user-42b39f 28 March, 2018, 10:12:23

@mpk thanks for your reply. Unfortunately it does not give me any indication how to solve the calibration issue. I still can't get any calibration functionning. How can I check the quality of the pupil dectection during the calibration process ? I don't know really how to qualify good pupil detection or to optimize it.

user-f1eba3 28 March, 2018, 11:16:53

I'm very close to my c++ Comunication with pupil but for some reason I got stuck

user-f1eba3 28 March, 2018, 11:18:49

I even made a google docs to better comprehend where I got stuck

user-f1eba3 28 March, 2018, 11:19:04

If anyone is kind enought to take a look please do : https://docs.google.com/document/d/1AYT1HqmhT2tnS5kr6gi5J34aIaZnjc7GSSqrYYtiL8w/edit

mpk 28 March, 2018, 11:50:38

@user-42b39f I think the best way to debug is to set up a quick video meeting. Can you reach out to us via email (info[at]pupil-labs.com ) ?

user-42b39f 28 March, 2018, 12:35:17

@mpk yes I will, thanks

user-02ae76 28 March, 2018, 15:31:25

Hey I've been running into a persistent issue where I cannot get pupil groups to successfully run on my MacBook Pro (2017). I usually can get both camera and instances working on the first go, then the second instance "loses" the eye tracker and will no longer recognize it. I am wondering if this is a processing power issue but I am unsure how to isolate it.

user-02ae76 28 March, 2018, 16:06:19

If anyone has tips or experience with Pupil Groups, I'd greatly appreciate any advice.

user-f1eba3 28 March, 2018, 17:24:13

So I managed to receive data in Unreal with c++. Now the question is how can I get the device calibrated as easy as possible

user-f1eba3 28 March, 2018, 17:24:37

I might attempt to reproduce something that is in hmd eyes but is there any other way ?

user-1bcd3e 28 March, 2018, 19:15:12

hey! Today I have made a pretty long record with my macbook pro 2017 and Pupil Capture (kind of 5.12 gb of recording seems) but I'm not able to lunch it on Pupil Player 😦 I suppose the file is too big to been upload from Player.... is there any possibilities to solve this issue? xx

user-e7102b 28 March, 2018, 20:37:45

@mpuccioni#0374 I can open similar size files on my MacBook Pro 2017. They take a while to load but it works ok. How much ram do you have on your machine?

user-c71581 28 March, 2018, 22:26:45

@papr hey papr, the ct precisa 100 came in today - I dunno if you're in america, but Wal-Mart, cvs, Walgreens and a bunch of other places develop but do not return the film negatives

Do you have any advice. I'm thinking an online company, possibly

Chat image

user-1bcd3e 29 March, 2018, 05:47:37

@user-e7102b i have 8 GB ram !

user-1bcd3e 29 March, 2018, 05:47:50

I'll try again thnks

user-e7102b 29 March, 2018, 05:49:08

Ah ok, I have 16GB on my machine. Maybe just give it a few minutes to load...

user-1bcd3e 29 March, 2018, 05:49:45

oh yes !

user-1bcd3e 29 March, 2018, 05:50:03

I'll let u know 😃

user-f1eba3 29 March, 2018, 14:01:40

Does anyone now where the schema of the pupil message on different topics is stored in code ?

user-b23813 29 March, 2018, 17:11:17

Does anyone know how I can export only the blinks with a confidence lower than 0.2? I can't understand what to put as onset and offset confidence thresholds to get this result...

user-c71581 29 March, 2018, 22:10:48

So running into problems with the ''ct precisa 100'' nobody in the area develops this film even checked Houston since I'm in Texas - the only route is to pray someone does it online

If anyone at pupil labs has extra film negatives that they've used and is willing to sell it to me; I'll buy it and pay for postage considering that to develop my film takes about half a month

user-c71581 29 March, 2018, 22:12:52

I would simply order a color slide film developing kit from Amazon, but that's going to run over $120

user-bfecc7 30 March, 2018, 14:13:59

Hey Everyone, just looking for advice on which method of calibration I should use. We are using pupil mobile and saving directly to the phone (no wifi, will put into pupil player after data collection), and walking around outside. There is no set depth of view that our subject's will be looking that as the task is simply to explore the environment. With that in mind and reading the various calibration methods, I believe that the natural features method is probably the best way to go, but looking for any and all suggestions. Thanks!

user-ed537d 30 March, 2018, 23:31:47

@papr apologies for bringing up old topics and apologies to anyone that has tried contacting me regarding the matlab script. My question is a follow up on the converstaion you had with @user-cf2773 in early March. I calibrated using 2D how would I go back and determine the visual angle X/Y of where the user is looking.....its publishing time 🙏

user-ed537d 30 March, 2018, 23:36:27

@user-e7102b strange are you porting the information over the same computer? I remember having issues a while back with this and the issue was an internal matlab buffer that I was able to work around based off placement of fclose or fopen another work around I used was downsampling the data on the python side using a circular buffer that would average whatever pupil info it received and send it at the "new" sampling rate

user-8b186c 31 March, 2018, 04:35:08

Hello everyone! Does anybody knows, what image sensor model is used in Pupil eye camera?

user-c71581 31 March, 2018, 04:36:42

Do you mean the mathematical formula?

user-8b186c 31 March, 2018, 13:35:53

I mean image sensor chip model and it's manufacturer... omnivision, sony? And what kind of shutter it using - global or rolling?

user-d2e36d 31 March, 2018, 15:20:51

rolling

user-d2e36d 31 March, 2018, 15:21:30

only global shutter cameras I can find are TTL (analog)

user-d2e36d 31 March, 2018, 15:21:43

at least the tiny ones

user-d2e36d 31 March, 2018, 15:23:22

should I be using a stack of 3 or 4 of these? Then the visible wavelength still passes but barely

user-8779ef 31 March, 2018, 21:28:57

Hey folks, I've found some concerning timestamps coming back from the fixation detector.

Chat image

user-8779ef 31 March, 2018, 21:29:29

The fixation durations suggest that some fixations begin before the previous fixation has ended.

End of March archive