πŸ‘ core


user-c9c7ba 01 April, 2018, 08:53:31

Hi, one more question... I am using HMD-eyes package in unity... But I am confusing about sending requests.... Especialy this piece of code, I am using HTC vive, and whz frame size is 1000x1000?, what is this treshold? and why I should use this translations?:

Send (new Dictionary<string,object> { { "subject","calibration.should_start" }, { "hmd_video_frame_size", new float[] { 1000, 1000 } }, { "outlier_threshold", 35 }, { "translation_eye0", Calibration.rightEyeTranslation }, { "translation_eye1", Calibration.leftEyeTranslation } });

user-875de4 01 April, 2018, 22:44:24

This is a cool project

user-875de4 01 April, 2018, 22:44:34

What fps can be expected?

user-b116a6 02 April, 2018, 11:10:51

Hello again, I will be using surfaces to define that my AOI will be the screen of my laptop. Is there any way to split that AOI into 100 segments based on the screen's dimensions? Also, what data does topic surfaces return when subscribing through the IPC backbone? Thank you.

user-b116a6 02 April, 2018, 13:50:50

Regarding my previous post, I used the script to create the markers, I printed them and set them on my screen on the 4 corners and I tried to add a new surface. I was using the show Markers and Surfaces mode from Surface Tracker in Pupil Capture. When I click add surface it starts the process and recognizes the 4 markers but not fully as the green colour that is shown on the marker is not static and goes on and off. Will this be a problem later in the recording process? I tried to record some short videos and it always says no gaze on any surfaces.

user-b571eb 02 April, 2018, 14:52:10

Hi there, I have a question about designing visual search tasks. Does anyone happen to know some software I can try?

user-e7102b 02 April, 2018, 16:26:19

@user-ed537d Thanks for responding to my questions re your MATLAB script! Yes, I was trying to port the information within the same computer. I haven't looked at this for several weeks (we ended up not needing live gaze interaction for our current studies) but I'll take another look at it soon. However, I do recall playing around with the downsampling in the python script and not having much luck.

user-e7102b 02 April, 2018, 16:33:07

@user-b116a6 I use a similar setup (printed markers attached to screen) and noticed a similar issue with the markers sometimes flickering on/off. I had to make the markers larger to make them work more consistently at the participant viewing distance (1m20cm). Detection is much more consistent now, although some markers do still flicker...but from the data I've looked at so far this doesn't seem to affect the surface.

user-e7102b 02 April, 2018, 16:36:58

@user-3070d9 PsychoPy (python based) and Psychtoolbox (toolbox for MATLAB) are both good options for visual search tasks.

user-b571eb 02 April, 2018, 16:43:10

@user-e7102b thanks a lot

user-b116a6 02 April, 2018, 16:51:42

@user-e7102b Ohh, I'll try using larger markers then and hopefully it will be okay, thanks for the input. Any idea about the data sent through the IPC backbone or are you exporting the data analysis reports from the Player at a later stage? I want to use the data as they are received and I saw in the documentation that the topic surfaces exist. Thanks again.

user-e7102b 02 April, 2018, 16:59:37

@user-b116a6 No problem. If you happen to use psychtoolbox, I wrote a script here for positioning markers at screen corners, resizing etc, (https://github.com/mtaung/pupil_middleman/blob/master/matlab_functions/Display_Surface_Markers.m)

user-e7102b 02 April, 2018, 17:03:52

@user-b116a6 No idea about the IPC backbone stuff, sorry. I'm currently just using pupil for passive gaze recording. Perhaps if you make the surface more consistent you will see data in the "surfaces" topic?

user-b116a6 02 April, 2018, 17:55:36

@user-e7102b Thanks for the help, when I fix the issue with the flickering hopefully I will be receiving data too.

user-ed537d 02 April, 2018, 17:55:57

@user-e7102b I may have time to help you today if you'd still like

user-ed537d 02 April, 2018, 17:57:14

ahhhhh i think i found the mistake in the code i uploaded

user-ed537d 02 April, 2018, 18:00:40

also as a side note i've also found a way to use the matlab python api to load zmq and pull values but in my experience this also leads to lag times

user-ed537d 02 April, 2018, 18:00:46

will post the correction in a second

user-ed537d 02 April, 2018, 18:01:56

are you using pupilRead?

user-ed537d 02 April, 2018, 18:02:01

and do you set a timeout on the udp socket?

user-e7102b 02 April, 2018, 18:38:06

@user-ed537d I have a bit of time today too, so any help would be great! Yes, I'm using pupilRead, and setting a timeout on the udp socket. When you post the correction I'll have another go at getting it to work. Thanks again.

user-ed537d 02 April, 2018, 19:14:09

i'll dm you

user-88dd92 03 April, 2018, 12:00:04

Hello All, I am getting the following message on PupilCapture when trying to record audio :"World: could not identify audio stream clock". Has anyone solved this issue? Thanks

mpk 03 April, 2018, 12:24:47

@user-88dd92 this is a news to us. Could you post an issue outline the problem here: https://github.com/pupil-labs/pupil/issues/new ?

user-e38712 03 April, 2018, 12:44:13

Hi guys, at first thanks @papr for your point of view regarding module writing I asked few days ago .

Do you know why when I recorded using Pupil Mobile, streamed and calibrated in Pupil Capture after recording there is no gaze data when I try to open it in Player? On streamed view I could see my gaze position.

user-88dd92 03 April, 2018, 13:09:35

@mpk The issue has been posted as: Audio stream clock Issue #1139

user-88dd92 03 April, 2018, 13:09:42

thanks for all the help !

user-1bcd3e 03 April, 2018, 14:20:54

Hi guys, we are starting a new research project where we will record long time interactions between people (kind of 40 minutes for session). Is there any possibilities to cut long recording in shorter parts to better upload later on pupilplayer? it will be very hard with our mac to open files so big...

user-1bcd3e 03 April, 2018, 14:21:01

THNKS

user-a04957 03 April, 2018, 14:51:41

Hi everybody, I know, that the orange line indicate the "angular distance between mapped pupil positions (red) and their corresponding reference points (blue)" BUT If i do a screen calibration: Shouldnt there be only one (blue) reference point for the pupil per Marker and not multiple? Obviously I did not fully understand the "Visualize mapping error". I am happy if someone could help me with that ;)

What light conditions are perfect to minimize this error?

Thanks!

Chat image

mpk 03 April, 2018, 15:03:14

@user-a04957 there will be more than one blue marker per site if you move your head. We sample about 30 sites per second. The marker is usually present for 1-4 seconds.

mpk 03 April, 2018, 15:04:21

@user-1bcd3e we are working on memory optimizations right now. I think 40min recordings will be much more managable in the near future.

user-072005 03 April, 2018, 15:08:52

When I record (single eye) with the app, after 15-30 min it will stop and say the max file size has been exceeded. The memory card isn't full, so what is limiting the file size?

mpk 03 April, 2018, 15:43:25

@user-072005 do you know what the file size is in this case?

user-d72566 03 April, 2018, 17:58:18

Hi, I also have a question about calibration and accuracy. As I understand it, there are 2 metrics, Angular Accuracy, Angular Precision. If I press C and calibrate and then press T, I will look at some markers and then the the Accuracy Visualizer opens and tells me these values. If I recalibrate, the values changes. What can be considered good and bad accuracy? I don't understand these values.

Also, sometimes when I calibrate, it fails after I'm done, printing like 10 errors messages saying "markers detected. Please remove all other markers" This can happend when I try to test accuracy aswell.

Sometimes either one or both accuracy metrics shows nan (not a number).

user-aa5ce6 04 April, 2018, 01:22:54

Hey, I am wondering if the Oculus Rift DK2 Binocular Add-on also works with the Oculus Rift version sold now commercially (not DK2). Anyone experiences or thoughts how to eye-trackify the Oculus Rift?

wrp 04 April, 2018, 04:13:11

Hi @user-aa5ce6 this is something we (Pupil Labs) have been working on for a while. We have a working prototype, but are still waiting on a revision of our new cameras with optics that will enable us to capture the eye region of a wide range of users. Our CV1 release is very behind schedule due to the limited space within the CV1, which requires very specalized cameras/optics in order for it to be an end-user (after market) add-on. That being said, it is in the works and we really hope to have something to share with you all soon

wrp 04 April, 2018, 04:14:50

@user-d72566 Gaze accuracy - under ideal conditions, has been measured to be within 0.6 degrees (with 2d mode). However, due to varying eye appearances, facial geometry, and other factors you may see something like 1 to 2 degrees of angular accuracy in the wild with the 3d mode. You should be aiming for something between 1-2 deg with the 3d mode.

wrp 04 April, 2018, 04:28:57

@user-d72566 regarding the message markers detected. Please remove all other markers - do you have other markers (e.g. manual markers) visible in the world camera (or multiple markers visible on screen)?

user-a04957 04 April, 2018, 06:30:55

Hi, how is it possible, that the norm_x_pos (column 4) and the norm_y_pos (column 4) have values outside the range 0,1 ? What does that mean (does it mean anything at all?) Even with a confidence level of 1.0 (see picture column 3). The same I see with very strange pupil diameter values.

Thank you!

@mpk Thank you, that makes sense now

Chat image

user-d72566 04 April, 2018, 07:29:55

@wrp Huh, I haven't noticed that there were 2 different detection&mapping modes, 2D and 3D. Can't find anything about it in the docs, what are the pros and cons of the modes? Can I find more info about it somewhere? The only thing I found was this: https://docs.pupil-labs.com/#notes-on-calibration-accuracy but it doesn't explain 2D or 3D modes.

But am I doing the proceedure correct? I mean, first I do a calibration and then I can just open the Accuracy Visualizer plugin and check the accuracy. But then I know that I should aim for 2 or less. Is there some litreature or reference I can use for this? (I'm using pupil for my Bachelor's thesis)

Regardin the markers, no I don't have any other markers visible. Not sure if it is a bug. If I reset the application and calibrate, I don't get the errors, but if I do multiple calibrations in the same instance of the program, that's when it starts printing the errors. I'm running Pupil version 1.5.12

user-ef998b 04 April, 2018, 08:59:31

Dear pupil users, please help me! I cannot set up the system on my computer. Can somebody show me the way? Thanks

user-ef998b 04 April, 2018, 09:01:50

Installation cause me some difficulties

user-b571eb 04 April, 2018, 10:03:21

I plan to use PyGaze. I read about previous chat and it seems to be possible just to synchronize the clock. Does anyone have experience using PyGaze and how to add support to it?

wrp 04 April, 2018, 10:33:02

@user-d72566 yes, we should provide more information about the difference of 2d vs 3d mode in the docs. A coarse summary is here: https://docs.pupil-labs.com/#pupil-detection

Please update to v1.6 - https://pupil-labs.com/software

Re accuracy test: Yes, you would do a calibration, then accuracy test right after (e.g. t on the keyboard). @user-92dca7 can you reference a paper on physiological limitations of human vision for @user-d72566 ?

wrp 04 April, 2018, 10:33:57

@user-ef998b please could you provide some information about your system: 1. Computer specs 2. Operating system and version 3. Pupil hardware you are using 4. Pupil software version number 5. What difficulties/issues are you observing?

user-112ecc 04 April, 2018, 13:04:39

I have the same problem as @user-a04957 . The "norm_pos_x" and "norm_pos_y" values are supposed to be normalized, but there are many points that are outside the [0, 1] range (surprisingly ). Points with low confidence values (<0.6) are not displayed.

Chat image

user-112ecc 04 April, 2018, 13:07:36

*(surprisingly with few points that have negative values)

user-a04957 04 April, 2018, 13:14:42

Hi,

whenever I work with multiple markers (only using the "offline surface tracker" in the player) and try to set the surfaces in the player, my computer crashes. I am using Windows 10: Would it be more stable to use linux? You guys have any idea/workaround? The setup can be seen in the picture. Thanks!

Chat image

mpk 04 April, 2018, 13:16:40

@user-a04957 can you send the error/exception that is raised in terminal?

mpk 04 April, 2018, 13:17:13

@user-112ecc are you looking at the gaze topic, pupil or gaze on surface?

user-a04957 04 April, 2018, 13:28:46

@mpk yes. here it is. Thanks! PS: Is there a way to safe the surfaces defined in the Pupil_Capture (where are they stored)? (e.g. safe a config file on a usb-stick) => If I define specific surfaces for 64 markers, it would be pretty nice to export this to another machine.

Chat image

user-c6967b 04 April, 2018, 15:15:31

hi there

user-d72566 04 April, 2018, 15:32:46

@wrp Alright, thanks.

I'm using a modified backend so I'm running from source. I have tried 1.6 but it gave me lots of errors so I will stick with my version for now. :D

Regarding the accuracy, a test does not seem to be needed? When I calibrate and check the plugin, it might say 3.23354665. Then if I calibrate again, I might get 1.7545343545. This is all without running the test, just regular calibration.

Another question about the Pupil Player. When I load a recorded video, it always says "no pre-recored calibration available Loading dummy calibration". Does this mean that the player ignores my calibration I did when I calibrated in Pupil Capture? I've tried including the user_calibration_data file in the recordings folder, but Pupil Player ignores it. I though calibration data was exported when recording?

(Ignore the image)

Chat image

user-f1eba3 04 April, 2018, 16:53:06

Hi guys

user-f1eba3 04 April, 2018, 16:54:06

Do you know in the code of https://github.com/pupil-labs/pupil where the sending methods are written ?

user-f1eba3 04 April, 2018, 16:54:31

I want to observe their behavior so that I can accomplish my own communication in cpp

user-dae976 04 April, 2018, 18:41:06

Hi everyone! I’m trying to have Matlab/PTB3 talk to Pupil Capture to start recording, send triggers, etc. I’m using the pupil middleman scripts (from @user-e7102b & @user-dfeeb9, https://github.com/attlab/pupil_middleman). I can get the sample_mm.py script running , and it opens the pupil remote plugin on the pupil capture GUI successfully. In addition, in the command output I can see the various triggers being sent from Matlab, received by the middleman script (e.g., mm_modules.pyudp:Received buffer: b'START_CAL'. b'START_CAL' 1521745885021). However, the triggers don’t seem to show up on Pupil Capture or do anything there (e.g., calibration/recording doesn’t start when I send the commands from Matlab). I’m running on Windows 7 (running both Matlab and Pupil Capture on a single machine). I’m hoping I might get some suggestions from the community for troubleshooting - I have some experience with Matlab, but almost none with Python, so it’s possible I’m missing something very obvious. Thanks!

user-e7102b 04 April, 2018, 19:00:42

@user-dae976 Welcome fellow MATLAB user! It sounds like the matlab>python side of things is working OK, but not the python>pupil capture. Did you check to make sure the addresses in sample_mm.py and pupil remote are the same? @user-dfeeb9 wrote the python script, so he might have some other suggestions? Re not seeing the triggers appear in Pupil Capture, that's not surprising (this should be possible, but I think we need to add a few lines of code to the python script to make it work). However, when you play back the recording in pupil player, you should see the annotations then.

user-e7102b 04 April, 2018, 19:01:53

Also, bear in mind that I haven't tested this on Windows (only Mac). But it shouldn't really matter...

user-dfeeb9 04 April, 2018, 19:15:03

Hi @user-dae976, thanks for pinging us on this and it's good to see my code being put through the test. As @user-e7102b said, we'd like to check if you have everything connected to the correct addresses (this is a stupid check but it's good for sanity), though my guess is that they are. As a matter of fact, given that the py code is able to start the annotations plugin (if I understand you correctly re: opening the remote plugin), then the py script is appropriately connected to pupil-remote. The problem therefore is with annotations and triggers. I have some ideas regarding this, but can I ask what your setup is like and what you are recording? Also, have you observed the pupil-capture log? In theory, any annotations/triggers you send should register on that log so if they aren't coming up on pupil-capture logs, they aren't being sent/received properly. if they show up on the logs but do not show up on pupil-player then we know it's more probable to be an issue with your player setup

user-dae976 04 April, 2018, 19:28:07

Hi @user-dfeeb9 and @user-e7102b , thanks for the quick replies, much appreciated! I think i described it poorly before, but yes, python is starting the annotations plugin, so i think those addresses are correct. What can i tell you about setup that would be useful? I'm running on Matlab 2015a (32-bit - i have to use this because i am also using a reach-tracker that works better on 32-bit matlab). The pupil capture GUI works fine when i use the commands there (e.g., i can calibrate, start a recording, see the file it creates, etc). I really haven't changed the setup much from how it is when you download it from the pupil-labs site. I'm recording in 2D. However, the Matlab script doesn't start a recording remotely, so there aren't files created in the recordings directory when i run the script. I can look at the output in the command window that opens with pupil capture (I think this is similar to the log file, though tell me if there is somewhere else i should look), and i don't see anything pop up when i send various commands through Matlab. Are there any additional scripts (besides those posted on the pupil_middleman github) that i'm supposed to have downloaded possibly?

user-e7102b 04 April, 2018, 19:35:46

I don't think there are any other scripts that you need to download. One question - which version of python are you using?

user-dae976 04 April, 2018, 19:44:00

python 3.6.4

user-8944cb 04 April, 2018, 19:47:24

Hello everyone, I have several beginner questions, and will greatly appreciate some help.
1. is the coordinate system created by the area that is seen by the world camera at any time, meaning that the coordinate system is changing every time the head is moved? Or is it someway creates a coordinate system that can be compared between different head positions? 2. Does defining a surface means creating the surface boarders as a coordinate system that can remain constant even when head is moved?
3. Using the Surface plugin, is is possible to define several surfaces in different depths with respect to the person wearing the eye tracker, so to then get the heatmaps? 4. Can the eye tracker detect the pupil when it is completely dark? (I know that there would be no gaze positions, but will the pupil detection still be accurate? Thanks so much!

user-e7102b 04 April, 2018, 19:49:50

@user-dae976 Can you try just running this simple python script to start a remote recording:

pupilStartRecord.py

user-aa5ce6 04 April, 2018, 19:49:54

@wrp thanks a lot! hope to hear announcements soon. πŸ˜ƒ

user-dae976 04 April, 2018, 20:01:09

@user-e7102b Yes, that script worked! Started a recording in pupil capture

user-e7102b 04 April, 2018, 20:09:16

Ok great. That is essentially the same code we use to send commands from python>pupil capture, so it looks like python>pupil is working OK, as well as matlab>python. My guess is that the issue is related to the format of the triggers/annotations (as @user-dfeeb9 suggested earlier). Perhaps something specific to Windows? @user-dfeeb9 have you tried running our scripts on Windows?

user-dfeeb9 04 April, 2018, 20:13:57

Yep, I use them on windows. My current implementation is nearly identical to what's on the pupil-middleman repo with the addition of some timer threads. They run as needed in windows 10

user-e7102b 05 April, 2018, 02:13:03

@papr In an earlier discussion you suggested recording the eye videos and calibration procedure along with the main recording, so that we could take advantage of the offline calibration tools in pupil capture. I have been recording the calibration procedure, but in a separate recording file to the tracking session (i.e. start rec > calibrate > stop rec, start rec > run task > stop rec. It occurred to me that you may have been suggesting to record both the calibration and tracking session in the same, continuous recording file? Will my approach cause problems if/when I come to do offline calibration, or is this OK? Thanks

user-a04957 05 April, 2018, 06:20:11

Hi everybody, Is there a way to safe the surfaces defined in the Pupil_Capture (where are they stored)? (e.g. safe a config file on a usb-stick) => If I define specific surfaces for 64 markers, it would be nice to export this to another machine. Thanks

wrp 05 April, 2018, 06:47:02

@user-e7102b yes @papr's suggestion (and my suggestion) would be to record calibration and tracking session in the same recording. If you want to conduct offline calibration, then you will need to have the calibration procedure present/visible in the tracking session recording.

user-e7102b 05 April, 2018, 06:52:00

@wrp Thanks - that's good to know.

wrp 05 April, 2018, 06:53:20

@user-e7102b you're welcome

user-d72566 05 April, 2018, 07:16:18

Hi, I have another question about the Pupil Player. When I load a recorded video, it always says "no pre-recored calibration available Loading dummy calibration". Does this mean that the player ignores my calibration I did when I calibrated in Pupil Capture? I've tried including the user_calibration_data file in the recordings folder, but Pupil Player ignores it. I though calibration data was exported when recording?

wrp 05 April, 2018, 08:00:56

Hi @user-d72566 is the message you are seeing related to the camera_models? Like this:

player - [INFO] camera_models: No user calibration found for camera world at resolution (1280, 720)
player - [INFO] camera_models: No pre-recorded calibration available
player - [WARNING] camera_models: Loading dummy calibration

If so, this is the camera calibration (not gaze calibration)

wrp 05 April, 2018, 08:01:41

@user-d72566 IIRC you are using a custom camera/backend, correct? Please clarify

user-d72566 05 April, 2018, 08:04:17

@wrp Yes that is the message I get and yes I am using a custom backend. I feel a little lost, what is the difference between camera and gaze calibration?

wrp 05 April, 2018, 08:04:34

@user-d72566 are you using a custom world camera as well?

user-d72566 05 April, 2018, 08:04:46

@wrp Yep

wrp 05 April, 2018, 08:04:56

ok, this makes sense then

wrp 05 April, 2018, 08:05:29

@user-d72566 camera calibration refers to estimating/calibrating for camera intrinsic parameters - please see: https://docs.pupil-labs.com/#camera-intrinsics-estimation

wrp 05 April, 2018, 08:07:43

Gaze calibration creates a mapping so that you can map pupil coordinates into the world coordinate space

wrp 05 April, 2018, 08:09:33

@user-d72566 out of curiosity would you mind/be able to share information about the cameras you are using in your setup/backend you are implementing?

user-d72566 05 April, 2018, 08:27:52

So Camera intrinsic is for modifying the scene/world view to capture more? Could actually be useful to me as my scene camera has a low fov. I tried the plugin and doing the steps described in the docs, but I can't get it to work properly. I click show pattern, resize the window and then press c, nothing happens.

The cameras I use are just regular web cams, not UVC compliant hence the custom backend I have allows for non-uvc devices to be used. They are pretty low res though. Eye camera is 640x480 30 fps and scene is 1280x720 30 fps

But so the section called Screen marker Calibration is just for gaze then?

wrp 05 April, 2018, 08:29:27

@user-d72566 you need to capture at least 10 images of the pattern for camera intrinsic estimation - this will enable you to rectify camera distortion from your camera lens and estimate depth accurately

wrp 05 April, 2018, 08:30:06

screen marker calibration is for gaze calibration, correct

user-d72566 05 April, 2018, 08:30:56

Alright. But so if I do camera calibration, this will be exported during recording, and then used in Player, is this correct? My modified backend is here btw : https://github.com/Baxtex/pupil/blob/master/pupil_src/shared_modules/video_capture/none_uvc_backend.py Not perfect, but it works . (shoutout to papr, he helped me a lot)

user-f1eba3 05 April, 2018, 13:48:00

Any pupil capture/service developer around ?

user-d1ad4f 05 April, 2018, 14:05:13

Hi, is there a way to use the eyetracker remotely with an iphone?

mpk 05 April, 2018, 14:15:33

@user-d1ad4f no unfortunately not. Only Android.

user-e38712 05 April, 2018, 14:27:08

Hello guys, why when I calibrate Pupil Mobile stream in Pupil Capture and record in pupil Mobile there is no gaze data available?

user-e38712 05 April, 2018, 14:27:44

when I choose Gaze From Record option

user-d1ad4f 05 April, 2018, 14:56:15

do you recommend an specific android device an version?

user-d1ad4f 05 April, 2018, 15:06:22

I see they work with Motorola Z2 Play, is there any other device which could work better and have more recording time?

user-f1eba3 05 April, 2018, 15:32:51

Hi so obviously you are using msgpack to serialize and deserialize the messsages. Is there a definition written somewhere ?

wrp 06 April, 2018, 01:51:52

@user-d1ad4f Z2 play is best so far in terms of maximizing recording duration as it has expansion interface where you can hot swap batteries on the back and external sd card

wrp 06 April, 2018, 01:53:35

Other devices work like Google Nexus 5x, 6p as well as OnePlus 3/3T/5/5T - but these devices do not have the ability to have extra battery power like the Z2 or external SD card

user-dae976 06 April, 2018, 02:00:49

I imagine a few people on here use (or have used) chin rests for stability - does anybody have recommendations for models/brands? Thanks!

user-e7102b 06 April, 2018, 02:05:52

@user-dae976 We use SR Research Head Supports. They're super heavy duty and will last forever. However, they cost over $1500 when you factor in shipping, tax, import duties etc., so not cheap. I've not found a good alternative, but would love to hear if anyone else has suggestions.

user-2fbdee 06 April, 2018, 02:10:32

@user-dae976 We have ordered one from Taobao https://item.taobao.com/item.htm?spm=a230r.1.14.90.452662c6UsXujr&id=564882492485&ns=1&abbucket=8#detail

user-2fbdee 06 April, 2018, 02:11:19

It has a funny looking, and it may not be as good as the professional ones, but at least it costs only around 4 - 5 USD

user-e7102b 06 April, 2018, 02:34:30

@user-2fbdee Interesting. I wonder if this will extend far enough for taller adults?

user-2fbdee 06 April, 2018, 02:37:45

@user-e7102b I will keep you guys posted when I have my hands on one of those, as I will be running some experiments that would require participants to go through passages for almost 10 mins

user-fd3b2b 06 April, 2018, 07:54:27

Hello, I have a pupil headset including R200 world cam and binocular eye cam. The problem I have is simillar to https://github.com/pupil-labs/pupil/issues/767. One difference is that the world screen is used via realsense, not UVC. So I want to know if realsense2.0 sdk is supported alternatively.

user-f6beb0 06 April, 2018, 12:53:32

Hi, so im currently trying to get the pupil with a R200 camera working in Ubuntu 16.04. Got several problems, but the core is, i can't get the uvcvideo patched correctly. Been checking the issues and searching the internet but while trying to install librealsense v1.12.1 (legacy) and it fails the patching. The result is Pupil not detecting the camera. Anyone here had the same problem and found a solution?

user-f6beb0 06 April, 2018, 12:55:47

Unloading existing uvcvideo driver... modprobe: ERROR: could not insert 'uvcvideo': Exec format error

user-af87c8 06 April, 2018, 14:00:29

Hi! I have a question regarding intrinsic camera estimation & actually applying it. I'm confused whether the pupil-player automatically applys the intrinsic camera estimation or not (I selected the fisheye during recording, the "show undistorted image" looks good to me.) In the player I get the distorted view (no plugin to "show undistorted image"). But the surface marker shows me a UNdistorted rectangle. My questions: On what are calibrations, fixations, surfaces, saccades etc. calculated, distorted or undistorted? 2) where do I change this in pupil player? - Thanks for the response!

user-489bd5 06 April, 2018, 14:03:59

Hi guys, I have quick question concerning Pupils for Hololens. I have been using it for a week and unfortunately wires from in plug broke. I'm adding photo for damaged part. Is possible to repair it easily?

Chat image

user-a04957 06 April, 2018, 14:49:40

Hi, I have a question: Why is the Start Time (Synced) in the info.csv different to the first timestamp in the gaze_positions.csv? Are they using the same source time ?

Does this mean, the first entry in gaze_positions.csv also referes to a UNIX-epoch time smaller than given in Start Time (System) ?

Thank you very much!

user-67e255 06 April, 2018, 15:52:21

hi @user-489bd5 this is Douglas from the hardware team. Sorry for your inconvenience, we'll send you replacement hardware as soon as possible. I've sent you a direct message with further details

user-ed537d 06 April, 2018, 17:49:50

@user-e7102b in reference to the pupil recording etc i have some code that adds zmq into matlab via the python matlab api and I have it working I can post the code here if anyone would like it

user-af87c8 06 April, 2018, 17:51:00

@user-ed537d I would be interested as well, we have a working solution now, but interesting to see how other people did it πŸ˜ƒ

papr 06 April, 2018, 17:55:20

@behinger#4801 @user-ed537d This will be the official Matlab example in the future https://github.com/pupil-labs/pupil-helpers/pull/26

user-af87c8 06 April, 2018, 18:00:45

@papr if I understand correctly, you open a socket from matlab to python, then python opens a zmq socket to pupillabs? We used a direct matlab zmq interface + custom plugin to decode text-commands in pupil

user-ed537d 06 April, 2018, 18:01:27

I'll take a look in a second

user-ed537d 06 April, 2018, 18:01:58

Ok @papr just wanted to add all I found during the time I was implementing our solution.

user-ed537d 06 April, 2018, 18:02:12

😬

papr 06 April, 2018, 18:02:57

@user-af87c8 No, the example uses the Matlab bindings for the czmq library and communicates directly with Pupil Remote. No middle man software required.

user-af87c8 06 April, 2018, 18:04:03

ah sorry, I'm a bit confused - ah I clicked through the wrong branch. now its clear. let me check what we used

papr 06 April, 2018, 18:04:13

This was the closest we found to a Matlab zmq implementation. https://github.com/fagg/matlab-zmq

user-af87c8 06 April, 2018, 18:05:11

there are multiple ones. I think we found three (or four?). We used the only one that did not crash matlab & compiled fine

papr 06 April, 2018, 18:05:30

The Readme in the pr also lists the msgpack library that the example uses

user-af87c8 06 April, 2018, 18:06:52

we used: https://github.com/UCL-CATL/cosy-zeromq-matlab/tree/master/zmq There was some problem installing matlab-zmq

papr 06 April, 2018, 18:07:15

BTW, the example is only tested on Matlab 2017 a, Ubuntu 17.10

user-af87c8 06 April, 2018, 18:07:18

(Ubuntu 16, matlab2016)

user-af87c8 06 April, 2018, 18:08:01

ok, cool, thanks! Can you by chance also help me out with the intrinsic camera settings I asked above?

papr 06 April, 2018, 18:11:07

Unfortunately not. I do not know that part by hard and I won't have the means to look it up for an other week. I am currently on mobile only.

user-af87c8 06 April, 2018, 18:13:28

ok. should I write a ticket? Will someone else check discord? I also found an USB-mainboard bus (or possibly the unix driver for it, cant tell) that does not allow to run the same bandwidth as other usb-busses. (on Dell Precision 1700s, (we checked 3 of them) the maximal bandwidth is quite limited) Is that something interesting for you?

papr 06 April, 2018, 18:17:17

Regarding the intrinsics issue, yes, please create an issue and assign me. I will check it as soon as I am back in the office.

user-ed537d 06 April, 2018, 18:19:25

@papr i found with afagg's github of zmq there were serious issues w/ lag

papr 06 April, 2018, 18:19:31

Regarding the USB bus stuff, I would say that this belongs in the category personal setup details that is so specific that it is difficult for us to document such things. Nonetheless, I would appreciate it if you could name the details here. This way we can search for them in case that we need them.

user-af87c8 06 April, 2018, 18:20:42

ok. I will do this next week. We will checkout an external PCI USB Bus and see whether that fixes the problem. Else we will need the split-hub schematics and upgrade our setup πŸ˜‰ Thanks a lot for your effort & the amazing piece of software!

user-af87c8 06 April, 2018, 18:20:49

enjoy your weekend, bye!

papr 06 April, 2018, 18:21:02

You too!

user-ed537d 06 April, 2018, 18:41:04

here is a very raw version of something that i pulled out of the module I built in matlab to send zmq commands it uses the matlab-python api to load in the zmq packages and thus doesn't require the c compiled codes that others like andrew fagg has on his github. I found that this was the most reliable.

user-ed537d 06 April, 2018, 18:41:09

p.remoteIP = '127.0.0.1'; p.remotePort = '50020';

isRemoteConnected = false; % import python packages zmq = py.importlib.import_module('zmq'); zmqM = py.importlib.import_module('zmq.utils.monitor'); % this.msgpack = py.importlib.import_module('msgpack');

% this must be run before running Requester context = zmq.Context(); remAddress = sprintf('tcp://%s:%s', p.remoteIP, p.remotePort);

% Requester Initialize socket = zmq.Socket(context, zmq.REQ);

% connect and block node block_until_connected = 1; if block_until_connected == 1 monitor = socket.get_monitor_socket(); socket.connect(remAddress) for attempt = 1:5 status = zmqM.recv_monitor_message(monitor); if double(status{'event'}) == zmq.EVENT_CONNECTED fprintf('Pupil Remote: Event Connected\n') break elseif double(status{'event'}) == zmq.EVENT_CONNECT_DELAYED fprintf('Trying to connect to Pupil Remote again: Attempt %d\n', attempt) else sprintf('ZMQ Connection Failed: Attempt %d\n', attempt) end end socket.disable_monitor(); isRemoteConnected = true; else socket.connect(remAddress); fprintf('Pupil Remote connection NOT tested...check ip and port\nIP: %s\nPort: %s\n',remoteIP,remotePort) end

user-ed537d 06 April, 2018, 18:50:33

remoteMountLocation = '/path/to/dir/tosave/recordings'; filename = 'nameOfFile'; % Set Time Base cmd = 'T 0.0'; socket.send_string(cmd); zMsgRecv = char(socket.recv_string()); fprintf('Sent command %s and received %s\n',cmd, zMsgRecv);

% set mount/save location and record mountLocation = remoteMountLocation; filePathName = [mountLocation filename]; cmd = ['R', ' ', filePathName]; socket.send_string(cmd) zMsgRecv = char(socket.recv_string()); sprintf('Sent command %s and received %s\n',cmd, zMsgRecv);

% set isRecording property to true isRecording = true;

%%

%stop recording cmd = 'r'; this.socket.send_string(cmd) zMsgRecv = char(socket.recv_string()); sprintf('Sent command %s and received %s\n',cmd, zMsgRecv);

% set isRecording property to false isRecording = false;

user-ed537d 06 April, 2018, 18:50:52

remoteMountLocation = '/path/to/dir/tosave/recordings'; filename = 'nameOfFile'; % Set Time Base cmd = 'T 0.0'; socket.send_string(cmd); zMsgRecv = char(socket.recv_string()); fprintf('Sent command %s and received %s\n',cmd, zMsgRecv);

% set mount/save location and record mountLocation = remoteMountLocation; filePathName = [mountLocation filename]; cmd = ['R', ' ', filePathName]; socket.send_string(cmd) zMsgRecv = char(socket.recv_string()); sprintf('Sent command %s and received %s\n',cmd, zMsgRecv);

% set isRecording property to true isRecording = true;

%%

%stop recording cmd = 'r'; socket.send_string(cmd) zMsgRecv = char(socket.recv_string()); sprintf('Sent command %s and received %s\n',cmd, zMsgRecv);

% set isRecording property to false isRecording = false;

user-ef998b 07 April, 2018, 08:39:19

Hi everyone! Im new here. Finally i have managed to install the software. But there is a problem using the device: When i click on detect eye, the little popup window shows the picture of the things infront of me and not my eye/pupil. It says something like this: capture failed to provide frames. Can someone please help me? Thank you very much.

user-2fbdee 07 April, 2018, 09:01:39

Hey guys, I am wondering how can I integrate different psychophysiological measures... like eye tracking and Galvanic skin response... any advise?

user-e7102b 07 April, 2018, 14:46:34

@user-2fbdee Sure. Of course, it depends on the different devices you're trying to integrate, but basically you'll need to send event codes simultaneously to the different recording devices, thus enabling you to synchronize the data in offline processing. For example, in our lab we use matlab/psychtoolbox to for stimulus control. With each stimulus presentation we send a numerical event code to pupil-capture via a UDP connection (https://github.com/mtaung/pupil_middleman#pupil-middleman), and to a Brain Products ActiCHamp EEG system via a custom USB/LabJack setup.

user-e7102b 07 April, 2018, 16:54:54

I have a question. Our lab are currently running a study where we record eye data from participants across multiple different sessions on different days. I'd like to compare pupil diameter across these sessions, but there will likely be variability between sessions in terms of camera position and distance from the eye. My understanding is that pupil diameter is measured in pixels. Will I be able to correct the pupil diameter values to compensate for variations in camera distance? I'm using a binocular pupil headset in 2D mode (algorithm), and I'm recording both eye videos. Any suggestions would be appreciated. Thanks!

papr 07 April, 2018, 17:00:54

@user-e7102b You could try to run offline pupil detection in 3d mode and compare the resulting 3d pupil diameter that are measured in mm. This assumes a well fit 3d model though. Please be aware that the the 3d model does not take refraction of the cornea into account.

user-e7102b 07 April, 2018, 17:28:12

@papr Thanks, I'll give that a try.

user-fc793b 09 April, 2018, 00:51:45

Hi all - I'm wondering if there is already a way to import eye and world camera data of a custom headset from a network source (such as through the lab streaming layer, http, or some network protocol). For example, a raspberry pi is broadcasting camera data, which I would like to receive and analyze on a local desktop running Pupil Capture. Has this been done? Can this be easily done with a plugin, or might I have to dig deeper into re-writing some of the source code?

wrp 09 April, 2018, 04:27:18

@user-fc793b starting from pyndsi's example might be the best intro to ndsi

user-ef998b 09 April, 2018, 06:33:38

Hi everyone! Im new here. Finally i have managed to install the software. But there is a problem using the device: When i click on detect eye, the little popup window shows the picture of the things infront of me and not my eye/pupil. It says something like this: capture failed to provide frames. Can someone please help me? Thank you very much.

user-78dc8f 09 April, 2018, 10:18:42

Greetings. We are trying to set a protocol for offline calibration. The first step is to detect the eye and set parameters for the eye model in algorithm mode. It looks like we have 4 parameters we can play with: intensity range, pupil min, pupil max, and model sensitivity. Any advice on how to optimize these parameters to achieve a good calibration?

user-78dc8f 09 April, 2018, 10:49:05

min and max seem straightforward, as does pupil intensity. Any advice on setting model sensitivity?

user-e02f58 09 April, 2018, 11:04:20

Hello, how can my plugin react when pupil app start recording? I tried the following code but not work def on_notify(self, notification): if notification['subject'] is "recording.should_start": print('my plugin: recording started')

papr 09 April, 2018, 11:07:27

@user-e02f58 try == instead of is. Also look for recording.started instead of should_start

user-e02f58 09 April, 2018, 11:09:36

@papr == work for me, thanks!

user-e02f58 09 April, 2018, 11:16:33

I also want to get the x and y of pupil_positions. Can I know which file is the gaze finder? so that I can get some reference

user-e02f58 09 April, 2018, 12:09:32

I wonder if I can get the x of pupil_position with something like that

def recent_events(self, events): if 'pupil_positions' in events: pp = events['pupil_positions'] print(pp.norm_pos_x)

user-f1eba3 09 April, 2018, 12:53:22

Is there anybody who has more insight on the pupil capture implementation to help me with some questions ?

user-dfeeb9 09 April, 2018, 14:22:36

Hi again, I have a question about vision correction lenses and the pupil trackers - at the moment, is there a native solution to accommodating corrective lenses and the pupil trackers? When testing the trackers on people wearing corrective lenses, as expected I don't get a very good capture

user-af87c8 09 April, 2018, 18:28:05

@John Spencer#6980 I would be interested in that as well

user-e7102b 09 April, 2018, 19:40:06

I'm encountering a lot of issues using PupIl Player (latest version) to open and export recordings on my Macbook Pro (2017, 16GB Ram, 2.5 GHz core i7). The files I'm trying to open are either 10 minute or 20 minute recordings (Pupil Headset, 120 Hz binocular, 30 Hz World). Player often crashes when attempting to open the files. Furthermore, when I get the files to open and hit the export button, this frequently fails or only exports a portion of the data. Do I not have enough a powerful enough system to perform these actions? I've found that I can reduce the likelihood of crashing by closing all other power hungry applications on the machine, but even with a freshly rebooted machine and nothing else open, I'm still having trouble.

user-e7102b 09 April, 2018, 19:57:18

This might be entirely coincidental, but I seem to have more luck with the exports if I don't attempt to play back the recording first i.e. just load the folder into pupil player, then hit "e" immediately.

user-6e1816 10 April, 2018, 02:24:50

Whether the data recorded with capture contain the detailed system time?

wrp 10 April, 2018, 02:48:55

Hi @user-ef998b apologies for the delayed reply. You noted:

When i click on detect eye, the little popup window shows the picture of the things infront of me and not my eye/pupil. It says something like this: capture failed to provide frames

Based on your description it would seem that the eye window is displaying the video feed from the world camera. Please restart Pupil Capture with default settings. General > Restart with default settings.

Please also let us know what OS you're using, OS version - if the above does not resolve the issue/behavior you are experiencing.

wrp 10 April, 2018, 02:53:55

@user-78dc8f In most cases you can use default settings for the pupil detector. I would only suggest setting model sensitivity as an advanced setting. The protocol I would recommend is as follows: 0. Adjust eye cameras and ensure that the eye of the participant is within the frame for all eye movements. Ask participant to move eye around to sample extreme sites or look at a point and roll head to build up 3d model. 1. Check if pupil is detected in eye windows. If not, then try adjusting min/max pupil detection params if needed.

user-2fbdee 10 April, 2018, 07:29:22

Thanks @user-e7102b

user-2fbdee 10 April, 2018, 07:29:46

By the way, regarding the chin rest, here are some photos (it seems to work well

user-2fbdee 10 April, 2018, 07:30:04

Chin Rest (Low cost)

Chat image

user-2fbdee 10 April, 2018, 07:30:28

Chat image

user-2fbdee 10 April, 2018, 07:30:40

Hope it helps!

user-e02f58 10 April, 2018, 09:56:09

Hello, anyone know how can to get timestamp in self created plugin? I tried self.g_pool.timestamps but the program crash

user-e02f58 10 April, 2018, 10:02:00

plugin base class 's g_pool have no attribute of 'timestamps' but base class like Gaze_Producer_Base do have the timestamps

papr 10 April, 2018, 10:17:06

@user-e02f58 g_pool.get_timestamp IIRC. G_pool is just a reference to a bunch of useful objects and functions. It's base class is meaningless.

user-e02f58 10 April, 2018, 10:24:47

@papr oic, I can use self.g_pool.get_timestamp() to get the number is there a list the describe all useful objects and functions in G_pool?

papr 10 April, 2018, 10:31:12

The dir() function is usually used to inspect python objects. https://docs.python.org/3/library/functions.html#dir

user-bf07d4 10 April, 2018, 11:11:03

hello, im new here, got some problems with drivers installation , somebody can help me? cant find libUSBK Usb Devices, in activate source the cameras are "unknown" , and the pupil windows says "camera already in use or blocked"

user-e02f58 10 April, 2018, 11:11:04

in gaze_producers.py, it get timestamps using self.g_pool.timestamps[self.g_pool.capture.get_frame_index()]

but I cannot use this function in my plugin

papr 10 April, 2018, 11:20:26

That's because gaze producers includes Pupil Player plugins. These are not compatible with Pupil Capture. Have a look at the imports in world.py for a list of Capture-compatible plugins

papr 10 April, 2018, 11:21:23

@user-bf07d4 this means that the driver installation was not successful. Try running Capture with administrator rights

user-bf07d4 10 April, 2018, 11:22:43

I've already tried but it still doesn't work

papr 10 April, 2018, 11:23:22

Please see the driver troubleshooting section in the docs then

user-bf07d4 10 April, 2018, 11:24:04

Already done 😒

user-bf07d4 10 April, 2018, 11:25:36

My computer uses windows 7 my cam is logitech c615 and hd 6000

papr 10 April, 2018, 11:28:08

Well, we only support Windows 10.

user-bf07d4 10 April, 2018, 11:28:48

thank you very much!!! Now I will try in another pc

papr 10 April, 2018, 11:31:59

@user-bf07d4 and another thing that just came to my mind: You will have to install drivers manually due to not using the standard cameras. https://github.com/pupil-labs/pyuvc/blob/master/WINDOWS_USER.md

user-bf07d4 10 April, 2018, 11:33:08

thanks πŸ˜€

user-bf07d4 10 April, 2018, 12:19:57

papr help please im doing installation on a win 10 os following instructions on this page :How to prepare your system for uvc on Windows (8 and later), but cant find wheel file :download the whell file from the releases page and do pip install uvc-0.7.2-cp35-cp35m-win_amd64.whl

user-bf07d4 10 April, 2018, 12:20:10

point 8

user-bf07d4 10 April, 2018, 12:20:33

@papr

papr 10 April, 2018, 12:21:39

Use the newest wheel that you can find on the github release page

user-bf07d4 10 April, 2018, 12:23:16

ok all clear now (maybe)! thank you again

user-bf07d4 10 April, 2018, 12:45:31

Chat image

user-bf07d4 10 April, 2018, 12:46:13

this is when I try to launch the program @papr

user-f1eba3 10 April, 2018, 13:19:48

How can you simulate dummy data from pupil capture for others ? Can you record some pupil data/Are there any dummy data available ? Don't need anything specific yet just to simulate the behavior ?/

mpk 10 April, 2018, 13:36:19

@user-f1eba3 if you set the eye images to fake capture you should get dummy pupil and gaze data.

user-f1eba3 10 April, 2018, 13:39:05

How can one do that ?

user-04d904 10 April, 2018, 16:53:27

Hi all, i'm fairly new to pupil labs. Trying to make sense of the data export. Regarding the gaze positions, pupil labs gives normalized coordinates based on the world frame. i'm trying to understand what the world frame looks like in order to make a transformation unto a surface. any help would be greatly appreciated

user-e7102b 11 April, 2018, 01:28:47

Question: is there a way to batch export raw data from multiple participants in pupil player? I'm aware that there is a batch export option, but this only seems to export the .mp4 and numpy files (I need the csv files, surface data etc.). If this doesn't exist, it would be really useful. It's very tedious to manually export each file if you have a ton of data, like we do.

user-b19122 11 April, 2018, 07:19:17

Hi All,

user-b19122 11 April, 2018, 07:21:51

Whenever I open Eye0 & Eye1 in debug mode, pupil_capture gets crash. Prompt displaying "out of memory". Can any one help me.

user-ef998b 11 April, 2018, 09:11:40

Dear @wrp , thanks for your help, but it is still not working. The following text appears: EYE1: done updateing drivers! EYE1: Init failed. capture is started in ghost mode. No images will be supplied. EYE1: no user calibration found for camera Ghost capture at resolution 320,240. EYE1: no pre-recorded calibration available. EYE1: loading dummy calibration.

user-ef998b 11 April, 2018, 09:11:51

After this, nothing happens

user-ef998b 11 April, 2018, 09:12:03

Do you have any suggestions?

user-ef998b 11 April, 2018, 09:12:08

Thank you very much.

user-b91aa6 11 April, 2018, 09:19:53

Question: why pupil lab in VR can compensate slippage? How this process is done?

user-1d8719 11 April, 2018, 10:01:36

Hi, I am completely new in this app, and I dont know who I should write to. I created my own camera and I am trying to find how to obtain the software programme to be able to start collecting the data. Thank you in adavance, Ana

wrp 11 April, 2018, 10:03:09

@user-04d904 gaze positions are relative to the world camera frame. So (0,0) would be the bottom left of the world camera video frame, and (1,1) top right corner. Re surfaces: you can use the Offline Surface tracker and export with this plugin loaded in order to get gaze positions relative to each surface

wrp 11 April, 2018, 10:04:01

@user-e7102b currently there is no way to batch export raw data from multipule participants in Player. This is a much needed feature for sure. Please create an issue for this in the github repo if there isn't one there already.

wrp 11 April, 2018, 10:06:58

@user-b19122 could you supply the specs of your system (I recall this is running on Windows). E.g. cpu and ram specs.

wrp 11 April, 2018, 10:07:36

@user-ef998b can you check the device manager to see if drivers are installed in libusbK category

wrp 11 April, 2018, 10:08:14

@user-1d8719 thanks for getting in touch. What camera are you using? Software is available at https://github.com/pupil-labs/pupil/releases/latest

wrp 11 April, 2018, 10:10:59

@user-b91aa6 I have responded to your question in the πŸ₯½ core-xr channel.

user-1d8719 11 April, 2018, 10:11:58

I am using the Logitech C525, and I have changed the light as it is done in the video. Is it a Demo or is it the complete software? Thank you πŸ˜ƒ

wrp 11 April, 2018, 10:12:49

@user-1d8719 Pupil software is used by many researchers around the world πŸ˜„ - so more than just a demo. We are always working to add more features and improve - so feedback is always welcome.

user-1d8719 11 April, 2018, 10:13:24

Cool!! Thank you so much, I will!

wrp 11 April, 2018, 10:14:12

You're welcome!

user-ef998b 11 April, 2018, 10:24:12

dear @wrp , i am really not finding this libuskb category. where is it exactly? Thank you

wrp 11 April, 2018, 10:26:36

Device Manager > libusbK

wrp 11 April, 2018, 10:27:28

if Drivers are correctly installed, then you should see libusbK category within the Device Manager on Windows

wrp 11 April, 2018, 10:27:45

based on your message @user-ef998b I assume that you are using Windows 10 - please confirm that this is the case.

user-ef998b 11 April, 2018, 10:28:14

yes, i am using windows 10

wrp 11 April, 2018, 10:29:07

Do you see drivers installed for Pupil Cam in other categories of the device manager?

user-ef998b 11 April, 2018, 10:29:12

pupil cam1id2 is intsalled

wrp 11 April, 2018, 10:29:23

in what category?

user-ef998b 11 April, 2018, 10:29:40

libubsK

wrp 11 April, 2018, 10:31:12

@user-ef998b how many cameras are listed in libusbK category? You can also go to view > show hidden devices in the device manager

wrp 11 April, 2018, 10:31:32

Please also start Pupil Capture and go to General > restart with default settings

user-ef998b 11 April, 2018, 10:32:53

only one is there: cam1id2

wrp 11 April, 2018, 10:33:18

are other cameras installed in other categories. You have a binocular or monocular system?

user-ef998b 11 April, 2018, 10:34:25

i have a binocular one, and yes, under the 'cameras' there are 2 ido and id1

wrp 11 April, 2018, 10:35:36

To debug driver installation on Windows 10 please try the following:

To debug driver installation could you please do the following:
1. Unplug Pupil Labs hardware
2. Open Device Manager
    2.1. Click View > Show Hidden Devices
    2.2. Expand the libUSBK devices category and expand the Imaging Devices category within Device Manager
    2.3. Uninstall/delete drivers for all Pupil Cam 1 ID0, Pupil Cam 1 ID1, and Pupil Cam 1 ID2 devices within both libUSBK and Imaging Devices Category
3. Restart Computer
4. Start Pupil Capture (With admin privlidges on Windows 10)
    4.1. General Menu > Restart with default settings
5. Plug in Pupil Headset - Please wait, drivers should install automatically (please ensure you have admin user privlidges)
wrp 11 April, 2018, 10:35:50
user-ef998b 11 April, 2018, 10:36:18

i will give it a try

user-ef998b 11 April, 2018, 10:36:28

i really appreciate your help

user-ef998b 11 April, 2018, 10:36:53

will get back to you with the results soon

wrp 11 April, 2018, 10:36:54

@user-ef998b I hope that we can resolve driver issues soon.

user-ef998b 11 April, 2018, 10:37:15

Thank you!

user-b19122 11 April, 2018, 11:11:20

@wrp system specs are [email removed] 8gb ram, 1 TB Hdd, 64 bit OS Win10.

user-b19122 11 April, 2018, 11:13:23

I don't think that it is the issue with specs.

Chat image

user-b19122 11 April, 2018, 11:13:40

have a check prompt message

Chat image

user-ef998b 11 April, 2018, 11:56:20

dear @wrp , i have tried out your suggestion, but nothing changed. Unfortunatelly now is worse because there isnt a main camera picture. Advice? Thanks.

user-04d904 11 April, 2018, 14:14:29

Thanks @wrp for your help

user-ef1c12 11 April, 2018, 14:25:23

Hi - I am new to Pupil and when I open Pupil Capture on MacOS 10.12, I get the following error message: "EYEO: Init failed. Capture is started in ghost mode. No images will be supplied"

user-ef1c12 11 April, 2018, 14:25:36

How can I fix this?

user-ef1c12 11 April, 2018, 15:56:49

FYI I have installed all the MacOS dependencies

user-d72566 11 April, 2018, 16:44:02

Is there a repository for Pupil plugins? I was just wondering if there is any plugins for detecting and exportiong Saccads, and smooth pursuits?

user-b9f3db 11 April, 2018, 17:58:33

Hi all, has anyone had any experience measuring pupil dilation?

mpk 12 April, 2018, 06:46:34

@user-ef1c12 have you tried running using a bundle? Does it work then?

mpk 12 April, 2018, 06:47:15

@user-d72566 maybe this is relevant: https://github.com/pupil-labs/pupil-community ?

wrp 12 April, 2018, 07:06:10

@user-ef998b please send me a DM and we can try to coordinate a way to fix this for you.

user-d72566 12 April, 2018, 07:41:33

@mpk Thanks, will take a look.

I have a feature request: It would be really nice if it were possible to zoom in on the graphs in Player. Like Holding CTRL and using the scroll wheel or similar. Just a thing that hit me. πŸ˜ƒ

user-d72566 12 April, 2018, 07:47:58

Also, I'm wondering if it is possible to export annotations together with for example fixations? For example, if I would export fiixations and blinks, it would be nice if they could include columns for annotations as well.

user-ef1c12 12 April, 2018, 08:04:37

@mpk I am sorry, what do you mean by "using a bundle"?

mpk 12 April, 2018, 08:05:07

@user-d72566 regarding timeline zoom: This is something we plan to implement soon!

mpk 12 April, 2018, 08:06:03

regarding joining annotations with other data. I have not thought about this before. If there is a nice way to do this in a generalistic way. I d say we can do it.

mpk 12 April, 2018, 08:06:36

@user-ef1c12 I mean just download the app from here and run it: http://github.com/pupil-labs/pupil/releases/latest

user-ef1c12 12 April, 2018, 08:07:41

I get the error message on the app

user-ef1c12 12 April, 2018, 08:14:13

FYI, the headset doesn't show up on the list of USB devices while it's plugged in

Chat image

mpk 12 April, 2018, 08:39:44

@user-ef1c12 it should look like this:

mpk 12 April, 2018, 08:39:54

Chat image

mpk 12 April, 2018, 08:40:03

if not there is a USB cable issue.

user-ef1c12 12 April, 2018, 08:40:52

@mpk What can I do about it?

mpk 12 April, 2018, 08:41:19

try a different usb port? Different cable? Different Computer?

mpk 12 April, 2018, 08:41:47

if nothing helps, please reach out to info@pupil-labs.com for supoort on the hardware.

user-ef1c12 12 April, 2018, 08:42:55

@mpk I have tried all USB ports (they are all functioning), I can't use a different cable since it's a Pupil-specific cable

user-ef1c12 12 April, 2018, 08:43:31

Are you sure it's not a software issue? I have seen other users report the same issue before

mpk 12 April, 2018, 08:44:08

if the hardware does not show up in the device manager and you are on mac this is for sure not a SW issue. make sure to restart the machine just to make sure.

mpk 12 April, 2018, 08:44:44

make sure that the USB-C connector is fully engaged with the Clip of our headset.

user-ef1c12 12 April, 2018, 08:45:34

That was the issue!

user-ef1c12 12 April, 2018, 08:46:14

It's working, thank you!

mpk 12 April, 2018, 08:52:26

@user-ef1c12 glad to hear I could help!

user-ef1c12 12 April, 2018, 09:53:30

@mpk is it normal if the pupil/iris image is out of focus? I had seen a way crisper pupil/iris image with the previous Pupil headset

user-b19122 12 April, 2018, 09:55:21

@wrp - I tried uninstalled & followed your instructions. But still pupil_capture getting crash while we opens debug mode for both eye ( 0 & 1 ). Other issue is that while we start device & both eye cameras. It gives -- eye1 - [ERROR] uvc: Could not init 'Analog video standard'! Error: Error: Pipe error.

what is this??

mpk 12 April, 2018, 09:59:27

@user-b19122 we have not seen this issue before. Can you try a different machine?

mpk 12 April, 2018, 09:59:51

@user-ef1c12 if you are using a 200hz system. Yes the image looks less nice but tracking should work fine regardless.

user-ef1c12 12 April, 2018, 10:00:43

@mpk Got it. Thank you!

user-d72566 12 April, 2018, 10:38:09

@mpk Yeah, a tool for merging the annotation and blink, fixations etc into CSV files would have been great. Right now, I doing it manually, I don't have that much video to work with for the moment(just 30 minutes) but it still takes an hour or two. I can imagine having more video would take a substantial amount of time.

user-d72566 12 April, 2018, 10:46:29

As an example: I exported the blinks into CSV and inserted a new column with the label "Task". Then had to move the trim handle inside Player to the first task in my experiment. Let's say that task 1 started about 30 seconds into the video. Then I move the trim handle to 30 seconds and look at the minimum Frame index to determine were the task started. I then continue to watch until task 2 starts and do the same thing again to determine the end frame. Then I look into CSV file and mark all cells that occurred in this frame span as task 1. This is pretty cumbersome if you ask me. and I have to do it for both blinks and fixations, maybe there is an easier way?

user-b19122 12 April, 2018, 11:43:37

@wrp check the image (algo) & let me know whether it is correct?

Chat image

user-072005 12 April, 2018, 13:24:19

I am using pupil mobile and I can record for 15-30 min and then I get a message saying that the max file size has been exceeded and it cuts off. I have more room on the SD card, so I was curious what is limiting the file size?

user-82fd94 12 April, 2018, 17:32:25

Hello, just got a second eyetracker, with the newer 200Hz fixed focus cameras. We're having a lot of trouble with calibration relative to the old 120Hz cameras, and the camera view seems out of focus at any reasonable distance from the eye, with or without the extenders. Also, the framerate in the software seems to have a cap of 120Hz. Any suggestions for what to do?

user-82fd94 12 April, 2018, 17:38:46

I see others have had the same issue, but our calibration is objectively worse than with the other device (testing them sequentially)

user-e7102b 12 April, 2018, 17:50:22

@user-82fd94 We've had a 120 Hz headset for a few months now, and just purchased a 200 Hz headset. I've not had a chance to play around with it too much yet, but I noticed that the eye looks less focussed, although apparently this is normal and shouldn't affect calibration. I've played around with 2D screen markers calibration and it looks OK. Are you trying to do 2D or 3D? Re the framerate, if you drop the resolution of the eye cameras you should see the higher frame rate options appear.

user-82fd94 12 April, 2018, 17:56:27

We're doing the 2D, and using the screen markers. I'll try changing the resolution as well, thank you.

user-82fd94 12 April, 2018, 18:40:58

Lowering the resolution seemed to fix it, we're getting similar angular accuracy for both now. Thanks @user-e7102b

user-e7102b 12 April, 2018, 18:43:51

Good to know! Seems counter-intuitive that lower resolution = better calibration...but hey, whatever works πŸ˜ƒ

user-6e1816 13 April, 2018, 02:31:20

I want to know the timestamps format in the world_timestamps.npy, and better tell me the source codes location.

user-b19122 13 April, 2018, 03:58:23

@wrp Camera Eye0 is giving quite good view but Eye1 is bit dark. How can we do settings for Eye1 cam? I didn't get any options.

user-bab6ad 13 April, 2018, 07:47:29

Hi everyone. I am a researcher in Austria. Actually a college is the main eye track guy, but for a demo we wanted to stream with a headset and the android phone to the desktop. It always says "Make sure 'time_sync' is loaded" and then 'time_sync' we are the leader, 'time_sync' Become clock master with rank 6.5someting, 'time_sync' 4.54 MG-CU removed, 'pyre.pyre_node' Group default-time_sync-v1 not found, 'time_sync' 6.54 MG-CU added...

user-bab6ad 13 April, 2018, 07:48:03

No video is appearing on the desctop. If I connect to the desctop directly it works, also switching on the individual cameras on the pupil phone works

mpk 13 April, 2018, 07:52:41

@user-bab6ad does the phone show in the pupil mobile menu of pupil capture? Did you select the correct camera in the menu below?

user-bab6ad 13 April, 2018, 07:56:15

@mpk hm, I did not see any menu on the phone. I just see the three dots for the settings. Or do you mean the main sensor listing in the middle?

user-bab6ad 13 April, 2018, 07:58:04

Maybe: The only phone we have with USB C is a pixel with Android I think 8.1. Can that be the problem.

mpk 13 April, 2018, 08:01:34

I was referring the the sidebar menu in Pupil Capture. I dont know if 8.1 is an issue.

user-bab6ad 13 April, 2018, 08:03:14

@mpk there I went to NDSI manager and selected pupil mobile. When I select it the messages with time_sync from above appear and the screen is grey. But maybe I need to select a camera in an other screen then?

mpk 13 April, 2018, 08:04:15

Chat image

mpk 13 April, 2018, 08:04:24

the bottom menue shows the camera you want?

mpk 13 April, 2018, 08:05:00

if you select the camera there and you still get a gray window its a phone os issue (unfortunately.)

user-bab6ad 13 April, 2018, 08:05:10

Hm, the laptop has windows, and I quickly check the version of the pupil capture, it looks different to me

user-bab6ad 13 April, 2018, 08:05:35

I start pupil v.1.6.11 on windows

mpk 13 April, 2018, 08:05:43

thats fine.

user-bab6ad 13 April, 2018, 08:06:28

ok, so I can select 'Pupil Mobile' in the Manager dropbown, but it does not tell me a remote host

user-bab6ad 13 April, 2018, 08:06:35

we are on the same WiFi, I checked that

mpk 13 April, 2018, 08:08:56

then the Wifi is port blocking. Can you try opening a hotspot with a third phone and logging both laptop and pupil mobile phone onto that?

user-bab6ad 13 April, 2018, 08:09:35

ah, that is a possibility

user-bab6ad 13 April, 2018, 08:09:38

I will try that

user-bab6ad 13 April, 2018, 08:20:13

it now says exception in thread-8, in the end it goes to zhelper.py 489, GetAdaptersAddress: argument 4 expected LP_IP_ADAPTER_ADDRESSES instance instead of LP_IP_ADAPTER_ADRESSES

user-bab6ad 13 April, 2018, 08:20:28

o_O this error message look as it would make no sense

mpk 13 April, 2018, 08:21:28

@user-bab6ad sorry to hear that. I m not sure what the issue is. sounds like a funny wifi config. The error is in one of the external libs we use. So nothing I can directly help with 😦

user-bab6ad 13 April, 2018, 08:23:25

@mpk ok, one last question: it now also do not start when I switch back to the default network. Is there a command to reset that? Because then I can play with the WiFi

mpk 13 April, 2018, 08:24:01

yes. Just delete the folder called pupil_capture_settings in your user directory.

user-bab6ad 13 April, 2018, 08:24:16

ok, perfect, thx!

papr 13 April, 2018, 13:01:51

@user-6e1816 it's a one dimensional float64 numpy array. Use numpy.load() to load it by yourself

user-9b14a1 13 April, 2018, 18:02:08

Hi, - did last week end some offline mobile binocular recordings. With the pupil-payer_v1.2.7 I could successful offline detect and video export. After that I got the idea to try out v1.6 and updated that for to OSX High Sierra. Now the pupil-player_v1.6 just shuts down after dragging a local recording folder onto it. To continue I put back my old working v1.2.7 binaries and suddenly the writing of the export doesnΒ΄t start anymore, the progress just stops at the in point. Are there dependencies to fulfil or permission to set manually? Again thinking never touch a running system.

user-9b14a1 13 April, 2018, 18:16:02

I will be back on Monday afternoon πŸ˜ƒ DonΒ΄t worry, but may someone has an idea.

... πŸ˜€ v1.6 is working now. May it was a invisible window in the display arrangement, with primary screen on the right, ... or a missing update after upgrade.

Chat image

user-42b39f 13 April, 2018, 20:38:45

Hi all. I am looking into some of pyglui functionalities and thus downloaded the example from github. I cannot understand why in the pyglui/example/example.py I can't import "draw_concentric_circles". I checked that pyglui in in the path and there is no error with the importations from pyglui.cygl.utils.

papr 13 April, 2018, 20:45:41

@user-42b39f the example is unfortunately out of date. Please create an issue and I will update it the coming week.

user-6e1816 14 April, 2018, 05:53:52

@papr I know this, but what I really want to know is the format of this time, I used to think it came from time.time(), but apparently not.

user-42b39f 14 April, 2018, 06:14:57

Oups, I did not notice that this issue had been reported already. Is Draw_concentric_circles deprecated or not supported anymore ? I tried to have a list of all the methods for pyglui but as I am new to Python I didn't find how to list them. I tried various combinaisons of dir(), hel(), getmembers...

papr 14 April, 2018, 06:25:24

@user-6e1816 by default the Pupil time is monotonic clock whose epoch has no meaning. But you can use Pupil Remote or Pupil time sync to change the epoch to e.g. the Unix epoch.

papr 14 April, 2018, 06:26:30

@user-42b39f it is not supported anymore since we changed the calibration marker drawing method.

papr 14 April, 2018, 06:28:10

@user-6e1816 Small addition: The time unit is always seconds

user-e55abf 16 April, 2018, 02:28:03

Hi all, I am interested in installing an IR camera inside a VR headset, recording pupil movement, and uploading the video into Pupil Capture + Pupil Player to get a readout of eye movement. I am not concerned with gaze (i.e. I only need one camera). Is there a smart/efficient way to do this?

user-3e880b 16 April, 2018, 03:09:59

Hello,

user-3e880b 16 April, 2018, 03:10:37

I am having a problem using the pupilremote to connect 2 desktops

user-3e880b 16 April, 2018, 03:10:59

One desktop is connected to pupillab eye tracker and the other is for remote visualization

user-3e880b 16 April, 2018, 03:11:29

Can anyone help me with how to do that?

user-babd94 16 April, 2018, 09:23:15

Hello, I purchased "Epson Moverio BT-300 Binocular Mount Add-on" through a Japanese agency. but, I broke wires in plug. Could you tell me how to repair it? I attach the photo of the damaged part.

Chat image

papr 16 April, 2018, 09:26:06

@user-babd94 Do you feel comfortable to change the cables yourself?

user-babd94 16 April, 2018, 09:31:31

@papr I think that wiring can be done if it is not too difficult

papr 16 April, 2018, 09:32:47

Please write an email to info@pupil-labs.com including your order details, the image and a reference to this discord conversation.

user-babd94 16 April, 2018, 09:39:32

Certainly. I will send to it after contacting the agency. thank you.

user-e55abf 16 April, 2018, 18:43:35

@papr, any advice on pulling only the eye tracking data, without using a world camera?

user-006924 16 April, 2018, 22:14:36

Hi Everyone, I just got my pupil labs eye trackers (previously I was using a DIY version) It's the high speed world camera. 200 Hz binocular version. I was wondering how I can zoom out in the eye cameras since even after adjusting them on their rail I can't get a decent view of each eye and one the eye cameras is upside down, I already flipped the image but I was wondering if there's another solution for this problem.

wrp 17 April, 2018, 01:54:29

Hi @user-006924 nice to hear from you again. The 200hz eye cameras do not zoom or focus. The right eye image is upsidedown by design - the orientation of the images does not affect gaze estimation. Did you try the extension arms (the orange arms that come with the headset)?

wrp 17 April, 2018, 01:56:12

You can learn more about the additional parts in the docs here: https://docs.pupil-labs.com/#additional-parts

wrp 17 April, 2018, 01:56:42

If you are still having difficulty capturing an eye image, please share or send images/videos of the eye so we can provide feedback.

user-d74bad 17 April, 2018, 02:20:50

hey, I'd like to open the camera feed from the two IR eye cameras with pyuvc and opencv. When I run: import uvc import logging import cv2 logging.basicConfig(level=logging.INFO)

dev_list = uvc.device_list() cap = uvc.Capture(dev_list[0]['uid']) cap.frame_mode = (400,400,60) cap.bandwidth_factor = 1.3

while True: frame = cap.get_frame_robust() cv2.imshow("img",frame.img) if cv2.waitKey(1) & 0xFF == ord('q'): break

cap = None

user-d74bad 17 April, 2018, 02:20:59

I only get a blank, black screen

user-d74bad 17 April, 2018, 02:21:10

Is there something else I have to initialize? Values, etc?

wrp 17 April, 2018, 02:22:04

@user-d74bad did you take a look at https://github.com/pupil-labs/pyuvc/blob/master/example.py already?

user-d74bad 17 April, 2018, 02:22:08

I've tried bgr and gray frame attributes as well

user-d74bad 17 April, 2018, 02:22:11

that's for the world camera

user-d74bad 17 April, 2018, 02:22:18

and I can get that working easily

user-d74bad 17 April, 2018, 02:22:26

with the example I posted

user-d74bad 17 April, 2018, 02:22:44

I just change the device list index to 1

user-d74bad 17 April, 2018, 02:24:00

@wrp yes

user-d74bad 17 April, 2018, 02:24:58

I do get a little bit of feedback: Estimated / selected altsetting bandwith : 210 / 256. !!!!Packets per transfer = 32 frameInterval = 166666

user-d74bad 17 April, 2018, 02:25:07

not sure if that's normal

wrp 17 April, 2018, 02:28:06

and why not use bgr?

user-d74bad 17 April, 2018, 02:28:12

that doesn't work 😦

user-d74bad 17 April, 2018, 02:28:22

just was trying some other formats

wrp 17 April, 2018, 02:29:39

@user-d74bad to clarify - you are trying to display frames from Pupil Labs 200hz eye camera, correct?

user-d74bad 17 April, 2018, 02:29:57

@wrp yes, exactly

wrp 17 April, 2018, 02:36:09

@user-d74bad I will try to repliacte this issue and get back to you

user-d74bad 17 April, 2018, 02:36:21

beautiful, thanks

wrp 17 April, 2018, 02:53:50

@user-d74bad what OS are you using?

user-d74bad 17 April, 2018, 02:54:57

Mac OSX

user-d74bad 17 April, 2018, 02:58:35

I've also opened a working camera feed with ffmpeg of the ir cameras

user-d74bad 17 April, 2018, 02:58:56

I had to specify a certain codec though

wrp 17 April, 2018, 03:03:54

@user-d74bad I am able to successfully get frames from the eye cameras as well - but perhaps opencv is not able to display the format we are supplying - all the data is there - but I also see black screen using opencv's imshow function. We will need to update the example in pyuvc. I will talk with my team about this today

user-d74bad 17 April, 2018, 03:04:55

Interesting, you're able to display the frames with ffmpeg or opencv?

user-d74bad 17 April, 2018, 03:05:07

Oh gotcha

user-d74bad 17 April, 2018, 03:05:10

I see now

user-d74bad 17 April, 2018, 03:05:25

Great, thanks for working on that

wrp 17 April, 2018, 03:09:43

alternatively - the exposure and other settings may be set such that you can not see the image, hence black screen from opencv's imshow

wrp 17 April, 2018, 03:13:27

@user-d74bad it is in fact the exposure mode and gamma

Example - if you do the following you will see the image in opencv's imshow window

controls_dict = dict([(c.display_name, c) for c in cap.controls])
controls_dict['Auto Exposure Mode'].value = 1
controls_dict['Gamma'].value = 200
user-d74bad 17 April, 2018, 03:14:23

Heck yes! I'll try when I get back to the lab, and I pass the controls dict to the capture object somehow?

user-d74bad 17 April, 2018, 03:15:59

Oh or those are.passed by reference and affect the cap through their object reference?

wrp 17 April, 2018, 03:16:08

you don't need to pass the controls dict - they are passed via their reference in this simple example at least πŸ˜„

user-d74bad 17 April, 2018, 03:16:20

Gotcha, great thanks

wrp 17 April, 2018, 03:32:41

@user-d74bad welcome πŸ˜„

user-88ecdc 17 April, 2018, 07:24:15

Good day to you people. I have a question regarding analysis of data using surfaces. For my study, i have used a single surface and i want to compare the amount of fixations that this particular surface received versus all other fixations on the dataset. Do you guys know how to do that? I hope it makes sense, thanks in advance πŸ˜ƒ Best regards Alexander.

papr 17 April, 2018, 07:26:24

Hey @user-88ecdc You can export the fixations on the surface and all fixations in two separate csv files. Fixations are identified by their timestamp. Then it is just a matter of comparing the two files.

user-88ecdc 17 April, 2018, 07:55:12

Thanks for answering so quickly @papr ! I assume that exporting all fixations is done by using the Raw Data Exporter, but how do i export a csv file that only contains the fixations on one surface?

user-8fe915 17 April, 2018, 08:09:31

Hey, I have a problem with binocular recordings using two 120Hz cameras. The absolute timestamps of the two pupil signals are set-off by 100-500ms. E.g. In this picture, the orange line is earlier than the late one. I'm using very basic code for plotting (see next message).

Chat image

user-8fe915 17 April, 2018, 08:09:40

pldata = pl_file_methods.load_object(os.path.join(filename,'pupil_data'))

def plot_trace(pupil): t = [p['timestamp'] for p in pupil] x = [p['norm_pos'][0] for p in pupil] plt.plot(t,x,'o')

plot_trace([p for p in pldata['pupil_positions'] if p['id'] == 0])

plot_trace([p for p in pldata['pupil_positions'] if p['id'] == 1])

user-8fe915 17 April, 2018, 08:10:04

Did this ever occur to someone else? I will try to do more recordings and tests soon, but I thought its worthwhile to ask

papr 17 April, 2018, 08:12:21

@user-8fe915 Could you shortly summarize which hardware and software/os you used for the recording?

user-af87c8 17 April, 2018, 08:15:08

sure, 120Hz trackers, 60Hz worldcam, Ubuntu 16.04, newest compiled pupil (would nee to check, downloaded it ~7 days ago)

papr 17 April, 2018, 08:15:30

@user-88ecdc You need to start the offline fixation detector and the offline surface detector, and after both finished their computation, you just need to hit export. The offline fixation detector exports the "all fixations" file and the offline surface detector exports the fixations on each surface

user-af87c8 17 April, 2018, 08:15:35

(need to go, will be back in ~2h)

papr 17 April, 2018, 08:16:49

@user-af87c8 that means that you probably recorded the eyes as well. Could you please enable the eye video overlay plugin and check if the eye video are in sync?

user-af87c8 17 April, 2018, 09:30:55

@papr the videos are delayed as well

papr 17 April, 2018, 09:32:45

Is this the case for all of your recordings?

user-af87c8 17 April, 2018, 10:16:24

I would need to check - but how can this occur for even a single recording?

user-af87c8 17 April, 2018, 10:19:05

also the delay is increasing over time. Does pupillabs rely on continuous framerate of the videos? Or is it using the timestamps of the eye video frames

papr 17 April, 2018, 11:18:50

@user-af87c8 We have not seen this behavior before. And we also do not know what the cause could be. We specifically do not rely on a fixed frame rate but on the frame timestamps to correlate the data.

papr 17 April, 2018, 11:19:16

The timestamps are generated by the cameras.

papr 17 April, 2018, 11:21:41

btw, @user-af87c8, you can archive the same results by loading the worl/eye0/eye1_timestamps.npy files with numpy. They should yield the same timestamps.

user-af87c8 17 April, 2018, 11:34:56

I don't follow, do the cameras have their own clocks? Or do you mean the eye-process (and then the timestamps are in system time)

I checked, at least the number of timestamps (numpy.shape) and the number of frames in the video (checked with ffprobe) are the same. I occasionally get a frame drop (libjpegturbo finds a corrupt file, will soon fix this) during recording, but framedrops should not influence the mapping of timestamp + video

papr 17 April, 2018, 11:37:57

The camera timestamps are based on the system's monotonic clock that is sued for the usb connection timings AFAIK

mpk 17 April, 2018, 11:38:33

@user-af87c8 just to confirm you are using a Pupil Headset with usb-c clip?

mpk 17 April, 2018, 11:39:18

framedrops do not affect the clock or timetstamps.

mpk 17 April, 2018, 11:39:32

are you setting uvc controls manually?

user-af87c8 17 April, 2018, 11:40:01

no, we have the old USB 2.0 connectors, I don't set uvc controls manually

mpk 17 April, 2018, 11:40:22

@user-af87c8 so every camera exposed seperatly?

mpk 17 April, 2018, 11:40:29

or a single connection?

user-af87c8 17 April, 2018, 11:40:43

single connection to the usb hub (the clip), then three cables (not sure what you mean)

mpk 17 April, 2018, 11:42:28

three cables from the headset to your PC?

papr 17 April, 2018, 11:42:31

Concerning the numpy timestamps: Pupil data is based on the frames' timestamps. Therefore [p['timestamp'] for p in pldata['pupil_positions'] if p['id'] == 1] should yield the same timestamps as np.load('eye1_timestamps.npy')

user-af87c8 17 April, 2018, 11:43:34

@papr ok this I understand. @mpk no. Single connection (as in a single USB cable) from the PC to the hub/clip . Then from the clip three cables to the three cameras

mpk 17 April, 2018, 11:44:42

@user-af87c8 ok thanks. Then its on the same USB bus and I dont know what else could be the issue.

user-af87c8 17 April, 2018, 11:45:24

yes. I request the alternative building instructions to make use of multiple USB hubs last week, but did not get a response so far

user-af87c8 17 April, 2018, 11:46:12

we are now checking another recording to see how easy we can reproduce it. But this is a huge problem for us

mpk 17 April, 2018, 11:49:18

@user-af87c8 of course this is a huge problem. We have spent a lot of time to get this sort of thing right. Its the basis for all other work.

user-af87c8 17 April, 2018, 11:50:23

As I said, there are sometimes framedrops (libjepgturbo complaining about incomplete frames)- I guess it is due to the high framerate being at the limit of USB 2.0. These occur occasionally, but might add up over time

mpk 17 April, 2018, 11:50:33

@user-af87c8 I m not sure if we recevied the email requesting alternative hardware. Could you send it again?

mpk 17 April, 2018, 11:51:01

framedrops do not affect the timestamps, I think the reason lies elsewhere.

user-af87c8 17 April, 2018, 11:51:01

sure, info@pupil-labs.com ?

mpk 17 April, 2018, 11:51:05

yes.

user-af87c8 17 April, 2018, 11:51:25

ok, send (thanks already!!)

papr 17 April, 2018, 12:23:25

@user-af87c8 I just tried to reproduce your issue on a binocular 120Hz headset (with usb-c clip though, since we do not have). You mentioned that the delay got bigger over time. What time span are we talking about until the difference is noticable?

user-af87c8 17 April, 2018, 12:29:43

after 10 minutes you can easily see it if you look at blinks

user-af87c8 17 April, 2018, 12:29:56

or saccade onsets with half speed

papr 17 April, 2018, 12:31:24

Yeah, blinks is usually what we use to determine sync

papr 17 April, 2018, 12:32:59

I made a three minute recording and was not able to reproduce the issue. I will do another 10-minute one

user-ef1c12 17 April, 2018, 12:46:53

@mpk Camera Eye1 is not working anymore and I get the following error message. Could you please let me know what I need to do?

Chat image

user-ef1c12 17 April, 2018, 12:48:09

Camera Eye1 appears as "Unknown" when I attempt to select it in the Local UVC sources

Chat image

user-67e255 17 April, 2018, 12:52:37

hi @user-ef1c12 it's Douglas from the hardware team. Let's schedule a quick debug meeting - i'll direct message you a link

user-ef1c12 17 April, 2018, 12:54:18

ok great, thank you @user-67e255

user-ecbbea 17 April, 2018, 17:31:01

Hey, just curious if there is any news on the release of the HTC Vive addon 200 hz version. We're getting ready to collect some data, and already have a 200hz headset, but noticed only the 120hz version is available for the Vive

user-d74bad 17 April, 2018, 18:35:28

@wrp that did indeed work for me, I submitted a pull request on the pyuvc github to help others who want to do the same

wrp 18 April, 2018, 04:00:54

Thanks for following up @user-d74bad we will review the PR this week 😸

wrp 18 April, 2018, 04:03:37

@user-ecbbea we have finalized the camera design for the 200hz Vive and Vive PRO and are gearing up for production. We will let you and the community know when it is available. These new cameras will also enable us to (finally) make an add-on for Oculus CV1 as well. I hope to be able to share something concrete with you all soon on a release date.

papr 18 April, 2018, 06:56:12

@user-d74bad @wrp I reviewed and merged the PR

user-8be7cd 18 April, 2018, 09:08:35

The yuv output from the frame publisher plugin appears to be yuv422 is there anyway to change this to yuv420?

user-b0c902 18 April, 2018, 09:10:20

How is the norm position X and Y for the frames aggregated in the fixations csv file?

mpk 18 April, 2018, 09:24:18

@user-8be7cd you would have to convert that yourself. We simply transmit the yuv format we are getting from the cameras.

user-8be7cd 18 April, 2018, 09:43:39

But doesn't the camera give MJPEG anyways that the uvc/pyuvc converts to yuv?

user-8be7cd 18 April, 2018, 09:46:20

or rather libturbojpg

papr 18 April, 2018, 09:50:31

@user-b0c902 IIRC it is the mean gaze position of the gaze data that the fixation is based on.

user-b0c902 18 April, 2018, 09:54:28

I was calculating the average based on the gaze position but it doesn't match. I then tried to exclude the gaze positions that were below confidence threshold, and that gave me a closer result yet not the same. However, I just want to be sure of how it is aggregating the gaze position.

user-d9bb5a 18 April, 2018, 10:33:10

Good afternoon. After reinstalling windows does not see the world camera computer, what should I do? What driver we could not install, could you tell us?

wrp 18 April, 2018, 11:16:51

@user-d9bb5a please try running the latest version of Pupil Capture as admin to install drivers

user-d9bb5a 18 April, 2018, 11:25:01

and how to do it? as an admin?

user-d9bb5a 18 April, 2018, 11:27:58

We installed the latest version, but the World Camera is not displayed\

papr 18 April, 2018, 11:37:24

Right click the app, "Run as administrator"

wrp 18 April, 2018, 11:40:20

@user-d9bb5a you are using 3d world cam correct?

user-d9bb5a 18 April, 2018, 11:42:29

Yes thank you. Everything worked out.

user-d9bb5a 18 April, 2018, 11:43:53

and thanks for the improvement of your product, the latest updates are very good. But could you please explain the point about iMotions Exporter -- #1118

papr 18 April, 2018, 11:44:39

@user-d9bb5a You can export your recording to the iMotions format. This is only of interest if you use https://imotions.com/

user-d9bb5a 18 April, 2018, 11:48:41

understand, thanks, read here

user-ecbbea 18 April, 2018, 18:54:29

@wrp thanks for the info! is there a timeline in terms of when production will be ready? even vaguely, like a week/month etc? cheers

user-ecbbea 18 April, 2018, 22:21:09

also, is it possible for us to beta test the new hardware? we can pay, as well as provide feedback. we're just in a time crunch to collect data.

user-b116a6 19 April, 2018, 10:35:03

Hey guys, I have successfully set up the physical surfaces markers on my PC screen and written a script using the IPC backbone to receive the metrics regarding the surfaces topic. What interests me is receive the fixations and in particular the norm_pos and timestamp on the surface but there isn't such a topic in the message received, should I calculate them myself based on some id or am I doing something wrong?

user-b116a6 19 April, 2018, 10:36:56

Or should I tap on the fixation topic also and compare the timestamps received from the fixations topic and the surfaces topic ?

user-2334b9 19 April, 2018, 10:54:04

@wrp I have logged in. all OK

wrp 19 April, 2018, 10:56:52

Hi @user-2334b9 I have sent you a Direct Message - thanks!

user-b116a6 19 April, 2018, 16:13:46

Hey guys, can someone help me-give me an insight with the above question I have as it is quite time sensitive. I have already written the script to receive the metrics related to the topics fixation and surfaces but I don't understand which information is "useful" to calculate the fixations in the surfaces topic in a real-time process. Any idea/insight will be very much appreciated. Thank you in advance.

papr 19 April, 2018, 17:04:41

@user-b116a6 Actually, the online surface tracker does not map fixations to surfaces. But that is a useful feature that could be added to the next release.

papr 19 April, 2018, 17:06:36

Alternatively you can use the surface's m_from_screen matrix to transform the fixations' norm_pos into normalized surface coordinates

user-b116a6 19 April, 2018, 17:17:45

@papr I thought it had something to do with the gaze_on_srf list that is received. How can i use the m_from_screen matrix to transform the fixations' norm_pos? Currently, yes I am saving the norm_pos of the fixations as they are received but I have trouble finding the correlation between the message received from the surfaces topic.

user-b116a6 19 April, 2018, 17:24:51

Is there an alternative way to do this mapping?-say given the timestamp of the surfaces topic if the on_srf field is True, map this timestamp to a fixation timestamp if it exists and assume that the fixation's norm_pos was inside the area created by the physical markers?

papr 19 April, 2018, 17:36:50

I would simply cache the most recent surface events, and use their m_from_screen matrix on receiving a fixation. Maybe add a maximum time difference between surface event and fixation

papr 19 April, 2018, 17:43:29

But this can easily be calculated in the plugin. And should be published directly. I will add this tomorrow

user-b116a6 19 April, 2018, 17:44:06

How could I use this in a real-time process though and use the m_from_screen to transform a fixation's norm_pos?

papr 19 April, 2018, 18:01:02

You subscribe to both, fixations and surfaces, and try to apply the most recent fixation to the most recent surface.

user-b116a6 19 April, 2018, 18:10:47

@papr Yes, I thought of that after I asked the question, but how would I use the m_from_screen matrix to normalize the fixation's norm_pos and what is the expected result after that normalization? I seem to be a bit confused... πŸ˜•

papr 19 April, 2018, 18:51:43

The matrix is meant as a linear transformation. Matrix multiply it with the norm pos vector

user-b116a6 19 April, 2018, 19:43:25

@papr Okay then I will try that, after that I assume that the fixation happened within the bounds of the surface if the output of the normalized coordinates are positive? Shouldn't another value exist in the norm_pos vector since the m_from_screen matrix is 3x3, or do I set it to 0?

papr 19 April, 2018, 19:46:38

Ah, you need to convert it to a homogenous point

papr 19 April, 2018, 19:47:15

The fixation is on the surface if x and y are between 0 and 1

papr 19 April, 2018, 19:48:11

The location of the fixation should correlate with the gaze that was mapped to the surface

user-b116a6 19 April, 2018, 19:50:42

@papr Okay thank you very much for the clarification, by saying converting it to a homogeneous point you mean that the 3rd value of the norm_pos vector should be 1 right?

user-e91538 19 April, 2018, 19:59:07

Anyone know why, when calibrating the stuff on the HTC Vive using hmd-eyes, no calibration stuff gets saved? (Also, in the dev console, it keeps trying to access files that don't exist....)

papr 19 April, 2018, 20:08:02

@user-b116a6 correct. And after multiplying you need to normalize the third dimension back to 1

user-b116a6 19 April, 2018, 20:09:47

@papr Okay got it, thank you very much, I'll try that and hopefully it'll work.

user-d7efdf 19 April, 2018, 20:13:58

Dear Pablo

user-d7efdf 19 April, 2018, 20:16:28

I need to guide for setting up wifi

user-d7efdf 19 April, 2018, 20:18:26

I'm not an expert on computer and information technology

user-d7efdf 19 April, 2018, 20:19:49

A clear and complete instruction can help me a lot

user-e91538 19 April, 2018, 20:25:53

clear and complete instructions would be nice 😦

user-5054b6 19 April, 2018, 22:05:24

I have 2 quick (I think) questions as I write a grant proposal. I want to have multiple children wear the pupil-labs headsets in classroom settings, and then will have them and others watch the resulting video records. So I'm interested in getting the most watchable videos and I'll also need to synchronize multiple records. I assumed the monocular high resolution system is what I want, but the specs on the high speed lens seem comparable (except for the lack of autofocus) - any advice on what system will produce the best video in a free-ranging environment? As for synching, I know in theory how to do this, but I'm interested whether others are doing multi-headset work.

wrp 20 April, 2018, 02:41:26

Hi @user-e91538 we received your emails and responded via email. Thanks for following up here via chat.

papr 20 April, 2018, 09:59:37

@user-b116a6 This one is for you: https://github.com/pupil-labs/pupil/pull/1157/commits/4bb1aebae85699a1ad78a20b796140c6a10ce727 Fixations will be included as fixations_on_srf in the surface events. Would you mind testing it and giving feedback?

user-b116a6 20 April, 2018, 10:56:04

@papr Thank you very much for your prompt response and commit, I just tested it from source and works really well with the script I implemented as I mapped the norm_pos values on the surface and the mapping was correct. Great work, thanks for implementing it in such short notice.

papr 20 April, 2018, 11:02:40

Nice! Great to hear that it works!

user-847773 20 April, 2018, 12:56:50

@papr Hello Pablo,

I'm Shubham Chandel masters student in Queen Mary University of London, am planning to do my thesis on pupil headset. I'm facing an issue with the eye camera, it doesn't seems to be working(init failed error), I've tried the troubleshooting steps and even tried installing the drivers separately but still no progress. Have installed the software on my Windows 8.1. I would really appreciate it, if you could help me resolve the issue. Thanks.

user-8fba8e 20 April, 2018, 13:56:45

Hello, I have a Moto z2 play with pupil capture and I want to transfer the recording to a macbook pro. How can I do this and view the videos in pupil play

user-aa0833 20 April, 2018, 14:13:50

hi, I'm using the HTC Vive addon for my research in foveated rendering. On your website you mention that the eye cameras transfer their images with USB2.0, however the product comes with USB C - USB 3.0. I would prefer using a cable of about 5m to provide sufficient mobility to the users. Can I use a convertor to microUSB without experiencing data loss? Do you have other ideas?

mpk 20 April, 2018, 14:15:51

@user-aa0833 yes you can use a converter. We use a usb3.0 cable but usb2.0 will be fine as well.

papr 20 April, 2018, 14:17:54

@user-8fba8e I recommend to use the Android File Transfer application

user-8fba8e 20 April, 2018, 14:51:34

When I use puìl mobile how do I calibrate the glasses?

user-d72566 20 April, 2018, 14:56:59

When exporting fixation data to a CSV file, it has a column named dispersion. What is the method for calculating this dispersion?

user-e91538 20 April, 2018, 17:48:03

Hello! I am running an experiment with the pupil-labs eye tracker. We are asking multiple persons to drive on a test-track and drive specific scenarios. Unfortunately, the weather is quite changing and it is extremely difficult to configure correctly the eye cameras to have a good pupil detection (especially when the sun is strong). Is there any documentation or anyone that could advise me a procedure to follow to optimize the pupil detection? Thank you

papr 20 April, 2018, 18:04:42

@user-d72566 maximum cosine distance between gaze/pupil vectors that belong to the fixation

user-e91538 20 April, 2018, 18:21:22

I went through the procedure presented on https://www.youtube.com/watch?v=6sWmOcGMDTk but I feel that it is missing some tuning to adapt the algorithm to a sunny environment.

user-d72566 20 April, 2018, 18:23:30

@papr thanks!

user-29e10a 20 April, 2018, 19:08:21

Is there a chance to get a phi/thera rotation out of the fitted eye ball in 3d detection mode?

user-6f2339 20 April, 2018, 21:08:25

How can I calibrate the eyetracker for doing a study while going as a copilot?

user-d72566 22 April, 2018, 08:59:18

@papr I have been looking at the code now, more specifically in fixation_detector.py There are a lot of references to dispersion but I can't find the actual algorithm that calculates it. Thought I could take a look at it and maybe mimic the algorithm for blinks and saccades. You could perhaps point me in the right direction? πŸ˜‰

user-d72566 22 April, 2018, 09:15:08

Aha, seems it has changed a little from my version too "def vector_dispersion(vectors): distances = pdist(vectors, metric='cosine') return np.arccos(1. - distances.max())"

papr 22 April, 2018, 09:25:00

Yes. But this is a very recent change in order to disregard outliers

user-d72566 22 April, 2018, 09:43:13

aha

papr 22 April, 2018, 09:49:10

Disregard is incorrect though. Take into account is the more fitting term.

user-e1dd4e 23 April, 2018, 04:20:50

For some reason the player shutdown whenever I try to load in a specific recording. It only does it for that one and not any of the others

papr 23 April, 2018, 04:57:13

@user-e1dd4e Could you share this recording with info@pupil-labs.com via e.g. Google Drive? We will have a look at possible causes.

user-c351d6 23 April, 2018, 09:02:29

Chat image

user-c351d6 23 April, 2018, 09:05:05

Hey guys, we are using the 200Hz eye camera. I'm just wondering whether the focus of the picture we get is high enough to do accurate eye tracking. Is there a way adopt the focus? Other eye pictures seem to be more sharp.

papr 23 April, 2018, 09:13:12

The 200hz cameras do not have adjustable focus. Do not worry about it. The pupil detection seems to work fine.

mpk 23 April, 2018, 09:15:12

@user-c351d6 please make sure to use th 192x192 resolution.

user-c351d6 23 April, 2018, 12:00:24

And I have another question. Is there somewhere a list of supported android devices, except of the list with the three devices in the documentary or a list with the requirements for a android device? For example, do they need a native USB-C connection or is it possible to use other devices with an adapter?

papr 23 April, 2018, 12:12:10

@user-c351d6 it is very unlikely that it will work with an adapter. Even having a usb-c connector is not a guarantee, unfortunately.

mpk 23 April, 2018, 12:22:02

@user-c351d6 Wolf#0823 I recommend using a nexus 5x, p6 , one plus5 , pixel 2

user-1f74d7 23 April, 2018, 12:42:45

Hi! I want to use Eye-Tracking for my thesis and I am new to all this. It is possible to see how long a fixation last? And how can I record this??

Thanks πŸ˜‰

user-072005 23 April, 2018, 13:58:24

Is it possible to record more than 15-30 min with the app?

user-072005 23 April, 2018, 14:09:08

@user-1f74d7 You can detect fixations using the fixation detector plugin (online through capture and offline through player). There is a section in the user docs on fixation detection and it says the duration of a fixation is recorded in Capture. In player there is a raw data exporter to get the data.

user-c351d6 23 April, 2018, 15:32:47

@mpk Did you try the OnePlus 5T as well?

papr 23 April, 2018, 15:33:27

Yes, I am 99% sure that it works.

user-9b14a1 23 April, 2018, 16:50:23

... πŸ˜€ v1.6 is working now. May it was a invisible window in the display arrangement, with primary screen on the right, ... or a missing update after upgrade.

user-6f2339 23 April, 2018, 17:44:58

Hi, how do I play videos recorded in pupil mobile in pupil player?

wrp 24 April, 2018, 02:39:33

Hi @user-6f2339 You visualize datasets by dragging a dataset into Pupil Player - https://docs.pupil-labs.com/#pupil-player

wrp 24 April, 2018, 03:04:24

@user-6f2339 You will need to first transfer the data recorded with Pupil Mobile on the Android device to your desktop/laptop and then drag dataset into Pupil Player. Please note that you should have included a calibration routine within your recording made with Pupil Mobile so that you can calibrate post-hoc in Pupil Player

user-c494ef 24 April, 2018, 06:44:10

hi, we have successfully deployed and run pupil labs on NVIDIA Jetson TX2. While it does work it consumes a lot of CPU on average 80%+ on all 4 cores. Any recommendation?

wrp 24 April, 2018, 06:45:51

@user-c494ef are you displaying video/graphics or running headless without a UI?

user-c494ef 24 April, 2018, 06:47:35

with graphics on, I would like to run headless if that is possible

papr 24 April, 2018, 06:48:20

It is not possible to run completely headless. You can save some CPU by minimizing all windows.

papr 24 April, 2018, 06:48:52

Lower eye video resolution helps as well.

user-c494ef 24 April, 2018, 06:58:12

well we initially used 320x180; however we found that 640x480 with lower frame rate was a little bit better. In any case one of the major problem is that the pupil detection consumes about 40% CPU of all cores. Are you planning any optimizations in that area, soon?

user-c494ef 24 April, 2018, 07:12:23

or have you experimented with compile time optimizations so far?

user-a04957 24 April, 2018, 07:20:07

Hi, since today my player always crashes with this error. Anyone has an idea?

Chat image

user-a04957 24 April, 2018, 07:56:56

... i cannot play any recordings 😦

papr 24 April, 2018, 08:36:27

@user-c494ef Do you need pupil detection and mapping for a real time application? If not you can do that offline after a recording.

user-c494ef 24 April, 2018, 09:01:55

@papr yes its for a real time application

user-c494ef 24 April, 2018, 09:02:22

so disabling is not an option

user-11dbde 24 April, 2018, 09:03:59

hello, can anyone give me an hint on how to stream pupil data to hololens wireless(ly)

papr 24 April, 2018, 09:04:48

@user-11dbde check out the hmd eyes project. It includes a Hololens application to do so, if I remember correctly

user-11dbde 24 April, 2018, 09:05:00

thank you

user-11dbde 24 April, 2018, 09:05:05

I will have a look there

papr 24 April, 2018, 09:06:26

@user-c494ef an easy solution would be to use two separate computers. One running Capture and the second one running your application

user-11dbde 24 April, 2018, 09:14:01

the application is running on hololens

user-11dbde 24 April, 2018, 09:14:24

i want to send eye data to the app in a wireless setup

papr 24 April, 2018, 09:14:47

The app connects wirelessly to Pupil Capture.

papr 24 April, 2018, 09:16:22

There is no pupil detection on the Hololens itself.

papr 24 April, 2018, 09:17:37

Could you describe your setup a bit more in detail? I have the feeling that I am misunderstanding your question.

user-11dbde 24 April, 2018, 09:18:23

I think i see what i need now

user-11dbde 24 April, 2018, 09:18:26

thanks!

user-c494ef 24 April, 2018, 10:08:08

@papr yeah problem is that we want to have a wearable autonomous system, so more PCs means more of a power problem. One solution we investigated so far is using pupil mobil, are you by any change planning to do pupil detection on the phone?

wrp 24 April, 2018, 10:23:06

@user-c494ef we do not plan on having pupil detection on the Pupil Mobile Android app.

user-c494ef 24 April, 2018, 10:45:55

okay any chance that a CNN based approach gets implemented as suggested by https://arxiv.org/abs/1711.00112

wrp 24 April, 2018, 11:16:55

@user-c494ef Pupil Mobile will not be conducting pupil detection - if you are interested in implementing the approach in the paper, it would be great to see an attempt to add something like this as an alternative approach to pupil detection pipeline in a fork of Pupil as a proof of concept πŸ˜„ - I'm sure the community would be interested to see this πŸ˜„

user-c494ef 24 April, 2018, 12:21:35

hehe πŸ˜‰

user-8b414f 24 April, 2018, 13:25:52

Hi everyone! Do you know if it is possible to know where do the calibration points appear in the screen, in coordinates, so they can be used to transform the normalised data to cm? (we controlled distance from screen, head movements, and we know at what distance from the centre of the screen the different stimuli appear). I found 3 calibration files that are automatically generated when I calibrate (eye 0 camera, eye 1 camera and world camera), but I don't know if can be useful. Thank you πŸ˜ƒ

user-c9c7ba 24 April, 2018, 13:33:28

Hi,

I am using eye trackers for Htc Vive and currently I wanted to know, is my focus and distance optimall?

Chat image

user-f1eba3 24 April, 2018, 15:16:05

Hi guys

user-f1eba3 24 April, 2018, 15:16:48

Regarding the messaging of the Pupil Service to it's subscribers . Does the structure of the message becomes volatile at any time?

papr 24 April, 2018, 15:22:14

What do you mean by volatile? The gaze messages might differ depending on being 2d/3d and mono/binocular?

user-f1eba3 24 April, 2018, 15:27:33

but once set it will not take any other form ?

papr 24 April, 2018, 15:34:35

monocular and binocular mapping is decided dynamicallly based on the pupils' confidence

papr 24 April, 2018, 15:37:38

Therefore it can change over time

user-6f2339 24 April, 2018, 16:25:00

@wrphow do I include this calibration routine?

papr 24 April, 2018, 16:25:29

You start the recording before calibrating instead of doing it the other way around.

user-6f2339 24 April, 2018, 16:40:21

I have a problem, when I transfer the data from the phone to the computer and open it with pupil player, the recordings do not show any gaze or fixations. Is this a setup problem?

papr 24 April, 2018, 16:41:24

@user-6f2339 This is expected behavior. You will have to run offline pupil detection and offline calibration in order to see gaze data. See the docs for more information.

user-6f2339 24 April, 2018, 20:12:38

@papr I did the offline pupil detection and offline calibration but still cannot see the gaze or fixations.

papr 25 April, 2018, 07:24:12

@user-6f2339 You need to activate the offline fixation detector to get fixations. They are based on gaze therefore you will only start seeing fixations if you have gaze data. If there is no gaze the offline calibration failed. There can be multiple reasons for that: 1. Not enough pupil data 2. Not enough reference markers (either circle markers or manually annotated ones) 3. You enabled 3d pupil detection but set the calibration section to 2d, or vice versa 4. The calibration does not converge.

user-c9c7ba 25 April, 2018, 08:26:52

Hi,

I wanted to ask, is there a possibility to know pupi gaze coordinations in degrees?.... for example Horizontal where midle is 0,0? I make a research for coordinations between head and gaze movement. Head movement is expresed in degrees from HTC Vive tracking system..

papr 25 April, 2018, 08:32:54

@user-c9c7ba Yes, this is possible if you know the intrinsics of the hmd.

papr 25 April, 2018, 08:34:32

We use this method in the fixation detector to convert 2d gaze norm_pos values to unprojected 3d vectors: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/fixation_detector.py#L113-L121

papr 25 April, 2018, 08:35:11

Afterwards you can use these vectors to calculate the cosine distance and convert it to degrees.

user-c9c7ba 25 April, 2018, 08:56:14

@papr thanks for response, what I need to know for my HMD?, screen resolutions?...

and this line of code: width, height = capture.frame_size frame_size is 1280x720 in my pupil capture, by HMD resolution is 2160x1200 (1080x1200 per eye) what I should use?

user-c9c7ba 25 April, 2018, 09:49:11

One more question: vectors = capture.intrinsics.unprojectPoints(locations)

this methos is very complicated, do I need this method?

papr 25 April, 2018, 10:14:39

Yes, this is the most important function when transforming 2d points to vectors. Also, the intrinsics should be as exact as possible.

user-c9c7ba 25 April, 2018, 10:26:55

OK, I found that method, but can you give me some information about properties?

def init(self, K, D, resolution, name): self.K = np.array(K) self.D = np.array(D) self.resolution = resolution self.name = name

user-c9c7ba 25 April, 2018, 10:27:19

resolution is the resolution of HMD, name this is not important.. but K and D?

papr 25 April, 2018, 10:31:38

Ok, in the hmd case it is a bit different to unproject the 2d point. Have a look at the hmd eyes examples. There is an example that projects a 3d ray into the scene. Based on this ray you can calculate your gaze in degrees.

user-c9c7ba 25 April, 2018, 10:33:50

OK I try it :)... I

papr 25 April, 2018, 10:41:23

See the πŸ₯½ core-xr channel in case of hmd-eyes specfic questions

user-c9c7ba 25 April, 2018, 10:49:22

Yes but I have feeling that this chanel is currently not very active

user-6f2339 25 April, 2018, 12:40:16

@papr Thank you very much. I managed to solve the issue.

user-c9c7ba 26 April, 2018, 09:41:17

Hi,

I have yet a big problem.... When I open my pupil capture, and run recording, it tooks 14 sec or so to start... I am using pupil capture v 1.5.12.. and I have worked with this version 2 months and all has worked fine, I dont know what happened.. I have tried reset to default setting, I have reinstalled this version... I have tried version 1.4.. and it was the same... any idea what is happening?

papr 26 April, 2018, 11:25:19

@user-c9c7ba something on your system must have changed. Which OS do you use?

user-c9c7ba 26 April, 2018, 11:41:09

Windows 10

user-c9c7ba 26 April, 2018, 11:42:59

but I am using HMD-EYE for data export... And I dont need this feature record.. I have removed send socket to start record and I can eport data yet

papr 26 April, 2018, 11:43:12

@user-c9c7ba could you confirm that the issue persists on v1.6?

papr 26 April, 2018, 11:43:52

@user-c9c7ba I do not understand? What did you change? The hmd-eyes/unit part?

papr 26 April, 2018, 11:44:19

And just to clarify: Capture is slow when starting, but hmd-eyes is as fast as always?

user-c9c7ba 26 April, 2018, 11:48:20

OK I will try v1.6, but its v1.6 tested with HMD-eyes plugin for Unity?...

capture takes 25 seconds to start recording and meanwhile is freezed... with Unity its the same when I make send(start recordin...) ..

papr 26 April, 2018, 11:49:02

Aah. I misunderstood. I will try to replicate the issue.

papr 26 April, 2018, 11:54:50

@user-c9c7ba do you also record audio?

user-c9c7ba 26 April, 2018, 11:55:21

yes

papr 26 April, 2018, 11:56:33

In unity? Or in Capture?

user-c9c7ba 26 April, 2018, 11:58:36

This is my pupil capture

Chat image

user-c9c7ba 26 April, 2018, 11:58:53

and so is called recording from unity

Chat image

user-7bef48 26 April, 2018, 12:57:05

Hi, drivers for Windows 10 are correctly installed and in working condition, but Pupil Capture doesn't see any camera of the Pupil Headset. This headset works fine with an iMac (and the Windows 10 computer works fine with all other applications) . Here is the Pupil Capture.exe screenshot. Any help is very welcome! :-)

MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version. Running PupilDrvInst.exe --vid 1443 --pid 37424 OPT: VID number 1443 OPT: PID number 37424 Running PupilDrvInst.exe --vid 1443 --pid 37425 OPT: VID number 1443 OPT: PID number 37425 Running PupilDrvInst.exe --vid 1443 --pid 37426 OPT: VID number 1443 OPT: PID number 37426 Running PupilDrvInst.exe --vid 1133 --pid 2115 OPT: VID number 1133 OPT: PID number 2115 Running PupilDrvInst.exe --vid 6127 --pid 18447 OPT: VID number 6127 OPT: PID number 18447 Running PupilDrvInst.exe --vid 3141 --pid 25771 OPT: VID number 3141 OPT: PID number 25771 world - [WARNING] video_capture.uvc_backend: Updating drivers, please wait... world - [WARNING] video_capture.uvc_backend: Done updating drivers! world - [ERROR] video_capture.uvc_backend: Init failed. Capture is started in ghost mode. No images will be supplied. world - [WARNING] camera_models: Loading dummy calibration world - [WARNING] launchables.world: Process started. world - [ERROR] calibration_routines.screen_marker_calibration: Calibration requireds world capture video input.

wrp 26 April, 2018, 13:04:08

@user-7bef48 do you see Pupil Cam listed in the libusbK category in the Device Manager on Windows 10?

user-7bef48 26 April, 2018, 13:05:40

Hi wrp, Yes, Pupil Cam1 ID0, Pupil Cam1 ID1 and Pupil Cam1 ID2 (each 3 times, because I've tried each of my 3 USB ports)

wrp 26 April, 2018, 13:06:25

Ok - any other pupil processes running in the background if you inspect task manager?

wrp 26 April, 2018, 13:07:11

Also what version of Pupil Capture?

user-7bef48 26 April, 2018, 13:08:20

Pupil Capture is running twice

user-7bef48 26 April, 2018, 13:08:54

but there were 2 windows opened.

wrp 26 April, 2018, 13:09:16

Can you restart Pupil Capture with default settings. General menu > restart with default settings

user-7bef48 26 April, 2018, 13:09:29

Capture Version 1.6.11

wrp 26 April, 2018, 13:10:07

Based on your description all drivers seem properly installed, running latest version of Pupil software, and on Windows 10.

wrp 26 April, 2018, 13:11:03

(I'll be AFK for a bit will respond async)

user-7bef48 26 April, 2018, 13:11:15

Some text is written at the starting of the app, but it is too quick for me to read it

user-7bef48 26 April, 2018, 13:13:54

Here is the text :

Chat image

wrp 26 April, 2018, 13:34:54

@user-7bef48 can you send a screenshot of drivers in device manager?

wrp 26 April, 2018, 13:36:08

Also expand the Cameras section in device manager if this category exists

wrp 26 April, 2018, 13:36:28

I have a suspicion that this is driver related

user-7bef48 26 April, 2018, 13:37:03

The headset is currently connected to one of the 3 USB ports

Chat image

user-7bef48 26 April, 2018, 13:38:05

The Device Manager is in "hidden drivers shown"

user-7bef48 26 April, 2018, 13:38:15

mode

papr 26 April, 2018, 13:55:44

This screenshot shows that the drivers are not correctly installed.

user-7bef48 26 April, 2018, 13:56:02

ok πŸ˜ƒ

user-7bef48 26 April, 2018, 13:56:55

I can desinstall all the drivers

user-7bef48 26 April, 2018, 13:57:07

"uninstall" sorry

user-7bef48 26 April, 2018, 14:01:04

All drivers were unistalled. Pupil capture was restarted with original settings. Same problem.

Chat image

wrp 26 April, 2018, 14:26:50

@user-7bef48 we are about to release a new version of Pupil Capture with driver installer that addresses recent changes to Windows 10. You can either use a tool like Zadig to overwrite system assigned drivers for Pupil Cam with libusbK drivers or wait for v1.7 Pupil software (coming very soon)

user-7bef48 26 April, 2018, 14:42:25

@wrp Thanks for this information. I'm not time stressed, because the experiment will only occur in some months. So, I prefer wait for the v1.7. Thanks for all your help. Best,

wrp 26 April, 2018, 14:43:38

ok, thanks

user-d72566 26 April, 2018, 15:14:04

Hi. I have a question about pupil dilation. In Capture, a graph is shown displaying the dilation. Is it possible to export the dilation to a csv file? If not, do you know any paper on how to calculate the dilation?

papr 26 April, 2018, 15:15:32

@user-d72566 The dilation is exported as part of the pupil data in the raw data exporter. Be aware that the 2d pupil diameter is measured in image pixels and the 3d diameter depends on the quality of the 3d model

user-d72566 26 April, 2018, 15:16:49

Hmm I had missed that one. What is the difference between the raw data and gaze_positions for example?

papr 26 April, 2018, 15:17:30

pupil data is relative to the respective eye camera space. Gaze is mapped pupil data into the scene camera space

user-d72566 26 April, 2018, 15:17:58

Aha.

user-d72566 26 April, 2018, 15:18:06

I am looking at pupil_positions now, it has a column labelled diameter, this is the one you are talking about? πŸ˜ƒ

papr 26 April, 2018, 15:18:44

yhis is the 2d pixel diameter correct

user-d72566 26 April, 2018, 15:18:54

I see, thanks!

user-d72566 26 April, 2018, 15:20:04

But where is the 3D diameter?

papr 26 April, 2018, 15:20:39

It is only available for 3d detected pupil positions

papr 26 April, 2018, 15:20:58

Should be called diameter_3d

user-d72566 26 April, 2018, 15:23:42

Hmm alright, then what are their formats? Here is an example row:

2D: 55.5888373471996 3D: 1.81006565884836

2D looks like mm but 3D looks like its cm?

papr 26 April, 2018, 15:27:29

55mm would be very huge for a pupil πŸ˜„ 2d is in pixels (and therefore depends on the eye camera distance to the eye) and is only sensible as a relative value compared to other diameter values.

user-d72566 26 April, 2018, 15:28:42

Indeed it would be very big! πŸ˜„ It has been a long day hehe. Alright then but then I can assume that 3D diameter is mm. My readings span from about 1.8-5, seems reasonable.

papr 26 April, 2018, 15:31:00

3d is in mm but as I said this depends on the fittness of the 3d model.

papr 26 April, 2018, 15:31:23

yeah, sounds good

user-d72566 26 April, 2018, 15:31:49

I hear you! Still, might be worth investigating. Thanks πŸ˜ƒ

user-c71581 27 April, 2018, 02:15:22

Hey guys, it's been awhile - thank you for the help previously with the DIY filter and other assistance.

After my experiments are over, I plan on building a second DIY build and setting up a start to finish Video tutorial with Updated reference materials and time line of acquiring those materials for future builders.

The data collection and other biometric equipment I'm running for 60 trials at 45 mins a piece is going very smoothly. Thank you again

user-c71581 27 April, 2018, 02:16:18

Also, I dunno if it's still in the github for manual eye calibration, but I might be able to add that for the DIY camera if that's needed

user-c71581 27 April, 2018, 02:16:30

*lens

user-c351d6 27 April, 2018, 07:08:06

Hi guys, I tried to connect the eyetracker to a OnePlus 5T via USB-C and started the app. The app is not displaying any cameras at all, just the three build in sensors audio, imu, and key (however, I have no idea what key could be because it's not responding to anything). Is there anything further I have to do before I can use the headset together with the app?

mpk 27 April, 2018, 07:10:41

@user-c351d6 you should get a permissions prompt for the usb devices. Mayb restart the phone? Try a different usb-c cable?

user-c351d6 27 April, 2018, 07:21:00

@mpk Unfortunately, no prompt and the usb diagnoses app says there is nothing connected (this time all cables are proberly connected, though). I'll try to find a second usb-c to usb-c cable to check if the cable or the phone is the problem.

mpk 27 April, 2018, 07:25:50

@user-c351d6 we use the same phone for development and it works for us. FYI...

user-c351d6 27 April, 2018, 07:27:57

@mpk 5 or 5T? We actually also bought a high quality usb cable.

wrp 27 April, 2018, 07:57:43

We have 5 and 5t - both working

wrp 27 April, 2018, 07:57:55

Please enable USB OTG

user-c351d6 27 April, 2018, 08:18:49

@wrp Thank you, it's working now. You may want to update your documentation.

wrp 27 April, 2018, 08:31:47

Thanks for the feedback @user-c351d6 - the OnePlus is the only device that I know of that requires one to explicitly enable USB OTG

wrp 27 April, 2018, 08:32:08

We will add a note to docs just to clarify this detail though so that others who try do not get stuck in the same place.

user-8caac2 28 April, 2018, 12:00:10

Hi people, I am new to pupil πŸ˜ƒ I want to use the ETD to track the eye movement of my right eye. Thus, I want to receive the vertical and horizontal angle of my eye. I am able to subscribe to the gaze' topic in my own program. Which key-value pair are now the eye position's vertical and horizontal angles? msg['base_data'][0]['ellipse']['axes'][0/1] ? Many thanks in advance and cheers, Sunacni

user-f68ceb 29 April, 2018, 07:44:17

Hi There – can anyone share the exact settings/set-up (soft-/hardware) to get PupilLabs Mobile running on an LG Android Nexus 5x? Would be great – I cannot seem to get a video signal onto my phone. Many Thanks

user-8944cb 29 April, 2018, 21:46:43

Hello, I have been using Pupil player the past month, and recently it stopped opening. When I try to open it I get following. I tried deleting and installing again, but am getting the same error. It is happening on one more computer, while on a third computer it is still working. Will be happy for any advise, Thank you!

Chat image

wrp 30 April, 2018, 04:24:38

@user-f68ceb Are you using the USBC-USBC cable that shipped with the Nexus 5x? If so, you will need to change to USBC-USBC cable - like this one made choetech - https://www.amazon.de/Certified-CHOETECH-marker-Delivery-MacBook/dp/B06XDB343M/ref=sr_1_17?ie=UTF8&qid=1524714193&sr=8-17&keywords=choetech+usbc+to+usbc

wrp 30 April, 2018, 04:24:48

The issue here is likely cable related.

wrp 30 April, 2018, 04:25:59

@user-8944cb is this issue being observed with the same dataset in Pupil Player? Can you provide us with some inforamtion about the dataset - size of the dataset, and the files contained within this dataset.

user-f68ceb 30 April, 2018, 08:28:16

Hi @wrp – I bought the cable that was recommended in your link. Still does not seem to work. I was wondering if you can send screenshots of ALL the settings in Pupil Capture?

wrp 30 April, 2018, 10:33:40

Hi @user-f68ceb just to confirm - you are not seeing any video feeds of eye or world cameras on the Nexus 5x when running Pupil Mobile? There should be no further settings to enable on the Nexus 5x as it natively supports USB OTG - are there any prompts when you connect the device and/or options in the top settings/shade of Android when you connect the Pupil headset?

user-f68ceb 30 April, 2018, 14:09:52

Hi @wrp – I got it working with the 1.6.14mac update, Nexus 5x, suggested cables and default settings. :-)))

wrp 30 April, 2018, 14:14:20

@user-f68ceb pleased to hear and that is for the update!

user-8944cb 30 April, 2018, 14:59:44

Hi @wrp , the issue is observed even before trying to upload (drag) a recording of a dataset. When I try opening the program I am receiving this error, and pupil player won't open. Thanks!

papr 30 April, 2018, 15:01:08

This is not expected behavior. How do you start Player?

user-8944cb 30 April, 2018, 15:23:17

I extract the files after downloading, and then just double clicking on the Pupil Player icon within the 'pupil_player_windows_x64_v1.6.11' document.

user-3d9cdd 30 April, 2018, 17:00:16

@user-f1eba3 did you ever get anywhere with the unreal implementation?

End of April archive