core


user-2798d6 01 November, 2018, 02:36:39

Hello! I have two hopefully quick questions. #1-I'm noticing that my fixation durations only go up to right below 1000 ms and never over. Is that part of the program that fixations are only recorded up to 1000 ms and then a new fixation begins? #2-I am using the fixation detector and the Vis Circle plugins. Sometimes the circle and fixation ring don't align. What would be the reason for that?

user-2798d6 01 November, 2018, 02:36:44

Thank you so much for your help!

wrp 01 November, 2018, 06:10:25

@user-2798d6 what is the max_duration of the fixation detector set to in Pupil Player?

papr 01 November, 2018, 07:24:43

@user-2798d6 the fixation ring displays the average gaze location of the gaze that belongs to it. In longer fixations with higher dispersion you won't see all the gaze at once since it is spread over multiple frames. Therefore, it might look like it is not aligned

user-2798d6 01 November, 2018, 13:06:06

@wrp - yep, that would be my issue! πŸ˜„ Thanks for pointing me in the right direction. And thank you @papr!

user-a04957 01 November, 2018, 13:31:20

Hello everybody:

I a m wondering if the gaze positions have calculated the fish eye distortion of the word view?

Meaning: Lets say a change of 100Pxls is a bigger "Gaze Movement" in the Center of the Fisheye Window than it is at the borders (due to fisheye distortion). ==> Is this calculated or is the gaze position at the border of the fisheye distorted world window inaccurate?

Thany you!

papr 01 November, 2018, 15:00:52

@user-a04957 3d gaze is corrected for distortion, yes

user-2798d6 01 November, 2018, 15:42:14

Hello! A followup question about adjusting the maximum for fixation duration: I previously had a fixation range for 200-1000ms and then changed it to 200-4000ms. But then I lost some fixations that were previously there. I know some fixations were combined into longer fixations, but there were some single fixations on one item that were 500-600 ms that now aren't showing up. Is there a reason for that?

user-2798d6 01 November, 2018, 15:42:17

Thank you!

papr 01 November, 2018, 15:56:49

@user-2798d6 We do some basic outlier detection. It might be, that somehow these were filtered out? The outlier detection needs improvement.

papr 01 November, 2018, 15:57:31

I would recommend to run the fixation detector with different settings and look for overlapping detections, as you might have done already

user-a04957 01 November, 2018, 16:41:06

@papr Thank you! Does 3d gaze mean, that I can use the norm_pos_x and norm_pos_y coordinates and they are corrected too? Thank you!

mpk 01 November, 2018, 16:55:10

@user-a04957 norm_pos_x is not corrected. only gaze_3d is corrected. If you want to correct norm_pos i recommend using cv2 fns and the intrinsics supplied in the recording.

user-910385 01 November, 2018, 17:05:24

@wrp yes, that is right

user-1809ee 02 November, 2018, 10:03:52

Hi, i am using Tobii Vive device in Unity3d, is it possible to use pupil-labs software in that case

user-1809ee 02 November, 2018, 10:04:12

i see there is a tool hmd eyes

user-1809ee 02 November, 2018, 10:05:17

but cannot make sure since it is used with htc vive

user-1809ee 02 November, 2018, 10:05:57

can we also use the unity plugin with tobii vive

papr 02 November, 2018, 10:07:06

Unless the Tobii device registers as normal usb cameras on your computer (very doubtful), no, sorry.

papr 02 November, 2018, 10:07:56

I think Tobii uses proprietary hardware which requires their own software for access.

user-1809ee 02 November, 2018, 10:08:32

Ok we'll try. Yes tobii has a software but quite expensive

user-1809ee 02 November, 2018, 10:08:43

thanks for quick response

user-1809ee 02 November, 2018, 10:09:04

we'll let you know if it works

papr 02 November, 2018, 10:09:05

Good luck! Let us know your results πŸ™‚

user-1809ee 02 November, 2018, 10:09:08

πŸ˜ƒ

user-41643f 02 November, 2018, 10:14:27

Hi papr, I was curious if you could point me to an existing recording I could play back in pupil player?

papr 02 November, 2018, 10:16:51

Sure! See this recording for offline pupil detection and calibration: https://drive.google.com/open?id=1OugotQQHsrO42S0CXwvGAa0HDvZ_WChG

Check out this recording for the offline surface tracker functionality: https://drive.google.com/uc?export=download&id=0Byap58sXjMVfZUhWbVRPWldEZm8

user-41643f 02 November, 2018, 10:53:55

Thanks papr! I was able to play the surface tracker and export a x,y coordinate in CSV. IS it possible to get a Z coordinate in CSV using binocular cameras?

papr 02 November, 2018, 11:32:51

@user-41643f if you enable 3d pupil detection and mapping you should have access to the gaze_point_3d values

user-41643f 02 November, 2018, 11:33:05

thanks!

user-a04957 02 November, 2018, 11:50:49

@papr How can I map the gaze_point_3d to normalized [0,1] coordinates? Thank you

papr 02 November, 2018, 11:56:29

@user-a04957 you can project it to the cameras image plane using the cameras intrinsics (including distortion). If I am not mistaken, this would be equal to norm_pos

mpk 02 November, 2018, 12:30:19

@user-a04957 if you want it without distortion just use the same fn but with distortion coeffs set to identity.

user-8944cb 02 November, 2018, 13:02:07

Hi @mpk , I am interested in getting 2d gaze positions without distortion, do I need to do the same thing? If so, can you please explain what does it mean to "use the same fn but with distortion coeffs set to identity" - how do I do this? thanks!

user-1bcd3e 02 November, 2018, 14:22:44

hey hallo to everyone! is there any possibilities to change the world camera with one with more resolution? or do you have any project to improve the resolution of the wolrd camera? thnks a lot

mpk 02 November, 2018, 14:35:00

@user-1bcd3e our current camera does 1080p if you need 4k you could get a version of our headset with usb_c connnector at the world camera mount site and a generic mount. This allows you to mount a logitech 4K camera.

user-b23813 02 November, 2018, 14:45:01

Hi @papr, in some of my recordings no blink detection report can be exported. Blinks are not shown with the visualiser either. I tried both the 'pupil from recording' and 'offline pupil detection' options but none worked. The offline blink detection plugin was active during recording. This is weird because the confidence of both eyes was high and the drops and gains of confidence was recorded as shown by the values shown on the upper left corner of the screen when I played the recording. I started and stopped recording several times without changing the settings so I had many sections of recording, for some of which blink detection reports were exported fine. Would you suggest anything else to try to export the blink detection reports? Thanks

Chat image

user-2798d6 03 November, 2018, 00:54:24

@papr : is is possible for me to turn the outlier detection off?

user-9bca2f 03 November, 2018, 05:32:25

Hi guys.....can we use this code on Raspberry Pi? Thanks in advance

papr 03 November, 2018, 10:05:03

@user-2798d6 not as a setting. You will have to change the source code for that

user-07d4db 03 November, 2018, 13:59:12

hello everbody πŸ˜ƒ i am conducting a psychological experiment using the pupil labs eye tracking glasses. Concerning this i've got some questions: How can i define

user-07d4db 03 November, 2018, 14:00:49
  1. How can I define which units of measurement I am recording? I am only interested in a) duration of fixation and b) number of fixations of predefined areas of interest. can anybody help me with this issue? thank you so much!
user-624fa4 03 November, 2018, 21:06:43

Hi everyone. I'm really new to using eye tracker and at my university we chose Pupil. I will conduct an experimentn with 90 people and I have understood so far the Pupil Player does not allow joint analysis by groups (someone corret me if I was wrong). I would do this with the raw data. Basically in addition to the individual heat maps I need the position data of the fixations and the duration of the fixations. The duration I understood however regarding the position in the file it is not clear to me what the number norm_pos_x 47385810382993000,00 and norm_pos_y 4405092041169440,00 (as example) represents. I have already understood that the lower left corner is 0,0 and the upper right is 1,1. Someone to help me?

user-1809ee 04 November, 2018, 10:51:38

Hi, we have tried for tobii vive, as you said, we couldn't make it

user-1809ee 04 November, 2018, 10:53:22

in the pupil capture - eye window, 'activate source' options appeared as unknown

user-1809ee 04 November, 2018, 10:53:29

i just want to let you know

user-71969b 04 November, 2018, 13:10:50

Hey everyone, I am also very new to this. I built my own 'homemade' headset, with two cameras, one facing the eye and the other one the surroundings. Is it possible to use the Pupil Capture software for my headset or do I need the PupilLabs? thank you

wrp 05 November, 2018, 04:20:07

Thanks for the update @user-1809ee

wrp 05 November, 2018, 04:23:59

@user-07d4db Welcome to the Pupil community chat πŸ‘‹ Responses to your questions: 1. Units of measurement - please see https://docs.pupil-labs.com/#data-format 2. Fixation within AOI - Please use the surface tracker and fixation detector - when you export data with Pupil Player with both plugins active you will get fixations within each surface as .csv data. Please see see: https://docs.pupil-labs.com/#capture-fixation-detector and https://docs.pupil-labs.com/#surface-tracking

wrp 05 November, 2018, 04:29:37

@user-624fa4 Welcome to the chat πŸ‘‹ Responses to your questions: 1. Multi-participant analysis - out of the box, Pupil software only support single participant visualization/analysis. You can export raw data for each participant as csv data and then ingest this gaze/pupil/surface/fixation/etc data to perform aggregate analysis. 2. Data format - please see: https://docs.pupil-labs.com/#data-format -- norm_pos_x and norm_pos_y for gaze positions are the normalized gaze position where 0,0 is the bottom left corner of the world camera frame and 1,1 the top right corner of the world camera frame. The norm_pos data is floating point data - so I'm not sure what you are refernencing with the number 47385810382993000,00 for example (was this maybe imported to a csv reader program with the incorrect settings? Can you send us the csv file so we can give you concrete feedback?)

wrp 05 November, 2018, 04:30:04

@user-71969b please see https://docs.pupil-labs.com/#diy - if your cameras are uvc compliant, then the should be able to work with Pupil software.

wrp 05 November, 2018, 04:31:37

@user-9bca2f RPI would likely not have enough CPU power to run pupil software's pupil detection algorithms at high frame rate. Some members of the community are using single board computers (sbc) but not sure what frame rates they are achieving with their setups. Perhaps someone in the community would like to step in here.

user-62b13b 05 November, 2018, 10:14:59

Hello I guess the question I have is a common question. I am currently using the pupil hardware for a study where people are looking at stuff where it is impossible to use the markers. Before I was used to use the SMI semantic gaze mapping function which allows to manually code to what part of a reference image people are looking at in order to calculate dwell time etc. How do people solve this issue with pupil labs eyetracking data? Can anybody assist with that?

user-9bca2f 05 November, 2018, 12:42:53

@wrp so can you tell me how can I make mobile device?

wrp 05 November, 2018, 12:47:23

@user-9bca2f I would encourage you to consider Pupil Mobile https://docs.pupil-labs.com/core-mobile

papr 05 November, 2018, 12:48:43

Interesting how channel linking takes preference over the actual link πŸ€”

user-41643f 05 November, 2018, 21:20:42

Hey papr, does the system have a audio component or supported plugin? I want to incorporate audio collection as well - if you have a recommended channel to follow, or system someone is familiar with (e.g. a good midfield mic?)

wrp 06 November, 2018, 01:17:18

@user-41643f audio capture works with built in mics and supported USB mics in Pupil Capture. Audio can also be recorded via built in mic on Android with Pupil Mobile.

user-2be752 06 November, 2018, 02:03:41

Hi guys, I'm trying to load in Pupil Player the recording I took with the Pupil Mobile, but it just won't load it. The log says: 2018-11-05 17:52:02,042 - MainProcess - [INFO] os_utils: Disabled idle sleep. 2018-11-05 17:52:03,869 - player - [ERROR] player_methods: No valid dir supplied (/Applications/Pupil Player.app/Contents/MacOS/pupil_player) 2018-11-05 17:52:17,714 - player - [INFO] launchables.player: Starting new session with '/Users/teresa/recordings/20181104203042389' 2018-11-05 17:52:17,716 - player - [INFO] player_methods: Updating meta info 2018-11-05 17:52:17,717 - player - [INFO] player_methods: Checking for world-less recording 2018-11-05 17:52:17,718 - player - [ERROR] launchables.player: Could not generate world timestamps from eye timestamps. This is an invalid recording. 2018-11-05 18:00:29,561 - MainProcess - [INFO] os_utils: Re-enabled idle sleep.

Chat image

wrp 06 November, 2018, 05:03:34

@user-2be752 - saw your issue on github here: https://github.com/pupil-labs/pupil-mobile-app/issues/29

wrp 06 November, 2018, 05:03:38

and responded there with questions

wrp 06 November, 2018, 05:03:54

perhaps we can continue the discusion via the issue if that is ok with you?

user-2be752 06 November, 2018, 05:55:01

@wrp of course! I've answered there too. Thanks!

user-29e10a 06 November, 2018, 10:16:41

@papr @user-af87c8 Hi, in terms of precision: What is the difference in angular resolution of the 3D eye model if I detect the pupil in VGA or QVGA? Since QVGA uses half the pixels in each direction, I assume the precision is half as good. I think the "edges" of the eye ball projection are most sensitive to resolution (since the density of degree-value is the highest here)... so what are (technically) the highest precisions possible at 0, 10, 20 degree of visual line of sight? I'm not talking about gaze, because this would add some other error sources. Did anybody calculated them? I noticed your gaze precision did not change from VGA to 200x200px on your website πŸ˜ƒ

papr 06 November, 2018, 10:22:01

@user-29e10a

Precision is calculated as the Root Mean Square (RMS) of the angular distance (in degrees of visual angle) between successive samples during a fixation.

1 . Degrees of visual angle are independent of the selected resolution. 2. Precision is measured in world camera space, not eye camera space.

user-29e10a 06 November, 2018, 10:28:45

@papr Thanks, but I'm not talking about world space. When the 3D detector calculates the model, the center of the ellipse depends clearly on the resolution, doesn't it? So if the ellipse center is shifted one pixel to the right, the difference in degree is double as high on QVGA than on VGA, or am I wrong?

papr 06 November, 2018, 10:37:15

@user-29e10a Ah, ok, thank you for the clarification. 2d pupil ellipses are fitted with sub-pixel accuracy. Therefore, the effect on precision should not be as high as you would expect.

user-8944cb 06 November, 2018, 18:00:09

Hi, is there a way to open the "world.intrinsics" file to see the camera matrix and distortion parameters after calibration? thanks!

papr 06 November, 2018, 18:00:43

Hey, did you receive my email?

papr 06 November, 2018, 18:01:00

And yes, this is possible if you have the source code installed.

papr 06 November, 2018, 18:09:46

@user-8944cb Save this file in the shared_modules folder: https://gist.github.com/papr/e14382fb4d7af5f4da9997f9f6b79f53

Execute it with python3 display_intrinsics.py <path to intrinsics file>

user-8944cb 06 November, 2018, 18:27:08

Hi @papr , yes, I have received your email - Thank you very much for your help and the informative reply! If I understand correctly, my options to get the undistorted 2d gaze are either export to i-motions, or post process the distorted gaze points from Pupil based on the camera intrinsics parameters. I don't program in Python, but will try to do the latter first, as we already have a code written for the format of the distorted gaze points (norm_pos_x and norm_pos_y) from Pupil. Again, thanks!

user-2be752 06 November, 2018, 19:05:52

Hi, do you recommend or not streaming the recording from Pupil Mobile to Pupil Capture while running and experiment? If I want to use the time sync plugin, it needs to be streaming. However, I find that there's a bit of a frame drop when it's streaming.

user-9f40a2 06 November, 2018, 20:27:43

hello all, which is the ideal data to use to measure the gaze of a person or if you can explain the difference between the following : - gaze point 3d - eye center 3d - gaze normal 0 - eye center 1 gaze normal 1 im using the eye tracker that attaches to the hololens , also how can i measure fixation using this eye tracker please excuse my lack of knowledge

user-e0a0e6 06 November, 2018, 20:29:31

Hello everyone, I dont know if I am in the right place to ask this, but I am currently working in a research using Gear VR as a HMD. The question I would like to make is: -Is there a way to use Pupil with Gear VR? We're really looking foward into your technology, but this specific research requires a Gear VR to work.

papr 06 November, 2018, 21:47:53

@user-2be752 time sync works without streaming as well

papr 06 November, 2018, 21:49:14

@user-9f40a2 Check https://docs.pupil-labs.com/#data-format

papr 06 November, 2018, 21:49:54

@user-e0a0e6 check out the hmd-eyes project on github for more information regarding vr integrations

user-9f40a2 06 November, 2018, 21:50:19

@papr thank your your response but i checked the document and i want further clarification please

user-2be752 06 November, 2018, 21:50:48

@papr thanks!! so do you then recommend not streaming the recording from Pupil Mobile to Pupil Capture while running the experiment?

papr 06 November, 2018, 21:51:22

@user-2be752 streaming is really just meant for monitoring. If you don't need streaming, turn it off

papr 06 November, 2018, 21:51:56

But for what do you need time sync, if you do not intend streaming the data anyway?

user-2be752 06 November, 2018, 21:53:32

@papr oh okay I might misunderstood, but I wanted to use time sync so the recording from Pupil Mobile will be time synced to the main computer (which is recording subjects'response) timestamps. Does that make sense?

papr 06 November, 2018, 21:53:51

Yes, that makes sense

papr 06 November, 2018, 21:55:34

@user-9f40a2 What exactly do you want to know? Pay attention to the coordinate systems that are mentioned in the docs. These make the difference between the different fields

user-2be752 06 November, 2018, 21:56:35

@papr so with that purpose, is it still okay for me not to stream?

papr 06 November, 2018, 21:58:18

You just will need a way to combine the recordings

papr 06 November, 2018, 21:59:06

How do you record subject response timestamps? Which response exactly?

user-2be752 06 November, 2018, 22:01:55

we want to sync keyboard press on matlab (windows)

user-9f40a2 06 November, 2018, 22:07:37

i want to measure the gaze position @papr

papr 06 November, 2018, 22:10:20

@user-9f40a2 Use the gaze norm_pos. It is the easiest way to find the gaze position within the recorded video

user-9f40a2 06 November, 2018, 22:13:43

is that the norm_pos or gaze_normal that you are referring to @papr

papr 06 November, 2018, 22:18:51

norm_pos

user-9f40a2 06 November, 2018, 22:23:54

thank you so what is gaze_normal @papr

papr 06 November, 2018, 22:26:12

The 3d eye model has a normal vector (it is normal to the 3d model's pupil) and it is mapped to the world camera. That's the gaze normal.

user-9f40a2 06 November, 2018, 22:29:41

thank you very much @papr

user-8944cb 07 November, 2018, 00:16:00

Hi @papr , I have another small question, the gaze file that is exported via the i motions exporter is a .tlv file. What is the best way to open the file? will it give we position, the same as norm_pos_ y and norm_pos_y? thanks!

papr 07 November, 2018, 07:05:39

@user-8944cb it is a text file similar to csv. The difference are the column delimiters

mpk 07 November, 2018, 10:47:14

@here We are pleased to announce the latest release of Pupil software v1.9! We highly recommend downloading the latest application bundles: https://github.com/pupil-labs/pupil/releases/tag/v1.9

user-dfeeb9 07 November, 2018, 18:03:56

@mpk Hi, this looks great. Just a question - do the changes to the annotation plugin affect now we're to send payloads from other systems?

mpk 07 November, 2018, 20:22:18

@user-dfeeb9 its a bit different now: https://github.com/pupil-labs/pupil-helpers/blob/master/python/remote_annotations.py

mpk 07 November, 2018, 20:23:11

However I think only a small change would be required to make it as easy as it was before. I have made a suggestion to add this to Pupil: https://github.com/pupil-labs/pupil/issues/1378

user-dfeeb9 07 November, 2018, 20:25:17

I see, thanks for the information. Currently @user-e7102b and myself have been sending simple bytestring packets with the old annotation system, I haven't a chance to look in detail at the new way of communicating with my stimuli etc. yet, so I'll need to see how best to work under your new system but may not update until I can

user-2798d6 07 November, 2018, 21:36:21

Hello! I am having some trouble with audio syncing with video. There was a post by @user-68d457 on Oct. 19 about the same issue, but I didn't see a solution. My audio is slightly ahead of the video when I open Player. Is there a way to adjust or fix this? Part of my research deals with fixation placement related to the audio, so they really need to be aligned. Thank you so much for your help!

user-2798d6 07 November, 2018, 21:38:04

I see that @user-68d457 W responded about a fix, but it involves adjusting the audio file. Is there any other solution besides adjusting all of my audio files? Thank you!

papr 07 November, 2018, 22:45:24

@user-2798d6 was it recorded with Pupil Mobile?

user-624fa4 07 November, 2018, 23:19:18

@wrp Hello, I am sending one of the exported files here. I opened them in excel so maybe that's why the data I mentioned is wrong. I intend to do analysis using the raw data and videos. My experiment is with people shopping on a website that we create. We will use the very basic metrics such as fixations, saccades, heatmaps, first fixation time and fixation sequences in each of the Areas of Interest.

fixations_03.11_-_000.csv

user-3f0708 07 November, 2018, 23:47:21

@user-624fa4 good night what plugins are you using to get the basic metrics, fixture time, and fixation sequence in each of the areas of interest?

user-2798d6 08 November, 2018, 02:20:43

@papr - It was not, it was recorded on my MacBook Air through Capture.

wrp 08 November, 2018, 02:28:31

@user-624fa4 I think the issue you are seeing is due to your import settings in excel. I have downloaded your csv file and imported to google sheets: https://docs.google.com/spreadsheets/d/1Qz9m3otu4ZjybkGzgdw1wBdm8hDpvbjqkjJ4T1-jSrc/edit?usp=sharing so you can get an idea of what this file is supposed to look like. Hope this helps

user-41643f 08 November, 2018, 20:34:01

Howdy, I was a little confused by this note: "Pupil headsets with 3d world camera are not compatible with Pupil Mobile" - the highspeed world camera should be fine? Thanks!

user-41643f 08 November, 2018, 21:12:18

Another question I have: "Pupil Mobile on Android (Supported Devices: Moto Z2 Play, Nexus 5x, Nexus 6p, OnePlus 3)" Since Samsung is not listed, I am curious if Samsung which runs on Android is supported as I have a Galaxys9 phone

papr 08 November, 2018, 22:01:27

You are correct, 3d is not supported but high speed cams are

user-41643f 08 November, 2018, 22:01:46

thanks!

user-37c9fb 08 November, 2018, 22:02:36

I'm trying to sync the time of pupil lab eye-tracker with some other device, can't not figure out how to relate the start time (synced) of eye-tracker with the local time. Please help explain a little bit. thanks.

Chat image

papr 08 November, 2018, 22:02:37

@user-41643f Regarding the phone: Most important is that your phone has a usb-c connector. As to the listed phones: These are the devices known to be working. We cannot guarantee anything for other devices.

papr 08 November, 2018, 22:04:29

@user-37c9fb Synced time is a monotonous clock that has a random time epoch. System time is the unix timestamp. Substract each other to get the clock difference which you can use to sync time with the other device's data

user-37c9fb 09 November, 2018, 07:22:28

@papr The other thing is that the precision of System time is only second, does this mean that I could not sync the time to a millisecond level?

papr 09 November, 2018, 07:29:24

@user-37c9fb The system time should have an higher precision. Let me look into that.

papr 09 November, 2018, 07:30:34

But yes, Unix time is not as precise as the monotonic clock with a random clock start.

mpk 09 November, 2018, 07:44:44

@user-37c9fb but usually system time is still sub second. I ll make an issue in Pupil Mobile.

mpk 09 November, 2018, 07:46:55

@user-37c9fb the other way of doing time sync is to use our time_sync protocol. A sample clock amster can be found here: https://github.com/pupil-labs/pupil-helpers/blob/master/network_time_sync/pupil_time_sync_master.py

user-88dff1 09 November, 2018, 12:52:27

Hello πŸ˜ƒ

user-88dff1 09 November, 2018, 12:53:23

Received the pupil devkit. But one camera is flipped compared to the other. Which means I've got 2 eyes moving independently and opposite of each other :/

user-88dff1 09 November, 2018, 12:53:55

Any ideas for this, other than going into the source and flipping the calculations too? There's a 'flip display' option. But it doesn't affect the calculations, just the camera feed

papr 09 November, 2018, 12:54:47

@user-88dff1 Yes, this is expected since one of the eye camera is actually physically flipped.

papr 09 November, 2018, 12:55:05

The camera being flipped does not have any effect on gaze estimation.

user-88dff1 09 November, 2018, 12:56:36

I see. So it's expected in the Pupil Capture to see two unrelated points moving around

user-88dff1 09 November, 2018, 12:58:26

Also, didn't get the 'world camera' attachment. Does this mean I cannot calibrate at all?

papr 09 November, 2018, 13:00:40

That is correct. Could you PM me your order id?

user-88dff1 09 November, 2018, 13:00:58

(and last question πŸ˜› any plans on supporting FPGA offloading for this in the future?)

papr 09 November, 2018, 13:03:10

Do you mean fpga offloading for the pupil detection? No, this is not planned. But feel free to implment it, the algorithm is open source. πŸ™‚

user-88dff1 09 November, 2018, 13:19:54

Sent my order ID via Pm

user-2798d6 09 November, 2018, 15:43:50

Hi @papr - I just wanted to check back in with you on the audio alignment issue. I recorded on my MacBook Air.

papr 09 November, 2018, 15:49:21

@user-2798d6 This is still being worked on. Unfortunately, I cannot give you a time estimation for when this is being fixed. Don't hesitate to ask in regular intervals. πŸ™‚

user-2798d6 09 November, 2018, 17:06:18

Will do - thanks @papr

user-624fa4 09 November, 2018, 20:17:26

Hello @wrp. Thank you for your help again. Some other questions (I read in github but do not know if I understand correctly): - my experiment will be with a fictitious e-commerce site. Is there any way I can demarcate the surfaces for this site? - if I use surfaces, can I export raw data and know only the position of fixations within that surface?

user-2798d6 09 November, 2018, 20:33:43

@papr - If it helps with the audio fix, I'm noticing that the offset gets more pronounced as the recording continues. So at the beginning of a video it seems almost aligned, but by the end of a 5 minute video, it's more noticeably pronounced.

papr 09 November, 2018, 20:37:30

@user-2798d6 Yes, that confirms that this is the bug that we know of. The problem is that there are audio frames missing in the audio files that need to be filled with silence.

user-2798d6 09 November, 2018, 20:38:11

Cool! Thanks for working on it @papr . I appreciate you all!

user-624fa4 09 November, 2018, 20:38:29

Hello everyone, What analysis software do you use or indicate for raw data? 😬

papr 09 November, 2018, 20:39:25

@user-624fa4 what do you mean by "demarcate"? Use Pupil Player to open, visualize and export recordings

papr 09 November, 2018, 20:39:35

The export format is csv. See the docs for details.

user-624fa4 09 November, 2018, 20:42:01

@papr Sorry. I mean define

papr 09 November, 2018, 20:47:48

@user-624fa4 Just print surface markers and tape them to your screen πŸ˜‰

user-624fa4 09 November, 2018, 20:49:44

@papr But how will it identify scrollbar, for example? Navigating from page to page ... sorry if the questions are for beginners, but I'm pretty lost with so much information.

papr 09 November, 2018, 22:03:59

The only the tracker does is to map gaze into the surface. The tracker does not know what is within the surface

papr 09 November, 2018, 22:04:19

You could make a screen recording and map the gaze onto that, for example

user-2798d6 10 November, 2018, 17:01:38

Hello! Is there a way to get a comprehensive scan path to show over a shot of the world-view? My scene recording doesn't move around a whole lot, and I've love to be able to see a representation of the scan path from the entire recording.

user-2798d6 10 November, 2018, 17:01:41

Thank you!

user-37c9fb 10 November, 2018, 19:19:39

@papr so, the monotonic clock has a random clock start-pupil epoch, right? When, or in what condition will this pupil epoch reset? If I connect it to an Android and use pupil mobile to record several trials, is it possible the pupil epoch reset between the trials?

papr 10 November, 2018, 19:26:41

@user-37c9fb capture syncs time with Android. There might be a time reset at the beginning but afterwards it should stay monotonic

user-37c9fb 10 November, 2018, 19:29:42

@papr Can I understand as: the start point won't be reset as long as it keeps connecting to the Android?

papr 10 November, 2018, 19:33:15

Yes. There might be very small clock adjustments between recordings, but just to keep clocks in sync.

papr 10 November, 2018, 19:33:32

No random resets should occur.

user-07d4db 11 November, 2018, 23:16:48

Hello everybody! πŸ˜ƒ I have got an other question concerning my research with pupil labs: 1.The first and very basic problem is the pupil detection: The red point, in pupil capture, doesn't stick constantly to the pupil of the participant recorded. Furthermore there is no explanation how to adjust the following parameters specifically, in order to fixate the pupil detection: pupil min-max range, pupil intensity, pupil mode.

Do you have any tipps how to solve this problem and an instruction, wehre these aspects are explained specifically?

  1. The calibration doesn't work, when areas of interest are defined (through the surface tracker plug-in). Maybe it diesn't work, because the pupil detection didn't work in first step. Or do we have to choose a specific calbration method, when using the surface tracker plugin?

And do you have any further hints, where I can find explanations how to use the eye tracking glasses from your company, apart from the website? An answer would help me again a lot with my research, using the pupil labs eye tracking glasses! Thank you!!!

user-76218e 11 November, 2018, 23:51:52

Hi, i was wondering what the driver of our high speed world camera is. Can we use the pupil glasses as a normal webcam if we installed the right driver. Our project would like to process the live video. Thank you.

wrp 12 November, 2018, 04:06:14

Hi @user-07d4db I have responded to your points via email. Just to summarize for those reading your comment here. It seems like you are not yet achieving robust pupil detection. This could be due to camera position. In most cases you will not need to adjust the pupil detection parameters if you have optimailly positioned/adjusted the eye cameras. Calibration is not returning accurate results due to pupil detection/eye view not being robust. All documentation is online at https://docs.pupil-labs.com

wrp 12 November, 2018, 04:08:21

@user-76218e the cameras work within UVC protocol. On macOS and Linux the cameras will work like a webcam for example. On Windows you will need to install libusbK drivers - you can install drivers by running Pupil Capture with admin privlidges.

You note Our project would like to process the live video - have you looked at our frame publisher plugin and this script that demonstrates how to subscribe/receive the frames: https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py

user-738c1f 12 November, 2018, 05:27:14

hello guys i have a one fundamental question for 'surface tracking'

user-738c1f 12 November, 2018, 05:27:36

i want to know the reason for using 'surface tracking'.

user-738c1f 12 November, 2018, 05:27:55

why we have to define surface?

wrp 12 November, 2018, 05:30:54

@user-738c1f surfaces are used to locate 2d surfaces within your field of view. This will enable you to automatically estimate gaze relative to a specific surface. Concrete example: You want to know where a participant is looking (gaze position on surface) on a page of a magazine. You could affix markers to this page of the magazine and then have your participant look at the magazine, and the gaze positions would be mapped relative to this surface so that you could generate a heatmap (for example) or later compare gaze data for this surface with gaze data of other participants looking at this surface/page of a magazine.

user-c4492b 13 November, 2018, 04:15:04

@papr I understand the Player is capable of reviewing and replaying the eye tracking / pupil data, however, what packages are out there for reporting on this data? Ideally in the aggregate. Any Python or Jupyter Notebooks floating around? Any direction would be greatly appreciated! Thank you

user-76218e 13 November, 2018, 04:28:40

Hi, thanks for your reply. I have two basic questions regarding our pupil glasses. Our team invested the pupil glasses with 2d world camera and 200 Hz binocular eye cameras. One question is that we subscribe to the gaze topic, a script come from python helper called message_filter, and the printout includes 3d gaze location. So how do you get this 3d gaze data without using 3d world camera. Another question is that if every individual needs to calibrate the glasses for their own since our project might involve several participants. Or we actually should calibrate every time putting on the glasses. Thank you.

user-738c1f 13 November, 2018, 04:45:27

@wrp thanks for your kind reply.

user-738c1f 13 November, 2018, 04:46:27

also i have one another question. i got the raw data from surface tracker. however, i cannot exactly understand each terms meaning. could you explain more about this?

Chat image

user-516564 13 November, 2018, 05:49:09

Hey guys, my pupil DIY headset just got stolen... Do you know if there is anyone that manufactures it in the US? Or the only way is to get it from shapeways in europe?

papr 13 November, 2018, 09:00:40

@user-c4492b Currently, we do not provide any advanced examples for data reporting/aggregations. Unfortunately, I do not know any open-source tools for that either. We provide the possibility to export your data in an http://imotions.com/ compatible format though.

papr 13 November, 2018, 09:04:26

@user-76218e You should calibrate everytime the subject or the eye/world camera positions/relations change. Additionally, it is recommended to integrate the calibration into your experiment and record the procedure as well. This allows you to do offline calibration after the effect.

papr 13 November, 2018, 09:06:27

@user-516564 Do you mean a manufacturer for the frame alone?

papr 13 November, 2018, 09:12:34

@user-738c1f

"world_timestamp": Associated world frame timestamp
"world_frame_idx": Associated world frame index
"gaze_timestamp": Timestamp of the mapped gaze point
"x_norm": Normalized gaze x coordinate
"y_norm": Normalized gaze y coordinate
"x_scaled": Scaled gaze x coordinate
"y_scaled": Scaled gaze y coordinate
"on_srf": Mapped gaze is within the surface definition
"confidence": Inherited confidence value from the original gaze datum

Normalized coordinates: - Bottom left corder -> (0, 0) - Top right corder -> (1, 1)

Scaled coordinates: - Bottom left corder -> (0, 0) - Top right corder -> (surface width, surface height)

user-d9bb5a 13 November, 2018, 09:18:31

And ScanPath not fixed in the new version?

papr 13 November, 2018, 09:27:14

@user-d9bb5a Unfortunately not.

user-d9bb5a 13 November, 2018, 09:28:10

Oh, we will wait)

user-c4492b 13 November, 2018, 17:35:55

@papr thank you the reply !

user-624fa4 13 November, 2018, 18:46:12

@papr Hello, I was researching about Ogama here and saw that you sent a PM with the details for @user-d79ff5. Could you give me some guidelines?

user-2c62ff 13 November, 2018, 19:21:15

I have the HTC Vive Binocular Add-on and was able to successfully mount these in a Vive pro. Is it possible to automatically detect IPD without first calibrating the Vive pro?

user-2c62ff 13 November, 2018, 19:21:43

We manually measure IPD but this seems often off with a few mm

papr 13 November, 2018, 19:54:01

@user-624fa4 I checked, I PM him due to an other issue. I do not know how Ogama works, unfortunately

user-624fa4 13 November, 2018, 21:56:43

@papr Ok. Thanks. As the message appeared right after his question I thought it was about it.

user-ce3667 13 November, 2018, 23:59:25

Hi everyone. I was just wondering if there is a 32-bit version of Pupil Capture, Player and Services. Thanks in advance

wrp 14 November, 2018, 00:14:55

@user-ce3667 sorry, only 64bit available

user-ce3667 14 November, 2018, 00:19:51

@wrp okay cheers mate

user-738c1f 14 November, 2018, 08:38:46

Thanks alot @wrp . However, at x_norm and y_norm there is some minus value. does it mean that tracking is out of surface?

Chat image

papr 14 November, 2018, 08:43:06

@user-738c1f that is correct. It is also out of surface if one of the norm pos values is bigger than 1

user-9dbee3 14 November, 2018, 20:20:00

can someone explain to me, in case of offline calibration and using pupil while walking around, how do we get the gaze position from the eye position? It seems to me that since the objects are not on the calibration plane it would be hard to tell it?

papr 14 November, 2018, 20:34:35

@user-9dbee3 gaze is always calibrated to the field of view of the world camera. Gaze is not calibrated to real world objects

user-9dbee3 14 November, 2018, 20:35:33

oh, I see . thanks!

user-9dbee3 14 November, 2018, 20:37:22

I am using the HTC vive with pupil labs installed, in which case I assume the main camera is just the VR game input then

papr 14 November, 2018, 20:58:45

In the vr case you calibrate to the field of view of each eye display. If you meant this by vr output, then yes πŸ™‚

user-8944cb 16 November, 2018, 16:30:15

Hi, I will be happy for your help with two questions: 1) When I record with Pupil Capture I get occasional warnings that say: "WORLD:Turbojpeg.jpeg2yuv:b'CorruptJPEG data: 25 extraneous bytes before marker 0xd3". What does this mean, and is there anything I can do to prevent it from happening? 2) When I export a recording to Pupil Player I sometimes get red errors that say for example X or Y equals something (looks line coordinated of some sort)- How can I prevent those errors from happening, and how do they affect the exported data? Thank you!

user-e2056a 16 November, 2018, 19:22:35

@papr Hi! We have been using Pupil player to extract fixation data, however, I noticed that the fixation count export was constantly 0. A month ago I extracted the same eye recording and the fixation was about 20, and the settings then was the same as we are using now (max dispersion 1.01 degree, min duration 100ms, max duration 4000) Is there anything other setting that we might have messed up to cause the difference? Thank you!

user-e2056a 16 November, 2018, 19:27:20

@papr Hi, another thing we noticed about our data is that heatmap out put is constantly 0kb or 1kb, the gaze position.csv showed some valid gaze data (about 500 gazes) though, again I was wondering what we might have done wrong in the heatmap generation process. I followed the 'using the Offline Surface Detector plugin to generate heatmap" instructions in Pupil Docs. Related to that, in the gaze_positions_on_surface_<surface_name>_<surface_id>.csv , x_norm and y_norm should be coordinates between 0 and 1, but we saw lots of y-norm data that is greater than 1. could you shed some light on this? Thanks!

user-2be752 16 November, 2018, 19:51:51

Hi, I was wondering, can you create heatmaps without surface tracking?

user-7db3ca 17 November, 2018, 18:36:48

Hi all. I recorded two videos with the same subject and same eye/world camera positions. Video 1 contains calibration markers. Video 2 doesn't contain calibration markers. Can I use the calibration from video 1 to calibrate video 2? Thank you.

papr 17 November, 2018, 18:45:55

@user-7db3ca This feature is under development and will probably be released in v1.11

papr 17 November, 2018, 18:47:14

@user-2be752 Currently, you can only create heatmaps using surfaces.

papr 17 November, 2018, 18:47:44

@user-e2056a Did you change Pupil Player versions?

user-7db3ca 17 November, 2018, 19:06:01

Thank you, papr. Estimate release date?

papr 17 November, 2018, 19:13:26

There is no estimate yet.

user-64b0d2 19 November, 2018, 15:02:11

Hi all, I'm trying to set up an experiment where I need to measure pupil diameter as accurately as possible. I only managed to find a single diameter value in my exported raw data, either for 2d or the 3d model (which I assume is the average of both eyes?) but not for each eye seperately. Is there a way to get the pupil diameter from each eye seperately?

papr 19 November, 2018, 15:03:35

@user-64b0d2 the exported pupil_positions.csv contains pupil diameter for each eye separately

papr 19 November, 2018, 15:04:51

Be aware that the 2d pupil diameter is in pixels and can vary based on eye-camera-to-eye distance. The 3d pupil diameter is in mm but not corrected for refraction effects

user-309b26 19 November, 2018, 19:55:10

Hi can anyone lead me to a reference where I can just in few lines of code will be able to extract the gaze positions from the eyetracker

papr 19 November, 2018, 20:17:39

@user-309b26 offline after the effect or online/realtime?

user-66516a 19 November, 2018, 21:26:33

Hi! I'm having trouble exporting data from my recordings. I'm using the pupil lab add-on for HMD, window 10, and used pupil capture to record data. When I'm using pupil player to export the data, I end up with empty files while the raw data does not seem to be empty (I don't recall any issue with the recording). Most files are 1ko and only contain the header. Any idea how to fix this issue? (I already downloaded the latest version of Pupil Player, restarted Pupil Player with default settings and deleted the player setting folder)

Chat image

papr 19 November, 2018, 21:28:20

@user-66516a Please restore to defaults in the General Settings and try again

user-66516a 19 November, 2018, 21:34:45

@papr Thanks for the response. I've just tried again, but it didn't work. I still get empty files. :/

papr 19 November, 2018, 21:35:17

Just to be sure, you did not enable Offline Pupil Detection?

papr 19 November, 2018, 21:36:01

If not, please share the recording with data@pupil-labs.com and we will have a look

user-66516a 19 November, 2018, 21:38:17

I'm not sure, is it in the "Plugin Manager" tab ?

user-66516a 19 November, 2018, 21:38:41

If yes, nothing "offline" seem to be enabled

papr 19 November, 2018, 21:41:09

No, it can be enabled in the Pupil From Recording menu

user-66516a 19 November, 2018, 21:42:29

No, it is set to "Pupil From Recording". I'm sending you a recording. Thx!

user-738c1f 20 November, 2018, 02:04:07

@papr thank you so much

user-e78e77 20 November, 2018, 02:32:19

Chat image

wrp 20 November, 2018, 06:54:25

Hi @user-e78e77 could you provide us with some context? It looks like you either do not have cameras connected, or that drivers are not installed (are you using Windows?)

user-738c1f 20 November, 2018, 07:29:34

@papr @wrp excuse me i have a question. I got data but i dont know meaning of frame and timestamp. What is that meaning? and also why it starts 6378 something?? and what is the unit of each one. for example x norms unit?

Chat image

wrp 20 November, 2018, 07:46:40

Hi @user-738c1f please see https://docs.pupil-labs.com/#data-format

user-e2056a 21 November, 2018, 03:18:35

@papr I dont think I changed Pupil Player version, the version I'm using now is v1.8.

user-e2056a 21 November, 2018, 03:19:36

@papr will a change of version fix the y-norm data > 1 issue?

papr 21 November, 2018, 07:44:31

@user-e2056a norm data is also valid if it is smaller than 0 or bigger than 1. In the case of surfaces it means that the gaze was not on the surface. See the on_surf column.

papr 21 November, 2018, 07:46:35

@user-e2056a could you also share a recording that exhibits your other issue with not detecting fixations? Please share it with data@pupil-labs.com

user-738c1f 21 November, 2018, 08:49:24

@wrp thank you for help

user-738c1f 21 November, 2018, 08:51:26

i have a question. i want to see gaze plot and heatmap with data. However, pupil player only showed heatmap and csv raw data. I want to know how to see gaze plot and data with heatmap and gaze plot.

user-bab6ad 21 November, 2018, 14:08:23

Hi everyone. I have a question: I recorded an experiment a while ago, and wonder how and if I can run a detection for the pupil size over that recoding afterwards? Thanks for the help!

papr 21 November, 2018, 14:13:39

@user-bab6ad Yes, this is possible if you recorded the eye videos

papr 21 November, 2018, 14:14:05

See https://docs.pupil-labs.com/#data-source-plugins

user-bab6ad 21 November, 2018, 14:14:08

@papr yes, found it now. was just blind! I refoded the eye videos back then yes and did the calibration

user-bab6ad 21 November, 2018, 14:14:25

thx again!

user-68d457 21 November, 2018, 14:23:43

@papr @user-2798d6 Just in case we could be talking about a slightly different audio alignment bug - I've just looked at a 20 minute test recording, and there's no drift as the recording goes on, just a constant offset between audio and video. The same fix worked as before (edit audio_timestamps.npy to match the first audio timestamp to the first world timestamp - I am doing this with a simple Matlab script currently).

user-c5a9dc 21 November, 2018, 19:12:26

Hi there! My eyeball size keeps on jumping between different sizes during a recording. Are there any tweaks to the settings that would prevent this?

user-29e10a 22 November, 2018, 07:24:54

Hi, as of v1.9 it seems there are some disharmonics between the GUI and my system (Windows). If I click into the eye video windows, the buttons on the menu become invisible. Theyβ€˜re still there and clickable, but not visible.

Chat image

user-29e10a 22 November, 2018, 07:25:42

Is it just me or has anybody else this β€žproblemβ€œ? πŸ˜‡

user-e435ae 22 November, 2018, 13:56:07

HI all! I would like to use Pupil eye tracker to access to the gaze of a human subject. In particular, I need to translate the gaze positions obtained from the tracker with respect to an absolute frame (the frame of a motion capture system). This would help me, for instance, to know whether or not the subject is looking at an object (without using the surface tracker). To do so, I need to know where the relative reference frame of the tracker is located. I looked for this information on the User Docs but I didn't manage to find it. Does anyone know this information or has a better idea to do this? Thank you in advance 😬

papr 22 November, 2018, 14:12:21

@user-e435ae The closest solution is to use the surface tracking feature. It does not do head pose estimation though...

user-21d960 22 November, 2018, 17:08:38

Hey, I want to start developing with a PupilLabs eyetracker. I have heard of OkazoLabs eventIDE, but it seems kinda sketchy, anyone have experience with them?

papr 22 November, 2018, 18:04:03

@user-21d960 I have never heard of it

user-21d960 22 November, 2018, 18:07:36

What is commonly used to create programs for eyetrackers then?

papr 22 November, 2018, 18:14:25

Our software is written in python. You can use any text editor to modify the code. This is often not though. You can use our network interface the access the data as a start.

papr 22 November, 2018, 18:15:52

See these examples https://github.com/pupil-labs/pupil-helpers

user-21d960 22 November, 2018, 18:18:41

Right, but eventIDE is basically a program that allows us to make, for example, a slideshow and test someones eyesight with different gaze tests, etc. it takes pupillabs eyetracker's data and see how well the patient it performing these tests. from what I understand the software you linked just acquires the data

papr 22 November, 2018, 18:20:29

That is correct regarding the examples. I don't know about eventIDE.

user-21d960 22 November, 2018, 18:21:49

here is an example of what eventIDE can do

user-21d960 22 November, 2018, 18:21:50

https://www.youtube.com/watch?v=GZnPoAOsXto

papr 22 November, 2018, 18:35:39

You are right, it looks indeed kinda sketchy. Why does the screen capture rotate?

user-c5a9dc 22 November, 2018, 18:38:01

@papr Any insights into the question of eye balls which change in diameter? Is it possible to peg the eye ball to a specific size?

papr 22 November, 2018, 18:40:33

@user-c5a9dc I think there were previous github issues on that topic. Please, could you also share a recording with data@pupil-labs.com such that we can investigate?

user-21d960 22 November, 2018, 18:44:53

@papr im not sure i just found this online

user-21d960 22 November, 2018, 18:45:05

so when people want to create programs with pupil, do they just make one from scratch?

user-c5a9dc 22 November, 2018, 18:52:30

@papr: Yes, I just opened an issue yesterday but haven't found anything else on it. I can share a recording which has this problem, will do so.

mpk 23 November, 2018, 09:56:54

@user-c5a9dc regarding the eye ball change. The relevant slider is called Model sensitivity it can be found in the 3d eye tracker settings.

user-c5a9dc 23 November, 2018, 10:24:00

Yes, can you describe to what it means? I.e. what does the parameter do under the hood?

user-c5a9dc 23 November, 2018, 10:25:46

I haven't found a description on your docs.

user-c5a9dc 23 November, 2018, 10:59:45

I've found this piece of code: https://github.com/pupil-labs/pupil/blob/e0781a373439f2cf33b9612c066cb31c3e4c1136/pupil_src/shared_modules/pupil_detectors/singleeyefitter/EyeModel.h (lines 79-81), but it appears that it's not used.

user-fe3dd2 23 November, 2018, 18:22:23

Hello. We are are trying to calibrate and are having an issue where, when looking far to either side, the confidence of one eye will drop down to 0.20. If looking far up or down, the confidence of both eyes similarly drops. It looks like it loses the tracking of the entire eye (green circle) rather than just the pupil. Any advice?

user-9429ba 24 November, 2018, 16:37:12

Hi- I have been using pupil glasses successfully on a windows 10 machine .. but having driver problems on a different Windows 7 laptop. Eye cameras give no eye image and only open in 'ghost mode'. On digging around I think the most likley reason is out-ofdate drivers. But the only drivers I now have access too were from a previous release, i.e. before version 1.9-7. Now however, when you go to driver download from the pupil docs page, the link is obselte (below). Are there newer/latest drivers for version 1.9 that someone can point me too? Or other suggestions? Thanks a lot! https://drive.google.com/uc?export=download&id=0Byap58sXjMVfR0p4eW5KcXpfQjg

papr 24 November, 2018, 19:26:11

@user-9429ba hi, unfortunately, we do not support windows 7. We recommend upgrading your laptops.

user-624fa4 24 November, 2018, 19:56:57

I captured four sessions with Pupil Capture. For all of them I set the surfaces with the Offline Surface Tracker in the Pupil Player. But for the last two it is not saving the .png file for heatmaps. Any help?

user-9429ba 24 November, 2018, 20:19:25

no worries - it's just nice to know that that's the likley problem. Thanks!

user-309b26 24 November, 2018, 20:35:53

how do you read the pupil_data in python

user-624fa4 25 November, 2018, 17:43:40

Regarding the Pupil Player not exporting the png file of the heatmaps: I already tried the "Restart with default settings". I've also tried exporting back from files that previously generated the png files of the heatmaps. Nothing works. Do I need to download the software again?

papr 25 November, 2018, 17:44:30

@user-624fa4 No, renstalling will not help if "Restart with defaults" did not help.

papr 25 November, 2018, 17:45:05

@user-624fa4 Does the gaze export work? i.e. is gaze_positions.csv filled with data?

user-6b1b1f 26 November, 2018, 07:09:49

Hi all, is pupil labs supported on Ubuntu 18.04? I can get the cameras to work on 16.04 but there is a driver initialization issue on 18.04.

user-6b1b1f 26 November, 2018, 07:10:48

Ah, I just saw this question was answered previously.

wrp 26 November, 2018, 07:12:31

@user-6b1b1f thanks for following up on this 😸 Just for consistency - Pupil software does run on Ubuntu 18.04, and should not require any customizations/setup other than downloading the app from https://github.com/pupil-labs/pupil/releases/latest

user-6b1b1f 26 November, 2018, 07:14:23

@wrp thanks. thats what I did but it fails. I can share a screenshot if you are available

wrp 26 November, 2018, 07:14:54

Sure, would be happy to take a look at a screenshot? BTW what Pupil hardware are you using?

user-6b1b1f 26 November, 2018, 07:15:20

Chat image

user-6b1b1f 26 November, 2018, 07:15:28

We are using the addon for the HTC Vive

user-6b1b1f 26 November, 2018, 07:15:42

The only thing I can see relevant in dmesg is

user-6b1b1f 26 November, 2018, 07:15:46
[ 1249.384129] uvcvideo: Found UVC 1.00 device Pupil Cam1 ID0 (05a3:9230)
[ 1249.421572] uvcvideo 1-6.1.3:1.0: Entity type for entity Extension 3 was not initialized!
[ 1249.421576] uvcvideo 1-6.1.3:1.0: Entity type for entity Processing 2 was not initialized!
[ 1249.421578] uvcvideo 1-6.1.3:1.0: Entity type for entity Camera 1 was not initialized!
[ 1249.421706] input: Pupil Cam1 ID0: Pupil Cam1 ID0 as /devices/pci0000:00/0000:00:14.0/usb1/1-6/1-6.1/1-6.1.3/1-6.1.3:1.0/input/input231
user-6b1b1f 26 November, 2018, 07:17:39

Right now Im guessing its something with uvcvideo on 18.04 vs 16.04 but Im not sure

user-6b1b1f 26 November, 2018, 07:17:51

note that we have a windows setup as well and it works no problem

user-6b1b1f 26 November, 2018, 07:18:01

and we just tested it on a 16.04 machine and that works as well

wrp 26 November, 2018, 07:20:05

@user-6b1b1f a few notes: 1. I would recommend upgrading to the latest version of Pupil software (v1.9) 2. Try changing udev rules for running libuvc as normal user (this usually does not need to be done with the app bundle), but give it a try:

echo 'SUBSYSTEM=="usb",  ENV{DEVTYPE}=="usb_device", GROUP="plugdev", MODE="0664"' | sudo tee /etc/udev/rules.d/10-libuvc.rules > /dev/null
sudo udevadm trigger
wrp 26 November, 2018, 07:20:43

@user-6b1b1f thanks for the comprehensive notes - good to know that the hardware is working, now it seems like it is a permission issue on your specific installation of 18.04

user-6b1b1f 26 November, 2018, 07:22:07

OK. that is good - can you suggest a fix for that? Thanks.

user-6b1b1f 26 November, 2018, 07:22:20

whoops too soon. I will try that

user-6b1b1f 26 November, 2018, 07:22:47

I am using using 1.9.

wrp 26 November, 2018, 07:22:59

πŸ†—

user-6b1b1f 26 November, 2018, 07:31:59

hmm that didnt seem to work

user-6b1b1f 26 November, 2018, 07:32:09

the rules file was updated though

user-6b1b1f 26 November, 2018, 07:33:49

I had to open the permissions completely

user-6b1b1f 26 November, 2018, 07:33:51

0777

user-6b1b1f 26 November, 2018, 07:33:53

works now

user-6b1b1f 26 November, 2018, 07:33:57

thanks!

wrp 26 November, 2018, 07:35:04

I had to open the permissions completely Open permissions completely for what exactly?

user-6b1b1f 26 November, 2018, 07:36:42

plugdev group

user-6b1b1f 26 November, 2018, 07:36:50

echo 'SUBSYSTEM=="usb", ENV{DEVTYPE}=="usb_device", GROUP="plugdev", MODE="0777"'

wrp 26 November, 2018, 07:37:33

Thanks for the notes @user-6b1b1f - is your user sudo user?

user-6b1b1f 26 November, 2018, 07:40:30

my user has sudo access

wrp 26 November, 2018, 07:41:01

ok, thanks for the notes. This should not require 0777 access, but we will look into it further

user-6b1b1f 26 November, 2018, 07:43:01

should the plugdev group be one of my users groups?

user-6b1b1f 26 November, 2018, 07:43:38

current user is not under many groups actually

user-6b1b1f 26 November, 2018, 07:44:34

[email removed] groups test test : test sudo video```

user-6b1b1f 26 November, 2018, 07:44:56

user permissions couldve gotten messed up somewhere but I dont recall doing anything

user-6b1b1f 26 November, 2018, 07:46:44

hmm yeah that seems to be on my end. my other users' permissions seem to have plugdev

user-6b1b1f 26 November, 2018, 07:50:26

I just added my user to plugdev and that didnt seem to fix it, so for now 0777 seems necessary for me until I can figure out whats up

papr 26 November, 2018, 07:59:36

I also have encountered that one needs to reboot after changing such permissions and user groups.

user-6b1b1f 26 November, 2018, 08:25:18

OK. that is good to know I did not do that. I will double check later. Thank you for your help

user-6b1b1f 26 November, 2018, 08:47:12

Now it is working, under calibration. There is HMD calibration and 3D HMD Calibration, but I don't see any documentation on using these calibration routines. Does a manual exist on how to use them?

papr 26 November, 2018, 08:49:41

@user-6b1b1f They are meant to be used with the unity plugin or a similar hmd client

papr 26 November, 2018, 08:49:51

See the hmd-eyes project on Github for details

user-6b1b1f 26 November, 2018, 08:50:15

OK.

user-624fa4 26 November, 2018, 10:55:50

@papr Yes. The csv files contain data. I opened again in the Pupil Player the files of session 002 that last Friday and Saturday I had managed to export the heatmaps. I did the same procedures yesterday (November, 25), and the Pupil Player no longer exports the heatmaps. 😦

papr 26 November, 2018, 12:42:58

@user-624fa4 Could you share one of the recording that exhibit this behavior with data@pupil-labs.com ?

user-63b99d 26 November, 2018, 18:57:54

Hello all, my name is Vasi. I am here because I am interested in learning and following along. I am hoping to be able to contribute to the project in the near future, however I will only follow at this point. I am excited to be here and learn about such innovated and new ingenious programs.

user-63b99d 26 November, 2018, 18:59:24

Oops, apologies name is updated😊

papr 26 November, 2018, 20:16:35

Hi @user-63b99d Welcome to the channel!

user-63b99d 26 November, 2018, 20:43:46

Thank you @papr

papr 26 November, 2018, 20:44:23

@user-63b99d May I ask what your background is and how you got to know about Pupil?

user-63b99d 26 November, 2018, 20:50:17

@papr Certainly! Previously I was a student and dabbled a bit in gaming and game development. Most recently I became curious about app development for the medical community. I heard about Pupil on GitHub. I am alway excited and intrigued to learn new things so I decided to follow the thread and here I am. 😊

user-624fa4 26 November, 2018, 22:30:49

Hello @papr I sent it to you.

papr 26 November, 2018, 22:31:55

@user-624fa4 Thank you, I have received them.

user-624fa4 26 November, 2018, 22:40:51

@papr I made a new test recording now and the heatmaps were exported. Was it any particular problem with these others? Luckily it was a pre-test of the experiment but I did not want to take that risk with the final data.

papr 26 November, 2018, 22:42:36

@user-624fa4 I will let you know when I had a look at your data.

user-f3048f 26 November, 2018, 23:59:58

Hi there! I'm wondering if anybody has used GazeCode for manual coding with pupil. If yes what was the experience? I'm having some problems to run gazecode in Matlab with pupil 200hz. Any input is welcomed. Cheers!

user-d81c81 27 November, 2018, 02:31:05

hey i have a trouble in building boost.python can anyone help me?

user-f27d88 27 November, 2018, 02:37:05

hello @user-d81c81 , please tell us more about the details, I will try to help you.

user-d81c81 27 November, 2018, 02:51:11

https://docs.pupil-labs.com/#boost

user-d81c81 27 November, 2018, 02:52:55

@user-f27d88 i have installed boost now i'm trying to build boost.python i'm failing to do so can you please figure out the reason for me?

user-f27d88 27 November, 2018, 02:58:38

Sorry, I don't have a windows laptop nearby :< But you can provide the environment info like system version and error logs so I can try to fix it.

user-d81c81 27 November, 2018, 02:59:36

how do i provide you the error log?

papr 27 November, 2018, 07:02:07

@user-d81c81 Is there a specific reason why you are building from source on windows? Often, it is much easier to just download the application releases from our github page.

user-f27d88 27 November, 2018, 07:16:17

I got some errors when I install libuvc on macOS 10.14, I provided a solution at https://github.com/pupil-labs/pyuvc/issues/30, I'm not sure there is better choice. I also found that the docs ask us to run make && sudo make install in https://github.com/pupil-labs/pyuvc but make && make install in https://docs.pupil-labs.com/#install-libuvc. I guess one of the docs is wrong.

papr 27 November, 2018, 07:26:49

@user-f27d88 Hi, I have been working on that issue yesterday as well. I made changes to CMakeLists file but they need to be tested on Windows first. I will make an PR later.

user-f27d88 27 November, 2018, 07:34:09

@papr Great, Thank you.

papr 27 November, 2018, 10:17:14

@user-f27d88 This is the promised libuvc PR: https://github.com/pupil-labs/libuvc/pull/32

user-d81c81 27 November, 2018, 12:13:46

@papr how to download app releases from github?

user-e0772f 27 November, 2018, 12:16:26

hello. I'm having problems with Mac OS 14 when install pyglui with pip3. Any suggestion?

papr 27 November, 2018, 12:25:20

@user-e0772f what is the exact error message?

papr 27 November, 2018, 12:27:01

@user-d81c81 https://github.com/pupil-labs/pupil/releases/tag/v1.9

papr 27 November, 2018, 12:27:21

Choose the download according to your operating system

user-e0772f 27 November, 2018, 12:28:05

pyglui/ui.cpp:664:10: fatal error: 'gl.h' file not found #include "gl.h" ^~~~~~ 1 error generated. error: command 'clang' failed with exit status 1

user-d81c81 27 November, 2018, 12:28:27

@papr thanks man

papr 27 November, 2018, 12:30:25

@user-e0772f please make sure that you installed OpenGL

user-e0772f 27 November, 2018, 12:38:23

according to my pip install list I have installed OpenGL: Package Version


av 0.4.2.dev0 cysignals 1.7.2
Cython 0.29.1
ipaddress 1.0.22
msgpack 0.5.6
ndsi 0.4
nose 1.3.7
numexpr 2.6.8
numpy 1.15.4
pip 18.1
psutil 5.4.8
PyAudio 0.2.11
PyOpenGL 3.1.0
pyre 0.3.2
pyzmq 17.1.2
scipy 1.1.0
setuptools 40.5.0
uvc 0.13
wheel 0.32.2

papr 27 November, 2018, 12:43:49

@user-e0772f are you on Mac OS or Linux?

user-e0772f 27 November, 2018, 12:58:25

I'm on Mac OS as I said before but I think that I'm gonna try with Windows 10 in another laptop

user-e0772f 27 November, 2018, 12:59:49

because I'm looking for a isolated environment for a pupil project but in this case I need to install all dependencies using brew

papr 27 November, 2018, 13:04:49

@user-e0772f ah yes, I did not see that. Is there a reason for running from source? 99% of the custom stuff can be done via plugins or the network api. No need to run from source for that

user-b0c902 27 November, 2018, 13:31:15

Hi can someone help me with how to resolve this please ?

user-b0c902 27 November, 2018, 13:31:59

Chat image

papr 27 November, 2018, 13:35:17

@user-b0c902 could you please close Capture and upload the capture.log file in the pupil_capture_settings folder?

user-d81c81 27 November, 2018, 13:47:36

@papr i've downloaded the files please tell me how to run it

papr 27 November, 2018, 13:49:46

@user-d81c81 after unpacking the 7z file, there should be a folder with a bunch of files. One of them is Capture.exe. Right click it and start it with administrator rights

papr 27 November, 2018, 13:50:57

Also please see the getting started section of our documentation. It is linked on our website.

user-d81c81 27 November, 2018, 13:52:27

i got 3 sub folders

user-d81c81 27 November, 2018, 13:55:18

is it pupil_capture?

user-d81c81 27 November, 2018, 13:57:57

@papr please send a video link explaining it if you encounter

wrp 27 November, 2018, 14:13:07

@user-d81c81 you will have 3 subfolders: 1 -pupil_capture_, 2 pupil_player_, 3. pupil_service_

user-d81c81 27 November, 2018, 14:13:37

@wrp i got it how to capture

wrp 27 November, 2018, 14:13:50

Open the pupil_capture subfolder

wrp 27 November, 2018, 14:14:08

and find pupil_capture.exe right click and run as administrator the pupil_capture.exe

user-d81c81 27 November, 2018, 14:14:29

i want the eye coordinates in a given time interval

user-d81c81 27 November, 2018, 14:14:34

@wrp

user-d81c81 27 November, 2018, 14:15:17

should i do anything after opening pupil_capture.exe or is it enough?

papr 27 November, 2018, 14:17:41

@user-d81c81 please read the getting started section of our documentation.

wrp 27 November, 2018, 14:18:14

https://docs.pupil-labs.com/

user-d81c81 27 November, 2018, 14:18:16

i have read the read the documentation i was stuck at building boost.python

wrp 27 November, 2018, 14:18:42

why are you building Pupil from source @user-d81c81 - based on above messages it seems you are running the application bundle. Can you clarify?

user-d81c81 27 November, 2018, 14:18:48

so i downloaded the files from the link given by @papr

user-d81c81 27 November, 2018, 14:19:09

@wrp now i am trying to run the downloaded application

wrp 27 November, 2018, 14:19:17

yes, and is it running?

user-d81c81 27 November, 2018, 14:19:40

only capture is running what's the next step?

wrp 27 November, 2018, 14:21:19

@user-d81c81 - https://docs.pupil-labs.com/#capture-workflow

wrp 27 November, 2018, 14:21:36

Please read through the workflow

user-d81c81 27 November, 2018, 14:21:49

ok thank you

user-2be752 27 November, 2018, 19:52:53

Is it possible to connect two eyetrackers to the same computer and record from both of them at the same time from Pupil Capture?

papr 27 November, 2018, 19:55:51

@user-2be752 You will need to run two instances of Pupil Capture. Be aware that this might very resource intensive. We recommend to either use two separate Pupil Mobile instances or two separate computers running Pupil Capture. You can synchronize them using our Pupil Groups and Time Sync plugins.

papr 27 November, 2018, 20:02:44

Good news everyone. You can now be notified about our releases via Github without having to read the day-to-day development notifications. πŸ‘

Chat image

user-46c590 28 November, 2018, 12:26:21

hello! I'm looking for raw data extract plugin and i could not find it on the pupil capture software

user-46c590 28 November, 2018, 12:26:38

how can i use it?

papr 28 November, 2018, 12:35:43

@user-46c590 Capture stores the data in an intermediate format. Use Pupil Player to use the Raw Data Exporter.

user-46c590 28 November, 2018, 12:39:43

i want to extract real time pupil data into a csv file. what is the easiest way to do that?

user-21d960 28 November, 2018, 22:37:26

anyone use experiment building tools? im looking for a good one

user-c02411 28 November, 2018, 23:56:18

Hello, would anyone know how to make ffmpeg recognized in python mac os for opencv or youtube-dl modules?

user-95b6b7 29 November, 2018, 10:14:06

[email removed] Has anyone used Pupil Labs set with Psychopy? Is there an easy way to send digital time-stamps via parallel port to Pupil Capture?

user-95b6b7 29 November, 2018, 16:05:00

To elaborate my question a bit further.. We're tracking pupil diameter as a measure of fear response during a simple aversive learning task. The goal is to extract trial-by-trial fear responses. Worst case scenario - we can just time-lock it to the onset of the task based on video recordings from frontal camera (we don't need perfect time-precision), but I would rather prefer having proper time-stamps at the beginning of each trial (like you normally do when, for example, registering galvanic skin response).

papr 29 November, 2018, 16:40:46

@user-95b6b7 what do you need the parallel port for?

user-dfeeb9 29 November, 2018, 17:04:28

Would have any advice on appropriately setting pupil-min when you're adjusting pupil tracking parameters? I understand the main source of accuracy will be from placement but does that mean we should be trying not to change these settings at all? More specifically, how are you utilising this pupil-min setting and will it affect for example diameter measurement data?

user-81072d 29 November, 2018, 19:11:34

@user-95b6b7 - we're using Pupil+Psychopy, but the other way around (gaze data informs our psychopy program).

FWIW though, the pupil remote plulgin has a pretty straightforward ZMQ interface. I don't know if it makes sense for your workflow, but you can request the current Pupil timestamp and save that in your psychopy log? Or maybe you want to use custom annotations?

https://github.com/pupil-labs/pupil-helpers/blob/master/python/remote_annotations.py

user-9429ba 30 November, 2018, 10:30:34

Hi- I recently did a successful recording, but when exporting the data in Player I get a message about user calibration not being available, despite calibrating:

Export World Video - [INFO] camera_models: Previously recorded calibration found and loaded! Export World Video - [INFO] camera_models: No user calibration found for camera eye0 at resolution (192, 192) Export World Video - [INFO] camera_models: No pre-recorded calibration available Export World Video - [WARNING] camera_models: Loading dummy calibration Export World Video - [INFO] camera_models: No user calibration found for camera eye1 at resolution (192, 192) Export World Video - [INFO] camera_models: No pre-recorded calibration available Export World Video - [WARNING] camera_models: Loading dummy calibration

It did cross my mind that this could be because I mistakenly pressed the T key to validate when meaning to log one of my custom annotations - then just continued. Could this be the case. And should I be concerned about this message??

Thanks a lot, /Richard

papr 30 November, 2018, 10:34:32

@user-9429ba This message is about the camera intrinsics calibration. This does not mean the user gaze calibration. Which world cam do you use?

user-9429ba 30 November, 2018, 10:55:33

hmm, not sure the name of world camera (looked in player and capture settings). It is a new 200Hz binocular - w120 e200b on the box. Does that help?

papr 30 November, 2018, 10:59:47

@user-9429ba yes. Can you check if there are any *. intrinsics files in your recording folder?

user-9429ba 30 November, 2018, 11:17:26

Yes, here is the intrinsics file

world.intrinsics

papr 30 November, 2018, 11:21:00

@user-9429ba thank you. I will look into the issue

user-9429ba 30 November, 2018, 11:37:30

thanks a lot!

user-95b6b7 30 November, 2018, 16:59:35

@papr I'm running pupil capture on a separate computer and would like to send time-stamps on each trial. (I did it via parallel port before when measuring galvanic skin response).

End of November archive