core


user-c351d6 01 October, 2018, 11:26:38

Hey guys, are you making progress with the issue that calculated fixations are not saved in Pupil Player? Is there already a release date?

papr 01 October, 2018, 13:15:08

@user-c351d6 It is being worked on but there is no release date yet

papr 01 October, 2018, 13:15:52

@user-81fd53 Activate the Frame Publisher plugin in Capture and use this to receive the frames: https://github.com/pupil-labs/pupil-helpers/blob/0df77b47cebd49a6c35b6769da483c115a626836/pupil_remote/recv_world_video_frames.py

papr 01 October, 2018, 13:16:30

@user-c828f5 offline_calibration_gaze

user-828f57 01 October, 2018, 13:27:38

Hey there everyone! New to the server here. I'm a Media Technology M.Sc. student from Denmark. This semester our group (3 students) will be working with Pupil Labs, especially the HMD one, probably doing a case study on disabled people and interactions in VR. Hope this is the right channel to introduce yourself in. I'll probably ask for help on here, and if anyone has tips and tricks on working with disabled persons, be sure to throw it our way. Happy eye tracking everyone! Oh also, please @ tag me, if you want to get in contact. I've turned off regular notifications

user-c828f5 01 October, 2018, 13:27:48

Thanks @papr , how do I read that file? I tried finding something on the website but no luck.

user-c828f5 01 October, 2018, 13:28:05

Welcome to group @user-828f57

wrp 01 October, 2018, 13:28:38

Hi @user-828f57 welcome to the Pupil community server 👋🏽 please direct questions regarding vr/ar to vr-ar channel

user-828f57 01 October, 2018, 13:29:29

@wrp Ah yes I see, thanks!

user-89dccf 01 October, 2018, 15:44:37

hi

user-89dccf 01 October, 2018, 15:45:09

Could I use mobile cam app to use the pupil eye library

papr 01 October, 2018, 15:45:43

Do you mean e.g. the front camera of your phone?

user-89dccf 01 October, 2018, 15:45:51

yes

papr 01 October, 2018, 15:46:23

No, this is not possible. This is a different type of eye tracking, called remote eye tracking. Pupil only does head-mounted eye tracking.

user-89dccf 01 October, 2018, 15:51:10

i want to use mobile front cam to get the pupil 's diameter ? Could I use CNN to do these stuff?

papr 01 October, 2018, 15:53:11

There might be work on that but nothing specifically that I could name. As I said, Pupil does not do remote eye tracking. Remote eye tracking comes with its own set of problems. e.g. head estimation, etc, that Pupil does not have to solve.

user-89dccf 01 October, 2018, 15:57:15

oket. Thanks for the information.

user-e7102b 01 October, 2018, 21:45:54

Hey @papr , a few months back I enquired if there were any plans to incorporate a Batch Raw Data Export function into pupil player. I'm just checking in to see if there are any updates? I'

user-e7102b 01 October, 2018, 21:47:20

Thanks

user-14d189 02 October, 2018, 04:53:39

Hi there,

user-14d189 02 October, 2018, 04:55:09

I try to understand better the confidence data of the pupil detections. Are there any documents you could recomend?

user-14d189 02 October, 2018, 04:55:52

Same with the calculation of the gaze data and it's confidence.

user-14d189 02 October, 2018, 04:56:39

I assumed there would be a correlation between both confidence factores, But so far I could not find it.

user-14d189 02 October, 2018, 04:56:58

Thanks for any hints.

papr 02 October, 2018, 06:50:09

@user-e7102b hey, unfortunately we did not have the time to start the Batch Exporter rework yet. Are you only looking for exporting prerecorded pupil and gaze data? Or do you need to export offline detected data as well?

papr 02 October, 2018, 06:52:50

@user-14d189 The confidence is a ratio between how many potential pupil edge pixels lie on the fitted pupil ellipse, i.e. how noise is the detected pupil.

papr 02 October, 2018, 06:53:32

Above goes for the 2d detector. The 3d detector also takes current model fitness into account.

papr 02 October, 2018, 06:56:09

Gaze confidence is derived from pupil confidence. 2d gaze calibration/mapping works via polynomial regression and 3d calibration is calculated using bundle adjustment. 3d mapping comes down to projecting vectors from the eye camera coordinate system into the scene coordinate system

user-ba85a3 02 October, 2018, 12:50:08

Hi there, following a list of information i would like to know: 1) i found on git the STL files of some eye tracking's parts. Is it possible to have the STEP file? STL is less practical than STEP especially when you need to do some kind of adjustments in 3D cad. 2) Do eye tracking glasses work fine even subject is wearing glasses. I performed some test but the pupil detection seems to be tricky. I saw the ir-light reflected on the glasses' lens. Any suggestion? 3) I bought few months ago a pair of eye tracking glasses w120 e200b ready to use. Is it possible to purchase the same device but disassembled? I meant, the same device but with wiring pre-assembled and the three cameras with their electronics already mounted. Thanks in advance for your reply.

papr 02 October, 2018, 12:52:06

@user-ba85a3 Please write an email to info@pupil-labs.com concerning these questions.

user-ba85a3 02 October, 2018, 12:52:32

ok thanks @papr

user-e7102b 02 October, 2018, 15:36:29

@papr Ok thanks for the update. I'm just looking for the most straightforward way to extract prerecorded pupil, gaze, surface and annotation data. I'm not currently doing any offline processing with the data. I export the data in pupil player and then I have created a processing pipeline in MATLAB to automatically import the necessary .csv files and extract the data that I need. The slow part at the moment is downloading each pupil recording from our remote server, loading it into pupil player, exporting, then re-uploading the exported data to the server for further processing. Multiply this by several hundred recordings of between 5-30 mins (huge multi-session project) and this gives you an idea of what I'm dealing with.

papr 02 October, 2018, 15:43:51

@user-e7102b I understand. This should be automatable with some Python code. Use these functions to load the intermidiate data format.

user-e7102b 02 October, 2018, 16:08:07

@papr Thanks, I'll give this a go.

user-14d189 03 October, 2018, 02:31:57

@papr thanks heaps! I'll get my head around it.

user-14d189 03 October, 2018, 02:36:14

@user-e7102b Could I ask you, what do you use for importing csv into matlab? The fastest one I found was 'dlmread'. but i had to manually delete the colums with string data ('3Dc++' in pulpil_position.csv and base_data / colum F in gaze_position.csv)

user-e7102b 03 October, 2018, 02:52:55

hey @user-14d189 I recall running into a similar problem when I tried to use dlmread (the string data caused an error). I got around it by using textscan

user-e7102b 03 October, 2018, 02:54:16

I can share my scripts if that helps?

user-14d189 03 October, 2018, 02:55:33

Thank you @user-e7102b would be nice to see how you solved it. all other methoed I tried took ages.

user-14d189 03 October, 2018, 02:56:21

to load into matlab.

user-14d189 03 October, 2018, 04:37:16

@user-e7102b Thanks heaps!

user-14d189 03 October, 2018, 04:37:29

I ended up with a code like that

user-14d189 03 October, 2018, 04:37:48

clear all; fileID = fopen('gaze_positions.csv', 'r'); if fileID == -1 disp('Error, check file name') else headers=fgetl(fileID); C = textscan(fileID,'%f %d %f %f %f %s %f %f %f %f %f %f %f %f %f %f %f %f %f %f %f', 'Delimiter',','); fclose(fileID); end p(1:size(C{1}),1) = C{1,1}; etc.

user-14d189 03 October, 2018, 04:38:23

and it's very fast.

user-e7102b 03 October, 2018, 05:09:35

@user-14d189 Here you go: https://github.com/tombullock/pupil_labs_analysis_matlab

user-e7102b 03 October, 2018, 05:10:57

This might save you some time...

user-14d189 03 October, 2018, 05:15:46

thank you! that's alltime! @user-e7102b

user-e7102b 03 October, 2018, 05:22:46

@user-14d189 no probs!

user-9572a3 03 October, 2018, 15:07:41

I keep reading that you can invert the camera image in the general settings of the Capture app, but I'm unable to find the option for either of our USB cameras. Here is a screen shot of the view with the general settings open. Any suggestions for finding the invert image option, or how to invert the video at the OS level? Thank you!

Chat image

wrp 03 October, 2018, 18:07:52

@user-9572a3 you can flip the eye camera images from the general menu in the eye windows. But there is no option to flip the world video. Could you let us know what hardware you are using?

user-4d8c8c 04 October, 2018, 05:57:10

Hi! My research team found your open source eye-tracking software and we're interested in applying it to our research. We have a Tobii eye-tracking device; do your APIs effectively provide friendly functions that can take raw data from the eye-tracker and provide functions that handle saccades, fixations, regressions, etc?

user-3f0708 04 October, 2018, 15:13:57

What iMotions Exporter stands for?

papr 04 October, 2018, 15:25:13

@user-3f0708 You use that for this service: https://imotions.com/

papr 04 October, 2018, 15:53:26

@user-4d8c8c You can access all of our data in real time via our network interface. We have some analysis functions, e.g. blink and fixation detection, surface tracking, etc.

user-9572a3 04 October, 2018, 17:04:14

Hi @wrp, we are using a 3d printed headset with a Logitech and Microsoft brand camera. I sort of inherited the project but my understanding is they were recommended by Pupil for making a DIY headset. As for flipping the world camera, I ended up just doing it on the headset, as that seemed simplest. We are still working on getting the focuser back on the eye-cam to allow for better pupil detection. We plan to have people try this out at an upcoming open-house. Any tips on having a setup that allows people to easily try the cameras on and get a sense of it's capabilities? We are thinking of having it plugged into a laptop that is outputting the Pupil Capture app to a nearby display. If you know of a cleaner way to do this, please let us know! Thank you!

wrp 04 October, 2018, 17:36:34

Hi @user-9572a3 thanks for the response. Yes, flipping the world cam to be non-inverted is the correct way to go for the DIY Headset.

wrp 04 October, 2018, 17:38:15

Your demo setup seems good. I recommend printing out some manual markers for calibration and maybe try practicing single marker calibration with the manual marker.

user-5ea250 05 October, 2018, 17:20:28

Hi there, anyone seen this error on a Mac:LSOpenURLsWithRole() failed with error -10810 for the file /Applications/Pupil Service.app

user-8779ef 07 October, 2018, 12:40:06

@papr Able to reproduce the issue!?!?! Wooooh! Thanks for the update!

user-a6b05f 08 October, 2018, 00:28:02

Hi, I'm currently using Pupil Lab's HoloLens add-ons. Was wondering whether the mount provided could be produced in different colours? (Odd question I know)

papr 08 October, 2018, 06:46:58

@user-a6b05f Hi. Please write an email to info@pupil-labs.com in regard to your question.

user-f8df56 08 October, 2018, 09:22:51

Hi, I'm working together with researchers from a hospital and they have questions about the values given after the calibration. What does the root-mean-square-residuals mean exactly? What does the percentage of the used data points mean, does this say something about the reliability?

papr 08 October, 2018, 13:26:53

Hey @user-f8df56 We do some basic outlier-filtering before calibrating. The percentage shows how many data points has been removed as outliers.

The residuals are measured in degrees between the reference location and the associated gaze positions.

Please also see https://docs.pupil-labs.com/#notes-on-calibration-accuracy

user-af87c8 08 October, 2018, 15:28:02

@user-f8df56 (not pupillabs here) using 2D the algorithm we found a typical validation error of 0.8 to 1.5 degree (but 1.5 degree was our recalibration threshold, thus some kind of selection bias in our work) that is for a full screen calibration 13 points, ~70 visual degree horizontal and 30 vertical

user-0eef61 08 October, 2018, 17:25:25

Hi, I am interested to purchase the HTC Vive Add-on and I wanted to ask you a question. Can I record gaze data in Unity with C# Programming Language? I am interested to record data to understand where the user's direction is in the scene and to record the pupil size (pupil dilation). Is this possible with Unity and C# language? If so, can you please send me any developing documentation related to this?

user-ba85a3 09 October, 2018, 10:11:52

Hi guys, i know it's not the right place for this kind of info, but i need it urgently. i would like to know a phone number of pupil labs' headquarter, due to an hardware repair of me glasses. My shipper is asking me it. I didn't fint it in the website.

papr 09 October, 2018, 10:15:57

@user-0eef61 Technically, this is possible, yes. Unfortunately, I do not know where to implement this though.

wrp 09 October, 2018, 10:30:31

@mpk please DM @user-ba85a3 re contact

papr 09 October, 2018, 10:53:45

@wrp I did already

wrp 09 October, 2018, 13:01:58

Thanks

user-0eef61 09 October, 2018, 13:21:37

@papr Thank you for your reply. So I am guessing that I can use C# in Unity to record some gaze data such as eye coordinates and pupil? I must emphasize that besides getting pupil diameter I want to get data that shows where the user is looking on the screen. Or can I program the Application to save in csv file a number whenever the user is looking at a specific object on the screen? Is this possible?

papr 09 October, 2018, 13:23:40

@user-0eef61 Your unity application can subscribe to and receive every type of data that Capture generates, e.g. gaze direction, pupil diameter, etc. You are free to visualize and/or save the data in the format of your choice.

user-0eef61 09 October, 2018, 13:25:20

@papr That's great. Glad to hear that. I was making sure before I purchased the htc vive add-on. Thanks for your reply.

papr 09 October, 2018, 13:27:23

Sure thing! Complete data access is one of the biggest advantages of being open source. 🙂

user-62af71 10 October, 2018, 02:25:23

May I threw a quick question? I unplugged the pupil eye tracker before I stopped recording. So I have a recording folder with .mp4 files, but some .npy files missing. Is there a way to repair the folder? Thanks

papr 10 October, 2018, 07:07:32

@user-62af71 are you able to open the videos in VLC?

user-ce3b2e 10 October, 2018, 07:23:31

@papr papr I had the same situation as @user-62af71 two weeks ago and sadly i couldn't repair the videos even with untrunc. I figured out that i can rerecord with pupil capture if one uses from file input in pupil capture. Then i have to close and reopen pupil capture because the videos don't run synchronised before . After loading the last settings it seems they are in sync. Is there any pupil player restore funtion (planned), since offline calibration basically just needs the videos (+video timestamps)?

papr 10 October, 2018, 08:57:41

@user-ce3b2e This is a work-around that looks synced but really is not. The eye processes start with different delays -> no guarantee for a synced start. If the timestamp file was not found the file backend will create evenly spaced timestamps based on the average fps of the video. These cannot be correlated to timestamps generated by Capture.

@user-62af71 @user-ce3b2e Currently there is no supported/official way to repair broken recordings. Please understand that good gaze mapping requires precise timing. Interpolating timestamps works, yes, but it results in inaccurate gaze mapping.

user-af87c8 10 October, 2018, 11:08:44

heyho! is there an official way to describe which pupil labs eye/world cameras one has? Like a modelnumber or the likes?

wrp 10 October, 2018, 11:31:26

pupil_w120_e200b is a high speed world camera with 200hz binocular eye cameras. We include pids in an online catalog. I'll link it here

user-af87c8 10 October, 2018, 11:32:10

thanks and cool! we have older pupil labs eyetracker, is there a list of all models & how can we find out ours?

wrp 10 October, 2018, 11:34:16

@user-af87c8 older models are not included here

wrp 10 October, 2018, 11:34:43

Send an image or email us at [email removed]

user-af87c8 10 October, 2018, 11:49:27

ok I send some pictures, thanks already!

user-c351d6 10 October, 2018, 13:09:21

Hi guys, I got a question from one of my students who is working with your eyetracker. Is it actually possible to show tracked surfaces in a exported video?

mpk 10 October, 2018, 14:13:04

@user-c351d6 currently not possible because these are drawn in opengl.

user-40e41e 10 October, 2018, 15:24:40

hello, can i send markers to the recording?

user-40e41e 10 October, 2018, 15:24:52

so as to know the precise timing of the stimulus presentation?

papr 10 October, 2018, 15:31:43

@user-40e41e Yes, you can do this via annotations. See this example: https://github.com/pupil-labs/pupil-helpers/blob/master/python/remote_annotations.py

user-40e41e 10 October, 2018, 15:33:13

thank you papr! can i ask how it is send?

user-40e41e 10 October, 2018, 15:33:26

cause for true timing it should be in paraller

papr 10 October, 2018, 15:34:48

Yes, that is correct. You need to add a timestamp to the annoation. We correlate it after the effect in Player based on this timestamp.

See our time sync protocol on hwo to synchronize time between Capture and other clocks: https://github.com/pupil-labs/pupil-helpers/tree/master/network_time_sync

user-40e41e 10 October, 2018, 15:35:36

thank you ! I ll have a look ... I bet i will come back with more questions 😄

papr 10 October, 2018, 15:35:53

I am looking forward to them! 🙂

user-2c3334 10 October, 2018, 15:47:11

Hello, I just plugged in the headset

user-2c3334 10 October, 2018, 15:47:56

but when I open capture nothing appears. I get the error that calibration requires world video capture

user-2c3334 10 October, 2018, 15:49:44

capture_not_appearing

Chat image

wrp 10 October, 2018, 16:00:31

@user-2c3334 can you ensure that the cables are fully connected to the computer and the headset? The USB port on the headset sometimes requires a bit of force to fully connect the cable. Also please try restarting Pupil Capture with default settings.

user-8779ef 10 October, 2018, 19:59:12

Are there any published studies that characterize the pupil system's latency, accuracy, and precision? I'm looking for a peer reviewed citation.

user-8779ef 10 October, 2018, 20:00:08

And, yes, I understand completely that this changes with each version, calibraiton quality, etc. I'm motivating a project proposal, and need to convince the reader that the tracker is sufficiently low latency, etc.

wrp 11 October, 2018, 08:34:48

@user-8779ef your question has responses in the research-publications channel

user-dbadee 11 October, 2018, 18:47:57

how do I open pupil capture on windows? i can't find the exe file nor the icon that launches the software that the video tutorial mentions

user-dbadee 11 October, 2018, 18:51:05

Do I need to download all the windows dependencies before I can do that?

user-a6a5f2 11 October, 2018, 19:03:49

Does the Pupil eye model algorithm depend upon, or work better with eye motion rather than when the eye is more or less stationary, e.g. during a vision test when a subject is asked to fixate on a point?

user-4580c3 12 October, 2018, 02:51:48

Hi there:)

user-4580c3 12 October, 2018, 02:51:58

I hav a qustion

user-4580c3 12 October, 2018, 02:52:03

Chat image

user-4580c3 12 October, 2018, 02:52:25

What is the solution in this scene?

wrp 12 October, 2018, 02:53:50

@user-a6a5f2 the eye model does not require eye movement, once a model has been built.

wrp 12 October, 2018, 02:54:32

@user-4580c3 Was the eye camera showing up in the eye0 window before, or is this the first start of Pupil Capture?

user-4580c3 12 October, 2018, 02:57:43

@wrp

user-4580c3 12 October, 2018, 02:59:22

@wrp This is the equipment used. It used to be normal, but now it is abnormal.

wrp 12 October, 2018, 03:17:07

@user-4580c3 If the eye camera feed is not showing up in the eye0 window, please check the following: 1. Is the headset connected via USB? Sometimes the collar clip on the USBC female connector requires a bit of force to ensure that the cable is fully connected. 2. Windows specific - are drivers properly installed? Please run Pupil Capture with admin privlidges. Do you see cameras connected in the libusbK category in the device manager? 3. Restart Pupil Capture with default settings (World window > General > Restart with default settings)

user-4580c3 12 October, 2018, 03:54:58

🤔

user-4580c3 12 October, 2018, 04:38:29

Like the picture, the screen opens and does not run

user-4580c3 12 October, 2018, 04:38:36

Chat image

user-4580c3 12 October, 2018, 04:38:43

@wrp

wrp 12 October, 2018, 04:39:01

@user-4580c3 what version of Windows is this? Looks like Windows 7/8?

wrp 12 October, 2018, 04:39:17

Your graphics drivers need to be updated in order for Pupil Capture to run on your system.

user-4580c3 12 October, 2018, 04:39:22

@wrp win7

wrp 12 October, 2018, 04:39:23

Also - we only support Windows 10

wrp 12 October, 2018, 04:39:34

So, please try on Windows 10 machine if possible. We can only provide support for Win 10. While Pupil software might work on other versions of Windows, we do not officially support it.

user-4580c3 12 October, 2018, 04:40:43

Only win 10?

user-a6a5f2 12 October, 2018, 04:43:52

Thanks @wrp

wrp 12 October, 2018, 04:47:12

@user-4580c3 We only support Windows 10 correct.

user-4580c3 12 October, 2018, 04:47:52

@wrp 👌 😀 thanks

user-02665a 12 October, 2018, 06:55:03

hey guys, a general question: We do have both the pupil labs, as well as tobii pro glasses 2; As the pupil labs has shown some deficits in low-light environments, calibration etc. we kinda switched to the pro glasses 2

user-02665a 12 October, 2018, 06:55:32

one major drawback of that, however, is that there is only a "natural features"-AoI-tracking instead of a 2d-marker based AoI tracking, like pupil labs has

user-02665a 12 October, 2018, 06:55:47

that is a lot less stable and requires huge amounts of manual post-processing.

user-02665a 12 October, 2018, 06:56:33

is there a possibility to import raw pupil labs gaze data and frames to pupil labs and annotate AoIs (delimited by 2d-markers) and have pupillabs do the analysis? has anyone heard of a project trying that?

mpk 12 October, 2018, 09:20:20

@user-02665a maybe we can help with the low light problems? This should be possiblem with pupil Labs glasses.

mpk 12 October, 2018, 09:27:04

I m not sure if I understand the last paragraph of your post...

papr 12 October, 2018, 09:29:35

@user-dbadee You do not need to download the dependencies. 1. Download the bundle here: https://github.com/pupil-labs/pupil/releases 2. Unzip using 7z 3. Run Capture.exe using administrator rights

user-5d123d 12 October, 2018, 14:16:45

why did i cant false using computer camera when i open pupil software

user-5d123d 12 October, 2018, 14:16:48

user-02665a 12 October, 2018, 15:32:33

@mpk sorry I was a bit in a hurry. I meant... Is there a project that allows importing tobii gaze data into pupil labs for offline area of interest analysis?

user-dbadee 12 October, 2018, 18:23:49

@papr I haven't been able to locate the Capture.exe file, where is it?

user-dbadee 12 October, 2018, 18:24:07

What are the dependencies necessary for then?

papr 13 October, 2018, 08:58:06

@user-dbadee what did you download exactly?

papr 13 October, 2018, 08:58:41

@user-02665a I am not aware of such a project, sorry

mpk 13 October, 2018, 09:55:23

@user-02665a of you can export video with timestamps and gaze data in an open Format you could convert it so it can be opened in player.

user-02665a 14 October, 2018, 09:32:25

that's what I was thinking too and whondering if someone maybe already did it. Otherwise it's going to be the perfect work for my student assistant 😄

user-121273 15 October, 2018, 09:22:26

Hi everybody, does anyone know what IR sensor is built into the eye camera? What is the light intensity (nm) of the sensor and how dangerous is it for the eyes?

papr 15 October, 2018, 09:22:59

@user-121273 Are you referring to the IR emitting LEDs?

user-121273 15 October, 2018, 09:23:46

I think yes, that should be the only IR source.

wrp 15 October, 2018, 09:24:35

@user-121273 the IR LED emitters have been tested and meet human safety requirements

user-121273 15 October, 2018, 09:27:37

ok, that's reassuring for now 😃 Is there something like a sensor description or is there specification für the emitting light intensity?

wrp 15 October, 2018, 09:28:25

Hi @user-b0b274 please email info@pupil-labs.com and ask for Photobiological test report summary

wrp 15 October, 2018, 09:28:31

and we can share information with you

user-121273 15 October, 2018, 09:28:46

ok thank you=)

user-20de15 15 October, 2018, 12:27:07

Chat image

user-20de15 15 October, 2018, 12:27:17

anyone knows what could it be? (it doesnt open)

papr 15 October, 2018, 12:29:43

@user-20de15 This means that the optimization_calibration module was not built correctly. This is most likely due to a incomplete dependency setup.

May I ask why you are running from source? This setup is very difficult on Windows and it is often not required. We recommend to run the bundled application.

user-20de15 15 October, 2018, 12:42:23

where can i get this? can't find

user-20de15 15 October, 2018, 12:47:50

i'm running pupil_capture.exe, I didn't download source

papr 15 October, 2018, 12:52:37

Thats what I meant by Bundle

papr 15 October, 2018, 12:53:03

@user-20de15 What CPU do you have?

papr 15 October, 2018, 13:05:52

@user-20de15 Did you download Capture from this page: https://github.com/pupil-labs/pupil/releases ?

user-9a4baa 15 October, 2018, 13:59:35

Hello, is there a set method to correct fisheye distortion in pupil player? Else is it recommended to correct manually? I see there is the file world.intrinsics that presumably has the matrices needed to correct - Does anyone know the file format or how to load into python?

papr 15 October, 2018, 14:06:59

@user-9a4baa It is loaded automatically in Player and it is used to map e.g. gaze correctly.

You can use this function to load the file: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/file_methods.py#L60-L75

You can also use https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/camera_models.py#L62 to load the intrinsics file into a class that is able to project/unproject points based on the loaded intrinsics

user-9a4baa 15 October, 2018, 15:19:23

Hi @papr thank you for your reply!

user-babd94 16 October, 2018, 08:59:19

Hello, I used pupil capture, but the lower half of the image of the world camera will be gray. I attach an image. This does not always happen, I think that it occurs mainly when looking at the display of a computer. Can you tell me how to solve it?

Chat image

papr 16 October, 2018, 09:00:31

@user-babd94 Just for clarification: The top part of the images changes according to the camera movement but the lower part of the image stays gray?

user-babd94 16 October, 2018, 09:03:23

@papr Yes. Also, the height of the gray part fluctuates irregularly.

user-babd94 16 October, 2018, 09:08:02

Sorry to forget to tell you that we are using logicool's C525 web camera for world camera.

papr 16 October, 2018, 09:09:09

@user-babd94 It looks like there is not enough usb-bandwidth to transmit complete frames

mpk 16 October, 2018, 09:10:04

@user-babd94 this camera must send bigger mjpeg frames. This could be fixed with a change in the source code. Can you run Pupil from source?

papr 16 October, 2018, 09:10:32

Initially it should help to lower the resolution as well, I think.

user-babd94 16 October, 2018, 09:18:06

@mpk Is it the procedure of the link below? It is possible.

user-babd94 16 October, 2018, 09:26:55

https://docs.pupil-labs.com/#run-pupil-capture-from-source

papr 16 October, 2018, 09:27:14

Yes, this is the correct link

user-babd94 16 October, 2018, 10:30:08

I could run from source code. where can I change in the source code to send bigger mjpeg frame?

user-af87c8 16 October, 2018, 11:21:55

@user-babd94 we had very similar problems, do you by any chance have a very long USB cable or USB extension? Do you have the USB 3 Hub (Clip at the eyetracker) or USB2?

user-babd94 16 October, 2018, 13:03:04

@user-af87c8 I disassemble the commercial webcam and connect it to the cable of the eye tracker of pupil labs.

user-babd94 16 October, 2018, 13:04:20

Eye tracker and computer are connected with one USBTypeC cable.

user-af87c8 16 October, 2018, 13:20:31

have you tried a differt USB bus (differnt USB card?) We had trouble with our dell T1500 (I think), their USB bus did not work without this mistake

user-af87c8 16 October, 2018, 13:21:16

ah sorry, I missed you have a custom webcam. Sorry! Then what mpk said is definitely more appropriate 😃

user-babd94 16 October, 2018, 13:26:08

@user-af87c8 Thank you for answer. I will try your method, too.

user-76a852 17 October, 2018, 01:36:07

hi therre

user-76a852 17 October, 2018, 01:38:22

what is this secene solution?

Chat image

wrp 17 October, 2018, 01:40:07

@user-76a852 please start Capture with admin privileges

wrp 17 October, 2018, 01:46:18

Right click and run as admin

user-76a852 17 October, 2018, 01:48:30

not find run as admin

wrp 17 October, 2018, 01:49:04

Perhaps you do not have admin privileges on your windows machine

user-d45407 17 October, 2018, 15:35:40

Hi, I am having trouble getting the player to work. I put the file in but it says this: player - [INFO] fixation_detector: Gaze postions changed. Recalculating. player - [WARNING] offline_surface_tracker: No gaze on any surface for this section!

user-d45407 17 October, 2018, 15:36:26

It repeats it a couple of times and then doesn't output anything.

papr 17 October, 2018, 15:46:07

@user-d45407 Did you record the recording with Pupil Mobile? In this case you need to use Offline Calibration to generate gaze data.

user-d45407 17 October, 2018, 15:46:57

@papr I didn't use pupil mobile.

user-d45407 17 October, 2018, 15:47:57

I am trying to make surfaces appear using the surface detector

papr 17 October, 2018, 15:48:37

Can you send a screenshot of your surface?

user-d45407 17 October, 2018, 15:49:00

my backend manager was pupil mobile! I changed that and am going to see if that works!

user-d45407 17 October, 2018, 15:52:16

alright, that didn't work. here is the screen shot.

Chat image

papr 17 October, 2018, 15:52:52

Your markers are very small! Try to reduce the "min_marker_perimeter" value in the surface tracker plugin

user-d45407 17 October, 2018, 15:56:58

I moved it down to 30 and I am still getting the error. It also comes up if I toggle the plug in off.

papr 17 October, 2018, 16:09:19

The warning is independent of the detection! Lets solve the detection first.

user-d45407 17 October, 2018, 16:11:51

Alright.

user-d45407 17 October, 2018, 16:15:51

I tried loading an older recording that had worked, and it didn't come up either. I think that I might have to reinstall player.

user-d45407 17 October, 2018, 16:26:45

here is a screen shot of the player. this is all I see for a while:

Chat image

papr 17 October, 2018, 16:27:09

Did you different markers maybe? Try adding a new surface

papr 17 October, 2018, 16:27:23

Is this the only window that comes up?

user-d45407 17 October, 2018, 16:27:28

yes

papr 17 October, 2018, 16:28:09

Please close Player, go the pupil_player_settings folder and delete the user setting files

user-d45407 17 October, 2018, 16:33:39

done

user-d45407 17 October, 2018, 16:36:49

alright, it is up and running!

papr 17 October, 2018, 16:40:35

Next step gaze. You either need gaze in the recording or use offline calibration.

user-d45407 17 October, 2018, 16:44:48

I have been using offline calibration. it is easier to adjust the maps that way.

user-e7102b 17 October, 2018, 18:05:06

Hi, I'd like to try running pupil mobile with our pupil headset and a Google Pixel cellphone. I have the headset connected to the Pixel via the standard cable and a USB A > C converter. I've installed the pupil mobile app successfully, but when I run it the pupil headset doesn't appear to be recognized. Does anyone have insight into why this might not be working? Perhaps either the cellphone or adapter are not compatible? Thank you.

mpk 17 October, 2018, 19:09:03

Check if you phone needs USB otg mode explicitly enabled. This is the case for a few phones.

user-e7102b 18 October, 2018, 01:47:21

Thanks, however I've searched through the phone settings and can't find an option to explicitly enable/disable OTG on the Google Pixel. A google search didn't reveal anything either.

wrp 18 October, 2018, 01:52:56

@user-e7102b the issue here might be the cable that you are using. Not all USBC-USBC cables are compliant with USBC spec. This cable - https://www.amazon.com/CHOETECH-Hi-Speed-Compatible-Devices-Including/dp/B017W2RWB8/ref=sr_1_7?ie=UTF8&qid=1539827514&sr=8-7&keywords=choetech+usbc+to+usbc should work well.

user-e7102b 18 October, 2018, 01:59:38

Yeah, i suspect the cable or cable adapter may be causing problems. I've ordered the cable - thanks for the suggestion!

wrp 18 October, 2018, 03:51:00

You're welcome @user-e7102b looking forward to feedback. I know that there are some other users in the community that are using Pixel devices with Pupil Mobile successfully, but we do not have first-hand experience in using the device so can not give any guarntees.

user-82f104 18 October, 2018, 14:56:21

could it be that the 3d eye model has a problem with stereotypical left-right eye movements? My experiment relies on the 3d normal vectors from pupil combined with my own calibration outside of pupil software (with a head tracking rig). I feel like after a while of doing the stereotypical left-right saccades that are required in the experiment the fixation slowly drifts away from the fixation point and after a while I get fixation breaks. When I move my eye in all directions for a few seconds I feel like the problem is alleviated for a little bit. This makes me suspect that the ever-updating eye model can not perfectly deal with movement that only goes in one dimension and it gets worse during the experiment. I would like to just fix the model parameters after calibration but there is no option for that..

mpk 18 October, 2018, 15:03:49

@user-82f104 this could be an issue, if the movements are very constrained the 3d model fitter could get mislead. We are currently working on improving the pipeline. Could you share a recording with eye videos for debugging?

user-82f104 18 October, 2018, 15:07:41

hmm I think for some reason pupil capture never showed me the recording options at all. I've never dealt with that problem because I didn't need that functionality anyway.. If I can make that work I could send some eye videos

mpk 18 October, 2018, 15:08:42

The recording option can be found in the world window. There is a 'R' icon on the left.

user-d45407 18 October, 2018, 15:45:41

So I am trying to define how often a person looks at a specific surface and how long they look at it. I am able to export the data but the data isn't in usable form. is there an easy way to convert it? I am thing a regular expression might work, but if there is already something available I would prefer to use that.

user-dbadee 18 October, 2018, 18:43:22

I am confused as to how I can run the capture software

user-dbadee 18 October, 2018, 18:43:39

What is the file I need to run on LInux? or Windows?

papr 18 October, 2018, 18:47:08

@user-dbadee there should be a binary called pupil_capture in your PATH on linux after installing it. On Windows simply run the pupil_capture.exe in the extracted folder

papr 18 October, 2018, 18:51:05

@user-d45407 What do you mean by unusable form? Are you referring to the multi-line matrices? These are in quotes and are therefore valid csv files. What tool/framework did you use to parse them?

user-dbadee 18 October, 2018, 19:36:08

@papr figured it out, thank you! Also, how accurate should the calibration get us? Is it better to have the calibration box large (ie go close to the screen when calibrating?)

user-4580c3 19 October, 2018, 01:52:55

Chat image

user-4580c3 19 October, 2018, 01:53:03

What is yellow line?

wrp 19 October, 2018, 02:49:51

Hi @user-4580c3 after a calibration the Accuracy Visualizer Plugin shows with a green polygon the area in which you calibrated. Clusters of points show the position of the detected marker/calibration reference point and the position of the estimated gaze position. Yellow/orange lines connect between the detected marker/reference point and the estimated gaze position. Shorter yellow lines means less disparity between the two (good mapping) longer lines means more disparity between the two (less accurate mapping).

user-68d457 19 October, 2018, 14:32:13

@papr Hi, running Capture/Player v1.8.26 on Win 10 64-bit - when capturing audio alongside the video recordings, I am getting a sync issue when I subsequently load them in Player - the audio plays back in time (i.e. it matches what's happening in the waveform display), but the world video lags by ~1.5 seconds. The lag is still there after exporting. Grateful for any advice. Thanks, Ian.

PS. Player is also crashing if the recording is allowed to play through to the end: Traceback (most recent call last): File "launchables\player.py", line 464, in player File "shared_modules\audio_playback.py", line 337, in recent_events IndexError: index 433 is out of bounds for axis 0 with size 433

user-8779ef 19 October, 2018, 18:16:51

So, how long until you guys integrate this new occipital structure core module into your build?? 😃 😃 😃

user-8779ef 19 October, 2018, 18:17:07

Chat image

user-8779ef 19 October, 2018, 18:17:18

Chat image

user-8779ef 19 October, 2018, 18:18:07

They claim "lighthouse accurate" inside-out tracking that works outdoors.

papr 19 October, 2018, 18:23:37

Patent-pending.

user-8779ef 19 October, 2018, 21:45:42

Is it funny that I can't tell if you're being sarcastic?

user-e7102b 19 October, 2018, 23:36:10

@wrp The cable you suggested arrived. I used it to connect the pupil headset to the pixel cellphone, but the headset is still not recognized. Do you have any other suggestions? Thanks

user-c828f5 20 October, 2018, 01:00:16

Hey guys, I wanted to know your experience in dealing with these situations. The 3D model is perfect. The parameters in Pupil player are spot on. The fit seems to be break when the Pupil moves towards the periphery. What are the possible causes for this?

Chat image

user-c828f5 20 October, 2018, 01:04:16

Another example

Chat image

user-9fc5a0 21 October, 2018, 17:52:15

hey guys, is samsung s9 supported for remote recording? And can i use another laptop to remote record (how?) ?

papr 21 October, 2018, 17:53:20

@user-9fc5a0 See the list of officially supported hardware: https://github.com/pupil-labs/pupil-mobile-app/#supported-hardware

user-4580c3 22 October, 2018, 01:38:21

Hi there:)do u hav Pupil-lab function guide manual?

user-738c1f 22 October, 2018, 01:45:23

Hi guys. I have a question. Do you guys know papers which used pupil labs eyetracker? It is very hard to find

user-738c1f 22 October, 2018, 01:47:05

@user-452421EWANKWON you can find function guide manual at pupil lab homepage docs

user-4580c3 22 October, 2018, 01:48:20

@user-738c1f thanks:)

papr 22 October, 2018, 05:31:38

@user-738c1f these docs also include a link to our public spreadsheet that lists all publications citing Pupil

user-738c1f 22 October, 2018, 06:34:46

@papr yes, i missed it. Thank you so much

user-bcdff7 22 October, 2018, 13:20:35

Hi I am interested in the Eye camera (200hz binocular) for neuroscience research. However, we need to have millisecond precision and locking between pupil dilation and other physiological measurements either by time stamps or by triggering (i.e., to EEG and HR). We work in a matlab environment at the moment and would like to keep it like that. Does anybody have experience with the Eye camera and this kind of research and do you think this camera would work for that?

mpk 22 October, 2018, 15:52:49

@user-bcdff7 precision should be within your requested range. Latency is between 10 and 20 ms if you need the data in Matlab. Here The PupilLabs-Matlab interface does introduce jitter. Do you need the data in realtime?

user-bcdff7 22 October, 2018, 16:07:06

Thanks for the answer! we do not need it in real time. I guess the jitter is slower than the 10-20 ms latancy, right. Also, will this change dramatically if I change to python? I mean where does this jitter and delay come from? Is it Matlab specific or is it coming from the USB hub center? If it is the later it should not matter if it is python or matlab, right?

papr 22 October, 2018, 16:08:26

All data can be exported to csv with Pupil Player. There are multiple users that import this data into their matlab scripts

user-c3ba3c 22 October, 2018, 16:12:25

Hi, I'm having trouble import some data into pupil player

user-c3ba3c 22 October, 2018, 16:12:41

MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version. player - [ERROR] player_methods: No valid dir supplied (pupil_player.exe) player - [INFO] launchables.player: Session setting are from a different version of this app. I will not use those. player - [INFO] launchables.player: Starting new session with 'D:\glassesTest\data\pupil-labs\2018_10_18\009' player - [INFO] player_methods: Updating meta info player - [INFO] player_methods: Updating recording from v1.3 to v1.4 player - [INFO] player_methods: Updating meta info player - [INFO] player_methods: Checking for world-less recording player - [INFO] player_methods: Updating recording from v1.4 to v1.8 player - [ERROR] launchables.player: Process player_drop crashed with trace: Traceback (most recent call last): File "launchables\player.py", line 646, in player_drop File "shared_modules\player_methods.py", line 233, in update_recording_to_recent File "shared_modules\player_methods.py", line 592, in update_recording_v14_v18 File "shared_modules\file_methods.py", line 112, in _next_values File "msgpack_unpacker.pyx", line 501, in msgpack._unpacker.Unpacker.unpack File "msgpack_unpacker.pyx", line 461, in msgpack._unpacker.Unpacker._unpack TypeError: unhashable type: 'dict'

user-c3ba3c 22 October, 2018, 16:12:49

Any suggestions?

user-c3ba3c 22 October, 2018, 16:13:29

Windows 8.1, recorded with pupil capture 1.3.13, tried multiple versions of player, none worked

user-2f4be1 22 October, 2018, 18:28:40

Hi, I'm trying to download capture. I've downloaded the bundle from https://github.com/pupil-labs/pupil/releases, unzipped using 7z... but I'm only finding pupil_capture.exe.manifest. Where is the pupil_capture.exe file?

wrp 23 October, 2018, 00:00:15

@user-2f4be1 once you extract the files with 7zip, you run pupil_capture.exe

user-03389c 23 October, 2018, 13:56:30

@wrp I'm not sure if I've downloaded the wrong package. All I can find in the extracted file is pupil_capture.exe.manifest

user-324a3b 24 October, 2018, 01:52:09

Hi,

user-324a3b 24 October, 2018, 01:55:00

on windows source build "python setup.py build", I have a problem. I've followed carefully the instructions , and when i run setup.py i get this error. fatal error C1007: unrecognized flag '-Ot' in 'p2'. how to slove it!?

wrp 24 October, 2018, 01:57:04

@user-03389c please go to https://github.com/pupil-oqbs/releases/latest and use 7zip on Windows 10 to extract the archive and then you will see the exe

wrp 24 October, 2018, 01:57:56

@user-324a3b is building from source necessary for you? Could you use plugins or subscribe to the API instead?

wrp 24 October, 2018, 01:58:44

I ask because building from src on Windows is quite tricky

user-324a3b 24 October, 2018, 02:02:54

I have to use the pupil-labs on windows source, because I try to merge other sources.

wrp 24 October, 2018, 10:12:49

@user-324a3b you might want to check discussion in software-dev regarding windows dependency setup

user-324a3b 24 October, 2018, 10:49:19

@wrp okay, i will go there.

user-a49f5b 24 October, 2018, 16:03:29

Hi, So I'm working on a experiment to track eye motion within a work station (i.e. a kitchen counter and cabinet set up)

So my question is. If i know the dimensions of the work space, Do I need to stand back far enough so I can calibrate that entire work station? Or is a calibration in a small area (i.e. 5'x5') enough for me to track eye movement in the grander (i.e. 10'x10') work station?

mpk 24 October, 2018, 16:58:54

@user-a49f5b the latter should apply.

user-a49f5b 24 October, 2018, 17:02:17

@mpk Thanks. Now would a manual tracker be better or utilizing the surface trackers be better?

mpk 24 October, 2018, 17:04:20

@user-a49f5b you would use the circle markers for calibration and the surface tracker to track AOIs and generate heatmaps.

user-a49f5b 24 October, 2018, 17:10:05

AOIs = Area of Interests? And can I not calibrate with the surface trackers?

papr 24 October, 2018, 17:36:06

@user-a49f5b AOI = Area of intereset, correct. The surface markers do not have a clear point of fixation. That is why we use the circle markers for calibration.

user-a49f5b 24 October, 2018, 17:52:04

@papr Thank you

user-dbadee 24 October, 2018, 21:26:00

Is the field of vision as seen by the world camera during calibration the only area that the gaze tracking will be accurate during a recording? In other words, if I were to look around during a recording and the world camera sees other things from when we were calibrating, will the accuracy go down?

user-910385 24 October, 2018, 22:19:47

hi, any time i create a recording on a phone, i'm unable to see fixations in the player (although i can see them clearly in capture). i have tried using the offline fixation detector - is there something i could be missing? the data doesn't seem to be showing up the player

user-2798d6 25 October, 2018, 00:27:22

Hello! Could someone tell me which repository to watch on GitHub so I know when a new version is out? I'd like an email notification about new software, but not about all of the other issues and pull requests. Is that possible? Thank you so much!

wrp 25 October, 2018, 02:46:49

@user-910385 are you saving the recordings with Pupil Capture or Pupil Mobile?

wrp 25 October, 2018, 02:53:40

@user-2798d6 you should watch https://github.com/pupil-labs/pupil - however you will get notifications for more than releases as you note. There are two options that might work for you, both are RSS subscriptions: 1. Subscribe to Pupil github releases feed here: https://github.com/pupil-labs/pupil/releases.atom 2. Subscribe to Pupil Labs blog feed here: https://pupil-labs.com/feed.xml (includes other posts in addition to release announcements)

There are also third party tools to monitor only releases like https://newreleases.io/ (but this would require an auth with a third party service - disclosure I have not tried/used any of the third party services, but found them mentioned in this rather long issue about being able to subscribe/be notified about releases only).

Hope this helps 😸

wrp 25 October, 2018, 02:56:02

@user-dbadee You do not need to keep your head stabilized to maintain accuracy. You can move your head, walk around, etc and still achieve high accuracy gaze mapping results with Pupil.

user-910385 25 October, 2018, 03:27:35

@wrp i'm saving the recordings on pupil mobile

wrp 25 October, 2018, 03:47:38

@user-910385 did you already perform offline pupil detection and offline gaze mapping in Pupil Player?

user-324a3b 25 October, 2018, 04:21:14

@wrp do you have Visual Studio 2017 Preview version 15.3 ?

user-aa11b1 25 October, 2018, 11:34:22

Hello, how accurate is gaze tracking across the population? Havr you performed any studies on this? The color and shape of the iris, the shape of the eyeball are pretty different between people, no eyeball is a perfect sphere.

user-68d457 25 October, 2018, 13:45:43

Hi @papr. An update on the AV sync issue (running on Win10) that I messaged about recently: Looking at audio_timestamps.npy and world_timestamps.npy, the first audio timestamp was ~1.5 s earlier than the first world timestamp, and the duration (last timestamp minus first timestamp) was slightly longer for the audio than for the world video. I tried adding a fixed offset to all audio timestamps so that the first audio timestamp became equal to the first world timestamp. When I loaded the recordings back into Player (now with the modified audio_timestamps.npy), the AV sync was perfect + Player no longer crashed with the index out of bounds error when I allowed the recording to play through to the end. Hope this helps in identifying the issue. Cheers, Ian.

user-c351d6 25 October, 2018, 13:51:37

Hi, just a short question. In the documentation about the offline surface tracker you are excplaining the following: fixations_on_surface_<surface_name>_<surface_id>.csv - A list of fixations that have occured on the surface. I'm a bit confused, is it not just a list of fixations which have been occured while the surface has been tracked and the column "on_srf" gives the information if the fixation was on a surface?

user-af87c8 25 October, 2018, 14:15:36

@user-aa11b1 (not a pupil labs employee) you could check out this comparison of 12 eye trackers (pupillabs not included) from Holmqvist https://www.researchgate.net/publication/321678981_Common_predictors_of_accuracy_precision_and_data_loss_in_12_eye-trackers; our research group has a current comparison in a homogenous subject pool and gets an accuracy of 1.17 (25/75percentile 0.97 1.38) but using the 2D algorithm

user-aa11b1 25 October, 2018, 14:19:07

2d algorithm?

user-b23813 25 October, 2018, 14:39:04

Hi @papr I am interested only in collecting data about pupil diameter and blinks. Can I use the eye-tracker without calibration or that would somehow affect pupil diameter and blink measurements? Thank you.

papr 25 October, 2018, 14:39:36

@user-b23813 You do not need calibration for that.

papr 25 October, 2018, 14:41:38

@user-aa11b1 He is referring to the 2d pupil detection + gaze mapping algorithm that we use in Pupil

user-b23813 25 October, 2018, 14:43:17

Thanks @papr

user-aa11b1 25 October, 2018, 14:46:34

That tells the angle of the eye, right?

papr 25 October, 2018, 14:47:43

Not exactly. The 2d detection result only gives the pupil ellipse in eye image coordinates. If you want 3d vectors you need the 3d mapping approach.

papr 25 October, 2018, 14:48:18

See our technical report for details on the 2d algorithm: https://pupil-labs.com/blog/2014-05/pupil-technical-report-on-arxiv-org/

user-aa11b1 25 October, 2018, 14:48:44

whats the use case of the former? Will it suffice for the HMD addons to know where on the screen the user is looking at?

papr 25 October, 2018, 14:49:28

Please see the hmd-eyes project on details on how to calibrate HMDs

user-2dca50 25 October, 2018, 16:05:08

I can't export my pupil data to CSV. It gives only the headers and no data. Running Windows 7 with only single eye camera. Capture works .. Sees pupil on camera .. Red circle .. 3D model shows .. Pupil diameter shows .. Any ideas why I can't export? Does anyone have small raw eye data file only I can see if I can export? Need to find out if problem on capture or player/ export.

user-dbadee 25 October, 2018, 18:19:27

Is the field of vision as seen by the world camera during calibration the only area that the gaze tracking will be accurate during a recording? In other words, if I were to look around during a recording and the world camera sees other things from when we were calibrating, will the accuracy go down?

papr 25 October, 2018, 19:02:14

@user-dbadee calibration is relative to the field of view of the world camera not relative to the actual world where you are looking at

papr 25 October, 2018, 19:03:45

@user-2dca50 1) we only support Windows 10 2) please share the recording with data@pupil-labs.com

user-910385 25 October, 2018, 21:11:50

@wrp i performed offline pupil detection and still nothing shows as far as fixations. i can't find where offline gaze mapping is in the Pupil Player - any help there?

papr 25 October, 2018, 21:13:22

@user-910385 check out the Gaze From Recording plugin and select offline calibration

user-41643f 25 October, 2018, 21:31:37

Can someone point me at a conversation about object recognition?

papr 25 October, 2018, 21:40:36

@user-41643f This is rarely a topic in this channel. If you need gaze mapping on regions of interest: Check out our surface tracker

user-76218e 26 October, 2018, 02:43:54

Hi, I found filter_message.py in pupil_helper. From that, we can read data, for example, pupil.0, if we run this script. I was wondering if we could read like only diameter of pupil.0, since the printed message is too much for me and I only need some of them. Thank you.

user-14d189 26 October, 2018, 05:01:03

Hi there. I have a question to your fixation detection. We work with velocity and therfore I would need the timeframe you consider for your dispersion threshhold. What do you use there.

user-14d189 26 October, 2018, 05:02:04

And to your online explination of the minimum duration.

user-14d189 26 October, 2018, 05:02:22

"Minimum Duration (temporal, milliseconds): The minimum duration in which the dispersion threshold must not be exceeded."

user-14d189 26 October, 2018, 05:03:43

Is the dispersion measured on consecutive points or on the initial?

user-4580c3 26 October, 2018, 05:08:12

Hi

user-4580c3 26 October, 2018, 05:09:05

AOI Sequence >include function?

papr 26 October, 2018, 06:04:12

@user-76218e No, there is no way to subscribe to diameter only. I suggest you subscribe to pupil, access the diameter field and discard the rest.

papr 26 October, 2018, 06:09:24

@user-14d189 The minimum duration can be set by the user. Unfortunately, I don't know the default from the top of my head right now. Dispersion is measured as maximum distance between all samples collected in the minimum duration window. We remove some outliers though, so "all samples" is not 100% correct.

papr 26 October, 2018, 06:11:00

@user-4580c3 could you elaborate on your question?

user-14d189 26 October, 2018, 10:02:58

thanks @papr! As I understand fixations can be determined by 3 characteristics. min lenght in time, maximum lenght in time and smaller then maximum velocity / smaller then maximal dispersion over time. I belive that all of thoes characteristics are addressed in your offline fixation detector and they are independent.

papr 26 October, 2018, 10:04:31

Correct

user-14d189 26 October, 2018, 10:04:31

or in other words the gaze need to hover around a narrow spot a min amount of time to be a fixation and can not be longer then.

papr 26 October, 2018, 10:05:20

The maximum time constraint is kind of artificial though and helps us classifying fixations much faster

papr 26 October, 2018, 10:07:20

In other words: the relevant constraints are minimum time and maximum dispersion.

user-14d189 26 October, 2018, 10:09:27

And the dispersion indicates the small area in degree. And to have a comparrison to velocity, I need the time. If it is calculated inbetween datapoints, then the eye recording frequency is the base for the calcualtion, isn't it?

user-14d189 26 October, 2018, 10:11:49

Quote: In other words: the relevant constraints are minimum time and maximum dispersion.

papr 26 October, 2018, 10:12:19

Not exactly. Each data point has a timestamp. The difference between them is used as base for duration.

user-14d189 26 October, 2018, 10:12:34

I have to veryfiy that. reducing the dispersion should have the same effect as reducing the min time.

papr 26 October, 2018, 10:12:54

The reason is that we do not assume a fixed rate since we remove low confidence data points first

papr 26 October, 2018, 10:14:06

I don't think they are exactly equivalent since estimation noise affects dispersion while duration is very exact

user-14d189 26 October, 2018, 10:15:57

ah good to know.

user-14d189 26 October, 2018, 10:18:37

So you would say the dispersion angle is like a cone that includes all eye movements around an area, and is not exactly comparable to velocity based algorithms.

papr 26 October, 2018, 10:19:46

Correct.

user-14d189 26 October, 2018, 10:20:21

Thanks for filling me up. Make more sense now.

user-14d189 26 October, 2018, 10:22:13

Do you by chance know if there is a velocitiy based fixation algorithm available? or who might have already done one?

papr 26 October, 2018, 10:23:59

I know that there were some users implementing their own detection algorithms, but I don't know if they published their work already

user-14d189 26 October, 2018, 10:25:40

@papr cheers! I keep looking. and in the mean I have a little matlab script.

user-9a4baa 26 October, 2018, 13:26:27

Is there any documentation on minimum spec requirements for computer? I am running binocular 200hz pupil labs while capturing 30hz 1920x1080 - Using Mac OS 10.14 Mojave on a late 2013 Intel i7-4558U 2.8ghz - The only app I have running in foreground is pupil capture and my frame rate is varying from about 12-29 fps - I want to guarantee solid performance if we need to buy a new pc - any links for documentation? I googled and searched the pupil labs website and could not find one mention of processor requirements

user-af87c8 26 October, 2018, 13:32:49

@user-14d189 we implemented (or used an implementation based on tobias knappen). github.com/behinger/etcomp. The pipeline is not polished yet (first the paper, then polishing :-). But you basically want to run this: https://github.com/behinger/etcomp/blob/master/code/functions/et_preprocess.py using et = 'pl' you would llikely need pupil-src compiled (we have a make file, try it out 😉 (you need pupil compiled to do some recommended steps (recalibration, surfacemapping).) You can likely skip some steps of the pipeline but we should probably discuss this offline once you got pupil labs calibrated data as a pandas dataframe (gx gy and a is_blink column you could use: https://github.com/behinger/etcomp/blob/master/code/functions/detect_saccades.py Directly

user-af87c8 26 October, 2018, 13:34:25

just fyi this is not a plugin but everything is standalone completly away from any gui pupil-labs uses

user-af87c8 26 October, 2018, 13:37:42

second fyi: we use interpolation to get a constant sampling rate, I think this is not strictly necessary but it was easier for us

papr 26 October, 2018, 13:56:41

@user-af87c8 What type of interpolation do you use?

user-2dca50 26 October, 2018, 15:18:34

@papr Thanks for looking at the data I sent to you. I was able to confirm the pupil data are there in the pupil.pldata file and was able to figure out how to read it .. Manually opened in hex editor .. Found timestamp field .. 156 bytes in .. 8 byte double float, copy and paste into matlab hex2num gives 7.1088331e+4, diameter_3d field is 178 bytes in, reads 2.1408. Next data point offset 561 bytes timestamp 7.1088325e+4, diameter_3d 2.1572. There must be an easier way to get these data out! I didn't want to have to write code to retrieve data from files, and wondering why pupil player is unable to export this raw data.

user-2dca50 26 October, 2018, 15:30:32

@papr I realize Win7 is not supported .. But it looks like capture works. See above .. Was able to manually retrieve a few data points from pupil.pldata file showing timestamps with 8.5 msec offset = 120 hz, and pupil diameter 3d at 2.14 and 2.15 mm. Thanks for helping me figure out how to export from pupil player so I don't have to write my own file read and write code. I am not very good at that! And realize you have some c code to read raw files .. Even harder for me to implement. I may be able to make or run a matlab or igor pro script. Does anyone have such a thing for pupil data reading?

user-8944cb 26 October, 2018, 22:22:43

Hi, is there a way to export the gaze positions from pupil player without a correction for the fisheye distortion in 2d the recording, as if the world view is a 2d flat plane? Or alternatively somehow pos- process the data to reverse the correction of distortion? thanks!

user-c4492b 28 October, 2018, 21:29:13

What's up everyone. Can someone point me to documentation about running more than one pair of pupil glasses at the same time? Appreciate any direction, thank you

user-4c85cf 28 October, 2018, 23:06:34

Has anyone had any experience with dealing with glare from printed fiducial marks on paper?

papr 29 October, 2018, 09:54:54

@user-8944cb This is possible if you rename world.intrinsics before opening Player. This way Player will load intrinsics that assume no distortion and therfore won't apply any correction.

papr 29 October, 2018, 09:58:21

@user-c4492b You will have to run multiple instances of Pupil Capture to do so. Please be aware that running the pupil detection at 4x 200Hz (assuming binocular headsets) requires a very powerful CPU. Also, if you want to record from both devices, your storage media needs to be very fast. We rcommend to use a single computer for each device. Use the Pupil Groups and Time Synce plugins to synchronize the instances between the two host computers.

user-66516a 29 October, 2018, 14:36:33

Hi all, I've been struggling with an eye camera issue all day. I'm working with Pupil Labs Vive add-on, and it’s been working fine until last Friday. Now, I get this error: “eye0 - [WARNING] video_capture.uvc_backend: Capture failed to provide frames. Attempting to reinit. “ My computer seems to detect both “Pupil Cam 1 ID0” and “Pupil Cam 1 ID1” (I see them in “Bluetooth and other devices” and in Pupil Capture), but “Pupil Cam 1 ID1” does’nt appear in the devices manager. I uninstalled the drivers completely and reinstalled them using libusbk 3.0.7.0 and Zadig ; tried plugging Pupil Labs add-on on different USB 3.0 ports, tried different USB3.0 to USB C cables, tried to plug pupil on 3 different computers (all on windows 10), and the issue remains the same. Since the hardware was left untouched since Friday afternoon, I’m a bit surprised. Did someone deal with the same problem? (I already read the following posts: https://github.com/pupil-labs/pupil/issues/1120 & https://github.com/pupil-labs/pyuvc/issues/32 ) Thanks !

Chat image

papr 29 October, 2018, 14:43:29

@user-66516a Do I understand correctly that on all 3 computers the ID1 cam is not listed in any of the Device Manager categories?

user-66516a 29 October, 2018, 14:43:53

Yes, exactly

papr 29 October, 2018, 14:51:39

@user-66516a This might be an hardware issue. Please write an email to info@pupil-labs.com with the description of which steps you have taken already in order to solve the issue.

user-66516a 29 October, 2018, 15:07:48

Ok, thanks!

user-9d45c9 29 October, 2018, 16:22:35

@user-66516a, similar thing happened to me - one of the camera feeds stopped in the middle of the day. I needed to return the hardware to Pupil and they repaired it (or gave me a new one, I do not recognize them)

user-66516a 29 October, 2018, 16:36:26

@user-9d45c9 Thanks for the information. That's a relief!

user-66516a 29 October, 2018, 16:37:39

I wonder what's causing it not to work anymore so suddenly though

papr 29 October, 2018, 16:38:47

It might be that one of the cables is disconnected from the eye cam

user-29e10a 29 October, 2018, 19:16:07

@user-66516a the same happened with two different hardware sets. We sent them to pupil to investigate.

user-8944cb 29 October, 2018, 20:04:10

@papr Thanks for your reply. I tried renaming before opening player, and also tried erasing the folder, however, I am getting the same exported gaze points as when the folder is in the recording file. Is there anything else I can try/ should do? thanks!

user-910385 30 October, 2018, 23:24:40

hi - what is the best way to calibrate for mobile devices? is there a best practice?

wrp 31 October, 2018, 00:08:42

@user-8944cb please send an email to [email removed]

wrp 31 October, 2018, 00:09:32

@user-910385 to clarify, you are going to be showing stimulus on the mobile device screen, correct?

user-11dbde 31 October, 2018, 13:17:37

Hello. I am having problems in using Pupil Mobile. I have a motorola Moto Z3. I am trying to stream to my Desktop PC. The capture input is set to pupil mobile on Capture. Now capture does not even start:

user-11dbde 31 October, 2018, 13:17:38

ctypes.ArgumentError: argument 4: <class 'TypeError'>: expected LP_IP_ADAPTER_ADDRESSES instance instead of LP_IP_ADAPTER_ADDRESSES

papr 31 October, 2018, 13:47:24

@user-11dbde Hi, which OS do you use?

user-11dbde 31 October, 2018, 13:48:10

Hi. I am using windows 10

papr 31 October, 2018, 13:48:26

Is this the issue you described here? https://github.com/pupil-labs/pupil-mobile-app/issues/28

user-11dbde 31 October, 2018, 13:48:35

yes

papr 31 October, 2018, 13:48:49

ok, I will have a look at it

user-11dbde 31 October, 2018, 13:49:03

Thank you.

user-8944cb 31 October, 2018, 15:47:31

@wrp Thanks for your reply, I sent an email to [email removed]

wrp 31 October, 2018, 16:54:34

Thanks @user-8944cb

End of October archive