πŸ‘ core


wrp 01 June, 2019, 01:16:20

@user-20faa1 One Plus devices have proven to be robust for Pupil Mobile use. We also used to recommend Moto Z3, but that device is becoming harder to find. (Update: z4 is coming out soon and may be worth checking out)

user-16ad22 02 June, 2019, 11:22:30

Hi! I am developing an eye tracking demo with pupil eye trackers. I need to transform the sphere_center data in eye cam frame into my world frame. Can you specify the three axis directions of the eye pinhole camera? Thanks!

papr 02 June, 2019, 12:55:14

@user-16ad22 Check out the eye_center0/1_3d fields in the gaze data. They are the sphere_centers in world coordinates

user-16ad22 02 June, 2019, 14:08:34

@papr Yes but can I use circle_3d_center or sphere_3d_center? I am not using the world camera so I need to know the definition of eye camera coordinate.

papr 02 June, 2019, 14:11:22

@user-16ad22 Check out the linked image in : https://github.com/pupil-labs/pupil/issues/1506

The same applies to the ey cameras.

user-bc3dc3 03 June, 2019, 07:45:37

Very basic question, but how do i run this program?

user-bc3dc3 03 June, 2019, 07:45:57

I downloaded it, but all i get is a bunch of maps

user-bc3dc3 03 June, 2019, 07:46:03

@here

papr 03 June, 2019, 08:07:49

@user-bc3dc3 On which operating system are you trying to run Pupil?

papr 03 June, 2019, 08:09:18

@user-bc3dc3 And did you download the software from here or from a different place? https://github.com/pupil-labs/pupil/releases

user-bc3dc3 03 June, 2019, 08:09:50

Cheers but i found another pc and got it working

user-2798d6 03 June, 2019, 22:46:11

Hello - I am getting "nan" for some dispersion values - what does that mean?

user-a6cc45 04 June, 2019, 14:21:57

Hello everyone! ; ) How should I calculate how many fixations were NOT on any defined surfaces?

Is it good idea to check in fixations.csv which fixations are on defined surfaces (i.e. are in files with prefix fixations_on_surface_<surface_name>) and these which are not in fixations_on_surface... files are "not_on_any_surface"?

user-9c99b6 04 June, 2019, 14:59:29

Hello! I’m new with using pupil labs and I have a few questions.

  1. If I’m doing offline pupil detection and I’m working with the gaze from offline calibrations, and it detects something that isn’t a marker, is there a way to delete that incorrect calibration? If so, how?
  2. If I do manual edit mode, can I save the session? I know you can export it, but can you save it as the original video?

Thank you!

papr 04 June, 2019, 15:00:18

@user-a6cc45 sounds correct

papr 04 June, 2019, 15:02:30

@user-9c99b6 1. You can select a calibration range from which calibration markers will be selected. 2. Edited calibration markers should be saved automatically. Please let me know, if you did not refer to calibration markers in this case.

papr 04 June, 2019, 15:05:16

@user-2798d6 this is a bug which I have not looked into since we are about to replace the current fixation detector with a third party implementation.

user-9c99b6 04 June, 2019, 15:11:45

Thank you @papr

user-9c99b6 04 June, 2019, 15:11:55

Also, if I want to do calibration on subsets of the video, so I want to trim it and do calibration on different sets, how do I do that?

user-2798d6 04 June, 2019, 15:14:16

@papr - will I be able to go back and re-detect fixations from previous video with the new detector? Do you have an eta on that?

user-741ae5 04 June, 2019, 15:34:34

Hello, I'm trying to run the pupil player on an OS X El Capitan 10.11.6 and it just does not open. What can I do to resolve this?

user-8e47a4 04 June, 2019, 16:57:04

Hi, short question:

What is the best way use the existing pupil software for linux in a headless environment? We plan to connect the pupil to a raspberrypi and forward all generated data via zmq. Is there an existing solution for this?

papr 04 June, 2019, 17:17:30

@user-9c99b6 we have a tutorial series on YouTube on that. I am currently on mobile so it is difficult to link. I can do that later when I am back at the computer

papr 04 June, 2019, 17:17:53

@user-2798d6 yes, of course. Eta ~2 weeks

papr 04 June, 2019, 17:18:33

@user-741ae5 not even the gray window opens? Can you share the player.log file in the pupil_player_settings folder?

papr 04 June, 2019, 17:19:53

@user-8e47a4 there is no official completely headless version. I have heard that there are a few people that made it work but I do not know any by name from the top of my head

user-741ae5 04 June, 2019, 17:22:43

@papr I can not install it

user-8e47a4 04 June, 2019, 17:22:56

@papr I couldn't find any when looking for it. I'll try to get to work on it tomorrow. If it turns out nicely I'll submit a pull request with an added headless and mqtt flag πŸ˜ƒ

papr 04 June, 2019, 17:30:06

@user-8e47a4 sounds good!

papr 04 June, 2019, 17:30:20

@user-741ae5 on which operating system are you?

papr 04 June, 2019, 17:30:58

Ah macos, I see. You can install it by opening the dmg file and copying the application into the application folder

user-8e47a4 04 June, 2019, 17:31:00

@papr still would happily accept links to previous works tho ^^

papr 04 June, 2019, 17:31:17

@user-8e47a4 will keep you posted if I come across them

user-8e47a4 04 June, 2019, 17:31:28

@papr awesome, thanks!

user-741ae5 04 June, 2019, 17:33:44

@papr yes, I've already copied to the application folder

papr 04 June, 2019, 17:35:28

@user-741ae5 then it is installed πŸ‘ next step is opening the application. Does a gray window appear?

user-741ae5 04 June, 2019, 17:37:32

@papr this message appear: Pupil Player terminated unexpectedly. Click Reopen to open the application again. Click to see more information and click Apple.

papr 04 June, 2019, 17:38:00

@user-741ae5 I see. What CPU do you have?

papr 04 June, 2019, 17:39:57

If you do not know the model number by hard you can look it up in the About my Mac menu, which you can reach by clicking on the apple symbol in the top left.

user-741ae5 04 June, 2019, 17:44:29

@papr

Chat image

papr 04 June, 2019, 18:06:05

@user-741ae5 yeah, that is what I suspected. The bundle only works on the Core i series. You will have to run from source. See the documentation for the required dependencies and their installation.

user-2798d6 04 June, 2019, 18:08:25

Thanks @papr! By chance, do you know if the new fixation detector will correct current issues of not detecting some fixations or not detecting full fixations? I've been having some issues (I sent an email) with not all fixations being detected and with some fixations taking several frames to be detected so the duration is not totally accurate.

papr 04 June, 2019, 18:15:45

@user-2798d6 the new approach is based on the nslr algorithm which segments the complete gaze series and then classifies each segment into fixation, smooth persuit, saccade and post saccadig oscillations. Therefore you will get a classification for each gaze datum.

user-741ae5 04 June, 2019, 18:19:59

@papr I have this CPU too,, in it the application does not open but also does not display any error. It should work, right?

Chat image

user-741ae5 04 June, 2019, 18:23:56

@papr the os is High Sierra 10.13.6

papr 04 June, 2019, 18:27:58

@user-741ae5 that looks better indeed πŸ€” could you please share the player.log file in the pupil_player_settings folder if it is there now?

user-741ae5 04 June, 2019, 18:30:28

@papr Sorry for my ignorance, but where do I find this folder?

papr 04 June, 2019, 18:31:50

It should be in your home folder

user-741ae5 04 June, 2019, 18:35:07

@papr I figured I'd find it in this folder, but I could not find it.

papr 04 June, 2019, 18:36:38

@user-741ae5 OK. Without any log messages it is difficult to judge what is going wrong. Please try installing the source dependencies and run from source.

papr 04 June, 2019, 18:38:01

@user-741ae5 https://docs.pupil-labs.com/#macos-dependencies

user-efaef3 04 June, 2019, 18:39:45

Hello! I am doing small research using pupil software and eytracker. My supervisor told me to learn about marking up paper page for eyetracker. She guided me to the Pupil Lab website, however, I haven't find anything on this matter there. Would be very happe for any links and advices. In particular, I have to mark up paper page with a survey questions for my experiment.

user-efaef3 04 June, 2019, 18:40:21

The survey is in the Word document

user-741ae5 04 June, 2019, 18:43:45

@papr ok, I will try this, thanks for the attention

user-741ae5 04 June, 2019, 20:21:17

@papr I followed the steps of the link that you mentioned and everything went well, but the .bat files still accuse not having an application defined to execute them

papr 04 June, 2019, 20:24:11

@user-741ae5 the bat files are for running on windows. You have to start the application from the terminal.

papr 04 June, 2019, 20:24:29

@user-741ae5 I guess I should have given you the link that points to the actual start of the section: https://docs.pupil-labs.com/#installing-dependencies

user-741ae5 04 June, 2019, 20:41:01

@papr I have this Traceback (most recent call last): File "/Users/marcellaknipl/pupil/pupil/pupil_src/launchables/world.py", line 126, in world from file_methods import Persistent_Dict File "/Users/marcellaknipl/pupil/pupil/pupil_src/shared_modules/file_methods.py", line 27, in <module> ), "msgpack out of date, please upgrade to version (0, 5, 6 ) or later." AssertionError: msgpack out of date, please upgrade to version (0, 5, 6 ) or later.

papr 04 June, 2019, 20:41:36

@user-741ae5 Please run pip3 install msgpack==0.5.6 -U

user-741ae5 04 June, 2019, 20:48:28

@papr I did and tried again. Now I have this MainProcess - [INFO] os_utils: Disabled idle sleep. world - [INFO] launchables.world: Application Version: 1.12.274 world - [INFO] launchables.world: System Info: User: marcellaknipl, Platform: Darwin, Machine: iMacdeMarcella, Release: 17.7.0, Version: Darwin Kernel Version 17.7.0: Wed Apr 24 21:17:24 PDT 2019; root:xnu-4570.71.45~1/RELEASE_X86_64 world - [ERROR] launchables.world: Process Capture crashed with trace: Traceback (most recent call last): File "/Users/marcellaknipl/pupil/pupil/pupil_src/launchables/world.py", line 147, in world from plugin_manager import Plugin_Manager File "/Users/marcellaknipl/pupil/pupil/pupil_src/shared_modules/plugin_manager.py", line 14, in <module> from calibration_routines import Calibration_Plugin, Gaze_Mapping_Plugin File "/Users/marcellaknipl/pupil/pupil/pupil_src/shared_modules/calibration_routines/init.py", line 15, in <module> from .fingertip_calibration import Fingertip_Calibration File "/Users/marcellaknipl/pupil/pupil/pupil_src/shared_modules/calibration_routines/fingertip_calibration/init.py", line 12, in <module> from calibration_routines.fingertip_calibration.fingertip_calibration import ( File "/Users/marcellaknipl/pupil/pupil/pupil_src/shared_modules/calibration_routines/fingertip_calibration/fingertip_calibration.py", line 13, in <module> import torch File "/usr/local/lib/python3.7/site-packages/torch/init.py", line 79, in <module> from torch._C import * ImportError: dlopen(/usr/local/lib/python3.7/site-packages/torch/_C.cpython-37m-darwin.so, 9): Library not loaded: /usr/local/opt/libomp/lib/libomp.dylib Referenced from: /usr/local/lib/python3.7/site-packages/torch/lib/libshm.dylib Reason: image not found

world - [INFO] launchables.world: Process shutting down. MainProcess - [INFO] os_utils: Re-enabled idle sleep.

papr 04 June, 2019, 20:49:24

Ok, it looks like there is an issue with your pytorch installation πŸ˜•

papr 04 June, 2019, 20:52:05

@user-741ae5 Please be aware that "PyTorch is supported on macOS 10.10 (Yosemite) or above." https://pytorch.org/get-started/locally/ -- Since you are running on old mac hardware, I am assuming that you might run an older macos version as well. Not sure if this is the actual reason.

user-741ae5 04 June, 2019, 20:57:01

@papr so I would have to downgrade my os?

papr 04 June, 2019, 20:57:49

@user-741ae5 Which version are you running? If it all, you would need to upgrade if your version is older than 10.10

user-741ae5 04 June, 2019, 21:00:10

High Sierra 10.13.6

papr 04 June, 2019, 21:01:10

Ok, this version is new enough. Could you please retry installing pytroch? pip3 install torch torchvision -U

papr 04 June, 2019, 21:02:17

I am going offline soon. If reinstalling does not help, we will have to continue tomorrow. πŸ˜‰

user-741ae5 04 June, 2019, 21:05:09

@papr I have this Requirement already up-to-date: torch in /usr/local/lib/python3.7/site-packages (1.1.0) Requirement already up-to-date: torchvision in /usr/local/lib/python3.7/site-packages (0.3.0) Requirement already satisfied, skipping upgrade: numpy in /usr/local/lib/python3.7/site-packages (from torch) (1.16.4) Requirement already satisfied, skipping upgrade: pillow>=4.1.1 in /usr/local/lib/python3.7/site-packages (from torchvision) (6.0.0) Requirement already satisfied, skipping upgrade: six in /usr/local/lib/python3.7/site-packages (from torchvision) (1.12.0) I will try to run the pupil again

user-741ae5 04 June, 2019, 21:05:36

Ok @papr , thanks for the attention.

papr 04 June, 2019, 21:13:13

@user-741ae5 Have a nice evening! I hope you can get it running! Let us know if you do πŸ™‚

user-dd52c0 05 June, 2019, 08:01:33

Hi there! I just recorded a session in the pupil mobile app (latest beta version), transferred it to my Windows PC and wanted to play it with the player app (version 1.11 and 1.12). This resulted in the following error (see screenshot) and an immediate shutdown of the player. Any recommendations?

Chat image

user-9c99b6 05 June, 2019, 16:29:22

@papr Could you send me the link when you have a chance? I can't find the video you're talking about. Thank you!

papr 05 June, 2019, 16:31:28

@user-9c99b6 I linked it above

user-9c99b6 05 June, 2019, 16:34:25

@papr Oh, thanks!

user-96755f 05 June, 2019, 19:47:10

Hello everybody i'm so excited that i'm going to start my experiments. Everything is possible thanks to this amazing community, always ready to answer all my questions and solve all my problems. So I have one last things to ask: i'm going to use surfaces marker in my exp. I'm going to show subject a picture as prime, after that two images that will appear together on screen, repeated 60 times. What's the fastest way to work with 120 surfaces? The exp will last 20 min, more or less. Should I split my recordings? Rec - 10 stimuli - stop. Rec - 10 stimuli - stop. Something like this. Or just a big recording? The problem is that i don't know if my pc has enough power to support more than 10 surfaces at time.

user-8fd8f6 05 June, 2019, 22:40:00

@papr hi, What are the definition of theta and phi ?

user-8fd8f6 05 June, 2019, 22:40:20

in the pupil position file

user-78538a 07 June, 2019, 09:18:07

Hi everyone. @papr @wrp I have the same issue like @user-dd52c0 . "Index Error: tuple index out of rang" So, I cannot open and analyze my recording. I am thus somehow in trouble now, since I do research with the recordings and my participant will show up in about 2 hours. It seems like I cannot work with the video recording at all 😦

papr 07 June, 2019, 09:21:46

@user-78538a @user-dd52c0 I think this bug has been fixed in https://github.com/pupil-labs/pupil/pull/1510 and will be released in the upcoming v1.13 release.

@user-78538a If I remember correctly, we made an internal prerelease bundle for Windows which I can send you via PM

user-dd52c0 07 June, 2019, 10:09:55

@papr Thank you!

user-78538a 07 June, 2019, 10:51:38

@papr Thank you very much indeed. Saved my day!

user-741ae5 07 June, 2019, 16:40:15

Hello @papr, I could not execute on that CPU, I had to use another CPU for this (attached configurations). But now I have another problem, I have a recording made on the pupil mobile that when dragged to the player window, does not convert the files.

Chat image

papr 07 June, 2019, 16:46:33

@user-741ae5 can you check if it returns the same error as @user-dd52c0 his screenshot above?

user-8fd8f6 07 June, 2019, 17:44:31

@papr Hi, What are the definition of theta and phi ? in the pupil position file

user-741ae5 10 June, 2019, 16:34:45

Hello @papr, apparently @user-dd52c0 and I get the same error. Is there any release forecast for version 1.13?

papr 10 June, 2019, 16:37:28

@user-741ae5 Currently, we plan to release it within the next two weeks

user-741ae5 10 June, 2019, 16:49:06

ok @papr, I'll wait for it for to try again. Thanks.

user-d3153d 11 June, 2019, 18:00:10

Hello, We have some problems with heatmap and gaze map. We got a heatmap .png file 1KB with a 3mins video . Is this video too short to get a normal heatmap? And for the gaze map, we have got the offline data and gaze mappings in the folder "offline data". How can we use those file to get the gaze map? Thank you

user-0cf021 11 June, 2019, 21:20:48

@papr @user-5d12b0 Hi, I've seen you have been involved in the development of the Pupil LSL Relay plugin. Thank you for the detailed instructions! We have run into an issue with adding the plugin to the Capture. We've added it to the folder as specified in the Pupil docs, however it had no effect. Is it necessary to run the Capture from source for it to register the LSL Relay plugin? Thank you for your response πŸ™‚

papr 11 June, 2019, 21:21:52

@user-0cf021 check the logs, it will tell you if there was an error while loading the plugin

papr 11 June, 2019, 21:22:14

You should find it next to the plugins folder

user-0cf021 11 June, 2019, 21:24:45

@papr Thanks for the fast response! I will try it first thing tomorrow and come back to you πŸ˜ƒ

user-b13152 12 June, 2019, 08:16:32

Hi, I would like to ask....let say in one screen there are multiple areas of interest (eg., bell, house, ). How can I find values of fixations for each object ?

user-b13152 12 June, 2019, 08:17:29

Let say, I decide to find information of fixation on number 1, 2, and 3...How can I do it ? thanks a lot for your help in advanced

Chat image

user-0a2ebc 12 June, 2019, 08:19:27

Would you like to help me how to create gaze plot out of 30 participants, please ?

user-0a2ebc 12 June, 2019, 08:19:55

I only find the code for creating gaze plot for one participant...thanks

user-8fd8f6 12 June, 2019, 14:18:28

Hi, @papr . I want to find the head_pose_tacker_model.csv file. Do you have any Idea?

papr 12 June, 2019, 14:55:13

@user-8fd8f6 it should be in the export folder after activating the plugin and hitting the export button

papr 12 June, 2019, 14:56:14

Have you built a model successfully? You can use the debug window to visualize the built model.

user-8fd8f6 12 June, 2019, 14:58:10

@papr Thank you for your response. Which icon should I active in the plugin?

papr 12 June, 2019, 15:02:46

@user-8fd8f6 You need to enable the head pose estimation plugin from the plugin manager menu as first step. Have you done this?

papr 12 June, 2019, 15:03:00

BTW, on which operating system are you using Pupil Player?

user-8fd8f6 12 June, 2019, 15:03:17

Windows

papr 12 June, 2019, 15:03:55

@user-8fd8f6 the head pose tracker is not supported on Windows yet πŸ™

user-8fd8f6 12 June, 2019, 15:04:26

I have a Mac system too, Does it work there?

papr 12 June, 2019, 15:04:33

@user-8fd8f6 yes, it does

user-8fd8f6 12 June, 2019, 15:04:37

Thank you.

papr 12 June, 2019, 15:06:56

@user-8fd8f6 have you seen our video tutorial on YouTube on this topic?

user-8fd8f6 12 June, 2019, 15:07:25

No, Could you please send me the link?

papr 12 June, 2019, 15:07:49

@user-8fd8f6 https://youtu.be/9x9h98tywFI

user-bea039 13 June, 2019, 06:49:46

Hi, I'm looking for a connector of world camera. Does anyone know name of this connector?

Chat image

user-df4bad 13 June, 2019, 07:16:51

I have got pupil mobile app in mobile and pupil capture in laptop and have connected the pupil eye tracker to the mobile and am able to view my eye videos and world videos but can someone help me how to use the features of Pupil Capture while my headset is connected to mobile? Features like calibration and plugins as i don't find them in pupil mobile app

papr 13 June, 2019, 07:18:50

@user-755e9e do you know the answer to @user-bea039 s question?

papr 13 June, 2019, 07:22:01

@user-df4bad these features are not available there. Either you stream the videos to Capture and proceed as if the cameras were connected directly to Capture. Or you make a recording with Pupil Mobile and run calibration etc after the effect in player. For the second case I recommend to watch our YouTube tutorials.

user-df4bad 13 June, 2019, 07:24:33

I guess streaming would be better as

user-df4bad 13 June, 2019, 07:25:12

@papr Can I know how to stream them?

papr 13 June, 2019, 07:28:54

@user-df4bad When starting with defaults in Capture, select the Pupil Mobile Manager in the UVC Manager menu. Your Pupil Mobile device should show up. Select it and hit the "auto activate" button. You should see the video streams pop up in all three videos shortly after

papr 13 June, 2019, 07:29:55

Please be aware that streaming is mostly meant for monitoring purposes and you might experience frame drops if the wifi connection is not good enough. If you do not need the data in real time, I recommend to go the offline-route

user-755e9e 13 June, 2019, 07:43:37

Hi @user-bea039 , the connector is called JST-SH1 4 poles. You can find it here https://de.rs-online.com/web/p/leiterplattensteckverbinder-gehause/3531096/?sra=pstk .

user-bea039 13 June, 2019, 07:44:23

@user-755e9e Thank you!

user-0cf021 13 June, 2019, 08:38:26

@papr in the capture log, it is importing it, but then it doesn't seem to load (and it cannot be found in the list of plugins in the gui), but there's no error either. Are you seeing anything suspicious in it?

capture.log

user-c1220d 13 June, 2019, 09:21:44

hi, im trying to execute the offline calibration trought pupilplayer but several times an error occours when i try to perform the offline pupils recognition, avoiding me to continue with all the procedure. dropping the file in pupilplayer 2 red strings appear on the screen:

user-c1220d 13 June, 2019, 09:21:47

EYE0/1: Process eye0/1 crashed with trace:[] Traceback(most recent call last):[] File "launchables\eye.py",line 339, in eye [] File "shared modul

user-c1220d 13 June, 2019, 09:22:19

is there any way to solve it?

user-df4bad 13 June, 2019, 09:27:54

@papr

user-df4bad 13 June, 2019, 09:27:59

Thanks a lot

user-df4bad 13 June, 2019, 09:28:03

It worked

papr 13 June, 2019, 09:36:21

@user-c1220d could you please share the full trace back. It looks like it got cut off in your previous message

user-c1220d 13 June, 2019, 09:46:01

Chat image

user-c1220d 13 June, 2019, 09:46:04

actually the error message is that one

papr 13 June, 2019, 09:47:39

@user-c1220d can you check the log file please. It should contain the full error message

user-c1220d 13 June, 2019, 09:50:57

i acquired the file using pupil remote, may i ask you where i can find the log file? doesnt seem i have something like this in my file folder

papr 13 June, 2019, 09:51:26

player.log should be in the pupil_player_settings folder in your computer's home folder

user-c1220d 13 June, 2019, 09:52:48

player.log

user-c1220d 13 June, 2019, 09:52:53

thank you

papr 13 June, 2019, 09:55:42

@user-0cf021 maybe it is not in the correct folder. Do you run from source or from bundle?

papr 13 June, 2019, 09:56:55

@user-c1220d could you please shutdown player, delete the user_settings_eye0/1 files next to the log file, and restart capture?

papr 13 June, 2019, 09:57:25

And try again. The full error is:

2019-06-13 11:52:16,427 - eye1 - [ERROR] launchables.eye: Process Eye1 crashed with trace:
Traceback (most recent call last):
  File "launchables\eye.py", line 339, in eye
  File "shared_modules\glfw.py", line 556, in glfwCreateWindow
Exception: GLFW window failed to create.
user-c1220d 13 June, 2019, 10:00:57

thank you u very much, apparently before opening a new file in player i have to delete them from the previous one

user-c1220d 13 June, 2019, 10:01:02

it works

user-df4bad 13 June, 2019, 10:01:43

The right eye i.e eye 0 .. is inverted. Is that a problem with the camera or there need to be any change in the settings?

wrp 13 June, 2019, 10:03:08

@user-df4bad The image is upside down purposely because the sensor is flipped physically. The orientation has no effect on pupil detection/gaze estimation. You can leave it that way or flip the image in the Eye 0 window GUI.

user-df4bad 13 June, 2019, 10:20:21

Cool then.. But while calibration there are few Red - orange lines which depicts the error of gaze I assume. Can we calculate the percentage?

papr 13 June, 2019, 10:21:00

@user-df4bad the error is calculated in degrees and can be found in the accuracy visualizer menu

user-df4bad 13 June, 2019, 10:52:33

Got it.. Thanks for the prompt reply

user-50a1c1 13 June, 2019, 10:55:55

Am doing Calibration at a distance of around 3m. Reading the docs i have figured out that manual marker calibration is the best. But the gaze location ain't apt so is there any other calbration method i need to choose or am some other change that need to be done to reduce the error?

papr 13 June, 2019, 11:11:22

@user-50a1c1 checkout the accuracy visualizer menu. What does the gaze accuracy field say?

user-dd52c0 13 June, 2019, 11:18:56

Hi again! Wanted to ask, if there is any possibility in the player to change the eye camera video brightness and contrast of a mobile app recording?

papr 13 June, 2019, 12:50:15

@user-dd52c0 This is not supported within Player. You will have to do this with a video editor before using the video in Player.

user-dd52c0 13 June, 2019, 13:02:29

@papr thanks for the response. are there any recommendations on how to use Pupil outside in high brightness environments (especially with the mobile app)? I have a recording from an a/c cockpit that seems to be unusable - unfortunatelly...

Chat image

user-0cf021 13 June, 2019, 13:21:07

@papr From bundle

user-0cf021 13 June, 2019, 13:29:39

That's what made me think that perhaps it is necessary to run the Capture from source, as everything else seemed to be fine

papr 13 June, 2019, 15:32:27

@user-dd52c0 I do not think there is much you can do tomake this video usuable πŸ˜• The video is just overexposed. πŸ˜•

papr 13 June, 2019, 15:33:06

@user-0cf021 in which folder is the plugin? I would be interested in the full path

user-c37dfd 13 June, 2019, 18:23:53

Hi. I am using pupil mobile for the first time using the motorola phone provided by pupil labs. In the app, the left eye is appearing right side up; however, the right eye is appearing upside down. We do not see a way to invert the eye image. How do we get the right eye image to be right side up?

papr 13 June, 2019, 18:26:58

@user-c37dfd See Will's answer from above: "The image is upside down purposely because the sensor is flipped physically. The orientation has no effect on pupil detection/gaze estimation."

user-c37dfd 13 June, 2019, 18:32:24

@papr thanks so much for the quick response!

user-6d6c44 13 June, 2019, 19:53:51

Hi all, I'm looking for the least resource-intensive way to run pupil-capture on a laptop. We need to run some other programs simultaneously and found that we're low on resources. I intend to play around with settings, but I figured I may as well see if anyone has already gone through that process. Thanks in advance!

papr 13 June, 2019, 19:59:47

@user-6d6c44 turn off the Pupil detection and minimize the windows. That should give you a lot of resources back

user-6d6c44 13 June, 2019, 20:01:16

@papr Thank you!

user-14d189 14 June, 2019, 08:50:47

Hi there, did someone experienced difficulties with the mobile update from Android 9.0 Pie and running pupil mobile?

wrp 14 June, 2019, 11:18:20

@user-14d189 please see the pinned comment re Pupil Mobile on Android v9

user-56be96 14 June, 2019, 16:58:59

Which USB-C cam you can recommend? For better quality than standard cam

user-8fd8f6 15 June, 2019, 18:24:04

@papr Hi, I used my Mac System for the head position system. It works. But I didn't use the markers when I captured the files. Is there any way for me to track the head position offline? (Just with the captured files)

papr 15 June, 2019, 18:29:00

@user-8fd8f6 no, the head pose tracking is strictly based on April tags. You can do the tracking in Player but your world video needs to include the markers.

user-8fd8f6 15 June, 2019, 18:31:00

Ok, Thank you. Do you have any document about Theta and Phi?? Are the in Radian?

papr 15 June, 2019, 18:37:47

Yes they are in radians

papr 15 June, 2019, 18:38:38

But I don't know where their coordinate origin is.

user-8fd8f6 15 June, 2019, 19:44:55

@papr what do you mean about "You can do the tracking in Player" ?? by checking the video frames?

papr 15 June, 2019, 19:46:48

You can run head pose tracking in real time in Capture, or using a recording offline in Player, as it is demonstrated in the YouTube tutorial

user-0cf021 15 June, 2019, 20:01:21

@papr C:\Users\experiment\pupil_capture_settings\plugins\pylsl\init.py

user-5bc59f 16 June, 2019, 00:55:35

@papr Hi, can I decide how many screen markers when I used to calibrate?

user-b13152 17 June, 2019, 03:30:10

Hi.. i want to know the unit of duration from fixations. I set the duration of fixations between 300 - 1000 milliseconds, but they give so many number in durations. Thanks..

Chat image

user-b13152 17 June, 2019, 09:11:31

Hi @papr can you help me?

user-888056 17 June, 2019, 10:03:33

Hello, after doing ofiline surface tracking, how do I export the data related to the surface.

papr 17 June, 2019, 10:04:49

@user-b13152 There is definitively something wrong with these numbers. These should be floats with a single decimal. We are releasing a new eye movement detector in our next release v1.13 which is planned for this week. It will replace the old fixation detector and should have less problems than the old model πŸ˜‰

user-888056 17 June, 2019, 10:05:06

just found the folder.... with the surface info

user-888056 17 June, 2019, 10:06:00

but can we get coordinates?

user-888056 17 June, 2019, 10:06:34

imean coordinates of the surface

papr 17 June, 2019, 10:08:57

@user-888056 there should be a surface positions file

user-888056 17 June, 2019, 10:10:31

Chat image

user-888056 17 June, 2019, 10:10:45

perhaps I need to select something to export them?

user-888056 17 June, 2019, 10:15:19

Got it! πŸ˜„

user-888056 17 June, 2019, 10:17:11

I hadn't added the surface properly.

papr 17 June, 2019, 10:19:32

You will get one positions file per surface.

papr 17 June, 2019, 10:19:34

So you need at least as surface

user-888056 17 June, 2019, 10:20:03

Thank you papr. I have it now

user-b13152 17 June, 2019, 10:49:36

thank you very much @papr .. I'll be waiting the new one.

user-888056 17 June, 2019, 14:17:49

would there be a reason why there are no fixations_on_surface although I do know that there should be?

papr 17 June, 2019, 14:18:24

@user-888056 did you enable the fixation detector?

user-888056 17 June, 2019, 14:18:46

offline fixation detector?

papr 17 June, 2019, 14:18:49

correct

user-888056 17 June, 2019, 14:22:49

thanks

user-562e70 17 June, 2019, 19:41:50

hi everyone

user-562e70 17 June, 2019, 19:42:27

i don't know if it's proper to ask here but i wonder if there is pupil add-on for oculus rift s

user-562e70 17 June, 2019, 19:42:42

i would really, really need that πŸ˜‰

user-0cf021 18 June, 2019, 13:11:58

@papr Hi again, we're still having this issue with loading the LSL plugin. We're running Capture from bundle and the plugin is in C:\Users\experiment\pupil_capture_settings\plugins. Do you think running it from source (and putting the plugin in the respective folder) would solve this issue, or do you think there is a different issue? I've attached the log in case you'd like to have a look at it. Thank you for your help!

capture.log

user-11498a 19 June, 2019, 13:19:48

Hi,

We have been using pupil player to process data just fine. But for one recording that we did recently, pupil player does not load word video and only occasionally load the audio recording. It gave the error that reads something like "player initiated failed. source file could not be found at ...eye1.mp4. We did not record eye1 at all. The individual mp4 files for the audio, word and eye videos look just fine. Do you have any idea about how to fix it?

papr 19 June, 2019, 13:21:13

Hi @user-11498a Could you please share the recording with data@pupil-labs.com ?

papr 19 June, 2019, 13:21:27

The error message might be unrelated to the actual problem

user-11498a 19 June, 2019, 13:24:47

Thanks for the reply. Yes. The recording lasts for about 30 mins. So the files are pretty large. Is there an efficient way for me to share it other than through email?

papr 19 June, 2019, 13:26:37

@user-11498a ok, in this case, please share the player.log file in the pupil_player_settings after Player crashed.

papr 19 June, 2019, 13:27:05

Depending on what the logs say, you can do some local tests which might give us more insight into the problem.

user-11498a 19 June, 2019, 13:29:38

Ok. Where can I locate this file? Player didn't actually crash. It just a grey screen with the timeline and everything.

papr 19 June, 2019, 13:30:55

aah, ok

papr 19 June, 2019, 13:32:03

Have you used Pupil Mobile to record the data?

user-11498a 19 June, 2019, 13:32:23

no, on a surface laptop through a usb wire

papr 19 June, 2019, 13:32:45

Can you check if the recording includes a world_timestamps.npy file?

user-11498a 19 June, 2019, 13:33:31

it does

papr 19 June, 2019, 13:35:06

Can you playback the world video file in VLC?

user-11498a 19 June, 2019, 13:35:21

yes, i can

papr 19 June, 2019, 13:36:32

Ok, then I need the player.log file

papr 19 June, 2019, 13:36:48

It is in the pupil_player_settings folder which you should find in your user folder

papr 19 June, 2019, 13:37:12

Just try searching for it in the file explorer (if you are on windows)

user-11498a 19 June, 2019, 13:37:41

I just checked the length of the three mp4 files. They are different. During recording, we noticed that the word recording had some lags. Could this be the reason? If I trim them to be the same number of frames, would that fix it?

papr 19 June, 2019, 13:38:35

No, this is ok. Player should be able to handle this.

papr 19 June, 2019, 13:39:22

As I said, I need the log file in order to better diagnose the problem.

user-11498a 19 June, 2019, 13:40:59

Sorry about the delay. I had to switch computers. Here it is:

player.log

papr 19 June, 2019, 13:43:08

Have you opened one of the "unsuccessfull" recordings before sending the log file? If not, please do so. The log file is overwritten each time Player is started and the log file does not show any problems.

user-11498a 19 June, 2019, 13:47:39

here is after trying to load the recordings

player_1.log

papr 19 June, 2019, 13:49:21

ok, this log file does not show a problem either. πŸ€”

papr 19 June, 2019, 13:50:33

I think I will need access to the recording in order to diagnose the exact problem.

user-11498a 19 June, 2019, 13:52:41

ok, i will compress it and email it to you

user-11498a 19 June, 2019, 13:52:45

tex

user-11498a 19 June, 2019, 13:52:47

thx

papr 19 June, 2019, 13:52:55

You can use ffmpeg to compress the videos

papr 19 June, 2019, 13:53:51

After installing ffmpeg, you should be able to run ffmpeg.exe -i <original video path> <compressed video path>

user-11498a 19 June, 2019, 15:18:55

Hi, I have sent the files to [email removed] Please take a look. Thanks.

user-a7dea8 19 June, 2019, 17:21:36

Quick question: if I use offline calibration and pupil detection is there away to export videos from pupil player that will also generate the pupil_position and gaze_position spreadsheets?

papr 19 June, 2019, 17:22:30

Yes, but these are two different plugins. The world video exporter and the raw data exporter.

user-fe9abf 19 June, 2019, 17:23:47

Hi, so we are doing research that involves having subjects look at an easel board with stimuli (cards, magnets, etc) on the board and observing what the subjects look at based on audio cues. We are having some trouble calibrating the gaze data because the subjects often move away from or towards the surface during the trial and in between calibrations. I know you can use the surface plugin to add a surface and observe a heat map, but is there a way to use the plugin to improve gaze detection in 3d space while still observing temporal data? For example, would adding the easel board or the cards as surfaces be useful for preventing error in calibration due to the subject moving around? Thanks!

papr 19 June, 2019, 17:26:26

The calibration is independent of the surface tracking. It is a three step process. 1) pupil detection 2) gaze mapping 3) surface mapping

papr 19 June, 2019, 17:35:25

@user-a7dea8 to elaborate on that: the exporter does not care if the data was prerecorded or generated offline. At the end, it depends on the active plugins which data is being exported.

user-11498a 19 June, 2019, 20:17:20

Hi, we have been using the audio recording. It introduces some delays. How to get around that issue and make sure that the video and audio are perfectly synced?

papr 19 June, 2019, 20:21:18

@user-11498a on which platform did you record?

user-8fd8f6 19 June, 2019, 20:44:09

@papr Hi, Is the "circle_3d_radius" the r in the below picture?

Chat image

user-8fd8f6 19 June, 2019, 20:44:31

in the pupil position file

papr 19 June, 2019, 20:56:15

@user-8fd8f6 no. Phi and theta are related to the position of the location of the 3d circle on the eye ball. The eye ball has a fixed size that does not change. The circle 3d radius is related to the size of the pupil

papr 19 June, 2019, 20:56:32

The r in this picture is the eye ball size

user-8fd8f6 19 June, 2019, 20:59:40

So the Theta and Phi are not as shown in the picture too?

papr 19 June, 2019, 21:13:31

@user-8fd8f6 spherical coordinates define a three dimensional point. In the case of the pupil datum, we describe the orientation of the eye model either as a vector in the cartesian coordinate system or as (phi, theta, r) in the spherical coordinate system. Since the eye (modelled as a spehre) has a fixed radius, the Pupil datum omits the r value. The Pupil is modelled as a flat disk (circle 3d) that is tangential to the eye ball at the location (phi, theta, r). The radius of this disk is dependent of the current size of the pupil.

papr 19 June, 2019, 21:14:45

So the circle_3d has nothing to do with a spherical coordinate system or the picture that you posted. I hope this clears up any potential misunderstandings. πŸ™‚

user-8fd8f6 19 June, 2019, 21:25:21

perfect answer! Thank you

user-a21e3c 19 June, 2019, 21:48:38

Hi all. I am currently trying to generate heatmaps using Pupil Player with no avail. Whenever I click "(Re)-calculate gaze distributions" in the Offline Surface Tracker plugin, I get the following error:

player - [WARNING] offline_surface_tracker: No gaze on any surface for this section!

I do not believe the issue is with my surface marker size, as if I change the mode to "Show markers and Surfaces" the markers are detected just fine. It appears that none of my gaze data is being generated or carried over from Capture. Adjusting the X and Y sizes under the surface tab does not appear to help. Any advice?

papr 19 June, 2019, 22:20:28

@user-a21e3c do you see the gaze data being visualized as green dots during playback?

papr 19 June, 2019, 22:22:41

This error message indicates that there is no gaze data. If you recorded gaze data in Capture, make sure that the Gaze From Recording mode is active.

user-a21e3c 20 June, 2019, 14:30:05

@papr Yes, I can see the gaze data being visualized as green dots. Where can I find Gaze From Recording mode in Capture?

papr 20 June, 2019, 14:30:57

Did you mean to say Player? In your initial post you mentioned that you were using Offline Surface Tracking.

user-a21e3c 20 June, 2019, 14:31:45

Sorry I misread your message. Thought you meant Gaze From Recording was something you selected when recording in Capture

papr 20 June, 2019, 14:32:27

In Player, you have an icon with concentric circles on the right. From its menu you can activate Gaze From Recording and Offline Calibration

user-a21e3c 20 June, 2019, 14:33:56

Ah, I see thanks! I still can see the green dot representing eye tracking throughout the video, and the surface markers are still highlighted in Blue when I select Show Markers and Surfaces. But I still get the "No gaze data for any surface on this section!" Warning when I try to generate a heatmap

papr 20 June, 2019, 14:37:39

Do you see the red line around your surface?

papr 20 June, 2019, 14:38:32

You might need to setup a surface first, Marker detection is the first step in Surface Tracking. The second is setting up surface based on a set of detected markers

papr 20 June, 2019, 14:38:52

You can add surfaces on the left by hitting the A button

user-a21e3c 20 June, 2019, 14:41:39

Yes that did it! Unfortunately the heatmap is now entirely red. I do recall seeing this problem earlier though in the Discord chat, if it's already been answered I can look there and stop bothering you πŸ˜ƒ

user-a21e3c 20 June, 2019, 14:46:05

Figured it out. Thanks for the help!

user-f3a0e4 20 June, 2019, 14:57:19

Trying to open pupil player and keep getting the following error: player - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend

Any ideas?

papr 20 June, 2019, 14:58:42

@user-f3a0e4 You can ignore that error if you are not using a Realsense 3D camera

user-f3a0e4 20 June, 2019, 14:59:45

The issue is that the grey pupil player screen that usually pops up doesn't appear

user-f3a0e4 20 June, 2019, 15:00:21

I'm therefore unable to drag and drop a saved file

user-f3a0e4 20 June, 2019, 15:04:52

In fact, today I was having issues connecting remotely to the device and decided to "restart with default settings", which fixed the issue. However, since then I have had non stop errors, especially with calibration using pupil mobile. Does anyone know if any of the default settings might be causing this?

user-f3a0e4 20 June, 2019, 15:06:52

The error for calibration is: WORLD: collected 0 monocular calibration data WORLD: Collected 0 binocular calibration data WORLD: not enough ref point or pupil data available for calibration

user-f3a0e4 20 June, 2019, 15:07:07

This error does not happen when calibrating through local USB

papr 20 June, 2019, 15:11:14

@user-f3a0e4 1. In player, you need to drop the whole folder, not a single file. 2. Make sure that you are streaming all cameras to Capture

user-f3a0e4 20 June, 2019, 15:13:09

Yeah I have tried, but it literally won't allow me to drop anything in there as the grey window will not o/pen. It was working earlier this morning but there now seems to be some sort of error

papr 20 June, 2019, 15:19:12

@user-f3a0e4 ~~What is the error?~~ Can you be specific which folder you are dropping onto Player?

user-a21e3c 20 June, 2019, 15:22:45

Hi, question of clarification. Under the Calibration menu in Player, I have the option to set a range for "Calibration" and for "Mapping". I am assuming that the difference between these two is that the trimmed section one selects for "Calibration" is used to identify the region of the video with the calibration marker, and the trimmed section one selects for "Mapping" is the region of space that calibration corresponds to?

Additionally does this mean that if the wearer significantly changes their visual perspective, a recalibration must be in order each time?

user-888056 20 June, 2019, 15:23:05

Hello papr, I have stupid question. When you see the video in player. How do you plot gaze position. I mean how do you overlay the frame with the position since we have multiple eye recordings for the same frame?

user-f3a0e4 20 June, 2019, 15:23:46

I'm trying to open a recording. To do so, I usually just open pupil player and a grey blank page appears saying "drop a recording directory onto this page". However, now when I try to open pupil player, this grey screen does not appear?

papr 20 June, 2019, 15:24:55

@user-f3a0e4 ah, ok, now I understand. Can you try deleting the user_settings* files in the pupil_player_settings directory and try starting Player again?

papr 20 June, 2019, 15:25:50

@user-888056 We create a many-to-one relation ship for each world frame. Each gaze datum is associated to its closest world frame (by time)

papr 20 June, 2019, 15:26:57

@user-a21e3c Correct, the mapping range is the range on which the calibration is applied to. Yes, recalibrating regularly is recommended. Especially in 2d mode.

user-a21e3c 20 June, 2019, 15:28:06

What happens if we select a range for Calibration but not one for Mapping?

papr 20 June, 2019, 15:32:45

@user-a21e3c Then you won't get any gaze data

papr 20 June, 2019, 15:33:01

But the default gaze range is the complete recording

user-f3a0e4 20 June, 2019, 15:34:03

@papr I hope i'm not missing it, but I can't find anything for user_settings or pupil_player_settings when searching the entire pupil labs directories

papr 20 June, 2019, 15:34:49

@user-f3a0e4 you can find pupil_player_settings in your user folder

user-f3a0e4 20 June, 2019, 15:38:58

@papr amazing that worked! thanks!

user-f3a0e4 20 June, 2019, 15:46:56

@papr are you able to help with my other issue? Today, for seemingly no reason, calibrating when using pupil mobile is always unsuccessful. This cannot be because of pupil detection or positioning because when performing the same calibration using the local USB the calibration is perfect?

user-f3a0e4 20 June, 2019, 15:47:17

The error for calibration is: WORLD: collected 0 monocular calibration data WORLD: Collected 0 binocular calibration data WORLD: not enough ref point or pupil data available for calibration

user-a21e3c 20 June, 2019, 15:49:22

What does "Surface Cache is not Build Yet" mean?

user-a21e3c 20 June, 2019, 16:19:19

I was working on generating heatmaps for a series of surfaces in a single video in Player. Everything was fine at first until I hit the 1 minute mark, at which point I had no gaze data for any part of the video afterwards. I tried playing around with a few of the settings, and now I lost the gaze data from the first minute as well. No green circles or fixation points at any point during the video anymore. Reopening the video in a separate Player tab does not restore the data. What can I do?

papr 20 June, 2019, 16:21:11

@user-f3a0e4 Please make sure the Time Sync plugin is active before calibrating.

papr 20 June, 2019, 16:24:02

@user-a21e3c I am not sure. We will be releasing a new version of the surface tracker this week. It might solve your first problem. Not sure about the gaze data. Are you using offline calibration?

user-11498a 20 June, 2019, 16:24:38

@papr we record on a surface pro 6 laptop

user-a21e3c 20 June, 2019, 16:25:38

Yes I am. Gaze data is restored when I select Gaze Data from Recording, but every fixation point is about an inch off of where it should be. Offline Calibration was far more accurate while it was working, so I chose that instead.

I'm also seeing under the Offline Calibration tab that the Detection Progress is stuck at 100% and never says "Complete". Does that tell you anything?

user-a21e3c 20 June, 2019, 16:26:02

*an inch of the screen, not an inch of the real world

papr 20 June, 2019, 16:29:23

@user-11498a Could you share an example recording with [email removed] that makes clear that the audio is offset?

papr 20 June, 2019, 16:29:37

@user-a21e3c Can you try recalculating?

user-a21e3c 20 June, 2019, 16:42:39

Yes, does not appear to do anything. Here is the terminal log:

player - [INFO] gaze_producers: Calibrating section 1 (Unnamed section) in 3d mode... Calibration Section 1 - [INFO] calibration_routines.finish_calibration: Dismissing 2.99% pupil data due to confidence < 0.80 Calibration Section 1 - [INFO] calibration_routines.finish_calibration: Collected 657 monocular calibration data. Calibration Section 1 - [INFO] calibration_routines.finish_calibration: Collected 0 binocular calibration data. Ceres Solver Report: Iterations: 6, Initial cost: 1.950097e+01, Final cost: 1.618617e+01, Termination: CONVERGENCE Calibration Section 1 - [INFO] gaze_producers: Offline calibration successful. Starting mapping using monocular 3d model. player - [INFO] gaze_producers: Cached offline calibration data to C:\Users\Coltan\recordings\TestingWithCode\000\offline_data\offline_calibration_gaze player - [INFO] offline_surface_tracker: Gaze postions changed. Recalculating. player - [WARNING] offline_surface_tracker: No gaze on any surface for this section! player - [INFO] fixation_detector: Gaze postions changed. Recalculating. Fixation detection - [INFO] fixation_detector: Starting fixation detection using 3d data...

^Last line seems to imply some process is incomplete but running, but it's been sitting like that for almost 15 minutes now with no updates.

papr 20 June, 2019, 16:44:02

Still no gaze data visible? What is your mapping range set to?

user-a21e3c 20 June, 2019, 16:52:08

No, no gaze data. I have three sections in the video, each of which consists of a calibration and a surface. Currently I'm only working with the first one for simplicity, so the calibration range is the first 15 seconds or so, and the mapping range is everything after that until the next section's calibration

user-a21e3c 20 June, 2019, 16:52:35

Gaze data randomly reappeared a few minutes ago for the first section only, but disappeared when I tried to calibrate the second section

user-a21e3c 20 June, 2019, 16:57:31

Okay now I have the gaze data for the first section's surface but nowhere else. Heatmap generated nicely, not sure now that could have happened though if gaze data is not available for the calibration.

papr 20 June, 2019, 16:58:07

@user-a21e3c Maybe it could help if you watch our youtube tutorials on offline calibration. Did you watch them already?

user-a21e3c 20 June, 2019, 16:59:41

Are you referring to these?

Chat image

papr 20 June, 2019, 16:59:56

Yes πŸ™‚

user-a21e3c 20 June, 2019, 17:00:24

Okay haha, yes they did help! But they helped me get to where I am now, and can help no more

papr 20 June, 2019, 17:01:35

Can you extend the timelines at the bottom. They should show the current setup of the sections. Could you make a screenshot of that?

user-a21e3c 20 June, 2019, 17:04:58

Chat image

user-a21e3c 20 June, 2019, 17:05:53

There is nothing below eye0 FPS

papr 20 June, 2019, 17:12:04

I does not look like the gaze mapping ranges are setup correctly

papr 20 June, 2019, 17:13:26

Actually, it looks like you have not gaze mapper setup at all

papr 20 June, 2019, 17:14:09

@user-a21e3c In the gaze mapper menu, click: New Gaze Mapper

user-a21e3c 20 June, 2019, 17:14:48

Where is the gaze mapper menu?

papr 20 June, 2019, 17:15:31

open the offline calibration menu. it has three sub menus: references, calibrations, gaze mappers

papr 20 June, 2019, 17:15:41

open the gaze mapper section

user-a21e3c 20 June, 2019, 17:16:53

Hm I don't see that. Let me just send you a few screenshots of what I do see

user-a21e3c 20 June, 2019, 17:19:25

Chat image

user-a21e3c 20 June, 2019, 17:19:32

Chat image

user-a21e3c 20 June, 2019, 17:20:58

I am now realizing this is version 1.10. Perhaps I should update to 1.12 before attempting to troubleshoot further?

papr 20 June, 2019, 17:22:02

Yeah, I just wanted to note that. The videos refer to v1.11 and higher

user-a21e3c 20 June, 2019, 17:22:24

Okay I will do that and try again

user-a21e3c 20 June, 2019, 17:35:08

Another side question, is there any advantage to inverting the eye camera image when using Capture?

user-a21e3c 20 June, 2019, 17:35:33

Or I guess it says "Flip image display" in version 1.12 now

papr 20 June, 2019, 17:37:54

@user-a21e3c no

papr 20 June, 2019, 17:38:39

This is just for visualization purposes

user-f3a0e4 21 June, 2019, 08:02:38

@papr Thank you. Reactivating the time sync plugin fixed the issue πŸ˜€

user-5c16b8 21 June, 2019, 10:45:17

Hi all, we use the pupil-labs eye tracker (200Hz) in complete darkness and we're struggling to get it to work properly. With the lights turned on it works perfectly, no problems with the detection of the pupil. However, when the pupil dilates due to the complete darkness, the pupil is not detected anymore or the confidence drops below 0.3. We made sure to set the pupil size of the 3D detector large enough, so that should not be the issue. For some participants we even observe a partly "bright" pupil. With our old 120Hz pupil-labs headset we did not have this issue. Is there a way to solve this? I have to note that we use version 1.8 at the moment, because we wanted to keep settings similar between different experiments.

user-5c16b8 21 June, 2019, 10:45:47

Screenshot of the "bright" pupil effect

Chat image

papr 21 June, 2019, 10:56:35

@user-5c16b8 please try to run it with the 320x240 resolution

user-5c16b8 21 June, 2019, 10:58:00

@papr It's the new 200 Hz cam, so if I remember it correctly it only has 400 x 400 or 192 x 192, right?

papr 21 June, 2019, 11:01:38

@user-5c16b8 correct! Please use 192x192 in this case

user-5c16b8 21 June, 2019, 11:04:40

Ok I'll try it. The only issue with that approach is that we're also working on estimating torsion based on the eye images and that won't work with the lower spatial resolution. But for now it might be a solution. I'll let you know whether it works.

papr 21 June, 2019, 11:05:37

@user-5c16b8 I see! That will be likely a problem indeed.

user-5c16b8 21 June, 2019, 11:31:39

@papr It appears to track more stable with the lower spatial resolution, but we need to test it a bit more to be sure. And it's only a temporary solution, because we'd really like to include the torsion measurements

user-4bf830 21 June, 2019, 16:21:17

Hey, all! I'm brand spanking new to Pupil and I have no previous eye tracker experience. I'm trying to set up a Pupil Mobile headset for my researching professor but I can't seem to get the cameras anywhere near my eyes. It's like they're too low. I have tried the camera arm extenders and a bunch of different angles, but the only time the cameras even get my eye in frame is when I have the headband held up about an inch or more above my ears. Any suggestions?

user-20faa1 21 June, 2019, 18:40:33

Has anyone used the moto Z3 play with the Pupil Mobile with success? I know that previous versions of the moto Z* play have worked.

papr 21 June, 2019, 18:55:51

@user-20faa1 yes, if I am not mistaken we are shipping them as part of the bundle

papr 21 June, 2019, 18:56:04

@user-20faa1 you might need to enable the OTG settings

user-20faa1 21 June, 2019, 20:22:47

@papr Thanks! I thought I saw that the OnePlus 6 was shipping as part of the bundle, but I wasn't sure about the moto Z3 play.

user-14d189 22 June, 2019, 02:49:14

@user-20faa1 the Z3 works good.

user-14d189 22 June, 2019, 02:51:18

A question to the surface tracker. what's the min resolution it can be used on? If I set up the marker in the environment how many pixel does each marker need to include.

user-14d189 22 June, 2019, 02:52:27

reason is we work with a very low resolution cam and over time I need to control head movement.

user-14d189 22 June, 2019, 02:55:49

the cam resolution is only 224x171. a 10x10 matrix relate to 2.5 x 2.5 degree.

user-ede53a 22 June, 2019, 17:45:51

Can Pupil Labs team assist us with custom make cameras if we order enough of them and if yes what will be the cost per camera?

wrp 23 June, 2019, 08:52:31

Hi @user-ede53a send us an email so we can get a better idea of what you need

papr 24 June, 2019, 08:41:57

@user-20faa1 Actually, I was mistaken. We started shipping the OnePlus 6 as part of the bundle. We used to ship the Moto Z3 Play though. Both should work fine with Pupil Mobile.

user-8779ef 24 June, 2019, 14:08:58

Hey folks - a few weeks ago you updated HMD eyes to use new 200 Hz cameras. Will the mobile tracker be redesigned to use them as well?

user-8779ef 24 June, 2019, 14:15:24

...and, no more educational discount on the webstore?!?!

user-1146c4 24 June, 2019, 15:19:06

Hey Guys, I am currently using the pupil labs devices for a research project and I am pretty amazed and astonished so far. However, I would very much appreciate to opportunity to get to know some other users in exchange for some experiences concerning the setup und usability in some situations. Would anybody be willing to chat and exchance experiences? PM me please in German or English (German based research assistant). ✌

user-8fd8f6 24 June, 2019, 19:07:38

@papr Hi, I have a question about player 1.13. In the offline eye movement detector we have fixation gazes and saccade and.... and also we have offline fixation detector (that we can define the fixation parameters). Are they different with each other in the fixations?? What is the definition of fixations in the offline eye movement detector?

user-f3a0e4 24 June, 2019, 19:47:49

Hi guys. I am trying to install the Pupil apps on my MacBook Air, but when attempting to open the downloaded files I get an error stating "β€œPupil Player” is damaged and can’t be opened. You should eject the disk image."

user-f3a0e4 24 June, 2019, 19:47:54

Anyone familiar with this?

user-8fd8f6 24 June, 2019, 21:21:07

@papr In player 1.13 I think the Frame numbers in the offline eye movement detector are wrong

papr 25 June, 2019, 02:33:38

@user-8fd8f6 checkout the NSLR paper for details on how the eye movement detector classifies eye movements.

papr 25 June, 2019, 02:34:34

@user-8fd8f6 also, could you please let @user-764f72 know what you mean by wrong frame numbers?

user-20faa1 25 June, 2019, 02:36:00

@papr Thanks! Has anyone run any battery tests with Pupil Mobile on the Z3 Play? How long did it last? If the battery does die, how does Pupil Mobile save the data? Is the full recording corrupted/lost if the battery dies?

papr 25 June, 2019, 02:37:38

@user-20faa1 I do not remember the numbers but it was well over an hour. Generally, we recommend to make multiple shorter (~20 minute) recordings.

user-20faa1 25 June, 2019, 02:42:56

Interesting. Our tasks are fairly long, with 1.5-2 hours of recordings per participant per visit. We might have to look at using multiple phones or the moto power pack mod.

papr 25 June, 2019, 03:50:48

@user-20faa1 I do not have the exact numbers in mind. I underestimated to be sure to not make further false statements πŸ˜‰ Maybe @user-755e9e knows more exact numbers.

user-755e9e 25 June, 2019, 06:56:48

@user-20faa1 if you are using a Moto Z3 Play connected to a Moto Mods Power Pack the battery should last 2/2.5 hours.

user-f3a0e4 25 June, 2019, 07:42:47

@papr can anyone help me with pupil apps installation on a Mac? I have entered all code listed on the masters doc into my terminal but still get a message that the apps are damaged and can't be opened? Surely someone else has come across this?

user-f3a0e4 25 June, 2019, 11:10:43

the main issue appears to be with downloading pyuvc

user-8fd8f6 25 June, 2019, 13:58:59

@user-764f72 Hi, I checked the V1.13. The frame range has a problem. As you see in the picture the video is in frame 143 but the result show frame range between 814 and 816.

Chat image

user-8fd8f6 25 June, 2019, 14:01:01

@papr Did you change offline fixation detector in new version too?? Because the results are not same as previous version.

papr 25 June, 2019, 14:01:45

@user-8fd8f6 The only thing we did was to adjust the default parameters

papr 25 June, 2019, 14:02:02

It did not change in functionality

user-8fd8f6 25 June, 2019, 14:05:23

The method in previous was based on Pupil, In new version it is based on Gaze. Are the same in functionality?

user-8fd8f6 25 June, 2019, 14:09:55

@papr

user-20faa1 25 June, 2019, 15:38:32

@user-755e9e Thanks! Have you tested what happens to recordings if recording locally to a phone via Pupil Mobile and the battery dies? Is the full recording corrupted/lost?

papr 25 June, 2019, 16:02:09

@user-8fd8f6 Previously, it would choose depending on the available data. In order to be consistent with eye movement detector, we have decided to base both calculations on gaze since it is applicable to both, 2d and 3d pupil data.

To the difference between pupil and gaze based dispersion calculation: - pupil method: Requires 3d pupil data with a well fit eye model. One eye is chosen and any dispersion is calculated based on the eye model posture.

  • gaze method: Requires calibrated gaze data. Uses the world camera intrinsics to deproject 2d gaze into 3d cyclopian gaze vectors and uses these for dispersion calculations.
papr 25 June, 2019, 16:21:35

@user-8fd8f6 short update: @user-764f72 was able to reproduce the issue. We will be working on a solution to it.

papr 25 June, 2019, 16:32:39

@user-20faa1 Currently, we do not have any explicit fail-save on low-battery-shutdowns. This is something we will work on. For now, it is recommended to check the battery life regularly. Apologies for the inconvenience.

user-20faa1 25 June, 2019, 17:22:06

@papr Thanks for letting me know and being super helpful per usual!

user-6691cc 25 June, 2019, 17:34:30

Hi

user-6691cc 25 June, 2019, 17:36:01

I am a student at a university in Canada, and we are looking into getting your eye tracker to perform visual search tasks, do you have any papers that I can look at that have used your product to perform visual search tasks?

papr 25 June, 2019, 17:37:22

@user-6691cc Welcome to the channel. Please see our citation list: https://docs.google.com/spreadsheets/u/2/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/edit?usp=drive_web&ouid=115812273187120893285

user-6691cc 25 June, 2019, 17:38:20

Thank you

user-8779ef 25 June, 2019, 18:26:14

@wrp @mpk Guys, I'm not sure if this is intentional, but the academic discount for the Pupil Core / Invisible is not well advertised. It's only visible in the cart! If the goal is to incentivize more purchases form academia, well, you won't get that effect unless the discount is made more apparent.

wrp 25 June, 2019, 19:19:12

Ok, thanks for the feedback @user-8779ef - we will make it more explicit

user-02665a 26 June, 2019, 08:36:19

congratz to the new movement detector!

user-02665a 26 June, 2019, 09:13:29

just a small question: is there some way to do batch processing? couldn't find much in the documentation..

user-adf88b 26 June, 2019, 17:07:22

Is there anyway to fix this damage?

Chat image

user-adf88b 26 June, 2019, 17:07:48

I have soldering stuff but it looks like the wire frayed from the connection point

papr 26 June, 2019, 17:11:07

@user-adf88b please contact info@pupil-labs.com and attach the above image. Our support will come back to you. πŸ‘

papr 26 June, 2019, 17:13:27

@user-02665a There is no batch export built-in to Player. But there are scripts that can export recorded data from multiple recordings without starting Player. Please check the Pupil-community repository for links.

user-adf88b 26 June, 2019, 17:20:28

As another question, I have an existing hololens project and was initially thinking of using hmd-eyes, will I need to do the getting started for pupil too?

papr 26 June, 2019, 17:30:34

@user-adf88b You will need to run Pupil Capture/Service, if you are referring to that.

user-adf88b 26 June, 2019, 17:33:42

That's the thing I don't really understand, how is the hololens supposed to interact with the headset.

papr 26 June, 2019, 17:36:10

@user-adf88b The eye cameras are connected to the computer running Pupil Capture. Pupil Capture estimates gaze and publishes it in real-time over the network api. This data can be accessed for interactions running on the hololens.

user-adf88b 26 June, 2019, 17:36:55

Ah so for pupil labs to function with the hololens an additional computer is needed, that makes sense thanks.

user-28fba0 27 June, 2019, 02:36:29

What is the field of view of the 120hz eye camera ?

papr 27 June, 2019, 14:14:37

@user-28fba0 I do not know that number by hard.

user-4bf830 27 June, 2019, 15:38:11

Hey all! I’m opening my eye 0 window in pupil capture and it’s suuuuuper slow and laggy. It keeps freezing and trying to close on me. Any suggestions?

papr 27 June, 2019, 15:45:31

@user-4bf830 Hi, which version of Capture and operating system do you use? Also, can you check if the eye camera is properly connected to the rest of the headset?

user-4bf830 27 June, 2019, 15:47:06

I’m on a Windows computer with the last pupil download that was posted, is that 1.13?

papr 27 June, 2019, 15:47:43

@user-4bf830 ok, thank you

user-4bf830 27 June, 2019, 15:47:45

Camera is attached, yes.

papr 27 June, 2019, 15:48:32

@user-4bf830 Could you share the capture.log file in the pupil_capture_settings folder?

user-4bf830 27 June, 2019, 15:49:22

Not at the moment, as my entire computer has launched into an update. πŸ˜‚ I’ll let this update download and then try and run it, you’ll hear from me here again if it’s still being fussy.

papr 27 June, 2019, 15:49:55

@user-4bf830 Good luck with that update πŸ™‚

user-4bf830 27 June, 2019, 15:50:01

Thanks!

user-76a6ff 28 June, 2019, 08:26:31

Hello, we noticed that in the new version (V1.13) of the pupil-labs software the remove surface button has disappeared. Does someone know how to remove surfaces now ? Thank you

user-adf88b 28 June, 2019, 11:24:25

@papr Is it possible to use the current github version of hmd-eyes with unity 2018.3?

user-adf88b 28 June, 2019, 11:26:33

Or do I have to backdate my unity version?

user-6cbd99 28 June, 2019, 11:27:50

i m using it with 2019.1 and it works fine

user-adf88b 28 June, 2019, 12:14:12

Odd

user-adf88b 28 June, 2019, 12:14:23

.Is it the hololens version?

user-6cbd99 28 June, 2019, 12:14:35

no vive

user-adf88b 28 June, 2019, 12:14:46

Damn

user-adf88b 28 June, 2019, 13:16:03

The Vive version is up to date

user-60c798 28 June, 2019, 17:46:17

Anyone from Brazil here that mounted the DIY guide?

user-14d189 29 June, 2019, 03:36:56

congrats pupil labs to your new web out fit! look good!

End of June archive