πŸ‘ core


mpk 01 June, 2020, 06:39:36

@user-7d0b66 all you need to do is make sure that the world window from the app is not visible in the video.

user-370594 01 June, 2020, 11:49:36

HI! Recently you've told me that the function to count saccades and other eye movements was deleted from the latest version. When are you going to add it again?

user-b0c902 01 June, 2020, 12:54:49

Hi, i was wonder if someone could help me with - where can i find the gaze frames that are missing from the gaze report? for eg. the npy and the video have the same number of frames but the gaze report doesn't.

user-7d0b66 01 June, 2020, 13:27:03

@mpk Yes, I tried to do that. I tried recording and then switching the world window to something else, but even then the tracking showed to be very inconsistent, therefore not being able to genarate a proper heatmap. I don't know what I'm doing wrong.

user-a10852 01 June, 2020, 17:26:46

@user-c5fb8b changing the codec to MJPEG and reducing the size to under 400x400 did the trick, looks like it is working now with the exception that the video playback bar has a different length and frame index than the source eye video (not a huge issue - the offline analysis still works, maybe has something to do with the fake world file that is generated?) Thanks again for all the help!

user-c5fb8b 02 June, 2020, 06:55:39

Hi @user-7daa32

I can find analysis plugin in pupil player. Which analysis plugin are you looking for? Note that you might have to enable the plugin first from the Plugin Manager on the right.

Please why is the config. graph covers with black color? the markers blinked with black color The black text on the system graph is a bug caused by the blink detector. We are working on a fix for it. If you disable the blink detector, you should not experience this issue. I don't understand "the markers blinked with black color". Are you experiencing more black artifacts in another context?

user-7daa32 02 June, 2020, 06:58:46

Hi @user-7daa32 Which analysis plugin are you looking for? Note that you might have to enable the plugin first from the Plugin Manager on the right.

The black text on the system graph is a bug caused by the blink detector. We are working on a fix for it. If you disable the blink detector, you should not experience this issue. I don't understand "the markers blinked with black color". Are you experiencing more black artifacts in another context? @user-c5fb8b Thank you. Once that has been fixed, I will download a new pupil software right ? I am trying to say that I can't see the analysis plugin Someone told me I have to do data analysis with python or Matlab

user-c5fb8b 02 June, 2020, 07:01:53

Hi @user-370594 currently we don't have any plans for adding another eye movement detector/classifier to pupil. The old one was performing very poorly from our understanding and we don't have a better alternative available. If you want to give the old one a try, you can download Pupil v1.16, which was the last version with the Plugin enabled.

user-c5fb8b 02 June, 2020, 07:05:12

@user-7daa32 I can only highly recommend that you read through the documentation for Pupil at https://docs.pupil-labs.com/core/ Pupil Player offers a set of common analysis plugins that are usefull for standard eye tracking studies. You will find a list of all available plugins in the docs as well: https://docs.pupil-labs.com/core/software/pupil-player/#plugins If you need more elaborate data analysis, you can always export the raw data from Pupil Player and import that in the data analysis solution of your choice (Matlab, Python, Excel, ...). Please be aware that Pupil does not offer a general purpose data analysis platform.

user-c5fb8b 02 June, 2020, 07:07:00

Hi @user-b0c902, which npy and video are you talking about? The eye video? Pupil Player always maps all eye data to the world video, so data before the first world frame or after the last world frame will be excluded from exports. Would that explain the "missing gaze frame" you are experiencing?

user-c5fb8b 02 June, 2020, 07:15:03

@user-a10852 Please try deleting the world lookup. It might have been generated based on the wrong timing information of the old H.264 eye video. Other than that the frame index won't match since everything in Player will be relative to the world video, not to the eye video. Since a missing world video is not the common case, we just generate a fake world video with fixed 30FPS currently. Since your eye video has higher fps, multiple frames from the eye video will be mapped to a single frame in the world video. If you run Pupil from source, you could overwrite the default framerate here: https://github.com/pupil-labs/pupil/blob/b3dda8cba54b2eff9503faa66e126bfce3c7b952/pupil_src/shared_modules/video_capture/file_backend.py#L287

user-b0c902 02 June, 2020, 10:09:51

@user-c5fb8b So i have looked at the number of frames from the world.mp4 video and its corresponding .npy file. those two seem to have the same number of frames - which makes sense. However, once I process the recording in pupil player and export the gaze_position.csv report there are some world_index frames missing. If you look at the attachment , you can see that there is world index from 4-7 but not 8-9. And i'm curious about those missing world index.

gaze_position.csv

user-c5fb8b 02 June, 2020, 10:12:22

@user-b0c902 well it seems there is no gaze available on world frames 8-9.

user-c5fb8b 02 June, 2020, 10:12:45

@user-b0c902 can you share the pupil_positions.csv as well?

user-b0c902 02 June, 2020, 13:38:40

I have attached both - pupil position as well as blinks, just in case

OneDrive_1_02-06-2020.zip

user-c5fb8b 02 June, 2020, 13:46:11

@user-b0c902 there are a lot of world frames without any pupil data. Normally this shouldn't happen, is there any special setup that you are running for your recording? Can you share the entire recording with us? The best way would be to share it with data@pupil-labs.com via some file-sharing service like OneDrive/GoogleDrive/...

user-b0c902 02 June, 2020, 14:23:35

@user-c5fb8b we do run our studies at the participant's home environment while they play with their children. So I can understand if the eye-tracker loses the pupil at some points. And therefore I was wondering if there was an exported file on when it loses the pupil. I am not sure we can share the video recording because of the nature of the study but I will check and send it on the above mentioned email address if i can. Thanks @user-c5fb8b

user-c5fb8b 02 June, 2020, 14:41:44

@user-b0c902 it's not that the eye tracker loses the pupil, but that there is no data available. pupil_positions.csv contains all available data, also from when no pupil was found in the eye image (the confidence will just be close to 0). This seems to indicate that there were a lot of dropped frames from the eye cameras. If this only happens a couple of times, it might indicate a hardware disconnect, but in your recording there seems to be a systematic drop of eye video frames throughout the entire recording. My best guess would be that you are running Pupil on a computer with insufficient hardware. A couple of questions: 1. Do you know what hardware specs you have available there? 2. Are you experiencing this issue in all of your recordings consistently? 3. Can you observe the CPU graph in Pupil while recording?

user-b0c902 02 June, 2020, 14:50:08

Is the confidence being closer to 0 the only indicator of dropping frames or is there any other way to determine it? or do you count frames in the world video and look at the number of timestamps? if so, is there anyway to know which video images match up with the timestamps?

user-b0c902 02 June, 2020, 14:50:37

the answer to point 2 and 3 is yes. i will find out the hardware specs

user-b0c902 02 June, 2020, 15:02:15

@user-c5fb8b I have attached an image for the specs

Chat image

papr 02 June, 2020, 15:03:38

@user-b0c902 Are you recording to the internal storage or to an external disk?

user-c5fb8b 02 June, 2020, 15:05:28

@user-b0c902 the confidence has nothing to do with frame drops. It's an indicator of how good the pupil detection was for an existing frame.

Normally you would expect a consistent number of pupil data for every world frame, since both eye and world run at fixed frame rates. E.g. if world runs at 30FPS and eye at 120FPS you should get 4 pupil data entries on average (per eye) per world frame. Since most of your world frames have ~4 matching frames and there are only a few parts without matching frames, I assume there were frames dropped.

Regarding matching timestamps to frame numbers: In the export file world_timestamps.csv you find a list of all timestamps. The index in this file corresponds to the frame number in the world video. If you enable the eye video exporter you will get an according file for the eye video. Please note that the exported CSV files are always relative to the exported videos, which are truncated to the trim marks that you set in player.

user-c5fb8b 02 June, 2020, 15:07:25

@user-b0c902 also can you check the FPS of the eye video while recording?

user-b0c902 02 June, 2020, 15:09:08

@papr internal storage for both laptop and mobile phone

papr 02 June, 2020, 15:09:59

Mobile phone?

papr 02 June, 2020, 15:10:10

Is this a pupil mobile recording?

user-b0c902 02 June, 2020, 15:10:29

the gaze and pupil position reports are from the laptop

user-b0c902 02 June, 2020, 15:10:46

but we do record on the mobile phones as well and stream it on the laptop

papr 02 June, 2020, 15:11:20

Ok, but the recording in question is not such a recording that contains streamed data? Or is it?

papr 02 June, 2020, 15:11:41

Because that would explain the eye frame drops.

user-b0c902 02 June, 2020, 15:12:06

@user-c5fb8b i think the fps for eye was 120 and world 30 but i will check again. If i am not mistaken, world_timestamps.csv is generated with recent version of pupil player, correct?

user-b0c902 02 June, 2020, 15:12:49

@papr yes, it does contain streamed data.

papr 02 June, 2020, 15:17:16

@user-b0c902 In this case, it is very expected that you see frame drops as the video streaming does not guarantee you that all frames are being transmitted, especially if the wifi bandwidth is temporally limited by other traffic.

user-b0c902 02 June, 2020, 15:32:15

fair enough. so we are basically trying to extract frame by frame data for each participant. it is a 10 minute recording. We are extracting frames of world video and gaze report to match match up where they are looking. And the gaze report is not matching up to the frames in the 10 minute world video. Do you know what would be the best way to solve this for data analyses? Also, what does pts mean in the world_timestamp.csv?

user-c5fb8b 02 June, 2020, 15:49:55

@user-b0c902 the more stable workflow would be to not record data that has been streamed, but record directly on the phone instead. This means your recording won't contain pupil or gaze data yet and you will have to do this with offline pupil detection and gaze mapping. For this you would need to include the calibration procedure in your recording (which we recommend in general).

You can still connect Pupil Capture over network in order to monitor e.g. the calibration procedure. You can also start the recording on the phone via Pupil Capture through the Remote Recorder plugin, which might be helpful.

If by any change you also recorded on the phone so far, you can just pop this recording into Pupil Player and run the offline pipeline for pupil detection and gaze mapping. As I said this will only work if you included the calibration procedure in your recording.

user-c5fb8b 02 June, 2020, 16:01:25

@user-b0c902 the "best way for data analysis" certainly depends a lot on what you are trying to accomplish. I'm afraid we cannot really help you with that. If you need gaze data for every world frame, then I'm afraid your recording does not offer this. Maybe you could think about some form of interpolation in order to compensate for missing data?

PTS are presentation timestamps and correspond to the time in the actual exported video file. In 99% this should not be relevant for you.

user-06ed20 02 June, 2020, 16:38:23

Can someone please point me to where I can buy replacement lenses for the eye cameras? EU or Germany would be preferred. πŸ˜‡

papr 02 June, 2020, 16:41:23

@user-06ed20 I would recommend writing an email to info@pupil-labs.com including your question and your eye camera type in this regard.

user-7d0b66 02 June, 2020, 17:18:01

Hello, I'm still having trouble generating a heatmap as the tracking seems to be very inconsistent, it keeps blinking. I've tried having a different background on my screen but the problem persists. Does anyone know what's going on?

papr 02 June, 2020, 17:19:46

@user-7d0b66 Sorry if we have asked for it already: Were you able to share an example recording with [email removed] such that we can have a closer look?

papr 02 June, 2020, 17:21:07

Ideally, you could name a specific world frame in that recording in which you would expect the tracking to work but it does not.

user-7daa32 02 June, 2020, 17:37:07

Can we email a link to the pupil player video for you to check?

papr 02 June, 2020, 17:40:01

@user-7daa32 I had a look at the video. What should I look for specifically? What I noticed is the low recorded frame rate.

user-7daa32 02 June, 2020, 17:43:09

I'm only testing it. I have not started research proper. I just want to understand it before doing preliminary study. Can I send the whole recording ? I don't really understand the results. The pupil diameter I got will just be for one subject which can be compared to other subject right ? What if I want to compare pupil dilations among different AOIs?

user-7daa32 02 June, 2020, 17:47:57

@user-7daa32 I had a look at the video. What should I look for specifically? What I noticed is the low recorded frame rate. @papr the heatmap for each surface is bluish blank and no color gradient on stuff like that. I just want to know if that recording was okay

papr 02 June, 2020, 17:49:46

@user-7daa32 At 0:03 you can see that the heatmap is not setup correctl. Click "edit surface" in the top right and adjust the handles to contain the complete display. Also, in the menu on the right, set a size (e.g. the resolution of your display) for the heatmap.

user-7daa32 02 June, 2020, 17:50:54

@user-7daa32 At 0:03 you can see that the heatmap is not setup correctl. Click "edit surface" in the top right and adjust the handles to contain the complete display. Also, in the menu on the right, set a size (e.g. the resolution of your display) for the heatmap. @papr I will try this now. Also I am sending some data and please look at them and let me what are your thoughts.

papr 02 June, 2020, 17:51:29

I won't have time to look at it this evening. We will follow up in the coming days. πŸ‘

user-7daa32 02 June, 2020, 17:52:07

Okay. Thanks

user-3ede08 03 June, 2020, 09:09:17

I recently bought a pupil lab. I am trying to install the software on MacBook Pro, but I get an error: Pupil Service cannot be opened because Apple cannot search for malware in it. The same error occurs for pupil Capture and Service. How could I handle it ?

papr 03 June, 2020, 09:10:15

@user-3ede08 Please right click the application and click Open. Afterward, you should see a similar dialogue but with the option to open it anyway

user-3ede08 03 June, 2020, 09:17:52

@papr, thanks for your prompt response. I am new here, so sorry if my questions are trivial.

user-3ede08 03 June, 2020, 09:18:10

@papr I still get the same error

user-3ede08 03 June, 2020, 09:32:16

@papr I got it, thanks

user-7d0b66 03 June, 2020, 19:00:53

@papr I sent an e-mail to data@pupil-labs.com yesterday but I didn't get a response yet.

papr 03 June, 2020, 19:02:18

@user-7d0b66 Could you let me know in a private message which email address you sent this email from?

papr 03 June, 2020, 19:21:15

@user-7d0b66 It looks like we have not received your email. Could you please check if the email was sent correctly and maybe attempt to resend it?

papr 03 June, 2020, 19:23:34

Generally, we try to respond to all emails within 3 business days or less.

papr 03 June, 2020, 20:16:25

@user-7d0b66 I can confirm that we have received your email. We follow up we email.

user-7d0b66 03 June, 2020, 21:46:02

@papr Great, thanks!

user-7daa32 03 June, 2020, 22:24:57

Hello , > @user-7daa32 I had a look at the video. What should I look for specifically? What I noticed is the low recorded frame rate. @papr Hello

Do I need a high frame rate to get a good heatmap? The surface is the same at AOI right ? Why is it difficult to set up more than one surface? The edit line always scatter when trying to set up an additional surface. Please can you send me a screenshot of stimulus with more than one surface set up ? Thanks

user-c5fb8b 04 June, 2020, 06:53:57

Hi @user-7daa32 the demo recording for Pupil Player (found in the docs about Pupil Player: https://docs.pupil-labs.com/core/software/pupil-player/) shows an example of a multi-surface setup for surface tracking. Here's the direct link to the recording: https://drive.google.com/file/d/1nLbsrD0p5pEqQqa3V5J_lCmrGC1z4dsx/view?usp=sharing

user-c5fb8b 04 June, 2020, 06:57:26

here's a screenshot:

Chat image

user-b0c902 04 June, 2020, 07:46:21

@user-c5fb8b thanks! this is helpful.

user-ae4005 04 June, 2020, 10:20:03

Hi. We're trying to run mouse_control.py under pycharm and it says it needs module "windows". When we try to install the module PyCharm says file not found. Do we need the module? Is this a problem you can help with?

papr 04 June, 2020, 10:24:04

@user-ae4005 Please try installing https://pypi.org/project/PyUserInput/ Also make sure that PyCharm is actually using the python environment in which you have installed the module.

user-ae4005 04 June, 2020, 10:25:01

@papr Thank you for the quick response! I'll try right now

user-ae4005 04 June, 2020, 10:29:11

@papr I'm getting this error now: Could not find a version that satisfies the requirement pyHook (from PyUserInput) (from versions: ) No matching distribution found for pyHook (from PyUserInput) Any idea why that is?

papr 04 June, 2020, 10:31:06

Do you get this during install or when running the script? Also, how do you install it? Using pip or pip3? What is your output of pip -V if you are using pip

user-ae4005 04 June, 2020, 10:31:34

@papr Do you guys have much experience with pupil under PyCharm? Should we be using a different package manager / python install?

user-ae4005 04 June, 2020, 10:32:04

During install. PyCharm has a module / environment manager that manages installs for each project. It seems to use pip.

user-ae4005 04 June, 2020, 10:32:40

Chat image

papr 04 June, 2020, 10:33:29

I will be away from keyboard for a while. I might be able to help when I am back.

user-ae4005 04 June, 2020, 10:34:02

πŸ‘πŸ™

papr 04 June, 2020, 11:45:43

@user-ae4005 It looks like you are using a virtual environment based on the python.exe path in your screenshot. Please make sure that you are using Python 3.6.1 or higher

papr 04 June, 2020, 11:46:11

My guess is that you are using Python 2.7 and that one of the requirements is not available for Python 2.7 anymore.

user-ae4005 04 June, 2020, 13:24:19

I just checked.. I'm using Python 3.7

user-ae4005 04 June, 2020, 13:25:01

I'm using PyCharm, would Condo be better or should that not make a difference?

papr 04 June, 2020, 13:28:48

@user-ae4005 I have no personal experience with this setup. But it looks like pyHook has not provided any concrete uploads to PyPI which is why its installation fails. Instead they link here: https://sourceforge.net/projects/pyhook/files/pyhook/1.5.1/

papr 04 June, 2020, 13:29:13

You should be able to download the zip and install it via pip install <path to zip file>

user-ae4005 04 June, 2020, 13:29:56

I'll try. Thanks again!

user-ae4005 04 June, 2020, 14:30:38

So I finally managed to install pyHook and the code runs. However, nothing happens. When I go through it in debug mode, it stops at line 74 (topic, msg = sub.recv_multipart) of mouse_control.py. Do you know know what the problem could be? I'm completely new to python and I'm not sure how to handle this...

papr 04 June, 2020, 14:34:08

@user-ae4005 Is pupil capture running? And have you set up a surface in the surface tracker plugin?

papr 04 June, 2020, 14:34:33

The surface should have the name screen

user-ae4005 04 June, 2020, 14:35:55

Pupil capture is running. I did add the plugin but I did not set up a surface...

user-ae4005 04 June, 2020, 14:36:05

That explains a lot. Will do it now!

user-ae4005 04 June, 2020, 14:36:08

Thanks again πŸ™‚

user-ae4005 04 June, 2020, 14:37:31

Hmm Pupil capture tells me that I cannot add a surface because it can't find markers in the image. What do I need to do to set up markers?

papr 04 June, 2020, 14:39:06

Checkout this very old video demoing the script https://www.youtube.com/watch?v=qHmfMxGST7A

Please also read the documentation on surface tracking https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking

user-ae4005 04 June, 2020, 14:41:35

Okay, I'll check it out! Thanks πŸ™‚

user-7daa32 05 June, 2020, 00:20:24

Hi @user-7daa32 the demo recording for Pupil Player (found in the docs about Pupil Player: https://docs.pupil-labs.com/core/software/pupil-player/) shows an example of a multi-surface setup for surface tracking. Here's the direct link to the recording: https://drive.google.com/file/d/1nLbsrD0p5pEqQqa3V5J_lCmrGC1z4dsx/view?usp=sharing @user-c5fb8b thank you I got one done like this.

user-7daa32 05 June, 2020, 00:22:35

@user-7daa32 At 0:03 you can see that the heatmap is not setup correctl. Click "edit surface" in the top right and adjust the handles to contain the complete display. Also, in the menu on the right, set a size (e.g. the resolution of your display) for the heatmap. @papr thank you for this too

user-8eaeb3 05 June, 2020, 00:35:24

Hello, I'm trying to assemble a Pupil Core device following the documentation posted on the website and was wondering if anyone could provide clarity on the soldering for the eye-camera. I understand that the IR LEDs need to be soldered onto the PCB but the exact placement of LEDs is not very clear from the posted video. I was also wondering if there is any specific reasoning for using the Microsoft HD 6000 web camera for the eye camera. Would the same process of implementing IR image recording be possible using other web cameras?

wrp 05 June, 2020, 01:41:49

@user-8eaeb3 I see that you are assembling a Pupil DIY headset. It is possible to use other eye cameras, however the eye cam mounts for the DIY headset are designed for the HD 6000. The eye camera mount design and frame design has not been updated for other cameras. LEDs should be positioned in order to uniformly illuminate the eye region (as much as possible). Glints from IR illuminators are not leveraged by Pupil Core software, so just try to get good illumination of the eye region. You could use other cameras that are UVC compliant with Pupil Core software, but will need to manually select cameras in Pupil Capture software.

wrp 05 June, 2020, 01:42:49

@user-8eaeb3 out of curiosity, what type of research/applications are you looking to do with Pupil DIY?

user-8eaeb3 05 June, 2020, 01:52:15

@wrp Thank you for your reply! Would choosing a different camera model and manually selecting the camera have any effect on the accuracy of the data? I am working on a project studying memory deficits after laser oblation in epileptic patients. And was hoping to compare data from eye movements within a HMD VR system to eye movements when viewing stimulus images on a screen using the eye tracking data from the Pupil DIY.

wrp 05 June, 2020, 01:59:29

@user-8eaeb3 Welcome! Regarding different eye cameras vs accuracy. The first and most important part for accurate pupil detection is getting a good image of the eye(s). From a hardware standpoint there are many too many variables to give an easy "yes" or "no" answer. Getting a good image of the eye depends on: position of the eye camera, lens distortion/quality, frame rate of the eye camera, resolution, sensor quality/noise, illumination, etc...

From a software Pupil Core software should be able to support different UVC compatible sensors out of the box (or without too much additional work - @papr or @user-c5fb8b please jump in here when online if there is more that needs to be noted).

Pupil DIY is really intended for those that are prototyping and have the time to experiment both with hardware and software. We surely want to support those who are prototyping/learning/tinkering, but if you are time limited/want to ensure robust pupil detection, then you might want to consider getting Pupil Core.

user-8eaeb3 05 June, 2020, 02:11:19

@wrp Thank you! We will definitely consider getting a Pupil Core. We were hoping to work with some cameras we had on hand to gather some initial data and begin working with the Pupil Core software, but I do believe we will eventually purchase a Pupil Core for the project.

wrp 05 June, 2020, 02:27:53

@user-8eaeb3 Great! Looking forward to seeing what you are able to achieve with the DIY setup.

user-7d0b66 05 June, 2020, 18:27:08

@papr I've tried to send the (raw)recordings to data@pupil-labs.com but they're too heavy to be sent via e-mail. How can I share it then?

papr 05 June, 2020, 18:29:00

@user-7d0b66 we usually recommend using file sharing services like Google drive, Dropbox, WeTransfer etc

user-7d0b66 05 June, 2020, 19:04:21

@papr Do I just send you guys a shareable link via-email?

user-7d0b66 05 June, 2020, 19:04:34

so you can have access to it

papr 05 June, 2020, 19:07:15

@user-7d0b66 correct

user-499cde 06 June, 2020, 20:32:39

Hello @papr , Hope you are doing well. Is there any source where the weight of the pupil core eye camera is mentioned?

papr 06 June, 2020, 20:33:22

@user-499cde The core headset weights 22.75g https://pupil-labs.com/products/core/tech-specs/

papr 06 June, 2020, 20:34:17

@user-499cde Do you need to know what a single eye camera weighs? I don't know if our website mentions that. Instead I have measured the eye camera of my personal headset which weighs 3.8g

user-499cde 06 June, 2020, 21:13:41

Oh yes, very well. Thank you very much

user-499cde 06 June, 2020, 21:14:05

I was looking for the weight of single eye cam

user-7d4a32 07 June, 2020, 09:44:26

Hello, I am using calibration markers for offline calibration. I am wondering if it will be possible to know the exact x,y coordinate (normalized or not, doesn't matter), using pupil player?

papr 07 June, 2020, 09:46:07

@user-7d4a32 Do you mean if Player is able to detect the coordinates? Or do you mean if it is possible to read out the detection result?

user-7d4a32 07 June, 2020, 09:47:48

@papr Is it possible to know where the markers are with respect to x,y coordinates? So yes, if the Player is able to detect the coordinates but of the markers specifically

papr 07 June, 2020, 09:48:56

@user-7d4a32 Yes, in the offline calibration menu there is a button to detect the Reference Locations

user-7d4a32 07 June, 2020, 09:50:09

@papr Thank you! You just saved us from hours of googling.

papr 07 June, 2020, 09:51:53

@user-7d4a32 Have you seen our video tutorial on post-hoc pupil detection and calibration already? It explains the complete work flow πŸ™‚ https://www.youtube.com/watch?v=_Jnxi1OMMTc&list=PLi20Yl1k_57rlznaEfrXyqiF0sUtZMMLh

user-7d4a32 07 June, 2020, 10:03:58

@papr Thanks for the reference! We just checked it out. Curiously, is it possible to perform Gaze Prediction Accuracy after offline calibration? Meaning to simulate clicking "T" in the pupil player?

papr 07 June, 2020, 10:07:05

@user-7d4a32 Capture calculates the gaze accuracy after calibration (C) and validation (T). The difference is, that the validation uses (or rather should use) other locations in the subject's field of view than the calibration to test how accurately the calibration generalizes. In Player, you have to tell the validation in which time range it should be validating.

user-7d4a32 07 June, 2020, 10:17:42

@papr Thanks for everything!

user-331121 07 June, 2020, 17:02:49

Hi @papr , Is there anyway to export the video of the offline pupil detection 3D model (Algorithm mode) directly from pupil player software? Or an ability to play frame by frame ? Currently we can play at 0.25 but I wanted it to be more slower. Any suggestion would be great. Thanks

papr 07 June, 2020, 17:06:37

@user-331121 Did you know that you can pause the offline pupil detection from the menu? Unfortunately, the offline pupil detection does not have a frame-by-frame mode nor does the eye video exporter export the algorithm view. It can export the detection result visualization (red pupil ellipse + green eye model outline) though

user-331121 07 June, 2020, 17:21:15

I normally can not pause in the frame where I'm interested in as the speed is fast. I wanted additional information beyond 2D ellipse and 3D model. Anyways, thank you for your reply.

user-8f829f 08 June, 2020, 14:24:54

Hi I shot a country driving video with Pupil Invisible, however I find that I need to calibrate Gaze for longer distance view. I am a driving instructor so accuracy for where the driver is looking is the key point for the video. Centre and side mirror looking gaze are fine but for country/highway driving looking further ahead is important to show (especially on corners) but the recorded video shows that the gaze is way up as if I was looking up the sky. I tried to follow the Youtube video on Post-hoc gaze mapping validation but I got stuck in reference location step. I did not shoot circular reference check video before driving so I tried manual edit for referencing location but the player runs forever when I clicked "Calculate all calibration and mapping" and I have a message in status bar like, "Not enough referencing or pupil data available" message. The total video is about 40 mins. What is best way to calibrate for pupil invisible when reasonable degree of accuracy is needed? Thanks for help.

papr 08 June, 2020, 14:34:09

@user-8f829f Hi, the offline/post-hoc calibration in Pupil Player cannot be used for Pupil Invisible recordings as it requires pupillometry data which is not available for Pupil Invisible recordings.

user-8f829f 08 June, 2020, 14:36:41

hmm okay I see, does sun glare affect the gaze accuracy? I had sun straight up facing me for the video. Before recording, what practice could improve overall video quality for my driving scene with Invisible if more gaze accuracy is required?

user-8f829f 08 June, 2020, 14:46:13

Just now I increased the Vis cicrcle so it looks better on playback now, maybe it was bit smaller before, other question I have is that I put anti slip nose support sillicone patch on glass' nose bridge, does this cause much issue with Invisible's performance? Is my over the horizon gaze issue coming from this?

papr 08 June, 2020, 14:48:02

@user-8f829f Please be aware that this channel is dedicated for Pupil Core specific questions. I think my colleagues over at πŸ•Ά invisible are better suited to answer your questions. πŸ™‚

user-8f829f 08 June, 2020, 14:48:57

ah sorry about that πŸ˜…

papr 08 June, 2020, 14:49:11

Don't worry πŸ™‚

user-20b83c 09 June, 2020, 06:46:08

hello :). I am using Core, and i am getting a problem in frame rate. We have set the eye frame rate 120Hz, but we get the eye video with about 30Hz frame rate. Could you please what the problem is? If you need anything, i will upload it. thank you.

papr 09 June, 2020, 07:03:31

@user-20b83c hi, this is likely due to insufficient CPU resources. Which CPU does your computer use?

user-20b83c 09 June, 2020, 07:36:00

@papr we are using Inter(R) Core(TM) i7 CPU [email removed] and 8GM ram

papr 09 June, 2020, 09:05:05

@user-20b83c 1.6 GHz is comparably low. I fear that this is the cause of your low pupil detection frame rate.

user-20b83c 09 June, 2020, 12:41:49

@papr Then what is the minimum requirement for 120Hz pupil frame rate?

papr 09 June, 2020, 13:35:21

@user-20b83c It is difficult to tell what the bare minimum is as we do not have the resources to test a bunch of different CPUs. But I can tell you that I am able to run Pupil at 200Hz on my MacBook (early 2015) with a 2,7 GHz Dual-Core Intel Core i5.

papr 09 June, 2020, 13:36:46

Please be aware that the actual performance may vary depending on your system's current load, e.g. if you run a separate application.

user-c5fb8b 09 June, 2020, 13:39:06

@user-20b83c: also quick question: you are running with the headset connected to the computer via USB, correct? Not via some network streaming setup.

user-7daa32 09 June, 2020, 15:46:46

@user-20b83c It is difficult to tell what the bare minimum is as we do not have the resources to test a bunch of different CPUs. But I can tell you that I am able to run Pupil at 200Hz on my MacBook (early 2015) with a 2,7 GHz Dual-Core Intel Core i5. @papr Hi, Sorry. I am completely lost about the meaning of frame rate as related to the unit ''Hertz''. I usually see CPU graph running nonstop, is that what give the frame rate in Hz? is frame rate also related to the video range? I am using Dell Intel core i5, 8th Gen

papr 09 June, 2020, 15:49:55

@user-7daa32 There is a second graph next to the CPU graph saying FPS. FPS means Frames per seconds. 30 FPS is equivalent to 30 Hz (Hertz). Hertz basically also means per second but is meant for frequencies in general.

papr 09 June, 2020, 16:01:02

is frame rate also related to the video range? Btw, I am unsure what you mean by video range.

user-7daa32 09 June, 2020, 16:02:46

@user-7daa32 There is a second graph next to the CPU graph saying FPS. FPS means Frames per seconds. 30 FPS is equivalent to 30 Hz (Hertz). Hertz basically also means per second but is meant for frequencies in general. @papr ooh thanks! I am familiar with Hz being per second in my Physical chemistry classπŸ˜€. I just understood what is FPS and factors that can affect it. e.g the CPU. when I said video range, I was talking about video adjustment on pupil player---trimming

user-7daa32 09 June, 2020, 16:20:13

Hello, I know there are resources in the pupil lab website for guide. I want to ask if you can please outline few points we should consider to make us have confidence in our data. I have not started the actual research but just looking at objects on a screen. My primary goal is to be able to record people looking at different items and developing a heat map from their gaze pattern. My secondary goal is to be able to record and graph the pupil diameter information during the above process. I think surface tracking will be very important for Pupillometry

user-7d0b66 09 June, 2020, 22:36:58

Why are the png files of my heatmaps being generated like this?

Chat image

user-2be752 10 June, 2020, 04:11:15

Hi there! has anybody dealt with annotations delay between Psychopy and pupil labs? thanks!

wrp 10 June, 2020, 04:12:51

@user-7d0b66 Please enter a Width and Height for each surface. This will determine the aspect ratio for each exported surface/png. See example screenshot of the surface tracker submenu.

Chat image

papr 10 June, 2020, 09:24:49

@user-2be752 There is some natural delay due to the network connection. We compensate this delay by syncing clock between the annotation creator (PsychoPy) and the receiver (Capture).

papr 10 June, 2020, 09:26:14

Therefore, the sending delay is only relevant if you are processing the annotations in realtime, not if you record them with Capture.

user-5b46cf 10 June, 2020, 14:19:12

Hi Everyone, Hoping somebody here might be able to help with a few questions: I'm looking to convert the 3D gaze outputs from pupil core into 2D pixel coordinates (for further processing), and understand i can do this with cv2.projectPoints(), but i'm getting what appear to be pretty unlikely outputs - hoping that by clarifying some things i can figure out where i'm going wrong.

  • First, am i correct in thinking that because the pupil code models a 3D coordinate directly from 3D pupil coordinates, it doesn't produce a 2D coordinate at an earlier stage (that i could draw out instead of converting the 3D gaze position)?
  • When providing inputs to cv2.projectPoints, the rotation/translation matrices for each eye are located in the calibration information (eye_to_world matrices). The rotation matrix can be drawn from the first 3 values in the first 3 rows, and converted to a vector using cv2.Rodrigues() - the translation vector is made up of the final (4th) values in the first 3 rows. Assuming all of this is correct, how should i treat the matrices, given that we only have an average 3D gaze point - not one for each eye. Should i be averaging the vectors, or will applying the vectors individually allow me to calculate 2D pixel gaze coordinates for each eye?
  • I've found that omitting the distortion coefficients (setting them as 0) seems to result in a more reasonable output on the 2D gaze coordinates - is this related to the 3D camera space not including lens distortion? I wasn't sure whether to interpret that as lens distortion 'is' or 'is not' being controlled for?
  • In terms of the raw 3D outputs, am i correct in understanding these are always relative to the center of the camera at the particular frame? There isn't an absolute tracking of distance?

Please excuse any stupid questions among the above - i can neither call myself an expert in coding or camera vision. Any help would be appreciated.

Thanks, Sam.

papr 10 June, 2020, 14:21:07

@user-5b46cf Actually, the Pupil software projects the 3d gaze result back into normalised image coordinates (norm_pos). πŸ™‚ No need to do this yourself. The software uses the active camera intrinsics for that.

user-7d0b66 10 June, 2020, 14:44:13

It seems like the heatmap is not responding to my gaze attention as it should, the colored spots are static and not moving according to where my visual attention is being directed. Why is that?

Chat image

user-7d0b66 10 June, 2020, 14:44:27

Also, what does that big red triangle on the screen represent?

papr 10 June, 2020, 14:45:55

@user-7d0b66 Heatmaps in Player are static and aggregate all gaze that is available within the trim marks. The triangle shows the direction of the surface.

Chat image

user-c5fb8b 10 June, 2020, 14:50:12

@user-7d0b66 normally you want a static heatmap for your analysis, because a heatmap is essentially gaze data independent of the timestamp. It can also better be used for images in reports and similar. As @papr said you can use the trim marks of the timeline to adjust the input data used for the calculation of the heatmaps.

user-5b46cf 10 June, 2020, 14:50:58

@papr Ah! That's great - so we get that at the pupil level data. Am i correct in thinking that's in a coordinate system with the origin at the bottom left of the world view, with 0.5,0.5 at center?

user-c5fb8b 10 June, 2020, 14:51:54

@user-5b46cf please refer to the documentation in regards to the used coordinate systems :) https://docs.pupil-labs.com/core/terminology/#coordinate-system

user-c5fb8b 10 June, 2020, 14:52:32

But in short, yes this is correct in your case!

user-5b46cf 10 June, 2020, 14:53:46

@user-c5fb8b Thanks! Thanks for the help @papr

papr 10 June, 2020, 14:55:45

@user-5b46cf Please be aware of the difference between pupil and gaze data. They live in different camera coordinate systems

papr 10 June, 2020, 14:55:55

But both have a norm_pos field.

user-5b46cf 10 June, 2020, 15:16:51

@papr thanks for the reminder! So, for clarification and future use (it's not essential now), if we wanted to calculate a gaze position per eye, we would need to look at the pupil data and project this to the world camera ourselves? Sorry to be a hassle - just want to make sure i understand what i'm doing before i mess up someone else's research!

papr 10 June, 2020, 15:19:56

@user-5b46cf You basically need to map the per-eye 3d pupil position into 3d gaze positions and then back-project them into the scene image plane

papr 10 June, 2020, 15:20:22

With our upcoming 2.0 release this will be easy to realise.

user-5b46cf 10 June, 2020, 15:21:48

@papr Great stuff! We were only intending to work with average gaze points on this occasion anyway, so i'll keep an eye out for that.

user-5b46cf 10 June, 2020, 15:22:00

Again, thanks for all the help!

user-2be752 10 June, 2020, 17:37:50

@papr thanks, I do not need annotations in real time, I have annotations sent through Psychopy to know image onset and so and so. You said there shouldnt be a lag because of the time sync, is this time sync something I should implement or does it come automatically? Thanks!

papr 10 June, 2020, 17:47:49

No, you need to do this explicitly. The easiest way is by using the T command of Pupil Remote. Please check out our documentation on time sync for details.

user-2be752 10 June, 2020, 18:31:44

okay, I will do this from now on, any ideas on how do fix this on already collected data? I would like to not loose this data πŸ€ͺ 😩

user-ae4005 11 June, 2020, 10:26:29

Hi there, I've been trying to add a surface in order to run the mouse_control.py script. Pupil Capture does not seem to recognize most of the markers and is very unstable so that I can't edit the surface or add markers either. I added a screen shot of my set-up for reference. I'm not sure what I'm doing wrong, any ideas on what I'm doing wrong here?

Chat image

user-c5fb8b 11 June, 2020, 11:02:54

@here πŸ“£ Announcement πŸ“£ - We just released Pupil Core software v2.0. Download the apps here: https://github.com/pupil-labs/pupil/releases/tag/v2.0#user-content-downloads

With v2.0 and beyond, we will be focusing on making Pupil Core more core, making it easier to use and to extend for you and easier to maintain for us. There are a ton of exciting improvements and changes we made, please visit the release notes for more details and our product vision: https://github.com/pupil-labs/pupil/releases/tag/v2.0

We look forward to your feedback!

user-ae4005 11 June, 2020, 11:31:12

It's me again. I downloaded the new version and I was able to create a surface, edit it and add markers πŸ₯³ But I keep getting the same kind of error (also in other pupil_helpers scripts) and I'm not sure how to solve it. This is the line that causes the error: gaze_position = loads(msg, encoding="utf-8") and this is the error: TypeError: unpackb() got an unexpected keyword argument 'encoding'

Does anyone know what the problem is?

papr 11 June, 2020, 11:32:27

@user-ae4005 Which version of msgpack do you have installed? Please use 0.5.6. You can install it via pip install msgpack==0.5.6. (Please be aware that you might need to use pip3 depending on your setup.)

user-c5fb8b 11 June, 2020, 11:33:48

@user-ae4005 also since you said you downloaded the new version: are you running Pupil from bundle or from source?

user-ae4005 11 June, 2020, 11:42:09

@papr Thank you! I was using version 1.0 and changed now. It seems to be running now but the cursor is not accurately following my gaze yet, might be a calibration issue though.

papr 11 June, 2020, 11:43:09

@user-ae4005 @user-c5fb8b will follow up with tips regarding surface tracking. Could you post an other picture of the current state? The first image seems to be overexposed.

papr 11 June, 2020, 11:44:34

@user-2be752 You will have to potentially realign the recorded timestamps yourself. You can do that but creating a custom annotation (e.g. SYNC) in Player (it will always use the recorded Pupil time) at a point in time in which you know you should be seeing one of the recorded annotations.

So for example, you have an experiment where you show a traffic light and your program sends annotations when the traffic light changes its state, e.g. red->green. Then you add the SYNC annotation at the moment in time where the traffic light changes from red to green in the video. This way you will have two annotations for he same event, but with different timestamps. Calculate the difference between these timestamps and apply it to all other recorded annotations. This way you can correct all annotations.

user-ae4005 11 June, 2020, 11:45:04

@user-c5fb8b I'm not sure what you mean, sorry still very new to all of this. I'm just using pupil capture and running the mouse_tracking code from python.

user-ae4005 11 June, 2020, 11:45:20

@papr Great, thanks again!

papr 11 June, 2020, 11:45:57

@user-ae4005 Don't worry about his question. He was only concerned that your issue was due to the new release which it is not. πŸ™‚

user-ae4005 11 June, 2020, 11:47:41

@papr Okay πŸ™‚

user-ae4005 11 June, 2020, 11:48:44

This is what it looks like now

Chat image

user-c5fb8b 11 June, 2020, 11:54:38

@user-ae4005 it looks like your markers could be a bit small. Does the detection get more stable when you are closer to the markers?

The optimal marker size depends on the distance to the headset, you should try out different sizes and find a size that works stable in the range you are using in your experiment.

Additionally: when you have Capture open and record the screen, you will see the same marker multiple times, which can interfere with marker detection. Optimally you would minimiza capture or move it to a second monitor while recording your target monitor.

And as papr said the image seems a bit overexposed, please try adjusting the exposure settings in the Video Source menu, either switching to auto exposure or adjusting the exposure time.

user-ae4005 11 June, 2020, 12:15:31

@user-c5fb8b Yes, the detection seems more stable when I'm close to the markers so I'll try printing them out bigger. Thank you for all the other pointers, I'll adjust all of it and hope it'll work better then πŸ™‚

user-ae4005 11 June, 2020, 12:29:02

@papr @user-c5fb8b So it's all up and running now, thanks for the support! πŸ™‚ The mouse control seems to be a bot off still though. It seems to have a bit of an offset to the right, although it does fixate the start button perfectly when I look at it (which is on the bottom left). Does that have to do with my calibration?

papr 11 June, 2020, 12:31:08

@user-ae4005 Yes, there will always be some degree of error. Check out our best practices for a few tips in this regard https://docs.pupil-labs.com/core/best-practices/

user-ae4005 11 June, 2020, 12:32:09

@papr Will do, thanks again!

user-7daa32 11 June, 2020, 14:38:37

I feel like keeping the old version of pupil core software. can I have it together with the new one in my system?

papr 11 June, 2020, 14:39:16

@user-7daa32 Yes, you can πŸ™‚

user-7daa32 11 June, 2020, 14:46:41

After downloading for window, here is what i got

Chat image

papr 11 June, 2020, 14:47:14

To open the RAR-archive on Windows, you will need a decompression software, e.g. WinRAR. https://www.win-rar.com/predownload.html?spV=true&subD=true&f=winrar-x64-590.exe

user-7daa32 11 June, 2020, 14:55:12

Thanks. Because I want to keep both old and new files, what are the best options to select here?

Chat image

papr 11 June, 2020, 14:55:51

@user-7daa32 Are these WinRAR options?

papr 11 June, 2020, 14:57:45

After installing WinRAR, you just need to right click the downloaded Pupil release (ending on .rar), and unarchive/extract the file it includes (.msi). Double click the .msi file and follow the installer instructions.

papr 11 June, 2020, 14:58:21

Afterward you can delete both, the msi as well as the rar file.

user-c5fb8b 11 June, 2020, 15:00:12

@user-7daa32 also from the release notes: https://github.com/pupil-labs/pupil/releases/tag/v2.0

Improved Installation Workflow on Windows - #1853

We have wrapped Pupil in a Windows Installer package (MSI) in order to simplify the Windows workflow.

By default, all 3 apps (Capture, Player, and Service) will be installed in C:\Program Files (x86)\Pupil-labs\Pupil v<version>. All apps will also get a start-menu entry, making it much easier to open Pupil. Installed versions of Pupil can be removed in the Windows Uninstall Settings.

New versions of Pupil will be installed alongside older versions and you can choose which to start. Switching versions will still overwrite your user settings as previously.

user-7daa32 11 June, 2020, 15:13:44

Thank you. I have all apps installed automatically into desktop. This means that I dont need to open file to run pupil apps any more but just click the the already installed

user-c5fb8b 11 June, 2020, 15:15:18

correct! πŸ™‚

user-7daa32 11 June, 2020, 15:17:14

However, I dont know the importance of pupil service, why is it not opening?

user-c5fb8b 11 June, 2020, 15:18:02

Do you need to use Pupil Service for anything?

user-7daa32 11 June, 2020, 15:19:33

I dont think. thanks

user-430fc1 12 June, 2020, 20:58:10

Hi all, is there any documentation on notifications and what can be done with them? Also, is it possible to set the "Auto Exposure" mode of the worldcam to "Manual" with a notification? If so, what subject / name / args would be required? Thanks in advance!

user-c5fb8b 15 June, 2020, 07:44:39

Hi @user-430fc1 I'm afraid we don't have a documentation for the notifications. ~~There is currently no notification for changing UVC properties (which the exposure mode belongs to).~~ EDIT: see below

The general idea is that every plugin can send notifications and respond to notifications. You can also send notifications via the Network API, see this example: https://github.com/pupil-labs/pupil-helpers/blob/master/python/pupil_remote_control.py Every notification has a topic and can contain potential payload data. The payload data has to be serializable, so not every Python object will work. Very often we don't send any data and treat notifications as single "event" that happened. Sending notifications can be done from anywhere within a plugin with:

self.notify_all({
  "subject": "your-notification-topic",
  # add more key-value pairs here for custom payload
})

This will broadcast the notification to all other plugins.

You can react to notifications by implementing

def on_notify(self, notification):
  # ...

in your custom plugin.

If you want to find out more, I'd recommend you open the codebase and search for .notify_all( and def on_notify( to find out which plugins send or receive notifications.

user-c5fb8b 15 June, 2020, 08:18:51

@user-430fc1 I have to correct myself, there is actually a way of setting the exposure mode via notification: You can start any plugin remotely with custom arguments via the notification:

{
  "subject": "start_plugin",
  "name": "<plugin-name>",
  "args": <dict-with-arguments>,
}

The UVC_Source plugin is the plugin handling the UVC connection and it accepts uvc_control as argument in __init__ for starting with custom UVC settings. With that you should be able to basically restart the UVC_Source with modified UVC settings. You will need to do some digging in the code to figure out the necessary startup args however.

user-151c9e 16 June, 2020, 03:33:41

Hello, does pupil work with a computer integrated webcam?

user-151c9e 16 June, 2020, 03:34:27

I am looking for eye tracker software that can work with a webcam

user-c5fb8b 16 June, 2020, 07:11:14

Hi @user-151c9e I assume you are looking for a remote eye-tracker, where the eye tracker is positioned at a computer? Unfortunately, Pupil only offers head-mounted eye-tracking, i.e. the eye-tracker has to be worn on the subject's head.

user-430fc1 16 June, 2020, 07:28:47

@user-c5fb8b thanks, that’s very helpful. I’ll get digging πŸ™‚ I have one more question... about latency. I have developed a routine for measuring the pupil’s light reflex with Pupil Core. It uses the worldcam to detect the onset of a light and sends an annotation with the associated pupil time stamp to assist with calculating time-critical measures like latency to constrict, time-to-peak constriction, etc. It seems to work well, as the annotation is stamped on the first frame in Pupil Player where the light becomes visible during playback. I’m just wondering what sort of latency this might have from the actual event? Given that the Pupil Core tech specs quote 8 ms camera latency and >3ms (depending on hardware) processing latency, and that I am using world cam with 120fps, am I right in thinking it would be somewhere in the region of 15 plus-or-minus 4 ms?

user-ab6a19 16 June, 2020, 08:04:41

Hello, I'm a researcher at NTU and we are interested in using this product in a wearable robotic system. I noticed that there is a configuration option to add a USB C mount. Is there a list of compatible cameras? Also, has anyone added a depth sensor to this setup? Thanks.

user-20b83c 16 June, 2020, 09:18:17

hello, could you please tell me that the power of IR light source for eye camera? I want to add same power IR light source for our experiments. thank you πŸ˜„

user-6bd380 16 June, 2020, 10:02:47

Hello, my fixations.csv file is coming blank. Anyone can help me regarding this why it is like that?

user-6bd380 16 June, 2020, 10:03:42

Thank you in advance

Chat image

user-6bd380 16 June, 2020, 10:03:56

Chat image

user-c5fb8b 16 June, 2020, 10:05:52

Hi @user-6bd380 did you calibrate before/while recording? The fixation detector works on calibrated gaze only.

user-c5fb8b 16 June, 2020, 11:26:23

Hi @user-ab6a19 Pupil Capture supports third-party USB cameras that fulfill the following criteria: 1. UVC compatible: http://www.cajunbot.com/wiki/images/8/85/USB_Video_Class_1.1.pdf (Chapters below refer to this document) 2. Support Video Interface Class Code 0x0E CC_VIDEO (see A.1) 3. Support Video Subclass Code 0x02 SC_VIDEOSTREAMING (see A.2) 4. Support for the UVC_VS_FRAME_MJPEG (0x07) video streaming interface descriptor subtype (A.6) 5. Support UVC_FRAME_FORMAT_COMPRESSED frame format

Regarding depth sensors: Previously, we supported multiple versions of depth cameras from the Intel RealSense family. Unfortunately, maintaining support for the required third-party libraries was not reasonable for us. Currently we do not officially support these cameras anymore. You might be able to integrate any depth sensor into Pupil by writing a custom plugin, but this will require some serious effort.

user-c5fb8b 16 June, 2020, 11:43:04

Hi @user-20b83c the IR light source we are using is the SFH 4050, you should be able to find the specs that you need in the official datasheet:

SFH_4050_EN.pdf

user-20b83c 16 June, 2020, 15:02:23

@user-c5fb8b Thank you so much!

user-6bd380 16 June, 2020, 23:33:32

@user-c5fb8b yes I did the calibration before start recording

user-292135 17 June, 2020, 00:19:39

Hi, I am new to core. I am looking for a way to record external mic audio via stereo mini plug of android phone with Pupil mobile. I cannot find a way to alter mic from built-in to external mic. Is there any way to do it? The only description of audio source on Pupil mobile I found is this. https://discordapp.com/channels/285728493612957698/285728493612957698/632110119588593664

user-6bd380 17 June, 2020, 06:47:39

@user-c5fb8b thank you for your concern on my problem, I could manage to resolve, now its fine .

user-051673 17 June, 2020, 09:23:55

@user-4fb664 Hi feisal -just wondering how you went getting pupil labs software running on fedora? If you got it installed and working, what did you do? Any [email removed]

user-c5fb8b 17 June, 2020, 10:33:02

Hi @user-292135 Pupil Mobile will try to use your default microphone. Did you try configuring the default microphone in android? Which kind of external microphone are you using? Bluetooth or via cable?

user-292135 17 June, 2020, 10:44:30

Hi @user-c5fb8b , thanks for your reply. I cannot find default microphone setting in my phone ( Android 8 on Oneplus6, Pupil Mobile bundle ) Recorder app detects automatically external mic but Pupil Mobile not. I use a white iPhone earphone with mic for testing.

user-292135 17 June, 2020, 10:44:59

App version is 1.2.3

papr 17 June, 2020, 10:57:14

@user-292135 I had a look at the source code and if I see this correctly, the app should detect and use wired headsets automatically (as they become the default microphone when connected to the phone). Please make sure to connect the headset before starting the app. I do not if the app is able to handle mic changes while it is running / recording. You can stop the app in the settings.

user-292135 17 June, 2020, 11:24:21

set a headset, stop the app from the setting, and retried but failed. Rebooting also failed. Any other suggestion? Recorder app correctly captures a headset. Thanks

user-292135 17 June, 2020, 11:26:54

I am monitoring audio input level by tapping audio input -> monitoring the change of width of the circle. Is this ok?

papr 17 June, 2020, 12:17:30

@user-292135 and tapping the built-in mic produces bigger circles than tapping on the external mic?

user-292135 17 June, 2020, 21:12:16

Yes.

user-6bd380 17 June, 2020, 23:52:24

hello,

user-6bd380 17 June, 2020, 23:55:58

I cant find the option for 'edit surface'. I am attaching the image here . Actually I cant see any marker or line to adjust for defining my AOI. Is it because I didn't use the Apriltag Markers on my screen? One more thing I am doing it offline on pupil player.

Chat image

user-6bd380 17 June, 2020, 23:58:26

Any help regarding this would be very helpful.

user-c5fb8b 18 June, 2020, 06:51:41

@user-6bd380 Yes, you will have to place markers in your environment. Every surface needs to have multiple markers assigned. Please have a look at the example recording we provide in the docs for Pupil Player, it shows a setup with multiple surfaces defined. Although the example does not feature a screen-based setup, the idea is the same: you would place multiple markers around the screen. Example recording: https://drive.google.com/file/d/1nLbsrD0p5pEqQqa3V5J_lCmrGC1z4dsx/view?usp=sharing

user-03a2fe 18 June, 2020, 09:05:05

@user-c429de what Pupil Core hardware are you using? I assume that you are using the most recent version of Pupil Capture, correct? Also after doing sudo usermod -a -G plugdev $USER you will need to log out/log in for the changes to be applied IIRC. @wrp Thanks a lot! I had the same problem and this worked like a charm (Ubuntu 18, new V2.0 release, bundled version)

user-6bd380 18 June, 2020, 12:43:41

@user-c5fb8b Thank you for your reply. Since I already done this mistake and collected data on many participants. My last hope/question is there any way still can I define AOI ?else my all data will go in garbage and I have to begin it again from 0

user-6bd380 18 June, 2020, 12:54:10

If there is any possible way kindly suggest me. These data are very precious.

user-c5fb8b 18 June, 2020, 13:25:29

@user-6bd380 I'm afraid Pupil Player does currently not offer any surface analysis tools without the use of markers. Can you explain which metrics you are interested in? Maybe we can suggest alternative approaches for you.

user-6bd380 18 June, 2020, 15:04:54

@user-c5fb8b I didn't get in what context you ask for metrics what I understood (in terms of analysis) I am interested in quantitative metrics. Any possible approach would be helpful. I will give my best effort to get it done .

user-c5fb8b 18 June, 2020, 15:07:51

@user-6bd380 I meant to ask what analyses you had plannend with the surface tracker.

user-6bd380 18 June, 2020, 15:15:55

@user-c5fb8b My primary aim/interest is calculating the fixation duration of AOI. Secondary, aim is to do the analysis for saccades to find the sequence of eye movement on the surface.

user-14d189 19 June, 2020, 01:00:44

Hi all, I have a question to the Accuracy visualization. after a calibration the accuracy and precision and the data sample are presented on the monitor and I would guess it will be recorded in the log file of the calibration recording. Is that data somehow accessible later? after several other recordings with the same calibration. And if so how could I easily access it?

user-c5fb8b 19 June, 2020, 06:18:46

@user-6bd380 can you give an example for the AOI you are interested in?

user-6bd380 19 June, 2020, 06:22:17

@user-c5fb8b I am presenting images to participants having grieving person either individual or many. I want to estimate the fixation duration only on face. In face I want to divide it in two parts 1)Eyes 2)Lips area.

user-c5fb8b 19 June, 2020, 06:24:24

@user-6bd380 ok, one more question: are you using a Pupil Core or Pupil Invisible headset?

user-6bd380 19 June, 2020, 06:24:24

In the second part of my analysis I want to estimate the fixation proportion for human body and back ground. Whether the number and duration of fixation is more on human body or more on the background information (considering it as contextual information)

user-6bd380 19 June, 2020, 06:25:26

@user-c5fb8b Pupil core

user-c5fb8b 19 June, 2020, 06:32:01

@user-6bd380 I am afraid what you are trying to do is beyond Pupil's capabilities, even when using apriltag markers. If you want to do this analysis automatically, you would need some sort of face detection algorithm (or human pose detection for the second part). We do not offer any tools for this. With Pupil's surface tracker you can only track rectangular, planar surfaces. I will speak to our research team and ask if they have any recommendations for you.

user-2be752 19 June, 2020, 06:34:35

hi @user-6bd380 just chiming in that I've done this kind of analysis of the data I have collected from Pupil Labs, and what I've done is I use one of the many tensorflow object-detection algorithms (which can detect faces, eyes, mouth, etc.) on the world videos and then you can map the fixations output by pupil player and see if they fall within the coordinates output by the object-detection algorithm. I have found it works pretty nicely. I hope it helps πŸ™‚

user-6bd380 19 June, 2020, 06:42:38

@user-2be752 thank you so much for your suggestion. I am not good at writing codes but definitely I will try.

user-e70d87 19 June, 2020, 19:33:27

Hey πŸ˜„ Does anyone have experience running the Pupil software in Ubuntu 20?

I know Ubuntu 18 is recommended, but I am trying to figure out if it is worth rolling back

papr 19 June, 2020, 19:33:55

@user-e70d87 The bundle should work fine on Ubuntu 20

user-3ede08 19 June, 2020, 19:37:31

Hey! Why do the timestamps values start with high numbers (1337471.275137 1337471.283206 ... ). What does the integer part of the number (1337471) stand for ?

papr 19 June, 2020, 19:38:51

@user-3ede08 Read more about timing here https://docs.pupil-labs.com/core/terminology/#timing

user-e70d87 19 June, 2020, 19:40:38

It's in epoch time, which counts up the number of seconds it has been since 1/1/1970 in London.

It might seem silly, but it's my favorite time format because there is no need to consider timezone and you don't need to format the string to calculate duration (as you would with a HH:MM:SS.00 format)

papr 19 June, 2020, 19:41:06

@user-e70d87 Actually, it is not by default synced to the unix epoch. πŸ™‚ You are right about the number of seconds being the unit though.

user-e70d87 19 June, 2020, 19:42:28

Oh, right! Y'all use Boot time by default, right? i.e. time.time()?

(boot time is actually my LEAST favorite time format, because it is arbirtary and impossible to relate to other timecodes if you didn't log boot time in another format when you made the recording πŸ˜› )

papr 19 June, 2020, 19:43:17

time.time() returns seconds since unix epoch

papr 19 June, 2020, 19:45:02

Pupil time does not have a specified start.

user-3ede08 19 June, 2020, 19:46:38

The Current Unix Timestamp is about 1592595772 seconds since Jan 01 1970. (UTC). Which is a litte bit different to 1337471.275137 πŸ€”

papr 19 June, 2020, 19:46:59

Please see our messages above πŸ™‚

user-3ede08 19 June, 2020, 19:49:15

ok, thanks.

user-e70d87 19 June, 2020, 19:56:19

@papr Respectfully, that's impossible. A timestamp will always have a Zero time. That Zero might be based on the time the recording started (record time), the time since the computer rebooted (boot time), time since 1/1/1970 in the UTC timezone (epoch time), or time since 1/1/1970 in the local time zone (this one shouldn't exist, but I've seen it).

I think you mean that Pupil time is based on record time, which is the most intuitive, but also problematic because if I am recording two things on the same computer, it is impossible to synchronize the time streams. If both systems are recording in Unix Epoch time (based on the same system clock), synchronization is trival. If one is using Epoch time and the other is using Record time, synchonization is impossible if you did not log the moment that the recording started in the Epoch time format.

papr 19 June, 2020, 19:59:09

@user-e70d87 Sorry for the misunderstanding. Of course there is a zero time. I just wanted to say that the epoch (zero time) is not defined by a specific event, i.e. it is possible that it is seconds since boot but it is not guaranteed.

papr 19 June, 2020, 20:00:34

And no, it is definitivly not recording time unless you change the time base to 0 before starting a recording. Pupil Capture will not change its time base unless you use time sync or set it via a plugin or the network api.

user-e70d87 19 June, 2020, 20:03:19

I think I did verify that it was Boot time a while back.

I also recall a conversation we had years back that ended in you adding a record of the time that the recording started using the system clock (and floating point precision for the seconds). I checked a while back and couldn't find that number, but I'm hoping y'all moved it to a different log file (it's been a while since I've dug deep into raw data. I'm still working to publish some recordings I gathered in Jan 2018 πŸ˜› )

papr 19 June, 2020, 20:05:18

It might be the case for a specific OS, but the monotonic clocks might differ by OS. The info file saved in every recordign contains the start time in unix epoch and pupil epoch. You can use it to calculate the difference between both clocks and therefore sync the recording to unix epoch after the effect.

user-e70d87 19 June, 2020, 20:05:46

That's perfect, thanks!

user-370594 19 June, 2020, 20:42:47

Hey, do you have the audio recording function in the newest version of the soft(for Wind10)? In the General Settings I use Audio mode "sound only". But there is no file of the audio. As I remember somewhen I saw the plugin "Audio capture" (maybe in the previous version) but don't see it now.

papr 19 June, 2020, 21:07:18

@user-370594 The feature has been discontinued. You can find more information about it in our release notes https://github.com/pupil-labs/pupil/releases/tag/v2.0

user-e70d87 19 June, 2020, 21:31:40

Does anyone have advice on how to get Pupil Capture to actually see the tracker when you plug it in? It always seem to say "Camera not found" to start, and then I can usually get it to work after unplugging/replugging, restarting Pupil (with or without the tracker alraedy plugged in), etc etc.

I can't find any reliable way to make it connect to the tracker every time. Do you have advice here? Is it preferred to start Pupil with the tracker plugged in, or start Pupil and then plug in the tracker? Do you have any advice on how to troubleshoot this issue?

This has been an issue with - Multiple trackers Multiple computers Multiple USB cables Multiple USB ports Every OS (Mac/Windows/Linux)

[Edit - I have just discovered that the 'lsusb' command in linux is helpful for figuring out when my computer has seen the tracker, but that doesn't explain why it doesn't seem to connect every time I plug it in ...)

user-07222c 22 June, 2020, 03:07:33

Hi, I'm just starting out with getting a Pupil Core to run in our lab but am falling at the first hurdle: the Pupil Capture App simply won't open on my Mac. (Latest version of all three apps. MacBook Pro, macOS Mojave 10.14.6, 16 GB RAM). Pupil Player and Pupil Service both launch OK. I can see that Pupil Capture launches and that it creates three processes, but then Activity Monitor shows that one of them has become non-responsive, and no interface ever appears. Here is the console log for Pupil Capture:

2020-06-22 14:51:12,826 - MainProcess - [INFO] os_utils: Disabled idle sleep.
2020-06-22 14:51:13,366 - world - [INFO] numexpr.utils: NumExpr defaulting to 4 threads.
2020-06-22 14:51:13,391 - world - [INFO] launchables.world: Application Version: 2.0.175
2020-06-22 14:51:13,391 - world - [INFO] launchables.world: System Info: User: michael, Platform: Darwin, Machine: Persepolis-3.local, Release: 18.7.0, Version: Darwin Kernel Version 18.7.0: Tue Aug 20 16:57:14 PDT 2019; root:xnu-4903.271.2~2/RELEASE_X86_64
2020-06-22 14:51:13,391 - world - [DEBUG] launchables.world: Debug flag: False
2020-06-22 14:51:14,041 - world - [DEBUG] video_capture.ndsi_backend: Suppressing pyre debug logs (except zbeacon)
2020-06-22 14:51:14,089 - world - [DEBUG] remote_recorder: Suppressing pyre debug logs (except zbeacon)
2020-06-22 14:51:14,102 - world - [DEBUG] pupil_apriltags: Testing possible hit: /Applications/Pupil Capture.app/Contents/MacOS/pupil_apriltags/lib/libapriltag.3.dylib...
2020-06-22 14:51:14,103 - world - [DEBUG] pupil_apriltags: Found working clib at /Applications/Pupil Capture.app/Contents/MacOS/pupil_apriltags/lib/libapriltag.3.dylib
2020-06-22 14:51:14,143 - world - [DEBUG] file_methods: Session settings file '/Users/michael/pupil_capture_settings/user_settings_world' not found. Will make new one on exit.
2020-06-22 14:51:14,143 - world - [INFO] launchables.world: Session setting are from a different version of this app. I will not use those.
user-07222c 22 June, 2020, 03:30:41

So I just tried running the app from the command line, and some more specific information was shown in the Terminal:

Exception in thread Thread-2:
Traceback (most recent call last):
  File "PyInstaller/loader/pyiboot01_bootstrap.py", line 151, in __init__
  File "ctypes/__init__.py", line 348, in __init__
OSError: dlopen(libSystem.dylib, 6): image not found

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "threading.py", line 916, in _bootstrap_inner
  File "threading.py", line 864, in run
  File "pyre/zactor.py", line 59, in run
  File "pyre/zbeacon.py", line 66, in __init__
  File "pyre/zbeacon.py", line 303, in run
  File "pyre/zbeacon.py", line 218, in handle_pipe
  File "pyre/zbeacon.py", line 201, in configure
  File "pyre/zbeacon.py", line 73, in prepare_udp
  File "pyre/zbeacon.py", line 137, in _prepare_socket
  File "pyre/zhelper.py", line 232, in get_ifaddrs
  File "PyInstaller/loader/pyiboot01_bootstrap.py", line 153, in __init__
PyInstallerImportError: Failed to load dynlib/dll 'libSystem.dylib'. Most probably this dynlib/dll was not found when the application was frozen.

My take on that is that the problem lies with the packaging of the application for distribution but I'd certainly appreciate any guidance or suggestions.

papr 22 June, 2020, 07:07:03

@user-07222c I will have a look at it today

user-3ede08 22 June, 2020, 08:42:34

Hey @papr I sent an email to info@pupil-labs.com on the week-end. Please have a look at it. It is about .pldata

papr 22 June, 2020, 08:43:24

@user-3ede08 We have received your email and will follow up via email with an example on how to read the files.

user-3ede08 22 June, 2020, 08:43:52

ohh, thanks a lot.

papr 22 June, 2020, 08:58:24

@user-07222c Let us use your Github issue for any communication in regards to this issue: https://github.com/pupil-labs/pupil/issues/1919

user-430fc1 22 June, 2020, 09:25:35

@user-c5fb8b thanks, that’s very helpful. I’ll get digging πŸ™‚ I have one more question... about latency. I have developed a routine for measuring the pupil’s light reflex with Pupil Core. It uses the worldcam to detect the onset of a light and sends an annotation with the associated pupil time stamp to assist with calculating time-critical measures like latency to constrict, time-to-peak constriction, etc. It seems to work well, as the annotation is stamped on the first frame in Pupil Player where the light becomes visible during playback. I’m just wondering what sort of latency this might have from the actual event? Given that the Pupil Core tech specs quote 8 ms camera latency and >3ms (depending on hardware) processing latency, and that I am using world cam with 120fps, am I right in thinking it would be somewhere in the region of 15 plus-or-minus 4 ms? @user-430fc1 Hello again, I was just wondering if you have had any thoughts on this issue?

user-2143a5 22 June, 2020, 18:34:39

Hi Pupil team - thank you so much for maintaining these chats and being available to answer our questions. I am processing data captured using Pupil Core with audio recording and I have a question about the audio timestamps. For a 23 second recording, we have an audio file with 1,030,144 samples, an audio_timestamps.npy file with 1006 values, and a gaze_timestamps.npy file with 1123 values. I am trying to synchronize the audio with the tracking data, and am unsure how to correlate these three files... Could you point me in the right direction for any related documentation? My biggest question is: How do we use the audio_timestamps values to synchronize audio samples with gaze timestamps? Thank you so much for your help!

papr 22 June, 2020, 20:17:11

@user-2143a5 The audio file should have as many audio frames as timestamps. In your case, each frame holds 1024 samples. 1024 * 1006 = 1,030,144. The timestamp corresponds to the start of the audio frame, i.e. the first sample of the frame. Audio and gaze timestamps are generated from the same clock. Therefore, you can match audio and gaze by pairing gaze points to their closest audio frames.

user-2143a5 22 June, 2020, 20:24:43

Thank you!!! I also noticed a difference between the first value in the gaze and audio timestamp files (approx. 1.505) - is there any special consideration for this, or can I simply add that fixed offset to the timestamps?

papr 22 June, 2020, 20:26:49

@user-2143a5 You should take the timestamp as they are. If there is a difference of 1.5 seconds it means that one sensor started recording 1.5 seconds earlier than the other

user-2143a5 22 June, 2020, 20:27:37

Excellent - thank you so much for your help!

user-1af2b3 23 June, 2020, 02:46:03

hey guys I just started using pupil labs recently, and I had a question about the capture software

user-1af2b3 23 June, 2020, 02:46:47

when the pupil camera displays come up, by default the left eye is right side up while the right eye is upside down. is that supposed to be the case, or is my hardware set up incorrectly? I am having some issues with accuracy and I wanted to figure out if that was the reason

user-1af2b3 23 June, 2020, 02:56:31

^ thats actually me who posted that question lol I was just about to say here that I think that solution posted fixes it. I didnt realize they updated it again so recently

user-1af2b3 23 June, 2020, 02:56:44

thank you Michael

user-07222c 23 June, 2020, 03:07:08

@user-1af2b3 Nice, glad if was useful (if it was, do tick the asnwer as being correct on StackOverflow, which will help future readers). But I'm a brand-new user, so take anything I say with a grain of salt. PS I misread your StackOverflow question - I thought it was posted in 2018, not the 18th of this month, which is why I referred to it as of "historical interest". Have edited it now to remove that bit.

user-1af2b3 23 June, 2020, 03:35:11

haha no worries yeah i just started working with pupil labs recently

user-1af2b3 23 June, 2020, 03:35:28

trying to set up a unity app where i track people's vision while they watch a 360 video

user-1af2b3 23 June, 2020, 03:36:06

where can i find the new 2.0 unity package? it says it should be in the latest release but im having trouble finding it

user-1af2b3 23 June, 2020, 03:46:02

oh nvm i found it

user-3ede08 23 June, 2020, 07:52:51

Hey, I have noticed, that each element of the array in the eye0_lookup.npy file is a tuple with 4 values. The first one is always 0, the second one starts from 0 and increases by 1. The third value is equal to the eye0_timestamps.npy. I don't know how the fourth value behaves. So, my question is: what do the values in the eye0_lookup.npy file correspond to ? Thanks

papr 23 June, 2020, 08:03:13

@user-3ede08 https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/video_capture/utils.py#L449-L458

These are the semantics of the file. It is a cache file that we use for efficient seeking within the video files. Keep in mind that Pupil Mobile and Invisible recordings can have multiple video parts for each camera.

user-3ede08 23 June, 2020, 08:07:22

ok, thanks @papr

user-e33d45 23 June, 2020, 10:11:20

Hi, is it possible to calculate the distance (e.g., in mm or pixel) between a target object (e.g., the center of a surface/area of interest) and the gaze position in a given moment? πŸ€”

user-c5fb8b 23 June, 2020, 13:17:33

Hi @user-e33d45, please note that computing this in mm vs pixel will mean quite different values. With mm I would assume you are are referring to the distance based on the real-world dimensions of your surface. With pixels I would assume you are referring to the distance on the camera image. Do I understand you correctly? If the surface changes it's distance to the camera, the ratio between mm and pixel will change, so you need to keep this in mind.

That being said, you can compute the distance in mm easily when you define the real-world size of your surface in Capture/Player in mm as well. E.g. if your surface is a computer monitor with a size of 550x310mm, you would put Width: 550.0 and Height: 310.0 in your surface settings. Then when you export, in gaze_positions_on_surface_SURFACENAME.csv you will get scaled gaze positions for your surface. These will be relative to the surface corners, so you need to make sure that the corners of the surface in Pupil line up well with your surface in the real world. The x_scaled and y_scaled values in the export are then in mm relative to the surface origin and you can easily compute the different to the surface center.

user-c5fb8b 23 June, 2020, 13:24:28

@user-e33d45 the procedure I described above will only work for gaze that is already on the surface. We have also received your email to info@pupil-labs.com and will follow up with a more general method there.

user-3ede08 23 June, 2020, 13:43:10

Is there any difference if I use pip or conda to install msgpack ?

papr 23 June, 2020, 13:48:16

@user-3ede08 pip by itself only installs python modules from https://pypi.org/ into your system's python site-packages folder. If you use virtual environments, pip will install the modules into the virtual environment's site-packages folder. Conda is more than that. Conda creates virtual environments, too, but with the difference that it also isolates system library dependencies. Pip cannot do that.

Specifically, in case of msgpack, if you are in a conda environment, you can use both.

user-3ede08 23 June, 2020, 13:53:07

Thanks @papr

papr 23 June, 2020, 14:23:28

@user-430fc1 Hey, apologies for the delayed response. I think I need to clarify how timestamps are generated in Pupil in order to respond to your question. We differentiate between hardware timestamps and software timestamps. Hardware timestamps are generated by the camera at the start of the frame exposure. The software timestamps are generated by pyuvc using the system's monotonic clock at the time when the frame has finished transferring from the camera to the computer. The difference between the software and hardware timestamps is what we call camera latency. Camera latency is dependent on frame resolution, as a higher frame resolution requires more data to be transferred from the camera to the computer.

Ideally, we would use hardware timestamps at all times. Unfortunately, we have noticed that the camera and system clocks are not necessarily synchronized at all times and on all OS. Especially on Windows, we have seen major discrepancies. This is very problematic, as every of the three cameras is using its own clock, and if they are not synchronized, pupil data cannot be matched and mapped to gaze properly.

This is why we use corrected software timestamps instead. These are software timestamps from which we subtract a fixed amount of time to compensate the camera latency approximately.

In summary, the recorded timestamps should correspond to the actual time at which the frame was recorded. Therefore, the relevant questions are (1) how accurate the camera latency is approximated and (2) how much it varies. (Unfortunately, I cannot give representative values for this at the time as we were not able to measure the actual camera delay on Windows due to the desynchronized clocks.) Processing latency and camera frame rate do not play a role at all in this context as they do not affect hardware nor software timestamps.

user-430fc1 23 June, 2020, 15:17:07

@papr - Thanks so much for a highly informative response! So, it sounds like the camera latency is going to be shared by eye and world cameras. Can you clarify what you mean by "a fixed amount amount of time"? I.e. what is the value that gets subtracted? Also, do you have any 'ballpark' figures regarding the accuracy of approximation / variability for Unix? My testing has resulted in plausible values for PLRs (e.g. latency to 1% constriction = ~250 ms), but I'm keen to know if there are any ways of maximising my chances of getting good timing. For my application (detecting a flash of light), using the world camera at (320 x 240), 120/fps works just fine. Would you advise using the same settings for eye cameras? To what extent is the reliability of pupil size estimation improved by increasing the resolution?

papr 23 June, 2020, 15:22:20

camera latency is going to be shared by eye and world cameras This is only the case if you use the same resolution for both. I would highly recommend doing that in your case if possible.

Also, do you have any 'ballpark' figures regarding the accuracy of approximation / variability for Unix? Not at the moment. Let us gather some measurements to give you a more concrete answer. Once we have that, we can share the measurement script for you to replicate the measurement.

To what extent is the reliability of pupil size estimation improved by increasing the resolution? Unfortunately, I do not have a concrete number for that either. Are you analysing the 2d or 3d pupil diameter?

user-430fc1 23 June, 2020, 15:28:46

@papr Fantastic, I really appreciate your help with this, and very much look forward to the measurements / example script. Currently I'm focusing on 3d to avoid having to correct for perspective (and because mm units are more clinically relevant than pixels).

user-3ede08 23 June, 2020, 18:23:10

Hey, in the documentation you mention that "The payload, e.g. a pupil datum, encoded as msgpack ... is encoded twice". Both the first and the second encoded data are different. Which one should I use ?

user-3ede08 23 June, 2020, 18:35:26

I get none and nan in the head_pose data. { 'camera_extrinsics': None, 'camera_pose_matrix': None, 'camera_poses': (nan, nan, nan, nan, nan, nan), What could be the problem ?

papr 23 June, 2020, 18:52:23

@user-3ede08 There are two types of files

The payload, e.g. a pupil datum, encoded as msgpack ... is encoded twice This only applies to .pldata files

papr 23 June, 2020, 18:52:49

The camera poses are likely nan because the camera pose could not be estimated for that world frame

user-3ede08 23 June, 2020, 18:59:29

Sorry, I was speaking about the .pldata. So, which one should I work with.

The camera poses are likely nan because the camera pose could not be estimated for that world frame @papr What should I take into account ?

papr 23 June, 2020, 19:01:56

@user-3ede08 You are looking at the head pose tracking data, correct? Have you set up a head pose tracking model yet? If not I would recommend looking at our youtube tutorial on how to set it up. You can also open the recording in Player, open the Head Pose tracker, hit export, and get the data as CSV. This saves you the hassle of loading the data manually

papr 23 June, 2020, 19:02:29

https://youtu.be/9x9h98tywFI

user-3ede08 23 June, 2020, 19:20:28

ok @papr I will look at it, thanks

user-3f2e42 24 June, 2020, 01:10:33

Hi, I have a question about the pupil-labs installation from the source.

user-3f2e42 24 June, 2020, 01:12:34

In the last time (about 2 weeks ago), there were no issues to run the application from the source after installation of 'libuvc' from the github. But, when I try to install from the beginning today, there is an error "no module named 'uvc'" even if I already installed the 'libuvc' from the github.

user-3f2e42 24 June, 2020, 01:13:14

So, is there anyone who faced this issue so I can figure this issue out?

user-c5fb8b 24 June, 2020, 07:03:20

Hi @user-3f2e42, the error message means that uvc cannot be found, which is the python wrapper around libuvc. Which operating system are you on? Have you installed pyuvc in your Python environment?

user-3f2e42 24 June, 2020, 07:21:50

Hi @user-c5fb8b, Currently, I'm working on Ubuntu 16.04. And, I installed the pyuvc by cloning repository of Pupil-Labs github. (I followed the installation guideline in the github.)

user-3f2e42 24 June, 2020, 07:22:21

But the thing is, about 2 weeks ago, it was installed correctly and there were no errors what I mentioned.

user-c5fb8b 24 June, 2020, 07:23:41

@user-3f2e42 what's your Python setup? Are you working in a virtual environment? Theoretically you shouldn't even need to clone pyuvc, running this in your Python environment should be sufficient:

user-3f2e42 24 June, 2020, 07:24:51

@user-c5fb8b Ah.. okay I will try it and let you know soon, thank you! πŸ˜„

user-3f2e42 24 June, 2020, 07:28:44

@user-c5fb8b I found that there is an error while installing 'pyuvc'

user-3f2e42 24 June, 2020, 07:29:03

It returns "Building wheel for uvc (PEP 517) ... error"

user-c5fb8b 24 June, 2020, 07:29:13

In there any more output?

user-3f2e42 24 June, 2020, 07:29:49

there is 'error: command 'gcc' failed with exit status 1'

user-c5fb8b 24 June, 2020, 07:30:00

Did you add the udev rules as in the docs:

echo 'SUBSYSTEM=="usb",  ENV{DEVTYPE}=="usb_device", GROUP="plugdev", MODE="0664"' | sudo tee /etc/udev/rules.d/10-libuvc.rules > /dev/null
sudo udevadm trigger
user-c5fb8b 24 June, 2020, 07:30:04

?

user-3f2e42 24 June, 2020, 07:30:35

Ah, I don't think so,, I will try it..

user-3f2e42 24 June, 2020, 07:33:23

Yes, there are still the errors.

user-c5fb8b 24 June, 2020, 07:33:51

Please copy all the errors

user-3f2e42 24 June, 2020, 07:33:54

Should I close the terminal?

user-3f2e42 24 June, 2020, 07:34:03

Okay, please wait a second..

user-3f2e42 24 June, 2020, 07:34:36

Installation Error: pyuvc

message.txt

user-3f2e42 24 June, 2020, 07:34:56

Please check the note.

user-c5fb8b 24 June, 2020, 07:39:16

@user-3f2e42 If I understand this correctly, libuvc cannot link against your version of turbojpeg. This might be because turbojpeg was compiled with a different compiler. Did you upgrade your system between the installs? What's the reason for reinstalling by the way? Did you run all the steps in the docs again? I would suggest you cleanup every leftovers from the old installation and run everything clean again.

user-3f2e42 24 June, 2020, 07:41:23

@user-c5fb8b Okay, I got it. I will try the installation after cleanup all the previous files. If I faced the same issue again, I will upload the notes on this chat. Thank you for your supports πŸ˜„

user-c5fb8b 24 June, 2020, 07:41:30

As a side note I highly recommend upgrading to a newer LTS version of Ubuntu, as the setup is much easier there.

user-c5fb8b 24 June, 2020, 07:42:11

However, I know this might not always be possible πŸ™‚

user-3f2e42 24 June, 2020, 07:43:54

@user-c5fb8b Okay :D Since it is not the first time I installed the pupil source codes (it was succeeded several times before :D), I believe that it would be fine even in ubuntu 16.04 πŸ™‚

user-3ede08 24 June, 2020, 10:58:30

Hey Pupil Lab community, how could I calibrate the Pupil Core to get gaze, as well as the head pose while driving ?

papr 24 June, 2020, 11:45:52

@user-3ede08 The head pose is independent of the gaze estimation. I would suggest using single marker calibration with a physical/printed marker in the car + apriltag markers in the car for the head pose estimation

user-3ede08 24 June, 2020, 13:42:45

Ok @papr thanks.

user-94f759 24 June, 2020, 14:14:38

Hey guys, is it there any restriction for a plugin to use CUDA when running from source ?

papr 24 June, 2020, 14:21:20

@user-94f759 no. When running from source, you can use any Python module you like. This is specifically useful if you want to use PyTorch with CUDA support for example.

user-c5fb8b 24 June, 2020, 14:23:27

@user-94f759 however, keep in mind not to block the main thread for too long, or Pupil might become unresponsive. If you have longer-running computations, you might consider using a background process.

user-94f759 24 June, 2020, 14:24:19

thanks !

user-a98526 24 June, 2020, 14:29:14

@papr Hi, What other camera can I use for use USB-C mount after Realsense camera is not supported

papr 24 June, 2020, 14:33:42

@user-a98526 if you want to use the built-in UVC backend, your camera needs to fullfill the following criteria: 1) UVC compatible [1] (Chapters below refer to this document) 2) Support Video Interface Class Code 0x0E CC_VIDEO (see A.1) 3) Support Video Subclass Code 0x02 SC_VIDEOSTREAMING (see A.2) 4) Support for the UVC_VS_FRAME_MJPEG (0x07) video streaming interface descriptor subtype (A.6) 5) Support UVC_FRAME_FORMAT_COMPRESSED frame format

[1] http://www.cajunbot.com/wiki/images/8/85/USB_Video_Class_1.1.pdf

For other cameras, you have write your own video backend.

user-a98526 24 June, 2020, 14:37:38

Can you recommend some cameras, thanks!

papr 24 June, 2020, 14:38:49

@user-a98526 Unfortunately, I do not have any personal experience with other cameras than the cameras that are already supported by Pupil Capture.

user-bbee68 24 June, 2020, 14:40:21

hey guys! i'm developer in korea using pupli on some medical device. i cant find any c or c++ native code. can i get it?

papr 24 June, 2020, 14:41:26

@user-bbee68 Pupil is mainly written in Python. You can access our network API (https://docs.pupil-labs.com/developer/core/network-api/) though using zmq for c or c++.

user-bbee68 24 June, 2020, 14:43:04

@user-bbee68 Pupil is mainly written in Python. You can access our network API (https://docs.pupil-labs.com/developer/core/network-api/) though using zmq for c or c++. @papr thanks! i read them!

user-a98526 24 June, 2020, 14:45:44

I want to ask the camera that Pupil already supports, because the Realsense I use doesn't work very well.

papr 24 June, 2020, 14:48:34

@user-a98526 By "the cameras that are already supported by Pupil Capture", I was referring to the "High speed camera" which is the default scene camera for the Pupil Core headset. https://pupil-labs.com/cart/?pupil_w120_e200b=1

So please let me clarify: I do not have any experience with other USB-C connected scene cameras.

user-a98526 24 June, 2020, 15:03:30

I think I need this default camera information, then buy and use it

user-a98526 24 June, 2020, 15:24:17

I mean the information about "High speed camera" and where can I get it, thanks for your help.

papr 24 June, 2020, 15:28:21

@user-a98526 I do not know if this camera can be purchased separately. I will forward your question to our sales team and come back to you.

user-a98526 24 June, 2020, 15:34:38

@papr thank you !

user-f3cfc3 24 June, 2020, 18:48:58

hi, i have a question regarding why my recording is a grey screen for the entirety of the length of the experiment when i open it in pupil player. any ideas on how to fix it?

Chat image

papr 24 June, 2020, 18:50:11

@user-f3cfc3 it looks like your scene camera was not recorded correctly. Do you have a world.mp4 file in your recording?

user-f3cfc3 24 June, 2020, 18:56:55

hmm, it looks like the world.mp4 files are missing. i'll look into this further. thank you!

user-3ede08 24 June, 2020, 23:23:59

Hey pupil labs members, in the .pldata, we have {'base_data':({'circle_3d':{'center':( .... Does the center parameter refers to the center of each pupil ?

wrp 25 June, 2020, 01:36:32

@user-3ede08 this should be the center of the pupil as 3d circle in teye pinhold camera space units (x,y,z). @papr or @user-c5fb8b feel free to elaborate.

user-c5fb8b 25 June, 2020, 06:23:01

@user-3ede08: @wrp is correct here. The base data contains the pupil datums which were used to calculate the gaze (?) of your .pldata file. The content in base_data essentially has the same format as the pupil_positions.csv export (only hierarchical instead of flat unrolled), you can find more information in the docs: https://docs.pupil-labs.com/core/software/pupil-player/#pupil-positions-csv

user-3ede08 25 June, 2020, 06:33:41

I have used the both center values of the { 'base_data': ( { 'circle_3d': { '**center**': (..., ..., ...) ... in the gaze.pldata, and I thought I was calculated my popupillary distance. It got about 35 mm. But where ever I look, the value is about 60 mm

Chat image

papr 25 June, 2020, 06:41:12

@user-3ede08 Pupil data is located in the eye camera coordinate system. Each eye camera has their own, unrelated coordinate system. During calibration we assume a fixed interpupillary distance (IPD) (or more specific the location of the 3d eye ball centers in relation to the scene camera) Therefore, you cannot use the 3d eye ball position to calculate the IPD.

user-3ede08 25 June, 2020, 06:54:29

ok, thanks to all of you.

user-bd800a 25 June, 2020, 09:30:13

Hi, I still have a problem with my pupil core, I had two cameras with the same name, I then changed the firendly name of one in the registry but I still can't see it in capture

papr 25 June, 2020, 09:31:24

@user-bd800a ~~Where did you change the names? In Pupil Mobile?~~ I misread your question. Pupil Capture expects a specific set of camera names, i.e. Pupil Cam1 ID2, Pupil Cam2 ID0, and Pupil Cam2 ID1.

user-bd800a 25 June, 2020, 09:32:31

in the windows registry using the driver key

user-bd800a 25 June, 2020, 09:33:25

I have 2 sets of glasses, one worked normally, the other had two Cam2 ID0

user-bd800a 25 June, 2020, 09:33:29

I changed one

user-bd800a 25 June, 2020, 09:33:48

then switch back to the other glasses, now I also have again two Cam2 ID0

user-bd800a 25 June, 2020, 09:34:06

actually on the one that worked I have two Cam1 ID0

papr 25 June, 2020, 09:34:28

@user-bd800a We have never tested such a name change in the registry. So I do not know if that actually changes something for Capture.

user-bd800a 25 June, 2020, 09:35:37

It does not I think, since I only see the world and one one eye camera

papr 25 June, 2020, 09:35:46

Do you see the duplicated names in the device manager or in Capture?

user-bd800a 25 June, 2020, 09:35:53

device manager

papr 25 June, 2020, 09:36:16

ok, please ignore that as long as Capture correctly recognizes the camera names.

user-bd800a 25 June, 2020, 09:36:25

Capture does not

papr 25 June, 2020, 09:38:12

Could you please test that on a separate computer, ideally macOS or Linux, and verify the duplicated name appears there, too? Given that you made changes to the registry, I cannot tell if this is an issue with Capture, the drivers, or the device itself.

papr 25 June, 2020, 09:43:41

@user-bd800a The best way to verify it is by (1) downloading and running the most recent version of Pupil Capture, (2) go to the Video Source menu, (3) "Enable manual camera selection", and (4) check which cameras are listed in the "Activate Device" selector.

user-bd800a 25 June, 2020, 10:30:50

Ok, with the latest version it worked and fixed the issue also with the previous versions

user-bd800a 25 June, 2020, 10:32:16

thanks a lot

user-bbee68 25 June, 2020, 12:00:41

@papr my friend! can i use pupil for android device directly?

user-bbee68 25 June, 2020, 12:01:50

maybe pupil core on android? or ios?

papr 25 June, 2020, 12:03:56

@user-bbee68 You can record Pupil Core video on Android using Pupil Mobile. But you will have to transfer the recordings to a computer running Pupil Player for analysis https://docs.pupil-labs.com/core/software/pupil-mobile/#pupil-mobile

user-bbee68 25 June, 2020, 12:08:43

@user-bbee68 You can record Pupil Core video on Android using Pupil Mobile. But you will have to transfer the recordings to a computer running Pupil Player for analysis https://docs.pupil-labs.com/core/software/pupil-mobile/#pupil-mobile @papr oh thanks! one more thing. is it use some gaze tracking on android like google cardboard enviroment? it just can using vive or something?

papr 25 June, 2020, 12:11:54

@user-bbee68 Pupil Mobile does not do any live analysis. It only records video from the connected eye tracker. Therefore, it is not usable in any VR/AR-like environment.

user-bbee68 25 June, 2020, 12:16:47

@papr aha! thanks! i use it for medical center for dizziness patients. they are mostly old man and they are poor at computers but they can use android phone. and we suggest for some kind of rehabilitation of dizziness disease. anyway thanks!

papr 25 June, 2020, 12:21:40

@user-bbee68 What type of data are you looking for in regard to the dizziness disease? This sounds more like a use case for Pupil Invisible: https://pupil-labs.com/products/invisible/ Please be aware that Pupil Invisible does not provide any pupillometry data yet. If you have more questions in regard to that, please checkout πŸ•Ά invisible or mail your questions to info@pupil-labs.com

papr 25 June, 2020, 12:22:24

It comes with an Android phone and a Companion app that is very easy to use.

user-430fc1 25 June, 2020, 12:49:26

Does anyone know what might be causing this error when I load a recording into the new version of Pupil Player? I didn't encounter this with the older versions of the software

Chat image

papr 25 June, 2020, 12:50:03

@user-430fc1 This is a very short recording, correct?

user-430fc1 25 June, 2020, 12:50:35

@papr yes, only a few seconds testing

papr 25 June, 2020, 12:52:48

@user-430fc1 This is an issue with very short recordings and the drawing of the timeline. It does not influence your data or anything. You should not see this error in longer recordings. We will look into the cause of the error.

user-430fc1 25 June, 2020, 13:39:48

@papr FYI, I still encounter this error with ~2min recordings, but it does not seem to affect the data.

user-430fc1 25 June, 2020, 15:44:34

@user-430fc1 I have to correct myself, there is actually a way of setting the exposure mode via notification: You can start any plugin remotely with custom arguments via the notification: python { "subject": "start_plugin", "name": "<plugin-name>", "args": <dict-with-arguments>, } The UVC_Source plugin is the plugin handling the UVC connection and it accepts uvc_control as argument in __init__ for starting with custom UVC settings. With that you should be able to basically restart the UVC_Source with modified UVC settings. You will need to do some digging in the code to figure out the necessary startup args however. @user-c5fb8b Thanks again for this. I've tried various ways of doing this with a notification but I csn't seem to find the right arguments, and pupil capture always becomes unresponsive. Here's what I've be doing: notify(pupil_remote, {"subject":"start_plugin", "name":"UVC_Source", "args":{"uvc_controls":{"exposure_mode":"manual"}}}). If you can offer any insights they would be grately appreciated!

papr 25 June, 2020, 15:46:32

exposure_mode is not part of the "uvc_controls". https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/video_capture/uvc_backend.py#L59-L68

papr 25 June, 2020, 15:48:34

Try

notify(
    pupil_remote,   
    {
        "subject":"start_plugin",
        "name":"UVC_Source",
        "args": {
            "frame_size": (1920, 1080),
            "frame_rate": 30,
            "exposure_mode":"manual"
         }
    }
)
papr 25 June, 2020, 15:49:40

frame size and frame rate are non-optional

papr 25 June, 2020, 15:50:34

I would also recommend setting name or preferred_names

user-430fc1 25 June, 2020, 16:00:41

@papr Thanks - I sent the above notification and included "name":"World" - it didn't crash capture but looks like it killed the world process. I'll play about with it a bit more. Thanks for your help. The reason I want to do this by the way is because my application requires specific camera settings, and it would be easiest to be able to set them programatically rather than remember to do it manually in Pupil Capture at start up.

papr 25 June, 2020, 16:01:05

Please try "Pupil Cam1 ID2" instead

papr 25 June, 2020, 16:02:23

See these default arguments as reference https://github.com/pupil-labs/pupil/blob/master/pupil_src/launchables/world.py#L291-L305

user-430fc1 25 June, 2020, 16:05:14

@papr Bingo. Thanks!

user-dcca8c 26 June, 2020, 21:24:33

Hi, I have a very basic question: when running Pupil Capture, should a window pop up with the eye camera view? I'm running on a Windows 10 computer and ran Pupil Capture as an administrator. A command window popped up and a window with the World camera view, but not an eye camera view. When I tried to calibrate, no data was recorded, so I think something might be wrong with the eye camera connection. I tried deleting the camera device drivers and unplugging/replugging the USB to reinstall the drivers, but that did not fix the problem.

user-220849 27 June, 2020, 00:25:07

I believe it should show the world view and then one window for each eye camera view. Does it show any error in the terminal/command window that pops up when you open pupil capture?

user-e70d87 28 June, 2020, 15:07:32

I'm having an issue where Pupil Player (latest version) doesn't seem to notice when I change the "Minimum Pupil Confidence" value. When I try to calculate the calibration, it still shows the message "##%% of Pupil data was discarded because confidence >.8"

user-0f5d40 29 June, 2020, 01:00:22

Hello, this is a basic question but I am trying to analyze my video recording for fixations at certain AOI's and I wanted to confirm if the yellow circle (which identifies a fixation exists at that frame) is where the person's eye gaze is or is the eye gaze still located at the purple-colored dot that normally represents the gaze position; as these are not matching up in the frame.

user-a98526 29 June, 2020, 04:31:38

@papr Hi, When I use USB to connect the pupil to the computer, my pupil capture software cannot open normally

Chat image

user-a98526 29 June, 2020, 04:31:40

Chat image

user-a98526 29 June, 2020, 04:49:58

If I don’t connect the pupil core to my laptop, the core capture can be opened.

user-c5fb8b 29 June, 2020, 06:41:10

Hi @user-a98526 unfortunately your screenshot of the terminal windows seems to be cut off on the right side, which prevents me from reading the full log. Can you try this again and send a screenshot of the entire terminal window?

user-c5fb8b 29 June, 2020, 06:45:28

Hi @user-dcca8c, yes there should be windows for the eye camera views. Is this a fresh installation of Pupil? Maybe the eye processes are disabled. Can you check the general settings menu on the top right? There should be two buttons to "Detect eye 0/1", are these enabled? If they are enabled and you still don't see any eye-window, please hit the button "restart with default settings" in the general settings window. One of these should hopefully solve your problem.

user-c5fb8b 29 June, 2020, 06:46:28

Hi @user-e70d87, this is indeed a bug in the latest version of Pupil Player. We are working on a fix.

user-c5fb8b 29 June, 2020, 06:53:50

Hi @user-0f5d40, can you share a screenshot of a frame where this appears to be an issue? I'm not sure what you refer to with "purple-colored dot", the gaze indicator is by default a green circle with a red dot in the center. Please be aware that a fixation is not several gaze points at the exact same position, but several gaze points with a small distance. This is what you control with the dispersion sliders in the fixation detector menu. Because of this, some of the gaze points that belong to a fixation will not be perfectly centered at the fixation.

user-a98526 29 June, 2020, 07:06:39

This is the whole picture.

Chat image

user-c5fb8b 29 June, 2020, 07:26:59

@user-a98526 Pupil is having trouble installing the necessary drivers on your system. Were you able to run Pupil in the past, or is this your first run on this machine? I think the easiest solution is to try and uninstall/reinstall Pupil. Please uninstall Pupil via the official Windows "Add or remove programs" system settings page. Please notify us if the problem persists after reinstalling Pupil.

user-3ede08 29 June, 2020, 07:31:10

Hey, how long does it take to detect the markers, when tracking the head. I have clicked on start detection about 15 min ago to detect markers (running time = 25 s) and it is still running.

user-a98526 29 June, 2020, 07:34:31

@user-c5fb8b i can run pupil capture only when the pupil core is not connected to the laptop, I'll try reinstall Pupil. Thank you

user-c5fb8b 29 June, 2020, 07:35:11

@user-a98526 this is to be expected, Pupil will only try installing the USB drivers if you have a headset connected via USB.

user-3ede08 29 June, 2020, 07:41:02

The marker detection bar is completed, but the pupil player software is still running

Hey, how long does it take to detect the markers, when tracking the head. I have clicked on start detection about 15 min ago to detect markers (running time = 25 s) and it is still running. @user-3ede08

user-a98526 29 June, 2020, 07:51:36

Hi,@user-c5fb8b I reinstall Pupil, but the same problem occurred.

user-3ede08 29 June, 2020, 08:32:24

Hey, I have a problem concerning the head pose tracker. Each time I clicked on start detection in online head pose tracker the pupil player runs nonstop, and the only way to stop it is to rush it. No function is accessible, no buttons are working. How could I handle the problem ?

papr 29 June, 2020, 08:36:27

@user-3ede08 Did you mean offline instead of online head pose tracker?

papr 29 June, 2020, 08:37:40

@user-3ede08 Please try again, and as soon as the application freezes (i.e. the buttons stop working) please copy the player.log file in the Home directory -> pupil_player_settings folder and share it with us.

user-c5fb8b 29 June, 2020, 08:43:54

Hi @user-a98526, are you running Capture as administrator? If not: do you see a popup in Windows requesting administrator access? Can you try right-clicking Capture and running it as administrator?

If this does not help: Pupil uses Microsoft Powershell to install the drivers in the background. Normally every Windows installation should have access to Powershell, but maybe for some reason your machine does not have it or Capture might be unable to access it. To test this, please open a command prompt window. You can do this by opening the Windows start menu and typing "Command Prompt". Please open this App. A black terminal window should open. In this window, please type:

powershell

and hit enter. Please copy and paste the text that appears afterwards, it should be something like:

Windows Powershell
Copyright (C) Microsoft Corporation. All rights reserved.

After this you can just close the Command Prompt window again.

user-3ede08 29 June, 2020, 08:44:59

In the Plugin Manager -> Head Pose Tracker -> (on the top) there is written : Offline Head Pose Tracker. I don't see online Head Pose.

@user-3ede08 Did you mean offline instead of online head pose tracker? @papr

papr 29 June, 2020, 08:46:07

@user-3ede08 ok, thank you. I was just asking because you wrote "online" before. But this is clarified now. If you could share that log file I would be able to check what the issue is.

papr 29 June, 2020, 08:54:46

@user-3ede08 I have just remembered that we have made an update to Pupil Player last Thursday that fixed an issue with the Head Pose Tracker in Player. Are you using version v2.0-182 already? If not, please download it here and try again: https://github.com/pupil-labs/pupil/releases/latest#user-content-downloads

user-a98526 29 June, 2020, 08:55:05

Hi@user-c5fb8b ,Translation: 'powershell' is not an internal or external command, nor is it a running program or batch file.

Chat image

user-3ede08 29 June, 2020, 08:56:01

@user-3ede08 I have just remembered that we have made an update to Pupil Player last Thursday that fixed an issue with the Head Pose Tracker in Player. Are you using version v2.0-182 already? If not, please download it here and try again: https://github.com/pupil-labs/pupil/releases/latest#user-content-downloads @papr I am using the version you released about 2 weeks ago. Have you released a new one ?

papr 29 June, 2020, 08:56:18

@user-3ede08 Yes, we have updated it last Thursday.

user-a98526 29 June, 2020, 08:56:39

But I can start Powershell like this :start powershell

user-c5fb8b 29 June, 2020, 08:57:57

@user-a98526 Ah okay! So powershell is installed, but it cannot be found. Have you made any custom modifications to your PATH environment variable on Windows?

user-a98526 29 June, 2020, 09:00:06

@user-c5fb8b Because many applications have added environment variables, I’m not sure if I modified some of them.

user-c5fb8b 29 June, 2020, 09:01:43

@user-a98526 can you open the window where you edit the environment variables

user-a98526 29 June, 2020, 09:03:57

This

Chat image

user-3ede08 29 June, 2020, 09:04:21

I mean using the head pose. So, we can detect if someone is moving his head on the left/ right, up/down ...

Will it be possible to get something like this with the pupil core ? @user-3ede08

user-a98526 29 June, 2020, 09:04:31

This is a system variable

Chat image

user-c5fb8b 29 June, 2020, 09:04:36

@user-a98526 the PATH variable can be set either in the upper section (user settings) or bottom section (system settings). One of them should contain a path to the powershell directory, for me, the system PATH variable contains:

%SYSTEMROOT%\System32\WindowsPowerShell\v1.0\
user-c5fb8b 29 June, 2020, 09:04:52

You can double click on the Path variable to get a list of its content

user-c5fb8b 29 June, 2020, 09:06:07

@user-a98526 this is e.g. the content of my system's Path variable:

Chat image

user-c5fb8b 29 June, 2020, 09:06:18

I assume the WindowsPowerShell might be missing for you?

papr 29 June, 2020, 09:06:44

@user-3ede08 The head pose tracker exports the head pose as a 6-components vector for each scene frame. The first three components are the rotation vector in Rodrigues format [1] and the last three components are the translation vector. I am not sure if these rotation vectors correspond to yaw/pitch/roll but if they do not, you should be able to transform them into your preferred format. My colleague who developed the plugin is currently on vacation. I will forward your question to her once she is back.

[1] https://en.wikipedia.org/wiki/Rodrigues%27_rotation_formula

user-3ede08 29 June, 2020, 09:08:58

Please, when will she come back ?

papr 29 June, 2020, 09:09:35

@user-3ede08 You can expect a response around the end of the week.

user-3ede08 29 June, 2020, 09:12:23

ok, thanks.

user-a98526 29 June, 2020, 09:17:35

@user-c5fb8bI think I am missing this environment variable

Chat image

user-c5fb8b 29 June, 2020, 09:18:05

@user-a98526 is this your user Path or system Path?

user-a98526 29 June, 2020, 09:19:49

this is the system path,and the user path is

Chat image

papr 29 June, 2020, 09:20:39

@user-a98526 Is this a private computer or has this computer been provided to you by your employer/university?

user-a98526 29 June, 2020, 09:22:22

This is a private computer.

user-c5fb8b 29 June, 2020, 09:24:47

@user-a98526 I'm afraid you might have messed up your Path variable in the past. You can try just adding %SYSTEMROOT%\System32\WindowsPowerShell\v1.0\ to your System Path, maybe that fixes it. Please be aware that other functionality might also not work correctly with this setup.

I would actually recommend a clean reinstall of Windows in your case. Please never delete anything from the environment variables if you are not 100% sure that this is what you have to do. Also try not adjusting the system variables.

user-3ede08 29 June, 2020, 09:30:39

@user-3ede08 I have just remembered that we have made an update to Pupil Player last Thursday that fixed an issue with the Head Pose Tracker in Player. Are you using version v2.0-182 already? If not, please download it here and try again: https://github.com/pupil-labs/pupil/releases/latest#user-content-downloads @papr It works with the new one, thanks @papr

user-39c8c4 29 June, 2020, 13:29:44

Hello, what is the meaning of the "Default Gaze Mapper" line in pupil Player ?

Chat image

papr 29 June, 2020, 13:31:19

@user-39c8c4 Have you checked out our post-hoc pupil detection and gaze mapping tutorial already? https://www.youtube.com/watch?v=_Jnxi1OMMTc&list=PLi20Yl1k_57rlznaEfrXyqiF0sUtZMMLh It explains the complete workflow and how the gaze mapper relates to it.

user-39c8c4 29 June, 2020, 13:38:03

@papr Thank you for your answer. I've already watched this video but I don't understand everything. I understand the white rectangles from the "Reference" line, but I don't understand the "Default gaze mapper" just bellow (the colored lines)

papr 29 June, 2020, 13:39:07

There three colored lines. One refers to the calibration range, one to the mapping range, and one to the validation range. Each can be set in the corresponding sub menu.

user-39c8c4 29 June, 2020, 13:40:28

Oh yes thank you very much for your help !

user-c87ed5 29 June, 2020, 22:18:53

Hi everyone, I have trouble opening an old file in pupil player 0.7.1 and 0.4 from version 0.7.3 of pupil capture using Ubuntu 14.04. I am getting the error: "info.csv is not valid" . The files that contain the folder are in the image attached, Best , Pablo

Chat image

user-7daa32 29 June, 2020, 23:26:39

By the way please what is the major function of the headpose tracker plugin?

papr 30 June, 2020, 07:11:33

@user-7daa32 The idea is to set up an external coordinate system with markers in which the scene camera can be tracked. This allows you e.g. to map gaze from multiple participants into a common 3d space. Check out our tutorial on using the plugin https://www.youtube.com/watch?v=9x9h98tywFI

user-2ff80a 30 June, 2020, 10:49:23

Hi everyone! I have a problem with the timestamps in the recording. I get a negative timestamp. What could possibly be the reason for this?How can this be rectified?

papr 30 June, 2020, 11:03:58

@user-2ff80a we have seen this before on Windows, but as long as the timestamps are monotonicly increasing, there is nothing to worry about

user-2ff80a 30 June, 2020, 11:05:43

@papr thanks for the reply. But I also have some other issue regarding timestamps. This timestamp shouldn't it map with the real clock time.Or is it something else?

papr 30 June, 2020, 11:06:32

Unless you have synced it explicitly to Unix epoch or an other clock, no

papr 30 June, 2020, 11:07:15

I can link the corresponding documentation when I am back at my desk

user-2ff80a 30 June, 2020, 11:07:48

@papr okay that would be great! thank you

user-0f5d40 30 June, 2020, 13:58:50

@user-c5fb8b Hi, thank you for getting back to me. Attached are two images that represent what i'm referring to. Also, is there a default frame rate for the world & eye cameras?

Chat image

user-0f5d40 30 June, 2020, 13:59:02

Chat image

papr 30 June, 2020, 13:59:15

@user-2ff80a https://docs.pupil-labs.com/core/terminology/#timing

user-2ff80a 30 June, 2020, 14:00:38

@papr Thanks alot! will check this.

user-0fcca1 30 June, 2020, 14:01:53

Hi ! I'm having a problem with Pupil Player. It used to work fine, but today nothing happens, i can't seem to open it. I tried to uninstall and re-install but it still doesn't work. Also, Pupil Capture and Pupil Service work just fine. (i'm working with Windows 10). Did anyone ever encounter the same issue ?

user-2ff80a 30 June, 2020, 14:03:51

@papr but also It would be great to know if there's anyway that I can get rid of the negative timestamp issue. As I have to map this with the real clock time precisely in my project.

papr 30 June, 2020, 14:05:37

@user-2ff80a In this case, it does not matter if they are negative or not. The important question is how to synchronize the two clocks after-the-effect. For this, checkout info.player.json. It contains the start time in pupil and system time. Calculate the difference and apply it to the exported timestamps in order to shift the exported epoch to unix epoch

papr 30 June, 2020, 14:06:59

@user-c87ed5 Hi, these are very old versions that you are using. Not sure if we will be able to reproduce this issue. But if you want you can share the info.csv with us and we can have a look.

user-2ff80a 30 June, 2020, 14:07:23

@papr okay thank you!

papr 30 June, 2020, 14:08:37

@user-0f5d40 are these screenshots from the exported world video?

user-c5fb8b 30 June, 2020, 14:14:59

Hi @user-0fcca1, can you share the log file with us? You can find it in your home folder > pupil_player_settings > player.log

user-c5fb8b 30 June, 2020, 14:17:02

@user-0fcca1 maybe it already helps if you just clear your player user settings: in your home folder > pupil_player_settings, delete all files starting with user_settings

user-0fcca1 30 June, 2020, 14:17:50

[email removed] it is !

player.log

user-0fcca1 30 June, 2020, 14:18:48

@user-c5fb8b i deleted the files and now it works ! thanks a lot ! have a good day !

user-0f5d40 30 June, 2020, 23:27:33

@papr yes, they are just for example.

End of June archive