πŸ‘ core


wrp 01 May, 2019, 14:15:06

Hi @user-e3dae5 looks like the port is already in use. Please check the task manager to see if there are any other version of Pupil Capture or Pupil Service running in the background. If so, stop the task and then restart Pupil Capture.

user-4fb664 02 May, 2019, 14:13:41

Can anyone guide me on how to start with the software in fedora

user-e3dae5 02 May, 2019, 14:29:47

Thanks @wrp but it wasn't the main error I think that if I run pupil-service, it runs pupil-capture. But as soon as I decide to run pupil-capture if doesn't work. Seems to launch in "ghost mode"

user-e3dae5 02 May, 2019, 14:29:57

here the output

Chat image

papr 02 May, 2019, 14:30:24

@user-e3dae5 Does it work on a different computer running Capture?

user-e3dae5 02 May, 2019, 14:32:29

I didn't try on another computer yet.

papr 02 May, 2019, 14:33:39

Please do. This way we can narrow down the issue.

user-2798d6 02 May, 2019, 22:22:36

Hello - I'm sorry to possibly repeat a previously asked question, but I am having issues with audio not lining up with the video. Pretty soon into the video, the audio is ahead of the visual. Is there a way to fix this? Thank you in advance!

user-2daf6d 03 May, 2019, 06:57:21

I'm interested in developing gaze interaction with Bokeh (python) information visualisations. Such as hover and tap, but with gaze. Has anyone tried something similar?

user-8779ef 03 May, 2019, 12:24:55

Guys, how does the 3D pupil ID algorithm compensate when the contour of the pupil is interrupted by a corneal reflection? Does it try and fit a model to each visible portion of the pupil contour? Does it do some sort of infilling, or interpolation of the pupil's circumference?

user-8779ef 03 May, 2019, 12:27:09

A few sentences would be helpful. With our current VR modification, the circle of LED's in the HMD make a ring of reflections on the cornea that are approximately the size of the pupil. If the ring also aligns with the edge pupil, then the pupil's contour can be broken in several places at once by the 5 reflections.

user-00d9e2 04 May, 2019, 12:46:10

hi community ! im new using the pupil labs for my project about an autonomous vehicles and have a problem in calibration can you all give me tips and trick for it ? it misses almost of the targeted things.

user-9dbb42 05 May, 2019, 15:27:50

Up clicking the Player to run, the black box is empty and the grey region for accepting files does not open. Software downloaded again and attempt to run the Player failed again.

Player worked one week ago with no problems.

Pupil capture works well. Running Windows 10

papr 05 May, 2019, 15:51:32

@user-9dbb42 Try deleting the user_settings_* files in the pupil_player_settings folder before starting Player

user-9dbb42 05 May, 2019, 15:55:29

thank you very much ,it worked!! πŸ˜ƒ

user-ca46a5 05 May, 2019, 17:50:52

hello, i used to be able to play the world_viz.mp4 files in other computers when i wanted to demonstrate some things to other people who does not have the pupil player, but i can not do that with world.mp4 files since the new release. Do i need to do something different? Thank you very much

papr 05 May, 2019, 17:52:05

@user-ca46a5 Do you mean the exported world.mp4 in the exports/00X/ folder? or the world.mp4 in the top level folder of the recording?

user-ca46a5 05 May, 2019, 17:52:28

yeah, the exported one

papr 05 May, 2019, 17:53:27

That is unexpected since other than the file name nothing should have changed.

papr 05 May, 2019, 17:53:39

Which external player do you refer to?

user-ca46a5 05 May, 2019, 17:54:10

VLC media player, it says This file isn’t playable. That might be because the file type is unsupported, the file extension is incorrect, or the file is corrupt.

user-ca46a5 05 May, 2019, 17:54:46

even though its the file in the export folder

papr 05 May, 2019, 17:56:00

If I remember correctly, if the export is cancelled, the resulting file might not be playable. Can you try exporting again and make sure that the export finishes as expected?

user-ca46a5 05 May, 2019, 17:56:28

okay, let me try

user-ca46a5 05 May, 2019, 17:59:02

still not playable

user-ca46a5 05 May, 2019, 17:59:03

hmmm

papr 05 May, 2019, 17:59:50

Does this happen for all of your recordings or only this one? Also which OS and Player version do you use?

user-ca46a5 05 May, 2019, 18:03:10

yes its the same for all my recordings.

user-ca46a5 05 May, 2019, 18:04:03

i have windows and for pupil player i have downloaded pupil_v1.11-4-gb8870a2_windows_x64 one

user-ca46a5 05 May, 2019, 18:04:13

i have also tried to play it windows media player

user-ca46a5 05 May, 2019, 18:04:21

it also can not play it

papr 05 May, 2019, 18:05:10

Could you share one of the recordings with data@pupil-labs.com ? I will try to replicate the issue in the coming week.

user-ca46a5 05 May, 2019, 18:05:36

sure, thank you very much:)

user-b7c4e4 06 May, 2019, 01:23:47

Hello all, I am trying to detect pupil offline and whenever I try to detect eye videos in loop I get following errors in my log and video crashes. Need help. Thanks in advance.

Chat image

papr 06 May, 2019, 05:50:52

@user-b7c4e4 Please share the recording with data@pupil-labs.com and I will have a look in the coming week

papr 06 May, 2019, 11:32:30

@here We are pleased to announce the latest release of Pupil software v1.12! We highly recommend downloading the latest application bundles: https://github.com/pupil-labs/pupil/releases/tag/v1.12

user-8fd8f6 06 May, 2019, 15:03:44

@papr Hi, I could used the player software for windows. But today I have a problem. This error comes up and I can't run the player : "Install pyrealsense to use the Intel RealSense backend" What should I do?

papr 06 May, 2019, 15:04:37

This is not a relevant error. If Player does not work anymore, it is due to a different reason

user-8fd8f6 06 May, 2019, 15:06:27

@papr thank you for your answer. I can't drag the files into player ( It is minimized)

papr 06 May, 2019, 15:09:46

@user-8fd8f6 Try deleting the user_settings_* files in the pupil_player_settings folder before starting Player

user-8fd8f6 06 May, 2019, 15:21:37

@papr There is not "pupil_player_settings" folder

user-2798d6 06 May, 2019, 16:40:52

Hello - I'm sorry to possibly repeat a previously asked question, but I am having issues with audio not lining up with the video. Pretty soon into the video, the audio is ahead of the visual. Is there a way to fix this? Thank you in advance!

papr 06 May, 2019, 16:43:35

@user-2798d6 Is this regarding Pupil Mobile recordings?

user-2798d6 06 May, 2019, 16:56:06

No, running Capture on a MacBook Pro

user-bc5d02 06 May, 2019, 17:02:39

Hello! Have anybody used pupil eye-tracker for smartphone app research? Maybe somebody have any tips and tricks how to calibrate pupil in a best way for this kind of research? I calibrated it using features around smartphone and icons on smartphone, but it is calibrated not very well.

papr 06 May, 2019, 17:34:17

@user-bc5d02 The problem here is that a phone is actually a very small target. You could try to use a very narrow lens, decreasing the effective field of view of the camera, but increasing the spatial resolution for gaze estimation (to some degree).

user-2798d6 06 May, 2019, 17:44:47

@papr, I've been running Capture on a MacBook Pro. Is there a computer setting I should change or is this something happening with the glasses and audio working together?

papr 06 May, 2019, 17:50:54

@user-2798d6 There are multiple audio related issues on macOS at the time. I did not have the time to look into this in detail, unfortunately.

user-2798d6 06 May, 2019, 17:53:06

@papr, so it is an issue with Mac rather than pupil?

papr 06 May, 2019, 19:31:19

@user-2798d6 it's an issue with Capture on Mac. Other os do not seem to have the same issue.

user-bc5d02 06 May, 2019, 19:31:52

@papr thanks a lot! Will try the lens.

user-2798d6 06 May, 2019, 19:49:42

Thanks, @papr.

user-20faa1 06 May, 2019, 21:48:35

I'm thinking about using Pupil Mobile for an active experiment where the participant will be moving around quite a bit. Can Pupil Mobile calibrate and record without streaming to a subscribing laptop running Pupil Capture?

wrp 06 May, 2019, 23:36:52
user-99bf85 07 May, 2019, 06:00:21

Hello everyone Short question: Does anyone have any recommendation for the threshold values for blink detection. The default values of 0.5 for on- and offset don't produce the best results for me. Thank you for your help!

user-14d189 07 May, 2019, 06:27:20

@user-99bf85 Hi, I guess it depends what you get. I had trouble with too long blink detections and lowered the offset confidence threshold to 0.3. I get a bit shorter and and more blinks.

user-82488e 07 May, 2019, 06:38:45

Hi all I’m using Pupil-Labs in communication with MatLab using ZeroMQ to run an experiment measuring pupil diameter. However recently I’ve been continually getting the following error while measuring round trip delay:

How can I go about solving this issue? Is it to do with the ZeroMQ server or my own settings? Thanks

Chat image

papr 07 May, 2019, 06:40:24

@user-20faa1 there is no on-device calibration for pupil mobile. But you can do offline calibration after the effect.

papr 07 May, 2019, 06:42:18

@user-82488e please check in the Pupil Remote menu if the port is still 50020. It changes to a different if 50020 is not available

user-82488e 07 May, 2019, 07:07:03

Thanks for the quick response @papr. How do I check that? Sorry i'm new to pupil-labs

papr 07 May, 2019, 07:13:32

@user-82488e In Capture, on the right side, there are icons for the different menus. One of them should say Pupil Remote. Click it to open the menu. The menu should include a text field indicating the current port number.

user-82488e 07 May, 2019, 07:19:12

Thank you so much!

user-82488e 07 May, 2019, 07:19:28

It has indeed changed

papr 07 May, 2019, 07:19:48

@Lachlan#6367 just set it back to 50020

user-82488e 07 May, 2019, 07:21:21

Will do, thanks

user-1e8f1b 07 May, 2019, 18:57:47

Hello guys, Im using pupil remote and i suscribe for surface information. I get the 'gaze_on_srf' and 'norm_pos' and sometimes the coords are negative. But on the recording, I was always looking inside the surface. What does the negative coords mean? Thanks for the help!

user-99bf85 07 May, 2019, 19:38:05

@user-14d189 Thank you for your response. Do you use 0.5 as onset still?

papr 07 May, 2019, 21:09:01

@user-1e8f1b check the confidence? Is the confidence very low in these cases?

user-14d189 08 May, 2019, 03:24:04

@user-99bf85 yes onset still 0.5 and filter length 0.2.

user-99bf85 08 May, 2019, 06:03:15

Thanks a lot.

user-1e8f1b 08 May, 2019, 09:53:44

@papr Confidence does not go lower than 60% in the whole recording

papr 08 May, 2019, 09:54:46

@user-1e8f1b Mmh. Please share the recording with data@pupil-labs.com so that I can look for an explanation.

user-1e8f1b 08 May, 2019, 09:55:55

Sure i will, Thanks for the help!

user-009621 08 May, 2019, 10:53:43

Hi everyone I could connect to pupil, when I used the holographic emulation app in the hololens. But when I build my app in unity and open the project via visual studio i can not connect to pupil anymore. Do i need to change something in the settings in visual studio?

user-41c874 08 May, 2019, 11:09:15

Hey. I have a query . Why is it that the world timestamp not inherit the pupil timestamp? I presume they are calculated after each pupil sample is collected. Or am I wrong ?

user-41c874 08 May, 2019, 11:09:49

Let me know. Thanks ! πŸ˜ƒ

papr 08 May, 2019, 11:19:30

@user-41c874 pupil timestamps come from the eye cameras. World timestamps from the world camera. The cameras are not synchronized in hardware. Therefore we need to synchronize the generated data by time afterwards.

user-41c874 08 May, 2019, 11:51:27

Thanks! So, world timestamp doesn't provide any information about when the pupil sample was acquired. Am I understanding it correctly? (But, in principle if I interpolate all world timestamps to pupil0 or pupil1 timestamps, I should get approximate x, y coordinates at the time when pupil samples were acquired. Right?)

user-41c874 08 May, 2019, 11:55:21

I'm just trying to understand this because we are using Surface samples (which has the same timestamps as world gaze) and syncing the timing of this to another clock . And there was some non-uniformity in the world timestamps and the pupil timestamps were relatively more uniform . Any way, everything else works quite good ! And I have more or less only one more hurdle to solve (as of now) . Thanks a lot for your help!!!

papr 08 May, 2019, 12:25:51

@user-41c874 typically, multiple pupil Datums are mapped to a single world frame. For each world frame we try to detect the current location of the surface. Then we map all gaze data belonging to this world frame to the gaze. Therefore gaze on surface should have two timestamps: its original timestamps and the world timestamp which indicates to which world frame it was correlated to

user-e2056a 08 May, 2019, 14:50:18

Hi @papr, we found a broken

Chat image

user-e2056a 08 May, 2019, 14:50:45

Hi @papr, we found a broken wire on one of the eye camera, how will that affect the data?

papr 08 May, 2019, 14:55:49

@user-e2056a This will likely result in the eye cam not being recognized correctly. Please write an email to info@pupil-labs.com with that picture.

user-e2056a 08 May, 2019, 15:03:36

ok

user-e2056a 08 May, 2019, 15:05:38

@papr, thanks, is gaze and fixation data still available with one eye?

papr 08 May, 2019, 15:06:12

@user-e2056a Yes, it is, but gaze will be less accurate if the subject looks to the right

user-e2056a 08 May, 2019, 15:10:44

@papr papr thanks

user-ce2b97 08 May, 2019, 17:22:43

Hello, I have a binocular 200Hz Pupil headset and I experince the same issue as @user-1603a2 . It seems that the camera for the left eye is not as focused as that for the right camera and the image is way darker. I attached screenshots of the right and the left eye in 400x400px resolution. The right eye is focused and bright enough that the pupil detection works nice. The left eye in default settings is too dark and the pupil is not detected reliably. If I adjust image postprocessing for the left camera to, for example, brightness:8, gain:3, then the pupil detection works somewhat better but its still less reliable than the right eye.

The pupil detection works better when I record in 192x192 px instead of 400x400 px but still, the confidence for the left eye varies more compared to the right eyer (which is usually constant at 1, besides when blinking).

Is there any chance to get better results with the left eye or does the camera need to be changed?

In the following link you can download short recordings of both scenarios: https://www.dropbox.com/s/ynjxp6av5jvv47m/2019_05_08.zip?dl=0

Image of the left eye with default settings.

Chat image

user-ce2b97 08 May, 2019, 17:23:07

Image of the left eye. Image post processing set to: brightness=8, gain=3

Chat image

user-ce2b97 08 May, 2019, 17:23:13

Image of the right eye.

Chat image

user-6f86f3 09 May, 2019, 02:50:03

Hi, I have an question about camera Intrinsics. After I apply 'show undistorted image', the surface mark and surface cannot be detected anymore. I need to define a surface to extract heatmap without fisheye distortion. What can I do for this?

papr 09 May, 2019, 07:02:53

@user-6f86f3 The show undistorted image is just for verification of the estimated intrinsics and is layered on top. The actual surface detection still runs but it's visualization is hidden. Please understand that undistorting the whole image is expensive in terms of CPU and is not necessary for the functionality of the surface tracker.

user-e08fba 09 May, 2019, 07:37:36

Hello again. I've been trying to launch Capture using regular usb cameras. Capture's screen is gray. I know, that you should have a pupil headset, but i just have no time left... Can anybody guide me trough?

user-e08fba 09 May, 2019, 07:37:56

I've also replaced drivers with libusbk

papr 09 May, 2019, 07:38:12

@user-e08fba Check the UVC Manager menu. Does your camera appear in the list of devices?

user-e08fba 09 May, 2019, 07:44:58

No, it does not

user-e08fba 09 May, 2019, 07:47:03

Any ideas?

user-e08fba 09 May, 2019, 07:48:51

Wait

user-e08fba 09 May, 2019, 07:49:09

Still no

user-019256 09 May, 2019, 08:20:51

Hey everyone, did anyone else have a problem with zigzag lines when plotting the binocular gaze data? When I just export the gaze on surface data recorded with capture (version 1.7.42) using pupil player and plot them in R, I get these weird zigzag lines, as if the gaze points would jump between left and right eye maybe? (the purple line in the graphs)

But when I recalibrate the data in player after changing it to dual_monocular calibration, the plot looks smooth and there are no more zigzag lines (the red line in the plots is the average of both eyes).

Also, when I recalibrate in binocular calibration mode, the zigzag lines reappear (the blue line).

Did anyone else have this problem with binocular calibration or can point me in a direction of possible errors?

Thanks very much in advance!

Chat image

user-019256 09 May, 2019, 08:20:56

Chat image

user-019256 09 May, 2019, 08:20:57

Chat image

user-c494ef 09 May, 2019, 11:41:40

hey, is there a best practice guide to position the eye cameras?

papr 09 May, 2019, 11:41:53

@user-019256 Could you please check which of the binocularly calibrated gaze points in your example are mapped binocularly and which are mapped monocularly?

papr 09 May, 2019, 11:42:28

@user-c494ef the pupil should be well visible at all times, i.e. it should not leave the video frame if the subject looks e.g. up

user-c494ef 09 May, 2019, 11:44:36

@papr I was wandering more about inside (close to nose) or outside positioning and best working distance

papr 09 May, 2019, 11:55:43

@user-c494ef The position is not relevant as long the camera's field of view is able to capture the pupil well

user-bc5d02 09 May, 2019, 15:15:26

Hello! Have a problem with a Natural features calibration. After finishing calibration there is a error "world: Not enough ref point or pupil data available for calibration". I found that the reason could be in a pupil service ( https://github.com/pupil-labs/pupil/issues/1140 ) but I did not find a solution of the problem.

user-1e8f1b 09 May, 2019, 19:33:42

Can confidence drop because running short of CPU?

papr 09 May, 2019, 20:02:45

@user-1e8f1b no. If you run out of CPU, frames will be dropped

user-6f86f3 09 May, 2019, 23:15:29

@papr I have tried to record video with Camera Intrinsics Estimation enabled, and it turns out I cannot extract the heatmap foe a specific surface. So Camera Intrinsics Estimation is not designed for headmap ?

user-6f86f3 09 May, 2019, 23:25:55

additionally, I want to extract heatmap only for monitor screen, but the mark defined surface will include some part of monitor edge. (like the area in blue line below). Do you have any idea about how to extract the perfect heatmap?

Chat image

user-019256 10 May, 2019, 10:17:25

@papr in this plot I colored the data green if it was binocularly mapped and colored it red if it was monocularly mapped. For this I used the information given in base_data (1 or 2 eyes). As you can see, most of the data is mapped binocularly.

Chat image

wrp 10 May, 2019, 10:27:00

πŸ“£ Announcement for Pupil Mobile users πŸ“£

As many of you may already know, there is a bug in Android OS v9 (aka Android Pie) that breaks USB device enumeration for many USB Cameras. This bug also affects Pupil Labs hardware. Therefore if you are using Pupil Mobile on a device runningAndroid v8, we recommend that you do not upgrade yet to Android v9. If you did update to Android v9 and the update broke compatibility with our hardware, fear not we have we have a solution (actually two solutions!) πŸ˜„

Long term solution - We have identified the bug in Android and submitted a fix to the USB subsystem for Android OS. Our change has been accepted in the official Android repository! It's exciting to be able to fix a problem at the source, especially for such a widely used codebase! However, it will take some time for our fix to trickle through via Android OEM updates.

Short term workaround - We also changes to Pupil Mobile source code that enables you to continue using Pupil Labs hardware in Android v9. The workaround is available in the most recent version of the Pupil Mobile beta release. You can access the beta releases by opting into the beta program here: https://play.google.com/apps/testing/com.pupillabs.pupilmobile

wrp 10 May, 2019, 11:06:01
user-9dbb42 12 May, 2019, 19:15:23

what cpu can handle 200hz eye camera?

user-9dbb42 12 May, 2019, 19:15:48

because i have i7 intel 2.40Ghz

user-9dbb42 12 May, 2019, 19:17:54

and the program cant handle the rate and always drops it

papr 12 May, 2019, 19:46:21

@user-9dbb42 do you need the Pupil detection in real time? Alternatively, you can disable real time detection and run it after the effect in Player.

user-af9864 12 May, 2019, 19:46:37

Hi! I bought a pupil lab Hololens add-on but the wires to eye cameras are too delicate. One of the them breaks 😦 Can I have a new connection plugin?

papr 12 May, 2019, 19:54:03

@user-af9864 please contact info@pupil-labs.com in this case

user-9dbb42 13 May, 2019, 06:25:47

@papr thank you for the answer!! I will use what you have suggested!!

user-af87c8 13 May, 2019, 09:42:21

@user-019256 in our formal PupilLabs vs Eyelink comparison we see similar things https://www.biorxiv.org/content/10.1101/536243v1, just wanted to mention this e.g. Figure 5, Figure 9. Potentially this related to the problem discussed in Figure 4 (paper is btw. accepted, will be published soon in peerj)

user-23fe58 14 May, 2019, 04:47:00

Hi. I'm trying to run the Pupil Labs software on a .mp4 video. On Ubuntu - the software immediately closes when I finish selecting the path to the video. On Windows, I can't seem to open the application - when I run the executables, it just opens a blank command prompt, and nothing happens... can anyone advise me? Thanks!

papr 14 May, 2019, 06:33:58

@user-23fe58 The software is not supposed to work with only a mp4 file. Please refer to the documentation on how the recording format looks like

user-23fe58 14 May, 2019, 11:46:58

@papr Thanks for the tip. According to the documentation, .mp4 files are a valid extension, but in addition, the .mp4 file should have a specific name, and, it must have a corresponding .npy file. If I have my own video file, how can I create a .npy file for it?

papr 14 May, 2019, 11:52:03

@user-23fe58 I have a jupyter notebook that shows how to generate a minimal recording from a single video. I can share it when I am back at the office on Thu. πŸ‘ It uses the easiest approach: Generate evenly spaced timestamps. The more correct way would be to look at video's PTS and calculate a timestamp from that.

user-23fe58 14 May, 2019, 11:54:15

@papr That would be great. Thank you!

user-0d187e 14 May, 2019, 13:58:47

Hi guys. I know pupil tracker doesn't do glint tracking but does 3D model that estimates the center of the eyeball in 3D compensate for device shifts relative to the head? I'm asking because the gaze data we get in VR setup are very sensitive to device shifts

user-072005 14 May, 2019, 14:29:59

Has anyone created a pupillometry package for python compatible with pupil labs that can do things like adjust the pupil size reading for size differences from the pupil location, remove blinks, and convert the pixel readings for the pupil diameter to mm? I saw a link to a pupillometry package in the community github, but it was broken.

user-23fe58 15 May, 2019, 05:21:30

@papr An update to my attempt to use the Pupil labs software, that you were helping me with. I was able to create the info.csv and timestamps.npy file needed for a particular .mp4 file I wanted to test. However, the software doesn't show where the pupil is located in the video. My ultimate goal here is to input a video file of a recording of an eye looking around, and get the software to tell me the location of the pupil in the video. Is that possible?

papr 15 May, 2019, 07:06:39

@user-23fe58 In this case, you have to name the video eye0.mp4 and rename the timestamps file accordingly. Afterwards, you can run the Offline Pupil Detection in Player on the eye video.

papr 15 May, 2019, 07:07:13

@user-072005 I am not aware of such a package. Feel free to share it here when you have found it.

papr 15 May, 2019, 07:08:01

@user-0d187e That depends on which detection and mapping mode you use. The 2d mode is much more sensitive to slippage than the 3d mode.

user-78dc8f 15 May, 2019, 08:39:17

@papr @wrp : I am working on pulling out video frames from the head cameras that are sync'd across a parent and child (close to the same time stamp). The goal is to pull out the same number of sync'd video frames and then use tensor flow to recognize the objects in these frames. I've started by using matlab and ffmpeg; making progress, but this seems like something others might be working on. Any pointers toward useful plug-ins or otherwise that might be available from the pupil community?

user-78dc8f 15 May, 2019, 09:26:26

Hi All. I am working on pulling out video frames from the head cameras that are sync'd across a parent and child (close to the same time stamp). The goal is to pull out the same number of sync'd video frames and then use tensor flow to recognize the objects in these frames. I've started by using matlab and ffmpeg; making progress, but this seems like something others might be working on. Any pointers toward useful plug-ins or otherwise that might be available from the pupil community?

user-23fe58 15 May, 2019, 11:15:55

@papr Thank you for the help! I got it working. There is one glitch left though - even though Pupil Player and Pupil Service both work for me, Pupil Capture closes as soon as it opens. I have attached a screenshot of the command window's output. Anyway, even if it is not possible to fix, I can just use the other two applications. Thanks again!

Chat image

papr 15 May, 2019, 11:29:14

@user-23fe58 Are you not using Pupil Player? Capture is for processing real time video, e.g. from a camera.

user-019256 15 May, 2019, 11:43:01

@user-af87c8 thank you very much, this is helpful!

user-23fe58 15 May, 2019, 12:13:46

@papr ah, that would explain it. I can use Pupil Player - it works for my saved video. Thanks again for all the help!

user-0d187e 15 May, 2019, 13:18:40

@papr Is the 3D mode meant to compensate for device shift since it detects the eye ball center relative to the camera? Do you consider that as an alternative to using glint?

papr 15 May, 2019, 13:27:56

@user-0d187e Slippage compensation is archived by reestimating the eye ball center relative to the eye camera, yes. Yes, we consider this an alternative method.

user-0d187e 15 May, 2019, 13:28:50

thx

user-67c6ed 15 May, 2019, 16:35:42

hi

user-67c6ed 15 May, 2019, 16:36:17

I'm looking for some guidance on how to build the apps for mac from source

user-67c6ed 15 May, 2019, 16:38:34

none of the built ones run

user-67c6ed 15 May, 2019, 16:48:10

they all give : "Illegal instruction: 4"

wrp 15 May, 2019, 23:12:42

@user-67c6ed I have not had any issues with Pupil app bundles on macOS. What version of macOS are you using and what are the machine specs (specifically the CPU)?

papr 16 May, 2019, 09:11:33

@user-23fe58 Just for completion: This is the notebook that implements one of the many ways to generate a Player-compatible recording from a single video file: https://gist.github.com/papr/d3e9d3863b934d1d4893e91b3f935ed1

papr 16 May, 2019, 14:32:24

@user-358ea2 I have an update for you regarding audio recordings on macOS:

  1. Line-in and built-in microphones seem to work well (as mentioned earlier)
  2. Some USB microphones only record static noise (e.g. Logitech Webcam mic)
  3. There are some microphones that work correctly (e.g. [1]) if setup correctly. Unfortunately, this is not necessarily consistent across devices. In the case of the Jabra Speak 810 [1], I had to select it as system-wide input device in the system settings, as well as select the built-in (!) mic in Capture to make it work.

[1] https://www.jabra.com/business/speakerphones/jabra-speak-series/jabra-speak-810

user-67c6ed 17 May, 2019, 07:05:46

Hi @wrp, we finally found the compilation guide on the website and the problem was solved by compiling the apps ourselves. By the way, it's a core 2 quad CPU, I don't remember the exact reference. It's not an "official" mac, but it's unlikely to be related to the CPU, I've never seen such thing happen with any app in almost 15 years of using OSX with all kinds of exotic hardware, except about OS kernel. The apps are not giving any log, are there somewhere, or a way to enable logging/verbose mode ?

user-67c6ed 17 May, 2019, 07:12:06

or maybe apps just don't support core2 family, it's getting a bit old

user-67c6ed 17 May, 2019, 07:13:47

tested Sierra and High Sierra

papr 17 May, 2019, 07:14:36

@user-67c6ed If you say "compiling the apps ourselves", do you mean to run them from source, i.e. running python3 main.py?

papr 17 May, 2019, 07:15:44

If so, there should be a capture.log file in the capture_settings folder within the cloned repository.

user-67c6ed 17 May, 2019, 07:15:47

yes @papr

papr 17 May, 2019, 07:16:27

Please be aware that it is being overwritten every time you start Capture.

user-67c6ed 17 May, 2019, 07:18:46

it's actually working with a version we cloned from git repo

user-67c6ed 17 May, 2019, 07:19:10

and this one does work, so the log wouldn't give us much info

user-67c6ed 17 May, 2019, 07:19:18

is it possible to do the same with bundled app ?

papr 17 May, 2019, 07:20:00

The bundled apps store the log file to the pupil_capture_settings folder in the user's home folder.

user-67c6ed 17 May, 2019, 07:20:24

ok, I'll check that

papr 17 May, 2019, 07:23:14

Please be aware that the bundle is not compatible with all Intel cpus. As to my knowledge, the bundle does not work e.g. on the Intel Xeon processors.

user-67c6ed 17 May, 2019, 07:24:15

ok, thank you for the tips πŸ˜ƒ

user-67c6ed 17 May, 2019, 09:48:12

hi @papr, I just checked, the only entry in the log is : MainProcess - [INFO] os_utils: Disabled idle sleep.

papr 17 May, 2019, 10:53:08

@user-67c6ed Sorry, this makes it very difficult to debug. I can only recommend to run from source then. πŸ˜•

user-67c6ed 17 May, 2019, 11:03:50

ok, no problem, I was just being curious

user-817474 18 May, 2019, 19:03:19

Are there any Pupil management/executives in here or would it be best to email with an inquiry?

user-9a064b 18 May, 2019, 20:10:19

Hello to all! I ran into a problem when I first opened the latest version of the player. The phrase written was: player - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend . Maybe someone faced with a similar and can tell what the problem Thank you!

papr 19 May, 2019, 09:52:48

@user-817474 Please contact info@pupil-labs.com with your inquiry.

user-817474 19 May, 2019, 09:53:10

Thanks!

papr 19 May, 2019, 09:54:10

@user-9a064b please delete the user_settings_* files in the pupil_player_settings folder and try starting Player again.

user-07d4db 19 May, 2019, 10:51:47

Here is an example of an excel file data output, that I reveived for the calculated amount and duration of fixations for an AOI called "excercise"

fixations_on_surface_excercise.txt

user-07d4db 19 May, 2019, 10:54:05

Hey! I have another question concerning the excel files I received with the data output for my calculations. The dependant variables, I want to measure, is the mean number and duration of fixations within each AOI. Therefore I am using the file β€œfixations on surface XY”. I attached an excel files to this thread. When having a look at the data output, I wasn’t completely sure, about the data produced. It would be great if you could give me a short answer to that. β€’ For the mean duration of fixations within the AOI I had a look at the column β€œduration”. Surprisingly it was showing for nearly every fixation exact the same number. Do I have to change something in the parameter settings in pupil player? It would be great if I could see here the durations in ms per fixation. Another opportunity could be to take the difference between start and end frame, right? But here another inconsistency accured: When having a look at the start-and endframe, I realised that there is a gap between the endframe of the previous fixation and the startframe of the following fixation. What happens between this time gap? β€’ For the number of fixations I would simply count the number of times, where it says β€œtrue” for the relevant AOI. Do you agree on that? β€’ When I looked at the column id (showing the number/index of each fixation), I noticed that the numbering of the column "id" is not consistent in some places. That means it jumps to a higher number in some places. However, this is not constant between the excel output of different AOI. That means that in the different excel files β€œfixations on surface XY” is due to this a different total number of fixations shown. Ilustrating this, in the attached file for example it "jumps" directly from 33 to 36. Could you explain this to me? β€’ Furthermore, I wasn’t sure about the columns β€œx-norm/y-norm” and β€œx-scaled/y-scaled” as they show the same data. An answer of you regarding these questions, would be very helpful! Thank you!

user-07d4db 19 May, 2019, 20:21:02

Do I have to change something in the dispersion algorithm settings (or further software settings) in order to see the correct values for the duration of a fixation on a defined surface? πŸ˜ƒ

user-0a2ebc 20 May, 2019, 09:18:05

Chat image

user-0a2ebc 20 May, 2019, 09:18:37

Can anyone help me pls ? Why suddenly my eye tracker cannot connect to PC

papr 20 May, 2019, 09:20:33

@user-0a2ebc The camera drivers are not installed anymore. This is likely due to a Windows Update. Please restart Capture with administrator rights.

user-0a2ebc 20 May, 2019, 09:24:24

Unfortunately it is still not working

user-0a2ebc 20 May, 2019, 09:25:12

How can I install the camera drivers again pls ?

papr 20 May, 2019, 09:32:16

@user-0a2ebc I've sent you a personal messages with the full instructions

user-ae7c30 20 May, 2019, 11:45:46

Hello, I am trying to get the fixation- norm position in realtime, can anybody give me a tip ? I currently using cmd! Thank you πŸ˜ƒ

papr 20 May, 2019, 11:56:09

@user-ae7c30 Have you checked out our example on how to access data in real time? https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_messages.py#L23 You will have to change line 23 to sub.setsockopt_string(zmq.SUBSCRIBE, 'fixation') in order to receive fixation data. Also, do not forget to turn on the fixation detector.

user-ae7c30 20 May, 2019, 11:58:34

yes! thank you I am using the same file with 'fixations'

user-ae7c30 20 May, 2019, 11:59:07

but how can I parse its norm_position out of fixation? messages is formated as below

user-ae7c30 20 May, 2019, 11:59:08

fixations: {'topic': 'fixations', 'norm_pos': [0.24963392804467022, 0.2768624991848711], 'dispersion': 0.0, 'method': 'pupil', 'base_data': [['gaze.2d.0.', 61075.5142], ['gaze.2d.0.', 61075.538407], ['gaze.2d.0.', 61075.659443]], 'timestamp': 61075.5142, 'duration': 145.24299999902723, 'confidence': 0.7997475862503074, 'gaze_point_3d': nan, 'id': 28}

papr 20 May, 2019, 12:01:13

Ah, nice! Use

while True:
    try:
        topic = sub.recv_string()
        msg = sub.recv()
        msg = loads(msg, encoding='utf-8')
        x, y = msg["norm_pos"]
        print("\nnorm_pos x/y: {} {}".format(x, y))
    except KeyboardInterrupt:
        break
papr 20 May, 2019, 12:01:29

@user-ae7c30 The important line is x, y = msg["norm_pos"]

user-ae7c30 20 May, 2019, 12:01:29

oh wow.. I am so gratful

user-ae7c30 20 May, 2019, 12:01:31

thank you!!

user-072005 20 May, 2019, 12:46:58

Just checking, the gaze positions are normalized in the exported data to go 0-1 from left to right and bottom to top? So the left bottom corner would be 0,0 and top right 1,1?

papr 20 May, 2019, 12:51:15

@user-072005 correct

user-072005 20 May, 2019, 12:53:31

Thanks, also the timestamps... I'm looking at the csv in excel. There doesn't seem to be a format that they make sense in. Is it not in the same time zone as the phone I recorded it on is in?

papr 20 May, 2019, 12:56:54

@user-072005 The time unit is seconds. The epoch/start of the clock is arbitrary, unless you used the Pupil Capture Time Sync plugin to sync the Pupil Mobile clock to Capture's clock.

user-072005 20 May, 2019, 12:58:01

Ok that explains it. Great, thanks

user-07d4db 20 May, 2019, 16:28:27

Hey! My data output always shows me the same duration for the fixations on surface_name output. What can I do in order to receive the appropriate parameters? Thank you!

Chat image

papr 20 May, 2019, 16:32:42

@user-07d4db This is probably a bug.

user-07d4db 20 May, 2019, 16:33:30

Yes! Do you know what I can do in order to solve this?

papr 20 May, 2019, 16:33:47

@user-07d4db No, since I was not able to find the cause yet

user-07d4db 20 May, 2019, 16:35:57

Okay thank you! I tryed to find out whether there is propably something wrong in my software settings, but I coudn't find the reason so far...

papr 20 May, 2019, 16:36:15

@user-07d4db No, they should not be the reason

user-88b704 21 May, 2019, 17:51:42

Hi, I am new to pupil capture. On a Mac computer, I am trying to record audio. When I select Audio source to build in microphone. It gives me an error 5, saying that input/output/error 'none0'. I already gave pupil capture access to system microphone. Any idea why?

papr 21 May, 2019, 20:11:27

@user-88b704 The audio recording situation on macOS is currently a bit difficult. Please leave the built-in mic selected and restart Capture. Do you get the same error? Is the mic still selected? If not, do you get the error if you select it?

user-07d4db 22 May, 2019, 05:57:34

Hey! Here you see a produced data output fixations_on Surface_name for four different AOI, that I recorded simulaneously. Can you explain me why they show a different amount of detected fixations? Are fixations only registered if the frame, to define the AOI with the surface tracker plugin, is visible in the world camera?

Chat image

user-88f7b8 24 May, 2019, 04:21:26

Does anyone have ever used Pupil Headset with eeg devices? I’m trying to eye track while eeg recording, but because eeg is sensitive to electricity, eeg data is distorted caused by Pupil Headset. If there are anyone who solved this problem, or trying to blocking electro signal from headset, please help me.

user-26fef5 24 May, 2019, 05:25:48

@user-88f7b8. just a general thought. Did you by any chance look at the frequency spectrum of the eeg signals a) with the headset and b) without ? That should in theory let you design a proper filter to get rid of the disturbances. (We have not done eeg and eyetracking simultaneously yet but were thinking about it)

user-88f7b8 24 May, 2019, 06:40:22

@user-26fef5 thankyou for replying! I didn’t spectrum analysis yet because i can confirm actual effect of headset by my eyes. With and without headset condition make significant difference. As your suggest, i should check frequency and find proper bandfilter. Thank you.

papr 24 May, 2019, 08:31:08

@user-07d4db That is correct. The surface tracker exports fixations roughly like this:

for each defined surface S:
    create a fixations_on_surface.csv file
    for each exported world frame idx F:
        check if S was detected in F:
            map fixations belonging to F to S
    remove fixations that were mapped multiple time
    write fixations to csv file
user-5df199 24 May, 2019, 16:51:13

hello hello. it seems i've lost an eye

user-5df199 24 May, 2019, 16:52:54

unable to get pupil service or capture to display the video from pupilcam id1 (I'm using the hmd addon hardware)

user-5df199 24 May, 2019, 16:53:12

it was working yesterday

user-5df199 24 May, 2019, 16:54:29

when i use "activate source" to switch to id1, it just closes the window and doesn't open another one

user-5df199 24 May, 2019, 16:57:14

the window for eye1 just opens then immediately closes

user-5df199 24 May, 2019, 16:59:55

This is also true for the main pupil capture window, if I set it to source id1, it crashes to desktop

user-5df199 24 May, 2019, 17:02:51

restarted my PC and they both now work

user-07d4db 25 May, 2019, 10:07:30

Thank you Papr! Now I understood how the excel files are created

user-a6cc45 27 May, 2019, 20:19:32

Hello, quick question: what are the main differences between Pupil Capture and Pupil Service? Pupil Service is just Pupil Capture for VR/AR?

papr 27 May, 2019, 20:23:50

@user-a6cc45 Main difference: Capture's event loop is bound to the world cameras frame rate. ~1 world frame -> ~1 event loop iteration. This results in Capture buffering multiple pupil data points before mapping them to gaze data and publishing it. In comparison, Service does not support a world camera, and is able to map pupil to gaze data as soon as it arrives, resulting in a lower gaze processing latency.

papr 27 May, 2019, 20:24:20

Service has further limitations, i.e. there are multiple plugins that are only supported in Capture.

user-0a2ebc 28 May, 2019, 07:01:46

Can anyone help me , why is the picture not displayed in full (i already adjusted the pixel)...

Chat image

user-0a2ebc 28 May, 2019, 07:02:11

And it seems that heatmap is not generated properly on a surface (image)

user-0a2ebc 28 May, 2019, 07:02:41

I followed the script for heatmap tutorial

user-0a2ebc 28 May, 2019, 07:02:49

Thanks in advanced for the help

papr 28 May, 2019, 07:06:02

@user-0a2ebc it might be possible that jupyter notebook scales down the image visually to fit it into the cell.

papr 28 May, 2019, 07:08:04

@user-0a2ebc also, you might need to adjust the figsize

user-0a2ebc 28 May, 2019, 07:09:30

I did adjust but it still shows the same result

papr 28 May, 2019, 07:13:54

@user-0a2ebc do you have Opencv installed? You could save the image directly to disk with cv2.imsave() and open the image a proper image viewer

papr 28 May, 2019, 07:15:20

@user-0a2ebc since the image is vertical: Keep in mind that image matrices are described by Height x Width instead of the usual Width x Height

user-0a2ebc 28 May, 2019, 07:16:12

@papr i haven't installed opencv yet...the image is already on my disk. Do i still need the module ?

user-0a2ebc 28 May, 2019, 07:16:44

@papr ah I see ..Hx W...let me try

user-0a2ebc 28 May, 2019, 07:17:27

Any advice about the best pixel size to be displayed for heatmap or gazeplot ?

papr 28 May, 2019, 07:18:15

@user-0a2ebc depends on the size of the image on which you want to overlay the heat map on.

papr 28 May, 2019, 07:18:45

You only need a library like Opencv or pillow if you want to save modified images.

user-e91538 29 May, 2019, 11:33:31

I have a pretty straigth forward question: I calibrate the eye tracker, then record some data. Looking up the raw gaze data of the trial (not the calibration), I get the normalized eye vector to work with. At which value is the calibration origin? (i hoped for it would be at 0,0,1 after calibration?!)

papr 29 May, 2019, 12:24:08

@user-e91538 Just for clarification of the terms used in the Pupil project: Pupil data = pupil position in eye camera coordinates Gaze data = pupil positions that have been mapped to the world coordinate system (requires calibration)

https://docs.pupil-labs.com/#data-format See the Coordinate Systems subsection in the docs for details on the different coordinate systems that each camera can have.

user-d3153d 29 May, 2019, 16:06:07

Excuse. I am try to do Eye Tracking Research Data Analysis in Autodriving , and the link address of "Python pupillometry analysis scripts to analyize pupil_data." on Github is Not available with 404. Does anyone know where I can found those or recommend some other analysis code? Thank you so much.

user-5df199 29 May, 2019, 21:03:31

@user-d3153d it appears that the owner of that code removed it from github, see https://github.com/pupil-labs/pupil-community/issues/8

user-d3153d 30 May, 2019, 09:18:38

@user-5df199 thank you for your help!

user-d3153d 30 May, 2019, 12:52:28

I am trying to use CHAP https://in.bgu.ac.il/en/Labs/CNL/chap/default.aspx to analysis the data from pupil-lab. But the input file from CHAP of pupil-lab should be .plsd which is not the output file from pupil_capture or pupil_player. Does anyone know where to find the .plsd file ? thank you !

user-780603 30 May, 2019, 12:53:17

Hi! Can I somehow use a fixation plugin for .csv or .pldata files? I got a little lost in variables in fixation_detector.py. What are the main functions I should pay attention to? Looks like a detect_fixations makes a lot of work, is the gaze_data some sort of gaze.pldata? Or maybe there are some cmd tool or some easier methods?

papr 30 May, 2019, 12:54:53

@user-780603 pldata files are intermediate recording files. Just open the recording in Player, run the plugins and hit export to get the data exported to a csv file

user-780603 30 May, 2019, 12:56:05

yeah, I understand this, but I have about 60 sessions, is there other way around?

papr 30 May, 2019, 12:56:55

In this case you were on the correct track. You will have to call the source code directly from a script

user-e7102b 30 May, 2019, 15:15:46

Hi, I'm interested in using a USB webcam as an input into pupil capture, so that I can record accurately timestamped/synced video of participants performing an action (throwing a ball) from a side view. I have a couple of questions. 1) If I buy a Logitech C615 webcam (as suggested in the DIY section of the pupil docs) and plug it into a USB port on my machine, will it be detected as a source by pupil capture? 2) If this is the case, could I then substitute one of the eye camera source inputs for the webcam, so that when I'm wearing the pupil labs headset I'm simultaneously recording video from the forward facing world camera, the eye camera facing my pupil, and the webcam? Thanks!

papr 30 May, 2019, 16:41:51

@user-e7102b Alternatively, I can recommend the Logitech Webcam C930e.

My recommended setup would be two separate Capture instances, one connected to the headset, one connected to the webcam. You can use the Time Sync plugin to sync time between the instances and the Pupil Groups plugin to start/stop the recording on both instances at the same time.

Afterwards, you can open the headset recording in PLayer, open the new Video Overlay plugin, and overlay the webcam video.

user-923f12 30 May, 2019, 20:05:05

Hello, we just recieved the pupil device with the d415 realsense and it works grat. Though, all the recordings are in .bag files and we can't figure how to play the recordings again nor how to extarct information to a data we can work on (we want to extract the distances information and do some statistical analysis).

Can you please help?

papr 30 May, 2019, 20:08:06

@user-923f12 bag files are usually recorded on ROS. Are you using ROS?

user-923f12 30 May, 2019, 20:08:43

I'm not familliar with ROS...

papr 30 May, 2019, 20:10:53

@user-923f12 so you did a recording with Pupil Capture and it includes bag files? I am wondering if I am missing something here πŸ€”

user-923f12 30 May, 2019, 20:12:27

It saved the recordings from Pupil Capture as bag files. Isn't it suppose to be like that?

papr 30 May, 2019, 20:15:20

I've just checked. The realsense backend does not save anything as bag file. And the recordings by Pupil Capture are usually folders with multiple files, incl info.csv, world.mp4, etc

user-923f12 30 May, 2019, 20:22:06

Maybe there is a package needed to be install or something else I missed?

papr 30 May, 2019, 20:22:41

@user-923f12 could you post a link to the software that you installed/used?

papr 30 May, 2019, 20:23:20

@user-923f12 and just to confirm: You get only a single bag file per recording?

user-923f12 30 May, 2019, 20:25:29

I'm not with the computer we're working on right now.. I'd be glad to send you when I'll get to it. About the the recording file - yes. we got one bag file per recording

user-e7102b 30 May, 2019, 21:07:54

@papr Thanks for the advice! I didn't realize running two instances of Capture simultaneously was an option. Would you recommend trying this on Windows or Mac OS?

papr 30 May, 2019, 21:09:00

@user-e7102b One of the instances can be replaced by a Pupil Mobile instance, btw. Generally, I would recommend to use two different devices, one per instance.

user-e7102b 30 May, 2019, 21:14:38

Sure, two different devices makes sense. I'll try it out with a couple of mac minis.

papr 30 May, 2019, 21:15:14

@user-e7102b The time sync is the important step! Else the overlay will not work correctly

user-20faa1 31 May, 2019, 16:01:47

Which supported or unsupported Android USB-C phones do you use for Pupil Mobile? Have you had any issues?

End of May archive