👁 core


user-908b50 01 May, 2020, 02:34:35

Is there a way to automate the drag and drop feature in pupil player? I have been trying to take a go at it. Can use a few tips!

user-020426 01 May, 2020, 11:18:02

Hi there, I'm not sure if this is the right place to ask but i'l try anyway.

I'm going through the hmd-eyes documentation, i can run the completed executable demo with successful calibration and eye gaze working with pupil service open. When in the Unity editor version of the demo i get a message saying i've successfully connected to Pupil and returns my current version (1.23.10), however, the status text still says not connected and when i press C to begin calibration i get an error saying 'Calibration not possible: not connected!' leading back to subsCtrl.IsConnected catch being false in the CalibrationController script.

Any help or information would be greatly appreciated.

user-88b704 02 May, 2020, 16:03:25

Does anyone know if pupil mobile works with google pixel 4 which runs on android 10?

user-88b704 02 May, 2020, 16:03:53

Is there a complete list of pupil mobile compatible phones?

user-b37f66 03 May, 2020, 19:43:36

Hi, I'm trying to run pupil capture from source on Ubuntu 18.04. When I'm choosing my RealSense D415 camera as video source , I'm getting an error that "The selected camera is already in use or blocked". I ran ps and there is no other process that is using the RealSense camera during the running. In addition when I'm activating pupil capture in the regular way everything working fine . I will really appreciate any help in the subject. Yogev

user-905228 04 May, 2020, 05:14:39

@papr I've got the gps data successfully. Could you tell me the definition of "accuracy and bearing"? And what is the unit of speed? Thank you.

user-c629df 04 May, 2020, 05:16:00

@papr If I want to plot the average scan path across trials in an experiment, while each trial has different number of timestamps in the fixation data file, how could I find the average scan path instead of plotting that for every single trial?

user-c5fb8b 04 May, 2020, 07:18:59

@user-908b50 Hey, what exactly do you mean with "automating the drag and drop feature? If you just want to programmatically open a recording with player, you can just append the recording path when starting Pupil Player over command line. Depending on whether you run from bundle or source, this will look slightly different, but the idea is the same. Example with source:

python main.py player /path/to/my/recording/folder
user-c5fb8b 04 May, 2020, 07:25:56

@user-b37f66 which version of Pupil Capture are you using?

user-c5fb8b 04 May, 2020, 07:32:17

Hi @user-88b704, I can run Pupil Mobile without problems on a Google Pixel 3a with Android 10. So I assume a Pixel 4 should be good to go as well.

papr 04 May, 2020, 07:34:17

@user-894e55 if I see this correctly, you would have to find a way to attach eye cameras to the device. Afterward, you could use the VR-calibration-approach implemented in our unity integration. This way you would not need to integrate scene-camera access into Pupil Capture. This approach highly depends on what data is being made available by the m300xl in realtime.

papr 04 May, 2020, 07:38:26

@user-b37f66 In addition to @user-c5fb8b 's question: How do you start Capture in this case if not "in the regular way"?

user-b37f66 04 May, 2020, 21:13:50

Hi, @user-c5fb8b 1.23.10 @papr As described in the https://github.com/pupil-labs/pupil manual. I'm running "python main.py capture" from terminal in "pupil_src" direrctory

user-c5fb8b 05 May, 2020, 07:24:42

@user-b37f66 are you using the custom realsense backend plugin? Please note that we discontinued official support for the realsense cameras in v1.22. You can read about the reasoning behind this decision in the release notes: https://github.com/pupil-labs/pupil/releases/tag/v1.22 To make transitioning easier, we took the existing code and wrapped it into an external plugin that you can use to interface with the realsense cameras. You can find the link in the release notes or here: https://gist.github.com/pfaion/080ef0d5bc3c556dd0c3cccf93ac2d11

papr 05 May, 2020, 07:38:40

@user-905228 The location data is output of the Android location api https://developer.android.com/reference/android/location/Location Just check out the corresponding getter function documentation, e.g. speed -> getSpeed()

papr 05 May, 2020, 07:49:14

@user-c629df You will have to interpolate the location for a given set of timestamps. 1. Calculate relative timestamps by subtracting the reference timestamp (e.g. timestamp of the first gaze datum that is on the AOI) from all following timestamps. This makes the timestamps comparable 2. Generate a set of interpolation timestamps, e.g. using numpy.arange(0.0, duration_in_seconds, time_difference_between_interpolation_points_in_seconds) 3. Use an interpolation function to estimate the gaze positions at the interpolation timestamps, e.g. by feeding the surface gaze norm pos into https://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.interp1d.html (do it twice, once for x, once for y). This step will give you estimated gaze locations at the same relative timepoints and can therefore be averaged.

user-905228 05 May, 2020, 08:17:42

@papr Got it! Thank you.

user-ab28f5 05 May, 2020, 09:47:11

I have a problem that when i start my experience the value of confidence is low(lower than .5) , then ,my confidence value suddenly become nan . does anyone know what's wrong?

thank you!

user-c5fb8b 05 May, 2020, 09:49:06

Hi @user-ab28f5 what calibration method did you use and what are the reported errors when running a validation afterwards?

user-ab28f5 05 May, 2020, 09:50:37

I uesd the Screen Marker Calibration

papr 05 May, 2020, 09:51:13

Could you share a screenshot of your eye windows? Gaze confidence depends on the pupil detection confidence.

user-ab28f5 05 May, 2020, 09:59:41

Chat image

papr 05 May, 2020, 10:04:38

@user-ab28f5 I have not seen that before. Could you share the recording with data@pupil-labs.com such that we can try to reproduce this?

user-c629df 05 May, 2020, 16:58:01

@user-c629df You will have to interpolate the location for a given set of timestamps. 1. Calculate relative timestamps by subtracting the reference timestamp (e.g. timestamp of the first gaze datum that is on the AOI) from all following timestamps. This makes the timestamps comparable 2. Generate a set of interpolation timestamps, e.g. using numpy.arange(0.0, duration_in_seconds, time_difference_between_interpolation_points_in_seconds) 3. Use an interpolation function to estimate the gaze positions at the interpolation timestamps, e.g. by feeding the surface gaze norm pos into https://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.interp1d.html (do it twice, once for x, once for y). This step will give you estimated gaze locations at the same relative timepoints and can therefore be averaged. @papr Thanks for the detailed reply! 1) In this type of analysis, which data file is most appropriate to use (pupil_position, gaze_position, or fixation_position)? 2) What might be an appropriate duration for the interpolation timestamp? Should it be the same as the time length of a single trial in an experiment? Thanks!

papr 05 May, 2020, 17:01:14

@user-c629df 1) you will need a common reference frame for this to work, i.e. surface mapped gaze or fixations. 2) the duration depends on your experiment. The trial duration is a good starting point, though.

user-6e1b0b 05 May, 2020, 17:22:22

Hello! I was wondering if it was possible for core (or another product of pupil's, though I believe core is what I want) to detect an eye other than a human eye (eg. cat or dog eye). I wasn't sure if the algorithm used for tracking eyes was specifically designed to pick up characteristics of the human eye, like relative size of the pupil/iris, or requiring a completely visible sclera.

user-c629df 05 May, 2020, 17:27:04

@user-c629df 1) you will need a common reference frame for this to work, i.e. surface mapped gaze or fixations. 2) the duration depends on your experiment. The trial duration is a good starting point, though. @papr Oh I see. Thank you so much! What if each trial has slightly different durations, should I find the average of the duration to set up the interpolation timestamps? This might sacrifice certain gaze positions in some trials, but will the interpolation function be able to compensate for this error?

papr 05 May, 2020, 18:45:33

@user-c629df It really depends on what you are trying to research. E.g. if you want to show that the gaze pattern is the same when the subject looks at the AOI for the first time in comparison two the second time, then it would make sense to take all samples between "gaze enters surface" and "gaze exits surface". This would be an event based approach. If you want to simply show how the gaze pattern looks like in the first ten seconds after the gaze has entered the surface, then a fixed duration of 10 seconds would be appropriate.

user-c629df 05 May, 2020, 18:46:37

@papr Make sense! Thanks so much for your help!

user-c629df 05 May, 2020, 18:47:42

@papr May I also seek your suggestions on finding saccade data from the export data files? I couldn't find them nor among the pupil player plugins. I'm wondering if the data is embedded in some other data files?

papr 05 May, 2020, 18:48:37

Pupil Player does currently not perform saccade detection.

user-c629df 05 May, 2020, 18:49:23

I see. Would gaze positions be an effective an alternative to that?

papr 05 May, 2020, 18:51:26

@user-c629df Gaze positions are the raw data on which eye movement detection is based. An alternative would be to use the fixation detection and assume that all gaze that does not belong to a fixation is a saccade. This is not completely true though since there are other eye movements types like smooth pursuit which you would interpret as saccade.

user-c629df 05 May, 2020, 18:59:09

@papr Thanks for the clarification! A follow-up question: if I want to calculate the time interval between moving your eyes away from the fixation point (the center of a screen) to the target location on the screen within an trial of an experiment, which data file should I use to achieve that?

papr 05 May, 2020, 19:02:27

Is there a stimulus change that triggers the saccade? Usually, you would save a trigger/annotation with the timestamp of this event, effectively starting a timer. Then look at the gaze data after that event. As soon as there is significant movement, you can stop your timer.

user-c629df 05 May, 2020, 19:03:24

@papr I see. Thanks for the explanation! I will start with gaze_positions.csv then!

user-c629df 06 May, 2020, 01:04:00

@papr What is the unit of the timestamp? For instance, if there are two timestamps-1588133997.135 and 1588133948.176-would the difference between the two numbers, which is 48.959, represents the actual time difference between the two timestamps in seconds? It seems that in the surface data file, the timestamp doesn't match the real time length of my experiment. For instance, if a certain number of trial takes 5 mins, the difference between the min and max timestamp in the surface file is only 48.959 presumably seconds as its unit, which is drastically different from the real time length.

papr 06 May, 2020, 07:06:20

@user-c629df Timestamps are in seconds. Please be aware that gaze is actually only mapped to the surface if the surface was detected in the scene video. If you want you can share the recording with [email removed] and I can have a look.

user-78130d 06 May, 2020, 10:22:41

Hi everyone, I'm using pupil 👁 core for scientific purposes for a while now. At the moment we would like to combine a motion caputre analysis with eye tracking data. I found that you can integrate/combine the eye tracking data from ergoneers eye tracker with the Qualisys Motion Catpure System. Is there any possiblity to combine/integrate the pupil core with a motion cature system? Thanks a lot for any answer in advance!

papr 06 May, 2020, 15:46:30

@user-c629df We have received your recording. We will come back to you via email tomorrow.

user-c629df 06 May, 2020, 15:46:55

@papr Awesome! Thanks so much for your help!

user-26fef5 07 May, 2020, 08:12:24

@user-78130d Hi there. We didi exactly this. We are using ROS to sync the data streams from core and the qualisys qtm Client. This is based on the message filter package for ROS (time stamped based synchronisation). We have 3d printed a leightweight marker tree and attached that to the core frame. We used the c++ ROS Api since we are more familiar with that, but the Python API should be fine as well (there is a Python ros package for pupil core already and a Qualisys qtm ros wrapper)

user-26fef5 07 May, 2020, 08:13:08

Hope that this helps a bit

user-78130d 07 May, 2020, 09:23:43

@user-26fef5 Thanks a lot! I will check this out.

papr 07 May, 2020, 12:26:14

@user-c629df I did not realise that we have been in contact via email already. Would you mind sharing the scene video with [email removed] too? Without it, it is difficult to judge what is causing the shorter than expected surface data.

user-6e1b0b 07 May, 2020, 17:24:13

Hello! I was wondering if it was possible for core (or another product of pupil's, though I believe core is what I want) to detect an eye other than a human eye (eg. cat or dog eye). I wasn't sure if the algorithm used for tracking eyes was specifically designed to pick up characteristics of the human eye, like relative size of the pupil/iris, or requiring a completely visible sclera.

Sorry, I think my message from a few days ago may have been overlooked. Just reposting it.

wrp 08 May, 2020, 06:41:18

hi @user-6e1b0b the pupil detection algorithm is designed to be used with humans. However, there have been some projects that use modified versions of Pupil Core hardware for canines and non-human primates. I do not have insight into the results of these projects and/or what modifications were made to software for pupil detection.

user-c629df 08 May, 2020, 19:02:53

@user-c629df I did not realise that we have been in contact via email already. Would you mind sharing the scene video with [email removed] too? Without it, it is difficult to judge what is causing the shorter than expected surface data. @papr Thanks! I just emailed it to you!

user-b37f66 10 May, 2020, 20:47:07

Hi , I'm using pupil core headset with a Realsense D415 camera on Ubuntu 18.04 platform . I'm trying to run Pupil Capture from source and, but the program is running terribly slow and the screen barely changing. I tried to activate 1.21 and 1.23 (with the Realsense plugin) versions and in both of them it was happened. When I'm activating the pupil capture in the regular way everything is working just fine. I also checked the memory and CPU usage and there is nothing unusual their. I will really appreciate any help in that matter. Yogev

user-20b83c 11 May, 2020, 04:07:53

Hi, i see the minimum requirements(ram 8gb, cpu i5). Can i use maximum performance(such as higest frame rate, etc) with this?

user-48e99b 11 May, 2020, 04:54:20

Hi, is there anyone who faced the issue on using Realsense D435 camera with 'Pupil Capture from source'? In my case, the camera is well recognized (it is listed up in 'video_capture.uvc_backend'), but when I clicked the device in the application, it returns the error "The selected camera is already in use or blocked" Ah, I tested in Linux 16.04 😄

user-c5fb8b 11 May, 2020, 07:09:06

@user-b37f66 you do not need to run Pupil from source in order to use the RealSense plugin. You can also use the prebuild bundle and just add the plugin to: your home folder > pupil_capture_settings > plugins You might need to create the plugins folder first. Please give this a try to test whether this runs more smoothly for you!

user-c5fb8b 11 May, 2020, 07:11:09

@user-48e99b Which version of Pupil are you using? Please note that we discontinued official support for the realsense cameras in v1.22. You can read about the reasoning behind this decision in the release notes: https://github.com/pupil-labs/pupil/releases/tag/v1.22 To make transitioning easier, we took the existing code and wrapped it into an external plugin that you can use to interface with the realsense cameras. You can find the link in the release notes or here: https://gist.github.com/pfaion/080ef0d5bc3c556dd0c3cccf93ac2d11

user-01d553 11 May, 2020, 07:34:23

Hi, Does anybody have experience with pupil core and VICON Nexus software ? I am trying to send a trigger form VICON to the pupil core, using pupil remote with no luck.

user-48e99b 11 May, 2020, 07:58:40

@user-c5fb8b Currently, I run the 'Pupil Capture' from the source. And I put the file (pyrealsense2_backend.py' to 'pupil_capture_settings > plugins' folder. But the application still returns the error 'The selected camera is already in use or blocked' even though it also returns 'video_capture.uvc_backend: Found device. Intel(R) RealSense(TM) 435' message continuously in the terminal. Could you let me know how I can fix it? Thank you! (The pupil source's version is v1.23)

papr 11 May, 2020, 08:03:05

@user-01d553 Am I correct that you were in contact with us via email? If this is correct, I will come back to you via email to avoid two parallel conversations.

user-01d553 11 May, 2020, 08:11:47

OK... I though this is another channel....sorry...👍

user-c5fb8b 11 May, 2020, 08:12:56

@user-48e99b (maybe also interesting for @user-b37f66) The plugin we created with the removed realsense integration code depends on pyrealsense2, which are the python bindings from librealsense. If you run from source, you will have to install this as well into your python environment. A simple pip install pyrealsense2 should work if you're on linux. See the official docs for more info: https://github.com/IntelRealSense/librealsense/tree/master/wrappers/python#python-wrapper

user-48e99b 11 May, 2020, 08:28:26

@user-c5fb8b I already installed the 'pyrealsense2' by 'pip install pyrealsense2'. To make sure, I reinstalled and checked that the terminal returns 'Requirement already satisfied: pyrealsense2 in ...' But, there are still errors what I mentioned; 'The selected camera is already in use or blocked' in the application window, and 'video_capture.uvc_backend: Found device. Intel(R) RealSense(TM) 435' message in the terminal (It keep showing..) Is there any reason that it is happened? Thank you!

user-c5fb8b 11 May, 2020, 08:29:14

@user-48e99b can you share the log output? From pupil_capture_settings/capture.log

user-48e99b 11 May, 2020, 08:32:36

Michael Jo Pupil-D435 test log

capture.log

user-c5fb8b 11 May, 2020, 08:32:49

@user-48e99b also did you enable the plugin? In the plugin manager menu there should be Realsense2 Source

user-48e99b 11 May, 2020, 08:34:18

@user-c5fb8b Ah.. I didn't enable the plugin. It shows the image correctly! Thank you very much 😄

user-c5fb8b 11 May, 2020, 08:34:58

🙂 No worries, glad everything works!

user-48e99b 11 May, 2020, 08:44:35

Thank you again! 🙂

user-d9a2e5 11 May, 2020, 09:33:23

hello, i want to be sure on something , - capture is the one who calculating the pupil size ,gaze and etc and adding a video record ? , if yes then can i stop the option of calculating the pupil size in it? , and on the other hand pupil player he is transforming the data to exel? , and if i have out side video data it can calculate pupil size and gaze by itself in offline mode?

user-c5fb8b 11 May, 2020, 09:43:25

Hi @user-d9a2e5, I hope I understand you correctly: Pupil Capture records all video streams (eye and world) and by default also runs online pupil detection and gaze mapping, which will be included in the recording. Pupil Player also offers offline analysis options, including re-calculating pupil and gaze data from the recorded video.

1) The pupil size is a by-product of the pupil detection pipeline. You can disable the whole pipeline while recording, which will get rid of the pupil size, but also of the other pupil information as well as the gaze mapping (as no pupil data is available to map). You can do this in the general settings of Capture, by selecting detection & mapping mode: disabled. A common use-case for this is e.g. to reduce CPU load while recording on machines with low hardware specs.

2) If you have a recording without pupil and gaze information, you can still run offline pupil detection and offline gaze mapping in Pupil Player (as with any other recording).

3) Pupil Player does indeed export the data, but not necessarily "to excel". Instead we export all data in a general purpose format format (CSV), which can be opened from many different applications, but yes, also from Microsoft Excel.

I hope this answers your questions?

user-d9a2e5 11 May, 2020, 09:57:37

wow , you mega answered my question , thanks!!! you are great guys , always helping me 🙂 , have a good day

user-ae6127 11 May, 2020, 12:31:04

Hi, I'm using the pupil service to get data via zmq in a c++ application. I have an issue that data does not seem to reflect what I see in the capture program. Initially I though it was a coordinate mapping issue, but I realize it is not - as my data does not even relatively correspond to the movement in capture, nor is it mirrored/inverted. I scale the normalized data with the screen resolution. I am supecting that I might be seeing old data, is there a buffer that can fill up if I don't request information fast enough - any other ideas what could be causing it. At the moment I'm simply trying to get a reliable reaction to looking up, down, left, and right.

user-ae6127 11 May, 2020, 12:32:53

here is how I connect and receive data: https://github.com/mrbichel/eye-read-01/blob/master/src/ofApp.cpp#L21

papr 11 May, 2020, 12:33:46

@user-ae6127 You are subscribing to pupil data, i.e. data in the coordinate system of the right eye camera

papr 11 May, 2020, 12:34:36

please be aware that the right eye camera is physically flipped. Therefore, by default, bigger norm_pos_y values map to a downward movement of the pupil position

user-ae6127 11 May, 2020, 12:34:44

Yep I tried both with theese topics: "pupil.0" "gaze.3d.1."

papr 11 May, 2020, 12:35:02

Gaze is mapped pupil data in scene coordinates and requires calibration. Did you calibrate already?

user-ae6127 11 May, 2020, 12:35:53

yep, I did calibration, and I did a rcording looking around the edges of the screen with that calibration - that is of good quality. I use a chin rest and immediately test with same calibration in my own software.

user-ae6127 11 May, 2020, 12:36:07

I'm aware of the flipped y-axis.

papr 11 May, 2020, 12:36:40

ok, great! There is indeed the possibility, that you are receiving old data if you do not process the data quick enough

user-ae6127 11 May, 2020, 12:37:42

is there a way to make sure you always receive the newest data, and that older packages are simply dropped? - It was actually my understanding of ZMQ_SUB that you always get the most recent data when requesting?

papr 11 May, 2020, 12:39:16

https://rfc.zeromq.org/spec/29/

The SUB Socket Type ... For processing incoming messages: SHALL silently discard messages if the queue for a publisher is full. SHALL receive incoming messages from its publishers using a fair-queuing strategy. SHALL not modify incoming messages in any way. MAY, depending on the transport, filter messages according to subscriptions, using a prefix match algorithm. SHALL deliver messages to its calling application.

papr 11 May, 2020, 12:39:52

New packages will be dropped if the sub queue is full

papr 11 May, 2020, 12:40:56

I recommend to recv() all available data first, before processing the data points one by one. One processing step would be to drop all but the most recent package

user-ae6127 11 May, 2020, 12:41:38

Ok I see, so basically each time I run my recieve method I will currently get the oldest message in the queue?

papr 11 May, 2020, 12:41:45

This is how you can check if data is available: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/zmq_tools.py#L128

papr 11 May, 2020, 12:42:00

@user-ae6127 correct

papr 11 May, 2020, 12:42:31

It is on you to empty/process the queue.

user-ae6127 11 May, 2020, 12:43:00

Ok I see, that si probably my issue then. Any chance you know of a c++ reference / example ?

papr 11 May, 2020, 12:44:23

Yeah, having a look at your code you call recv once every app cycle. This is probably 60Hz or less. This is not sufficient if you run pupil detection at 120 or 200Hz.

user-ae6127 11 May, 2020, 12:45:20

Yep, I tried to increase my framerate to accomodate - but my graphics processing in the application can not keep up with that.

papr 11 May, 2020, 12:45:22

Basically you need to modify ofApp::update() to

while pupilZmq.hasData() {
    pupilZmq.receive()
}
user-ae6127 11 May, 2020, 12:45:38

ok

papr 11 May, 2020, 12:46:19

The frequency of your gui should not be dependent on your subscription input frequency

user-ae6127 11 May, 2020, 12:47:10

cppzmq that I use does not seem to have a hasData method.

papr 11 May, 2020, 12:47:59

Neither does the python implementation. 🙂 This was more of an abstract example to clarify what needs to change 🙂

papr 11 May, 2020, 12:48:22

Check this Python equivalent on how to implement a hasData() function: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/zmq_tools.py#L128

user-ae6127 11 May, 2020, 12:52:33

hmm ok, seems I can just keep running recv until it returns false ?

papr 11 May, 2020, 12:57:19

@user-ae6127 No, this won't work, as recv() is a blocking function by default

papr 11 May, 2020, 12:59:00

@user-ae6127 btw, it might be easier to use the higher level http://zeromq.github.io/zmqpp/ over cppzmq which is quite low level

papr 11 May, 2020, 12:59:35

You can even use an event based approach with http://zeromq.github.io/zmqpp/classzmqpp_1_1reactor.html

user-ae6127 11 May, 2020, 13:02:37

yeah, might be a good idea. Have a bunch of code written with zmqpp now though.

user-ae6127 11 May, 2020, 13:02:51

something like
"while( socket.get( ZMQ_EVENTS ) == ZMQ_POLLIN ) { socket.recv(&frame1); socket.recv(&frame2); }"

papr 11 May, 2020, 13:03:51

This looks about right. I have no c++ experience though. If you have already bunch of code and got used to cppzmq, then stick with it. There is nothing wrong about using it.

user-ae6127 11 May, 2020, 13:17:30

thanks for the guidance

user-d9a2e5 11 May, 2020, 16:09:32

hey ,i have another question ,do you have any suggestion in interpolating gaze data? im doing the following things: first deleting all data with confidence less then 0.8 , and all data which is not on AOI , then i making sure all data between 0 to 1 , and then i m using interpolator . do you know to tell me if i m doing something off the chart?, im asking because i see that my data which is interpolated to the same frequency of the movie is not same as shown in pupil player , and the gaze in pupil player is it raw data ? or it is interpolated too?

user-50974c 11 May, 2020, 17:32:54

is it possible to do 3d calibration with one eye by doing multiple 2d calibrations?

user-50974c 11 May, 2020, 17:33:37

the idea is to have many calibration profiles and switch to the right distance one (i know the distance they'll be looking at)

papr 11 May, 2020, 17:34:18

@user-50974c No, simply run the 3d calibration to get 3d gaze mapping. There is no simple way to switch between multiple calibrations, too.

user-50974c 11 May, 2020, 17:35:09

thanks for the quick response 👍

user-50974c 11 May, 2020, 17:46:12

hmm, but how would the pupil capture software properly calibrate in 3d with only one eye? I was considering doing an extra calibration layer in my own python script so that swapping (between multiple) would be easy.

papr 11 May, 2020, 17:50:31

The 3d calibration works via bundle adjustment. For this procedure, it does not matter if you have one or two eye cameras. There is one difference though: The gaze_point_3d will show a fixed depth. But you should not rely on this value anyway as it is quite noisy.

user-b37f66 12 May, 2020, 04:51:41

@user-c5fb8b Hi, I know that the Realsense plugin could be loaded in the prebuild bundle and everything is working great when I'm loading the plugin in that way. I need to run pupil capture from source with Realsense plugin in order to load and use another plugin that I developed that imports pythonic packages that aren't available in the prebuild version. The problem as I said is that when I'm running pupil capture from source with Realsense plugin, the program is running very slowly. In addition, I already installed pyrealsense2 in order to load Realsense plugin from source. Hope the problem is now more clearer.

user-c629df 12 May, 2020, 06:09:37

@papr May I ask you for your suggestion on how to plot better pupil diameter graphs accurately? I followed the coding tutorials on github and it turns out that the plot does not exclude dips as shown below, which influence the overall pattern. Thanks!

user-c629df 12 May, 2020, 06:09:55

Chat image

papr 12 May, 2020, 06:14:02

@user-b37f66 can you check which frame rate/s is/are selected in the pyrealsense plugin? You should select a frame rate of 30 or higher to make the ui run smoothly. Please also check the fps graph in the top left for the effective fps numbers that you are receiving.

papr 12 May, 2020, 06:15:33

@user-c629df Are you aware that you should be filtering data by confidence? If so, which threshold have you chosen.

user-7fa523 12 May, 2020, 06:37:50

Hello, I try to find out where exactly the image plane (physical sensor) of the world camera is located. Is there any specific data on that? Or is it possible to tell me the actual sensor measurements, so that I can deduce the focal length in mm from the intrinsic camera calibration?

user-c629df 12 May, 2020, 07:00:44

Chat image

user-c629df 12 May, 2020, 07:01:30

@papr Yes you are correct! When I choose the 0.85 as a threshold, the graph looks like the one above. Thanks!

user-c629df 12 May, 2020, 07:01:57

A follow up question is how to explain the many variations within 3s? Also how come one's diameter can go above 160mm?

user-c629df 12 May, 2020, 07:02:51

A typical graph that I see from literatures are like the ones below, which has much smoother curve?:

Chat image

user-c5fb8b 12 May, 2020, 07:04:11

@user-d9a2e5 I'm not sure what you want to achieve? Why are you interpolating the data at all?

user-908b50 12 May, 2020, 07:36:38

@user-908b50 Hey, what exactly do you mean with "automating the drag and drop feature? If you just want to programmatically open a recording with player, you can just append the recording path when starting Pupil Player over command line. Depending on whether you run from bundle or source, this will look slightly different, but the idea is the same. Example with source: python main.py player /path/to/my/recording/folder @user-c5fb8b Thanks, that's good to know! I am on the very early stages of data analysis and learning rn. Basically, I want to get exports for each of the recording folder simultaneously instead of getting these exports individually. If I can do the above on python, I should be able to use for loops? I had been building a bundle in a conda env over the past week but I realized python 3.8 won't work so starting over in a 3.6 virtual env.

papr 12 May, 2020, 07:38:44

@user-c629df What are you plotting exactly? diameter or diameter_3d? Such high values speak for a ill-fit eye model. The eye model needs to be fit well in order to produce consistent and good data.

user-c5fb8b 12 May, 2020, 08:21:49

@user-908b50 Unfortunately it is currently not possible to automatically open and export multiple recordings

papr 12 May, 2020, 08:31:59

@user-908b50 There is a pupil-community script that extracts prerecorded data from recordings that you could give a try https://github.com/tombullock/batchExportPupilLabs

user-d9a2e5 12 May, 2020, 08:41:26

@user-d9a2e5 I'm not sure what you want to achieve? Why are you interpolating the data at all? @user-c5fb8b i want to check gaze data , and i need frequency same as the movie, and i need to delete all non relevent data no?

papr 12 May, 2020, 08:43:13

@user-d9a2e5 From your previous question: The gaze displayed in Pupil Player is not interpolated.

Also, could you elaborate what you referring to by "movie"?

user-c5fb8b 12 May, 2020, 08:44:38

@user-d9a2e5 do you want to match gaze data to world video frames?

user-d9a2e5 12 May, 2020, 09:28:58

@user-d9a2e5 do you want to match gaze data to world video frames? @user-c5fb8b yes and + the pupil data

user-d9a2e5 12 May, 2020, 09:30:52

@user-d9a2e5 From your previous question: The gaze displayed in Pupil Player is not interpolated.

Also, could you elaborate what you referring to by "movie"? @papr i am trying to see movies which affect pupil delitation , and combining it with gaze data for my project

user-d9a2e5 12 May, 2020, 09:35:46

and i saw that when i am checking the raw data in pupils , its frequency histogram is not as i choose on pupil capture - i choose 200 , and i get some of them even bigger then 200 [hz] did i do something bad?

user-c5fb8b 12 May, 2020, 09:47:07

@user-d9a2e5 the exact frame-rate of the eye data can jitter a bit due to transmission lags etc.

How do you analyse the data? If you export your data with Pupil Player into CSV files, they will already contain the matching information, e.g. in pupil_positions.csv you'll find for every pupil datum the corresponding world video frame index. If you remove your outliers here (confidence filter), you might indeed end up with some world frames where you do not have any corresponding pupil data. Is interpolation here really a good idea? This might actually introduce bad data. Maybe you can think rather about discarding world frames that do not have any pupil data available for your analysis?

Please tell me if what I'm saying makes any sense, as I'm not aware of your specific experimental setup 🙂

user-d9a2e5 12 May, 2020, 09:59:22

@user-d9a2e5 the exact frame-rate of the eye data can jitter a bit due to transmission lags etc.

How do you analyse the data? If you export your data with Pupil Player into CSV files, they will already contain the matching information, e.g. in pupil_positions.csv you'll find for every pupil datum the corresponding world video frame index. If you remove your outliers here (confidence filter), you might indeed end up with some world frames where you do not have any corresponding pupil data. Is interpolation here really a good idea? This might actually introduce bad data. Maybe you can think rather about discarding world frames that do not have any pupil data available for your analysis?

Please tell me if what I'm saying makes any sense, as I'm not aware of your specific experimental setup 🙂 @user-c5fb8b basically what i do , i have a video with AOI markers , i am watching the movie , then using player to extract data , i m using pupil tool box that have been made by someone i am not sure his/her name+ i am using mine gaze data interpolator -that's basically all i did , and then i tried to make it the same size and now i am trying to put it in the movie i used - gaze and pupil data together. sorry for my bad english :C.

user-d9a2e5 12 May, 2020, 10:01:03

am i asking too much questions?

user-c5fb8b 12 May, 2020, 10:01:07

@user-d9a2e5 I understand. But doesn't it make sense to not display any gaze data if there is no high confidence gaze data?

user-d9a2e5 12 May, 2020, 10:04:52

@user-c5fb8b it does , but the time is not the same with the my movie video , so i have to make it same same :C , so i kinda feel like i most to do it , btw what data better? gaze_time , or taking the mean of the same world timer?

user-c5fb8b 12 May, 2020, 10:05:41

@user-d9a2e5 are you matching the gaze data by time or by index?

user-d9a2e5 12 May, 2020, 10:06:29

@user-c5fb8b i think i matching by index

user-c5fb8b 12 May, 2020, 10:13:46

@user-d9a2e5 are you using the exported surface data to match this? Since you seem to be only interested in the gaze on your surface, correct? I'm not entirely sure what the cause of your problems is. My guess would be that you are matching the wrong indices/timestamps somehow, if Pupil Player shows everything correctly. But I fear I can't help you much more on this, especially if you are using some toolbox that someone else wrote. In theory this should be an easy task.

user-d9a2e5 12 May, 2020, 10:28:04

@user-c5fb8b i am using https://github.com/ElioS-S/pupil-size , do you know their code ?

user-c5fb8b 12 May, 2020, 10:40:55

@user-d9a2e5 unfortunately we cannot offer any support for external code/tools. Could you share some screenshots of what you are trying to achieve with your current implementation? Maybe we can understand the underlying issue better from this.

user-d9a2e5 12 May, 2020, 10:47:52

@user-c5fb8b okay 🙂 , i will send it in upcoming days ty for help! and sry for the many questions hahah ops

user-370594 12 May, 2020, 13:24:47

Hi! I want to buy a Windows tablet for using it with the Core eye-tracker. Unfortunately I couldn't find any information about technical requirements to laptops and tablets.

user-0a5591 12 May, 2020, 14:20:19

Hi, I have a quick sales question. If we buy the Core eye-tracker with the high speed camera, can we still change the configuration to use it with the RealSense camera in the future?

user-c629df 12 May, 2020, 17:02:22

@user-c629df What are you plotting exactly? diameter or diameter_3d? Such high values speak for a ill-fit eye model. The eye model needs to be fit well in order to produce consistent and good data. @papr I'm plotting diameter of the pupil in image pixels as observed in the eye image frame. Should I plot diameter_3d instead for a better eye model?

papr 12 May, 2020, 17:19:25

@user-c629df ah OK, then the value range makes more sense as it's pixels, not mm. If you want mm, you need to plot diameter_3d

user-c828f5 12 May, 2020, 17:34:36

Hi @papr , I just wanted to quickly confirm an understanding of the PL core algorithm (Swirski model). When we select 3D pupil mode, the software generates a 3D model based on detected 2D ellipses in the eye imagery yes? In order to calculate norm_pos in the scene video, PL uses the 2D ellipse center and calibration data to map the 2D ellipse center onto a 2D norm_pos value in the scene video yes?

user-c828f5 12 May, 2020, 17:35:06

The 3D model does not update the 2D pupil center value right? Rather, it is used to derive a 3D vector?

papr 12 May, 2020, 17:41:31

If you select 3d detection and mapping mode, the gaze mapping will map the pupil circle_3d normals to the gaze_normals and try to intersect them, resulting in the gaze_point_3d. This 3d point is backprojected onto the scene image, resulting in the gaze norm_pos.

The pupil center (pupil norm_pos) is only used in 2d gaze mapping using polynomial regression.

In 3d mode, the 2d ellipse will be projected onto the 3d model. Then the ellipse will be adjusted based such that it is a circle that is tangential to the eye model. Afterward, this circle (circle_3d) is backprojected onto the eye image (ellipse). This actually overwrites the original 2d ellipse values. Depending on how well the backprojection fits on the original value, the higher the model confidence will be.

papr 12 May, 2020, 17:42:13

In other words, the 3d model does update the 2d pupil center.

user-c828f5 12 May, 2020, 17:42:23

Perfect. That's what I wanted to know.

user-c828f5 12 May, 2020, 17:42:33

Now, has this always been the behavior?

user-c828f5 12 May, 2020, 17:42:44

Or was this behavior updated post a particular version?

papr 12 May, 2020, 17:46:48

Now, has this always been the behavior? Yes, as far as I can remember.

In our upcoming 2.0 release, we will store 2d and 3d data, such that no data gets lost. This allows nice visualizations showing 2d and 3d ellipse at the same time. You can use this to check if your eye model is fit well.

user-c828f5 12 May, 2020, 18:00:46

I'm a little surprised by this information that it has always been the case .. So does this mean that a single natural features calibration point would calibrate the system? Or do you do a polynomial mapping on gaze norm_pos post natural features calibration (or any calibration for that matter)?

papr 12 May, 2020, 18:04:40
  • 2d calibration uses polynomial regression based on pupil norm_pos as input and the norm_pos of the reference target
  • 3d calibration uses bundle adjustment to find the physical rotation of the eye cameras in relation to the scene camera.

Both methods need multiple reference locations that are spread across the scene camera's field of view in order to work well.

user-c828f5 12 May, 2020, 18:06:30

Thank you, this cleared a lot of information for me. Could you point me to a link or webpage where this process is highlighted?

papr 12 May, 2020, 18:08:10

There is a note about this in https://docs.pupil-labs.com/core/software/pupil-player/#gaze-data-and-post-hoc-calibration

Mapping Method: 2d uses polynomial regression, or 3d uses bundle adjustment calibration. The terms are not explained further as they are common/public algorithms.

user-c828f5 12 May, 2020, 18:14:00

Ok thanks @papr

user-b37f66 12 May, 2020, 20:05:52

@papr The frame rate is by default 30 and I didn't change it. I have notice that in the very few seconds after the activation of the Realsensse plugin , the program is running fine and just after that it is becoming slow. The same thing is happening when I'm trying to run from source an earlier version of pupil capture (1.21) with the former way to activate the RealSense World\Depth camera.

papr 13 May, 2020, 12:17:47

@user-b37f66 Unfortunately, it is very difficult for us to judge what is causing this issue as we do not have access to a realsense camera right now in order to reproduce the issue.

user-7daa32 14 May, 2020, 03:46:46

Hello

user-7daa32 14 May, 2020, 03:49:15

hello, I am a first year graduate student and a novice in the use of pupil lab core and solfware. Please can someone help me on how to get started. the headset has two eye cameras.

user-c5fb8b 14 May, 2020, 06:23:22

Hi @user-7daa32 I would start by reading through the documentation here: https://docs.pupil-labs.com/core/ probably the Getting Started section as well as the User Guide. This should help you getting an idea of what Pupil can do and what to watch out for. Especially the user guide > best practices might be interesting.

user-7daa32 14 May, 2020, 06:39:38

Hi @user-7daa32 I would start by reading through the documentation here: https://docs.pupil-labs.com/core/ probably the Getting Started section as well as the User Guide. This should help you getting an idea of what Pupil can do and what to watch out for. Especially the user guide > best practices might be interesting. @user-c5fb8b done already

user-7daa32 14 May, 2020, 06:43:03

@user-c5fb8b done already @user-7daa32 I don't know why it's getting difficult to do a pupil detection with a confidence of .9 or 1 everyday. Today I will get a good confidence, at another time, I will find it difficult to detect the pupil

user-7daa32 14 May, 2020, 06:44:54

It's pretty easy to detect it in one eye but hard to do that for both eyes

user-c5fb8b 14 May, 2020, 06:45:40

@user-7daa32 there are a couple of reasons for why pupil detection might be hard. Generally you want the pupil to be clearly visible with a good contrast. If you have any examples for the pupil detection failing or producing low confidence values only, you can share a screenshot of the eye window with us and we might be able to give you more specific tips on how to improve your setup.

user-7daa32 14 May, 2020, 06:55:17

@user-7daa32 there are a couple of reasons for why pupil detection might be hard. Generally you want the pupil to be clearly visible with a good contrast. If you have any examples for the pupil detection failing or producing low confidence values only, you can share a screenshot of the eye window with us and we might be able to give you more specific tips on how to improve your setup. @user-c5fb8b thank you so much. I will screenshot them today during the day. I actually mean to say that it's hard positioning the pupil at the center of the eye window in order to obtain a good config. This got me frustrated and Most times, I tend to twist the frame incorrectly. I have recording file containing spreadsheet data, video data and data that I don't know what they mean. I am aware of parameters like Fixation, Pupil diameter (I saw this on the Excel file), heatmap, scanpath. I think I have to work on Pupil detection and calibration for now

user-c5fb8b 14 May, 2020, 06:59:46

@user-7daa32 ok. Just in case, here are the links to how you can adjust your eye cameras: - sliding: https://docs.pupil-labs.com/core/hardware/#slide-eye-camera - rotating: https://docs.pupil-labs.com/core/hardware/#rotate-eye-camera - switching the extenders: https://docs.pupil-labs.com/core/hardware/#eye-camera-arm-extender

user-7daa32 14 May, 2020, 07:06:27

@user-7daa32 ok. Just in case, here are the links to how you can adjust your eye cameras: - sliding: https://docs.pupil-labs.com/core/hardware/#slide-eye-camera - rotating: https://docs.pupil-labs.com/core/hardware/#rotate-eye-camera - switching the extenders: https://docs.pupil-labs.com/core/hardware/#eye-camera-arm-extender @user-c5fb8b thanks again. I have gone through these several times. I'm trying to say that with the headset fixed in my head, I was unable to get detect the pupil. I adjusted the movable parts and still don't detect the Pupils. I have read most of the guide in the pupil lab website. I will send pictures later today. Thanks

user-ab0622 14 May, 2020, 08:45:48

hey there ^_^ is this the right place to ask about vive eye pro questions.

user-ab0622 14 May, 2020, 08:50:48

I am having trouble confuring the vive eye pro ^_^ cause I am using it in a lab that speaks a foreign language so it's hard to ask around if I am doing something wrong. if it's the wrong place please direct me to the right place

papr 14 May, 2020, 08:59:30

@user-ab0622 Hi, the Vive Eye Pro uses eye tracking from Tobii. Unfortunately, we cannot help you with that here. I would recommend contacting their support directly.

papr 14 May, 2020, 08:59:59

Nonetheless, you can let us know your questions. Maybe, we can answer them partially.

user-ab0622 14 May, 2020, 09:00:28

well

user-ab0622 14 May, 2020, 09:00:41

I have a vive with a tracker and a plug. I am not sure if it's from pupil eyes or not

user-ab0622 14 May, 2020, 09:00:52

I'll show you the picture

user-ab0622 14 May, 2020, 09:02:07

Chat image

user-ab0622 14 May, 2020, 09:02:35

Chat image

user-ab0622 14 May, 2020, 09:02:55

Chat image

user-ab0622 14 May, 2020, 09:02:56

It displays errors as follows

papr 14 May, 2020, 09:03:04

@user-ab0622 This is indeed our product. 🙂

user-ab0622 14 May, 2020, 09:04:42

^_^ thought so that's why I came here. I thought it was a normal vive eye pro but I think it was assembled. First there's the usb. I am supposed to connect it to the computer right? And second do you think the reason why eyetracking isn't working is because I am installing the tobii eye tracking software? like how am I supposed to operate it.

papr 14 May, 2020, 09:07:39

@user-ab0622 Please checkout our getting started section for the add-on https://docs.pupil-labs.com/vr-ar/htc-vive/#install-the-vive-pro-add-on Afterward, checkout the getting started section for our software: https://docs.pupil-labs.com/core/#_1-put-on-pupil-core

papr 14 May, 2020, 09:08:34

Also, checkout the hmd-eyes project which includes a plugin for Unity https://github.com/pupil-labs/hmd-eyes/#hmd-eyes

user-ab0622 14 May, 2020, 09:09:11

oh nice man I didn't know there were many options I used to think the vive pro eye had many options but this is an addon to a normal vive pro right? also

user-ab0622 14 May, 2020, 09:09:34

do you recommend having the vive installation do all the work or it's better to do my own custom setup with teh software I'll definately check all the links.

user-7daa32 14 May, 2020, 13:19:07

Please why is the world window screen blurry. As soon as I progress, I will send screenshots

user-7daa32 14 May, 2020, 14:32:02

hi, Please where will i click to upload pics here?

papr 14 May, 2020, 14:33:19

@user-7daa32 If you are using Discord on a desktop machine there should be a + sign on the left side of the text field

user-c5fb8b 14 May, 2020, 14:34:11

@user-7daa32 you can also just drag-and-drop them

user-7daa32 14 May, 2020, 14:34:40

@user-7daa32 you can also just drag-and-drop them @user-c5fb8b Thanks so much

user-7daa32 14 May, 2020, 14:43:58

here are the screenshots of pupil detection. the config values especially for eye 0 tend to move sharply from 1 to below .5 while trying to look at the wall (supposed stimulus). I didnt use the arm extenders in this case. I have not done the algorithm setting or calibrated. just pupil detection. the eye 1 was flipped. is it not bad when the config vallues are changing like that?

Chat image

user-c5fb8b 14 May, 2020, 15:02:23

@user-7daa32 in this screenshot the confidence seems to be fine? Do you have an example where the confidence drops? Additionally the pupil might be obstructed partially by the eyelashes here., which will make detection harder. Optimally the cameras should record the eye from slightly below, which makes it less likely that the eyelashes obstruct the image, maybe you can give the extenders a try?

user-7daa32 14 May, 2020, 15:20:24

I tried the extenders but i kept getting the pupils out of detection. Right now the screen calibration marker it taking long to read

papr 14 May, 2020, 16:16:58

@user-7daa32 Please make sure that the screen is centered in the field of view of the scene camera. If the screenmarker calibration shows a red dot in the middle this means that the calibration marker is not being detected.

user-7daa32 14 May, 2020, 16:32:46

@user-7daa32 Please make sure that the screen is centered in the field of view of the scene camera. If the screenmarker calibration shows a red dot in the middle this means that the calibration marker is not being detected. @papr I am still unable to calibrate with the screencamera positined at the center. attached is the screenshot of the world camera window

Chat image

user-7daa32 14 May, 2020, 16:33:51

@papr I am still unable to calibrate with the screencamera positined at the center. attached is the screenshot of the world camera window @user-7daa32 @user-7daa32 Please make sure that the screen is centered in the field of view of the scene camera. If the screenmarker calibration shows a red dot in the middle this means that the calibration marker is not being detected. @papr I dont know why i am getting those stuffs i cirlcled

papr 14 May, 2020, 16:36:51

It looks like there is a smudge on your lens. Please use a microfiber cloth to clean it. If the image is still blurry, please try to rotate the lens carefully. If you turn it too far outwards it will detach.

user-7daa32 14 May, 2020, 16:56:10

It looks like there is a smudge on your lens. Please use a microfiber cloth to clean it. If the image is still blurry, please try to rotate the lens carefully. If you turn it too far outwards it will detach. @papr THank so much. I think I am improving. why is the calibration area very small?

user-7daa32 14 May, 2020, 16:57:03

It looks like there is a smudge on your lens. Please use a microfiber cloth to clean it. If the image is still blurry, please try to rotate the lens carefully. If you turn it too far outwards it will detach. @papr THank so much. I think I am improving. why is the calibration area very small? there is this words ''dismissing 33%'' what does it mean?

Chat image

papr 14 May, 2020, 17:05:55

I would say the best way to help you would be if you could start a recording, make a calibration, stop the recording and send it to data@pupil-labs.com @user-c5fb8b can have a look at it tomorrow.

user-7daa32 14 May, 2020, 19:08:49

I would say the best way to help you would be if you could start a recording, make a calibration, stop the recording and send it to data@pupil-labs.com @user-c5fb8b can have a look at it tomorrow. @papr OKay. thanks

user-7daa32 14 May, 2020, 19:11:21

pupil player refused to open. only this opened without showing the booting green wordings. I reinstalled and still having the issue. Please do you know anything I could do to resolve it? thanks

Chat image

papr 14 May, 2020, 19:12:03

Please delete the user_settings files in the pupil_player_settings folder

papr 14 May, 2020, 19:12:09

and try again

user-6779be 15 May, 2020, 10:46:52

Hello, I am trying to find out information about the headpose tracking and found very limited content on the website. Has anyone worked with real-time headpose tracking with pupil core? Also is there a way to get the data into unity3D?

user-c5fb8b 15 May, 2020, 11:13:41

Hi @user-6779be, do you have any specific questions regarding the headpose tracking?

user-6779be 15 May, 2020, 11:26:11

@user-c5fb8b I would like to know how to subscribe to the headpose data while streaming over pupil remote andwhat is the format of the data

user-c5fb8b 15 May, 2020, 11:47:45

@user-6779be I realized that we unfortunately don't have any documentation on the online head pose tracker in Pupil Capture. However, it works very similar to the offline head pose tracker in Pupil Player. You can find the documentation here: https://docs.pupil-labs.com/core/software/pupil-player/#analysis-plugins The online head-pose tracker also publishes the data every frame via our normal network interface. You can subscribe to the topic head_pose to receive the data. Here's an example dump for a single frame:

{
  'camera_extrinsics': [-2.549196720123291,
                       -0.05451478064060211,
                       -0.7083187103271484,
                       -2.4842631816864014,
                       -8.648696899414062,
                       18.81245231628418],
 'camera_pose_matrix': [[0.8645263314247131,
                         -0.08990409970283508,
                         0.49448105692863464,
                         -7.932243824005127],
                        [0.16451121866703033,
                         -0.879048764705658,
                         -0.44744759798049927,
                         1.223649024963379],
                        [0.4749003052711487,
                         0.4681779146194458,
                         -0.745170533657074,
                         19.247390747070312],
                        [0.0, 0.0, 0.0, 1.0]],
 'camera_poses': [2.549196720123291,
                  0.05451478064060211,
                  0.7083187103271484,
                  -7.932243824005127,
                  1.223649024963379,
                  19.247390747070312],
 'camera_trace': [-7.932243824005127, 1.223649024963379, 19.247390747070312],
 'timestamp': 230.674514,
 'topic': 'head_pose'
}
user-6779be 15 May, 2020, 11:49:00

@user-c5fb8b Thanks will check it out

user-7daa32 15 May, 2020, 18:36:59

Pupil player > pupil player refused to open. only this opened without showing the booting green wordings. I reinstalled and still having the issue. Please do you know anything I could do to resolve it? thanks @user-7daa32 @user-c5fb8b @papr I can't find the user_setting files. Please how do they look like ? Thanks

user-7daa32 18 May, 2020, 01:01:21

I have read that while calibrating we should try to keep the head still. Please is this also what we should do during recording? I'm glad for the previlage I got as a member of this community. I wish to apologize that I will be asking a lot of novice-like questions

user-c5fb8b 18 May, 2020, 06:26:02

@user-7daa32, you can find the user_settings in: your home folder > pupil_player_settings > user_settings Regarding head movement: once you are calibrated, slippage of the headset can reduce the accuracy of the gaze prediction. As you can read in the docs, this effect is minimized when using the 3D detection and mapping pipeline, but you will still get reduced accuracy if the accumulated slippage becomes too large. No need to apologize, this is the place to ask your questions! 🙂

user-6e9a97 18 May, 2020, 17:26:16

Hi, I am having problem from the very beginning, namely I cannot find a proper position of core glasses on my face. Is there any tutorial on that?

wrp 19 May, 2020, 02:16:46

Hi @user-6e9a97 Have you already checked: https://docs.pupil-labs.com/core/#_3-check-pupil-detection ?

wrp 19 May, 2020, 02:17:23

Are you familliar with all the adjustments that the eye cameras can make? https://docs.pupil-labs.com/core/hardware/#headset-adjustments ?

wrp 19 May, 2020, 02:18:08

If you are able to share a small video of your eye or screenshot, we might be able to provide some suggestions on how to improve the setup 😸

user-7fa523 19 May, 2020, 06:16:56

Hello, I try to do positional tracking with the world camera of the pupil core eye tracker. In order to evaluate my results accurately, I would need the exact projection center. Are there any dimensions I could possibly use for these? Any help with this would be much appreciated.

user-c5fb8b 19 May, 2020, 06:43:40

Hi @user-7fa523, did you take a look at the head pose tracking plugin? Maybe this is already something similar to what you need? Otherwise, what do you mean by projection center? What's your setup and what exactly do you track?

user-7fa523 19 May, 2020, 07:40:28

@user-c5fb8b thank you very much for the hint with the plugin. This is also interesting for me, but currently not exactly what I am looking for. My setup: I use a second camera with depth information and estimate the head pose through feature matching in both frames (depth camera and world camera of the eye tracker). I want to evaluate my results with an external camera system (Krypton K600). I track the origin of the world camera coordinate system assuming the pinhole camera model. So by projection center I mean that origin. In reality, I suppose it would be the image sensor. I am interested in physical measurements so that I can define from the outside where the center is inside the camera.

papr 19 May, 2020, 07:42:49

@user-7fa523 What you are looking for are the camera intrinsics. Checkout our prerecorded camera intrinsics here: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/camera_models.py#L27-L76

If you want more accurate results it is recommended to run the camera intrinsics estimation procedure for your camera https://docs.pupil-labs.com/core/software/pupil-capture/#camera-intrinsics-estimation

user-499cde 19 May, 2020, 14:35:08

Hello @papr , I had a question regarding the specifications of the pupil core camera. Does the pupil core eye camera has a global/rolling shutter?

papr 19 May, 2020, 14:37:00

@user-499cde The 200hz eye cams use a global shutter, as far as I know.

user-499cde 19 May, 2020, 14:40:02

yes! Thank you for the information

user-7fa523 19 May, 2020, 15:01:06

Thank you @papr! I did a camera intrinsic estimation procedure. But this gives me the measurements in pixel units. I need the position of the optical center in metric units. The camera in question is “Pupil Cam1 ID2”. The physical measurements of the sensor would be helpful to transform the values of the intrinsic camera matrix. Do you have any information about that? However, this calculation could become imprecise. Do you maybe have any measured values or a data sheet for this camera in which the position of the optical center is described as a physical value?

user-b11159 20 May, 2020, 06:51:03

Hey guys, I am launching pupil_capture from the terminal, but every time, I try to select one of the eye cams, I encounter this error message: ImportError: /usr/lib/x86_64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.22' not found (required by /opt/pupil_capture/pupil_detectors/detector_3d/detector_3d.cpython-36m-x86_64-linux-gnu.so) Could anybody help me out here, I am really stuck because of this and do not know what it causes, i thought it is maybe a problem with the version !?

papr 20 May, 2020, 07:00:59

@user-b11159 Hey, which Ubuntu version are you running?

user-b11159 20 May, 2020, 07:24:26

@papr that would be Ubuntu 16.04.

papr 20 May, 2020, 07:49:25

@user-b11159 Could you try running this sudo apt-get install libstdc++6

user-b11159 20 May, 2020, 08:05:02

@papr yes I tried that, it prompts, that libstdc++6 is already the newest version (5.4.0-6ubuntu1~16.04.12). And it still crashes with the same error

papr 20 May, 2020, 08:06:03

@user-b11159 maybe you can try to reinstall. It looks like one of its dependencies is not correctly installed.

user-b11159 20 May, 2020, 08:53:11

@papr i have reinstalled the whole thing now but it still throws the same error at me.

papr 20 May, 2020, 08:59:21

@user-b11159 What is your output of sudo apt show libstdc++6?

user-b11159 20 May, 2020, 09:42:30

@papr That would be the content of this file

Output.txt

user-ab28f5 20 May, 2020, 11:33:17

Sorry i want to ask a question~

I have a problem ! When I open the pupil player It show some like this

What happen?

Chat image

user-370594 20 May, 2020, 14:19:57

Hi! I don't see the plugin to count eye movements (fixations, saccades etc), I have here only fixation detector. I'm sure I had it on my previous laptop. Have you changed anything in the new version of the soft?

Chat image

user-c5fb8b 20 May, 2020, 14:25:59

Hi @user-ab28f5, please try resetting your user settings: go to your home folder > pupil_capture_settings and delete all user_settings files

user-c5fb8b 20 May, 2020, 14:27:54

Hi @user-370594, we found that the external library we used for eye movement classification did not perform well enough to be usefull and removed it from Pupil Player and Capture in v1.17.

user-370594 20 May, 2020, 14:28:43

Ok, I see. Thanks!

user-ab28f5 20 May, 2020, 14:50:42

@user-c5fb8b ok, I see thank you

user-7daa32 20 May, 2020, 20:14:38

Hello Scholars

I am having trouble understanding the meaning of most terminologies used in Pupil lab core Technology. Most of which always appear here. I read about them on the website and still don't understand. Also, I need a scaffolding tutor to understand these terms. I also need a help on how to get pupil dilation and gaze mapping data.

user-7daa32 20 May, 2020, 20:19:01

We are doing Chemistry Education Research and it involves asking Students to solve chemical problems while looking at Chemistry concepts. We fixation data, scanpath, pupil dilation and heatmap

user-1a449f 21 May, 2020, 06:11:44

Hello. I would like to know if the pupil core works as a marketing research tool

user-6e9a97 21 May, 2020, 19:29:57

Hi @user-6e9a97 Have you already checked: https://docs.pupil-labs.com/core/#_3-check-pupil-detection ? @wrp Dear wrp, thanks for your reply. Here's the link with some pics of my setting... in this way I'm able to calibrate with a 30% of missing data due to pupil < 1; hower, when I visually inspect my records the precision appears to be very low https://photos.app.goo.gl/z6EdB9SUVj6LXWjk8 I've followed the instructions provided on your website, but I'm still feel not confortable with data recording thank you in advance for any hints!

user-a10852 21 May, 2020, 21:58:44

Hello, I'm trying to create a recording from an external eye video file. I wasn't successful when I generated evenly spaced timestamps (the video file frame rate is also consistent), so I'm just wondering how the timestamps are usually generated? When looking at some recordings from my pupil core, I notice there is a difference between the timestamps and the frames on the video file, but I'm not sure what the significance is. Thanks!

user-ec60c7 22 May, 2020, 01:58:23

Hi I am a freshman , when i follow the step at https://github.com/pupil-labs/pupil/blob/master/docs/dependencies-windows.md,C:\Users\Administrator>pip install git+https://github.com/zeromq/pyre Collecting git+https://github.com/zeromq/pyre Cloning https://github.com/zeromq/pyre to c:\users\admini~1\appdata\local\temp\pip-req-build-fr2df3zp Running command git clone -q https://github.com/zeromq/pyre 'C:\Users\ADMINI~1\AppData\Local\Temp\pip-req-build-fr2df3zp' ERROR: Error [WinError 2] 系统找不到指定的文件。 while executing command git clone -q https://github.com/zeromq/pyre 'C:\Users\ADMINI~1\AppData\Local\Temp\pip-req-build-fr2df3zp' ERROR: Cannot find command 'git' - do you have 'git' installed and in your PATH?

user-ec60c7 22 May, 2020, 01:59:49

Hi I am a freshman , when i follow the step at https://github.com/pupil-labs/pupil/blob/master/docs/dependencies-windows.md,
"C:/Users/Administrator%3Epip install git+https://github.com/zeromq/pyre Collecting git+https://github.com/zeromq/pyre Cloning https://github.com/zeromq/pyre to c:\users\admini~1\appdata\local\temp\pip-req-build-fr2df3zp Running command git clone -q https://github.com/zeromq/pyre 'C:\Users\ADMINI~1\AppData\Local\Temp\pip-req-build-fr2df3zp' ERROR: Error [WinError 2] 系统找不到指定的文件。 while executing command git clone -q https://github.com/zeromq/pyre 'C:\Users\ADMINI~1\AppData\Local\Temp\pip-req-build-fr2df3zp' ERROR: Cannot find command 'git' - do you have 'git' installed and in your PATH?"
I don't know what's wrong and how to do next? Can some one help me?

user-7daa32 22 May, 2020, 07:05:46

Please I will like to ask this. I got data in my recording file, how can one analyze them?

My primary goal is to be able to record people looking at different items and developing a heat map from their gaze patterns. A secondary goal is to be able to record and graph the pupil diameter information during the above process.

user-c5fb8b 25 May, 2020, 06:25:01

@user-7daa32 regarding your previous message:

I am having trouble understanding the meaning of most terminologies used in Pupil lab core Technology. Most of which always appear here. I read about them on the website and still don't understand. Also, I need a scaffolding tutor to understand these terms. I also need a help on how to get pupil dilation and gaze mapping data. We are doing Chemistry Education Research and it involves asking Students to solve chemical problems while looking at Chemistry concepts. We fixation data, scanpath, pupil dilation and heatmap Which terms specifically are unclear to you? Please feel free to always ask here if anything is unclear! Regarding the points you mentioned specifically: - fixations: you can enable the Fixation Detector plugin to work with fixations - scan path: Pupil previously had a scan path plugin using third party technology, which unfortunately performed rather poorly, so we removed it again. Since v1.22 we offer a gaze history visualization that works much better. You can enable it in the Vis Polyline plugin. - pupil dilation: when using the 2D pipeline you get the pupil diameter in pixels (of the eye image) which can be useful for relative comparisons. If you need mm, you have to use the 3D pipeline. Please read up on the trade-offs between 2D and 3D in https://docs.pupil-labs.com/core/best-practices/#choose-the-right-pipeline - heatmaps: you can automatically generate gaze heatmaps from the Surface Tracker plugin. This assumes that you have an area of interest marked with apriltag markers. Please see: https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking

Also please note that we expect very basic knowledge about eye tracking from our users. We can assist with technical or software-related questions, but if you need a complete guide on study/experiment design, we also offer dedicated support packages: https://pupil-labs.com/products/support/

user-c5fb8b 25 May, 2020, 06:27:47

@user-7daa32 Regarding you second message:

Please I will like to ask this. I got data in my recording file, how can one analyze them?

My primary goal is to be able to record people looking at different items and developing a heat map from their gaze patterns. A secondary goal is to be able to record and graph the pupil diameter information during the above process.

You can open the recording in Pupil Player for common analysis tasks. You can export the data from all enabled plugins with the export function. You will see an exports folder inside the recording's folder. The data is in CSV format, which you can e.g. open with Microsoft Excel or any other data analysis tool. Please read: https://docs.pupil-labs.com/core/software/pupil-player/#export

Please see my previous comment on notes about heatmaps.

user-c5fb8b 25 May, 2020, 06:37:40

Hi @user-a10852, what version of Pupil are you using? In older versions, the eye videos were writting with fixed intervals between frames, which did not necessarily correspond to the real timing (so video frame time intervals did not correspond to the intervals in the timestamp file). Since v1.16 the eye videos are recorded with the same timing as the underlying timestamps. Depending on your operating system, the timestamps are either generated directly on the hardware in the moment of the frame recording or in the software upon receiving the data with some additional logic to compensate for transmission delays.

Either way you should be fine with mocking a timestamps file with evenly spaced timestamps. How are you generating the file? What exactly does not work?

user-c5fb8b 25 May, 2020, 06:39:58

Hi @user-ec60c7 you are following the instructions to run Pupil from source. I'd recommend you try running our pre-buillt bundles instead. Please see the download button at the very top of: https://docs.pupil-labs.com/core/ You only need to run Pupil from source, if you want to make modifications to the Pupil source code.

user-b11159 25 May, 2020, 06:40:41

Hello guys, I have posted a question here on Wednesday and I am still stuck with it 😦 Every-time I try to launch pupil_capture from the terminal, and try to select one of the eye cams, I encounter this error message: ImportError: /usr/lib/x86_64-linux-gnu/libstdc++.so.6: version GLIBCXX_3.4.22 not found (required by /opt/pupil_capture/pupil_detectors/detector_3d/detector_3d.cpython-36m-x86_64-linux-gnu.so)

user-b11159 25 May, 2020, 06:41:09

I tried re installation as @papr suggested, but that did not do the trick ...

user-b11159 25 May, 2020, 06:41:58

Next I checked the output of sudo apt show libstdc++6 also as @papr suggested

user-b11159 25 May, 2020, 06:42:39

The out put of this was this very long terminal message:

user-b11159 25 May, 2020, 06:43:07

message.txt

user-b11159 25 May, 2020, 06:44:02

But that is as far as it goes by now, I am still stuck. So could somebody please try to look into this again ?

user-c5fb8b 25 May, 2020, 06:45:38

@user-b11159 can you try running sudo apt show -a libstdc++6

user-c5fb8b 25 May, 2020, 07:02:52

@user-b11159 also please try upgrading your gcc version with sudo apt update && sudo apt install gcc

user-b11159 25 May, 2020, 07:07:58

@user-b11159 can you try running sudo apt show -a libstdc++6 @user-c5fb8b I did and this is the output:

user-b11159 25 May, 2020, 07:08:10

message.txt

user-b11159 25 May, 2020, 07:10:25

@user-b11159 also please try upgrading your gcc version with sudo apt update && sudo apt install gcc @user-c5fb8b I did, but it tells me, that gcc is already up to date with the newest version

user-b11159 25 May, 2020, 07:11:27

In addition, i forgot to mention again, that I am running Ubuntu 16.04.

user-c5fb8b 25 May, 2020, 07:15:45

@user-b11159 when googling quickly I found a post where a user had a similar problem because his Anaconda installation used a different libstdc++ version than his system and interefered. Are you using Anaconda? Maybe an older versions? Also please check if your system version of libstdc++ supports GLIBCXX_3.4.22 by listing the supported versions with: strings /usr/lib/x86_64-linux-gnu/libstdc++.so.6 | grep GLIBCXX

user-b11159 25 May, 2020, 08:21:21

@user-c5fb8b I used it in the past, but currently anaconda is not installed.

user-b11159 25 May, 2020, 08:21:46

I checked the output of the command, and it seems the support stops with 21

user-b11159 25 May, 2020, 08:21:56

The last lines are these:

user-b11159 25 May, 2020, 08:22:27

GLIBCXX_3.4.19 GLIBCXX_3.4.20 GLIBCXX_3.4.21 GLIBCXX_DEBUG_MESSAGE_LENGTH

user-c5fb8b 25 May, 2020, 08:23:27

@user-b11159 ok, so it's indeed the system version. Can you check your version of gcc? gcc --version

user-b11159 25 May, 2020, 08:24:52

That would be:

user-b11159 25 May, 2020, 08:24:58

gcc (Ubuntu 5.4.0-6ubuntu1~16.04.12) 5.4.0 20160609 Copyright (C) 2015 Free Software Foundation, Inc.

user-b11159 25 May, 2020, 08:25:15

@user-c5fb8b

papr 25 May, 2020, 08:30:16

@user-1a449f Hi, yes, it can be and has been used for this use case. You can look for previous publications on that topic on our website: https://pupil-labs.com/publications/

user-c5fb8b 25 May, 2020, 08:42:02

@user-b11159 It appears GLIBCXX_3.4.22 is only included in gcc > 6. Please try installting gcc-6 with

sudo add-apt-repository ppa:ubuntu-toolchain-r/test
sudo apt update
sudo apt install gcc-6
papr 25 May, 2020, 08:43:38

Btw, we explicitly delete libstdc++.so from the bundle.

Otherwise nvideo [sic] opengl drivers will fail to load.

user-b11159 25 May, 2020, 09:01:40

@user-c5fb8b So I did run these commands to install gcc-6 and it seemed to work:

user-b11159 25 May, 2020, 09:02:21

[email removed] gcc --version gcc (Ubuntu 5.4.0-6ubuntu1~16.04.12) 5.4.0 20160609 Copyright (C) 2015 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.`

[email removed] gcc-6 --version gcc-6 (Ubuntu 6.5.0-2ubuntu1~16.04) 6.5.0 20181026 Copyright (C) 2017 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.`

user-b11159 25 May, 2020, 09:03:26

But when trying to select one of the eye cameras for detecting an eye, the process still crasher with the same error as before :/

user-b11159 25 May, 2020, 09:04:03

eye0 - [ERROR] launchables.eye: Process Eye0 crashed with trace: Traceback (most recent call last): File "launchables/eye.py", line 150, in eye File "/home/pupil-labs/.pyenv/versions/3.6.0/envs/general/lib/python3.6/site-packages/PyInstaller/loader/pyimod03_importers.py", line 627, in exec_module File "shared_modules/pupil_detector_plugins/__init__.py", line 14, in <module> File "/home/pupil-labs/.pyenv/versions/3.6.0/envs/general/lib/python3.6/site-packages/PyInstaller/loader/pyimod03_importers.py", line 627, in exec_module File "shared_modules/pupil_detector_plugins/detector_2d_plugin.py", line 13, in <module> File "/home/pupil-labs/.pyenv/versions/3.6.0/envs/general/lib/python3.6/site-packages/PyInstaller/loader/pyimod03_importers.py", line 627, in exec_module File "pupil_detectors/__init__.py", line 28, in <module> File "/home/pupil-labs/.pyenv/versions/3.6.0/envs/general/lib/python3.6/site-packages/PyInstaller/loader/pyimod03_importers.py", line 627, in exec_module File "pupil_detectors/detector_3d/__init__.py", line 12, in <module> ImportError: /usr/lib/x86_64-linux-gnu/libstdc++.so.6: versionGLIBCXX_3.4.22' not found (required by /opt/pupil_capture/pupil_detectors/detector_3d/detector_3d.cpython-36m-x86_64-linux-gnu.so)`

user-c5fb8b 25 May, 2020, 09:05:51

@user-b11159 What's the output of strings /usr/lib/x86_64-linux-gnu/libstdc++.so.6 | grep GLIBCXX now?

user-b11159 25 May, 2020, 09:15:17

@user-c5fb8b It seems to be still the same:

user-b11159 25 May, 2020, 09:15:30

GLIBCXX_3.4 GLIBCXX_3.4.1 GLIBCXX_3.4.2 GLIBCXX_3.4.3 GLIBCXX_3.4.4 GLIBCXX_3.4.5 GLIBCXX_3.4.6 GLIBCXX_3.4.7 GLIBCXX_3.4.8 GLIBCXX_3.4.9 GLIBCXX_3.4.10 GLIBCXX_3.4.11 GLIBCXX_3.4.12 GLIBCXX_3.4.13 GLIBCXX_3.4.14 GLIBCXX_3.4.15 GLIBCXX_3.4.16 GLIBCXX_3.4.17 GLIBCXX_3.4.18 GLIBCXX_3.4.19 GLIBCXX_3.4.20 GLIBCXX_3.4.21 GLIBCXX_DEBUG_MESSAGE_LENGTH

papr 25 May, 2020, 09:22:02

Could you please reboot and check if it is still the same?

user-b11159 25 May, 2020, 09:29:30

Yes I did a reboot and it is still the same 😦

user-b11159 25 May, 2020, 09:29:55

[email removed] strings /usr/lib/x86_64-linux-gnu/libstdc++.so.6 | grep GLIBCXX GLIBCXX_3.4 GLIBCXX_3.4.1 GLIBCXX_3.4.2 GLIBCXX_3.4.3 GLIBCXX_3.4.4 GLIBCXX_3.4.5 GLIBCXX_3.4.6 GLIBCXX_3.4.7 GLIBCXX_3.4.8 GLIBCXX_3.4.9 GLIBCXX_3.4.10 GLIBCXX_3.4.11 GLIBCXX_3.4.12 GLIBCXX_3.4.13 GLIBCXX_3.4.14 GLIBCXX_3.4.15 GLIBCXX_3.4.16 GLIBCXX_3.4.17 GLIBCXX_3.4.18 GLIBCXX_3.4.19 GLIBCXX_3.4.20 GLIBCXX_3.4.21 GLIBCXX_DEBUG_MESSAGE_LENGTH

user-b11159 25 May, 2020, 09:31:21

So does that means it is just not compatible with my current setup because the support for the 22 version is missing ?

user-c5fb8b 25 May, 2020, 09:34:27

@user-b11159 No you should be able to upgrade to the correct version of libstdc++. Sorry for poking around in the dark here, but I don't have a Ubuntu 16 available to test how to do this properly. I suspect you might have to also upgrade libstdc++ again. Please try sudo apt install libstdc++6 once more, now that you have the correct gcc available.

user-b11159 25 May, 2020, 11:05:35

Thank you very much, that resolved the issue 🙂 !

user-c5fb8b 25 May, 2020, 11:06:48

@user-b11159 great to hear! Sorry for the trouble, normally you should be able to run the bundle without any setup steps. We will look into why this was necessary!

user-a10852 25 May, 2020, 20:32:00

Thanks for the response @user-c5fb8b, I'm generating an evenly-spaced timestamps file with: eye_timestamps = np.arange(0, eye_vid.duration, f) np.save("eye0_timestamps.npy", eye_timestamps)

I'm able to open the newly created folder in Pupil Player; however, the video does not play correctly (the FPS is changed to 30 as well) and I receive the error notification: "Advancing frame iterator went past the target frame".

user-c5fb8b 26 May, 2020, 06:36:09

@user-a10852 ah. it's very important that the number of timestamps matches the number of frames in the video. every frame has a timestamp basically. we don't do any resampling or similar. I assume it might be that the combination of your eye video duration and f (probably your fixed frame rate?) does not yield exactly the same number of frames? Please check that. Also Pupil Player will generate a lookup table for faster access upon loading the video for the first time. You will probably have to delete it (eye0_lookup.npy) after you modify your timestamps file in order for the changes to have any effect.

user-141bcd 26 May, 2020, 10:22:02

Hi, for some of my recordings I get a ValueError : Each element in 'data' requires a corresponding timestamp in 'data_ts' (Traceback below) when trying to feed the folder to Pupil Player. Any ideas on what went wrong and whether this is fixable post-hoc? The data folder can be downloaded here (170MB): https://keeper.mpdl.mpg.de/d/167b191b2dd3473f9e69/

papr 26 May, 2020, 10:22:44

@user-141bcd which version of Pupil Player are you using?

user-141bcd 26 May, 2020, 10:23:33

@papr v 1.22

papr 26 May, 2020, 10:23:45

Thank you.

papr 26 May, 2020, 10:24:20

@user-141bcd We will have a look and come back to you in this regard.

user-141bcd 26 May, 2020, 10:24:20

this is the traceback:

Chat image

user-141bcd 26 May, 2020, 10:25:03

data was originally recorded with hmd-eyes + pupil capture

user-141bcd 26 May, 2020, 10:27:26

There are no problems with the majority of the recordings. I had it now for the 2nd data folder of ~200 that I processed this far. @papr thanks already!

papr 26 May, 2020, 10:27:40

Thank you for providing this information. This is very helpful!

papr 26 May, 2020, 11:54:00

@user-141bcd I cannot reproduce the issue with your recording. I called the corresponding code manually on the gaze files instead of opening the complete recording. Could you please check if you did not upload the wrong recording by accident? If it was incorrect you only need to upload the gaze.* files. If it was correct, I will download the complete recording and give it an other try.

user-141bcd 26 May, 2020, 14:19:13

@papr pardon! when checking back, I realized that my local copy of the gaze.pldata file must have been corrupted when originally downloading it from our data server/cloud. The files I shared with you are the ones on the server (there the .pldata is about twice the size with everything else being equal). After downloading again it works perfectly fine. Totally didn't see that this was a possibility. Thanks for looking into it and sorry for wasting your time!

papr 26 May, 2020, 14:41:19

Don't worry. I am happy to hear that the problem could be fixed that easily :)

user-a10852 26 May, 2020, 17:15:01

Hi @user-c5fb8b , The timestamp file and the video file have the same amount of frames (and the video file has the same fixed framerate "f"). Once in the Pupil Player (v1.19), I'm still getting the same error message with the video not playing correctly and I notice that the time ranges and index range do not match that of the source eye video file. Thanks again

user-c5fb8b 27 May, 2020, 06:19:04

@user-a10852 if you open it in player... do you have an existing recording where you replace the eye video? Or are you mocking up an entire recording?

user-c5fb8b 27 May, 2020, 06:51:38

@user-a10852 also, if you say "the time ranges and index range do not match..." are you referring to the ranges of the world video? I feel like there might be some confusion, as Pupil Player always operates relative to the world video. The full time range being displayed will be the time range of the world video. If your generated timestamps for the eye video lie outside of that range, you won't be able to see your eye video since the full information will be squashed into the first or last world video.

papr 27 May, 2020, 06:54:16

Alternatively, you can delete all world files and Player will try to generate artificial world timestamps based on the available eye timestamps.

user-a10852 27 May, 2020, 18:23:10

@user-c5fb8b, I am mocking up an entire recording with just an eye video (so there is no world video). Sorry for any confusion, when I refer to "time range and [frame] index range" I am referring to what I see in the Pupil Player (e.g. the video playback bar/scrubber, time range export settings) which doesn't match with the source eye video and timestamp file frames.

user-7d0b66 28 May, 2020, 00:48:41

Hi, I'm trying to generate a heatmap on Pupil Capture but I keep getting the "cannot add a new surface: no markers found in the image" error message even though I have placed markers in each corner of my screen. It doesn't seem to be recognizing it, though I have managed to generate a heatmap a couple weeks ago following the exact same steps, as far as I can remember, and it was actually pretty quick, only this time it's not working. Any ideas what might be happening?

user-c5fb8b 28 May, 2020, 06:29:47

@user-a10852 did you mock an info file for the recording? E.g. info.player.json? Can you post the content?

user-c5fb8b 28 May, 2020, 06:31:18

@user-7d0b66 are the markers recognized? They should have a green transparent overlay. Maybe they are too small/too far away from the camera? Can you post a screenshot of your view in Capture with the markers visible?

user-a10852 28 May, 2020, 07:03:13

@user-c5fb8b here is the mocked info file

info.player.json

user-a10852 28 May, 2020, 07:03:38

and the timestamps

eye0_timestamps.npy

user-c5fb8b 28 May, 2020, 07:10:34

@user-a10852 hm, from what I can see this should work. Maybe you encountered a bug in Player here. Can you please share the entire recording folder (including the eye video) with [email removed] For example via google drive or any other file sharing service.

user-b11159 28 May, 2020, 08:52:36

Hello guys, I would like to convert mage coordinates, that I am retrieving from a "gaze on surface" detection into world coordinates The easiest way to do this would be using the extrinsic/intrinsic parameters of the world camera i think. I am not sure however, how to obtain them. Could please you tell me how to obtain them, after calibration ?

user-670c64 28 May, 2020, 10:10:43

Hi, some question about using the Core with an Android smartphone: I calibrate the Core connected to the pc using Capture. Then I record with the Core connected to the Android smartphone using the Mobile app. After copying the data to pc and into Player, does it use the last calibration made automatically? Thanks!

user-7d0b66 28 May, 2020, 13:25:16

@user-c5fb8b here it is

Chat image

papr 28 May, 2020, 13:27:29

@user-b11159 You can also just subscribe to gaze and gaze on surface, and match them base don their timestamps. Normal gaze is already in surface coordinates. Alternatively, you can map the gaze back into world coordinates using the transform matrices that come with each surface detection.

user-7d0b66 28 May, 2020, 13:32:27

Oh and for the record, I'm using the markers that were made available on the company's website.

user-c5fb8b 28 May, 2020, 13:38:26

@user-7d0b66 as I assumed these are a bit too small. When you get closer to the markers you will notice that they will get a green/transparent overlay in Capture, signalizing that they were detected. I recommend you print them a bit larger. You will have to experiment how large you need them to be recognized consistently for your setup. We also added another PDF in our docs that shows only a single marker per page. You can use this to print the markers as big as you need them (e.g. by printing only 6 per page or similar). Here's the link: https://github.com/pupil-labs/pupil-helpers/blob/master/markers_stickersheet/tag36h11_full.pdf?raw=True

Please also always make sure that you have a large-enough white border around the marker. The width of the white border should be about twice the width of one of the blocks/pixels of the apriltags. Otherwise detection might also be less stable.

user-a10852 28 May, 2020, 15:49:21

@user-c5fb8b, sent! Thanks

user-c5fb8b 28 May, 2020, 15:51:28

@user-a10852 I got it, I'll come back to you tomorrow!

user-a10852 28 May, 2020, 15:53:17

@PFA great, thanks!

user-7d0b66 29 May, 2020, 01:46:21

@user-c5fb8b Thanks! Changing the size of the markers worked (partially). I'm still getting some trouble generating the heatmap as the detection of the markers seems to be very unstable, it goes on and off repeatedly. Any other tips on how I can fix this? Here's how it looks now:

Chat image

mpk 29 May, 2020, 06:55:52

@user-7d0b66 also you cant have the markers show up again in the image of the world on the screen. This 'echo' breaks the tracking.

user-c5fb8b 29 May, 2020, 08:14:00

@user-a10852 I identified the problem: Your eye video was not recorded with Pupil and has an incompatible encoding. You might be able to fix this by converting or re-encoding it. Potentially you might already be able to solve it by converting it from H.264 to something like MJPEG. The specific problem is that Pupil can only handle streams where packet PTS equal frame PTS, which is not necessarily the case for H.264 encoded videos. While we also work with H.264 in some places, we ensure that these videos get created with matching PTS.

Additionally I noticed that your eye video is very large (500x700 px or something). If you intend to use our offline analysis tools, you will have to shrink the image, as the pipeline is only optimized for images of max. 400x400 pixels.

user-fa8a06 29 May, 2020, 08:56:20

Hi, I'm totally new with pupil core (they arrived to me yesterday). I have some troubles with the pupil detection: all the position of the eye cameras fail in detecting my eyes (pupil) in the center of the screen at a good resolution and focus.

user-fa8a06 29 May, 2020, 08:57:03

I already tried to change some settings but it didn't work

user-c5fb8b 29 May, 2020, 09:01:18

Hi @user-fa8a06 can you share a screenshot of the eye video?

user-fa8a06 29 May, 2020, 09:06:05

Chat image

user-fa8a06 29 May, 2020, 09:06:36

Chat image

papr 29 May, 2020, 09:10:05

@user-fa8a06 Have you seen this part of the docs yet? https://docs.pupil-labs.com/core/hardware/#headset-adjustments

user-fa8a06 29 May, 2020, 09:11:52

yes, I've tried to change all the possible configurations, with or without the extension arms. The actual screenshot is the best image I've obtained with the arms of the cameras extended to the maximum point.

papr 29 May, 2020, 09:13:52

You should be able to rotate the eye cameras along its long axis upwards

wrp 29 May, 2020, 09:14:42

(a twisting motion)

user-c5fb8b 29 May, 2020, 09:16:14

@user-fa8a06 The video Rotate Eye Camera shows possible motions that you can do: https://docs.pupil-labs.com/core/hardware/#rotate-eye-camera

user-fa8a06 29 May, 2020, 09:19:00

thank you for your answers. My pupil core are slightly different from the one showed in the videos, and the movement of the lenses more limited. Are they a different version or I'm doing some mistakes?

user-c5fb8b 29 May, 2020, 09:19:50

The headset in the video is an older model, but you can basically move, twist and turn the camera in any direction. Let me try to show an example:

papr 29 May, 2020, 09:20:39

We are also actively working on updating the animations for our newer models.

user-fa8a06 29 May, 2020, 09:27:14

ok, thank you. I'll try to rotate the cameras more than I already did. Is there a way to change the zoom of the cameras and/or their resolution?

user-c5fb8b 29 May, 2020, 09:28:28

@user-fa8a06 This is the movement that you have to do to center the pupil vertically in the image

user-c5fb8b 29 May, 2020, 09:31:14

You can choose between two different resolutions in the settings (you can see it in the second screenshot you sent). But you can also move the camera closer/further away from the image by sliding the camera: https://docs.pupil-labs.com/core/hardware/#slide-eye-camera

user-fa8a06 29 May, 2020, 09:38:48

now it's working quite well, thanks!

user-c5fb8b 29 May, 2020, 09:39:13

@user-fa8a06 glad we could help! 🙂

user-fa8a06 29 May, 2020, 09:50:28

another question, if you can help me. I cant install the application on my macOs version 10.11.13. Is it a problem of version?

papr 29 May, 2020, 09:55:27

@user-fa8a06 Yes, currently, the bundles are only supported on macOS 10.12 or higher. All future releases require macOS 10.13 or higher.

papr 29 May, 2020, 09:56:18

macOS 10.12 has reached its end-of-life and it has become too difficult for us to maintain the bundles on this and older versions of macOS.

user-7d0b66 29 May, 2020, 18:19:48

@user-7d0b66 also you cant have the markers show up again in the image of the world on the screen. This 'echo' breaks the tracking. @mpk So basically, in order for it to function normally, I'd have to already have an image displayed on my screen so the markers can be tracked properly, withouts echoing, is that correct? This might be a silly question, but how can I add an image onto it, then? I need to generate heatmaps out of these specific set of images, but I'm not sure how to, exactly. Could you clarify this for me?

user-7daa32 29 May, 2020, 21:10:28

I can find analysis plugin in pupil player. Do we really have that on pupil core headset?

user-7daa32 30 May, 2020, 05:28:57

Please why is the config. graph covers with black color? the markers blinked with black color

Chat image

End of May archive