πŸ‘ core


user-f122fa 01 March, 2021, 07:59:02

Hi, is there a possibility to use unity without using the HTC VIVE glasses? I want to make a simple arcade-like project and looking for suitable algorithms, if you have any suggestions and tips I will be more than glad to learn them

user-f22600 01 March, 2021, 09:18:52

Can you tell me the command line argument to do that?

papr 01 March, 2021, 10:13:22

It is as simple as in your first message: <path to pupil_player.exe> <path to recording>

user-82b4f1 01 March, 2021, 09:42:59

Dear @papr, I explain again: I understood that in order to get my own recordings read by pupil_capture or pupil_service, I would need the corresponding _timestamps files. I made them, but they don't seem to be enough. I get warning messages in the console and in the main "world" window about "No camera intrinsics ... Loading dummy intrinsics ...; Consider running Intrinisc Estimation". In the smaller eye windows, it says it couldn't read either the file or the timestamps. How do I check that the video I supplied is the correct format, an that timestamps are not off by one? (for a video of N frames, I made a N-long np.array in the timestamps file)? Do I need anything else (intrinsics, lookup, ???), and what should I put in those (if indeed possible)? Or am I wrong and shouldn't be trying to do it? Please just let me know if you are going to help. [at the moment I am reading the code about intrinsics and it seems not so complex, except I can't find the method file_method.save_object that is used to create the file]. I can post or pastebin one eye video and the corresponding timestamps if you want.

papr 01 March, 2021, 10:19:53

for a video of N frames, I made a N-long np.array in the timestamps file That is correct. Do I need anything else The lookup is generated automatically. The intrinsics are not required for playback. In the eye process it is used for accurate eye model fitting.

I am feeling like your approach is a work around for something that I did not fully understand. Maybe, you can elaborate on your goal and I can give a recommendation on the best approach?

user-98789c 01 March, 2021, 13:43:24

Can we say that the duration of a "single" trip form MATLAB to Pupil Capture is almost half the "round" trip measured here?

roundtrip = tic; % Measure round trip delay zmq.core.send(socket, uint8('t')); result = zmq.core.recv(socket); fprintf('%s\n', char(result)); fprintf('Round trip command delay: %s\n', toc(roundtrip))

papr 01 March, 2021, 13:58:02

yes. but I recommend putting the toc as close to the recv as as possible.

  roundtrip = tic; % Measure round trip delay
  zmq.core.send(socket, uint8('t'));
  result = zmq.core.recv(socket);
+ roundtrip = toc(roundtrip)
  fprintf('%s\n', char(result));
- fprintf('Round trip command delay: %s\n', toc(roundtrip))
+ fprintf('Round trip command delay: %s\n', roundtrip)
user-98789c 01 March, 2021, 15:48:02

I have changed my annotation timestamp into an array:

keySet = {'topic','label','timestamp','duration'}; topic = 'annotation.duration'; label = strcat('duration',num2str(t2)); timestamp = {(toc(seqtime)) (toc(trialtime) - singletrip)} duration = 1;

recording is done correctly, annotations are sent and received correctly (I see them being generated correctly both on MATLAB and on Capture while recording), but Player won't open the recording folder; goes in the "no response" mode. I tried it several times.. Is there anything I can do to fix it?

user-82b4f1 01 March, 2021, 15:59:40

So @papr you go: my goal is to use pupil as much as possible as it is (instead of reinventing the wheel), to get x,y timestamped eye tracking data and elaborate on them, without a world video (the person is rather following an external stimulus). So far I am well confident that I can apply a script like this on recorded data like these that you gave me in this message about apriltags, and I might play with those a bit. Sooner or later I will have files gotten via pupil_capture, or will be able to work real time, but so far I only have a few recorded videos, and I am trying to see whether I can pupil_capture or pupil_service them meanwhile. And there I am stuck with the warning/error messages that I said above. Here are data about my video files.

papr 01 March, 2021, 16:57:05

if you want, share an example eye video + timestamps to [email removed] As for now, you can ignore the intrinsics warnings. Please also specify the error/warning regarding the timestamps.

an that timestamps are not off by one? Not sure what you meant by that.

user-82b4f1 01 March, 2021, 17:48:49

thank you, email sent.

user-f1dd79 01 March, 2021, 21:49:07

I think I am doing something wrong, but the Microsoft hd-6000 camera is not being recognized by pupil service β€œlocal USB: camera disconnected!” Ran as administrator. I’m trying without doing the soldering just to see if it can connect. Not sure if this is the problem. I notice that when capture is opened the light does not turn on on the camera so it seems like it is not being activated at all. Thanks for any help!

user-f1dd79 01 March, 2021, 23:16:14

Manual selection also does not work

user-f1dd79 01 March, 2021, 21:49:14

Chat image

user-f1dd79 01 March, 2021, 23:29:22

Found a pyuvc setup requirement - going to try that

user-f1dd79 01 March, 2021, 23:45:06

Worked!

user-94f03a 02 March, 2021, 04:42:40

Hi, when using the pupil player, and the annotations plugin. Is there a way to display the annotation events in the timeline? (a bit how it looks with the surface markers?) THanks!

papr 02 March, 2021, 09:27:03

Currently, this is not implemented.

user-98789c 02 March, 2021, 11:17:31

tried it also like this, by adding another key to the container.Map, instead of the array:

keySet = {'topic','label','sequencetimestamp','trialtimestamp','duration'}; topic = 'annotation.duration'; label = strcat('duration',num2str(t2)); sequencetimestamp = toc(seqtime) - singletrip; trialtimestamp = toc(trialtime) - singletrip; duration = 1;

this way Pupil Capture also collapses and freezes.

Is it because my system is not fast/strong enough, or that Pupil Capture/Player just won't work this way? both ways, annotations are correctly generated by MATLAB. first way, Capture works fine and records but Player collapses, second way Capture also collapses.

papr 02 March, 2021, 13:03:49

I think it always requires a timestamp field. And if I interpret your code correctly, you replaced it with two different one. I think this is why Player freezes. The extra fields should be supported by both, Capture and Player.

user-98789c 02 March, 2021, 12:21:55

ok gave up and will send two sets of annotations instead of all in one!

user-98789c 02 March, 2021, 13:20:01

Any idea where this error suddenly comes from?

C:\Program Files (x86)\Pupil-Labs\Pupil v3.0.7\Pupil Capture v3.0.7\sklearn\base.py:334: UserWarning: Trying to unpickle estimator Pipeline from version 0.22.2.post1 when using version 0.23.1. This might lead to breaking code or invalid results. Use at your own risk.

papr 02 March, 2021, 13:22:05

It is not an error but a warning. The bundled sklearn version (0.23.1) is different from the version (0.22.2.post1) used to write some other bundled files. This is nothing to worry about in this case and has been fixed in our 3.1 release.

user-7daa32 03 March, 2021, 00:34:28

We should have a channel here people to ask questions about coding information.

papr 03 March, 2021, 08:16:24

We do have such a channel. Check out πŸ’» software-dev

user-94f03a 03 March, 2021, 01:53:11

Thanks. Is there any other way to correct a misplaced event? (pressing the wrong button)

papr 03 March, 2021, 10:37:45

At the moment, this is not implemented either.

user-1bfcbc 03 March, 2021, 11:55:26

Hi,

We are using the single marker choreography. The calibration marker included with the Pupil Core is sized for calibration distance at 1-2.5 meters. I was wondering if there is any information regarding changing the size for a smaller distance, say 0.5 meters.

nmt 03 March, 2021, 13:38:27

Hi @user-1bfcbc, the calibration marker will work at a distance of half a metre. As a general rule, try to calibrate at the viewing distance you will be assessing: https://docs.pupil-labs.com/core/best-practices/#calibrating

user-6cdb90 03 March, 2021, 21:43:32

Hi, I have a question regarding the fixation time and maximum dispersion. I need to know when the maximum dispersion is 1 Β°, how can I find the area as well as circle that includes all gaze locations in that specific fixation? Does the yellow fixation ring represent this area or I need to do some calculations to find the radius of this circle based on the maximum dispersion, and if it is the solution, I do not know how could I calculate the radius from maximum dispersion while I do not know the observer's distance to objects? Any suggestions would be appreciated.

user-98789c 04 March, 2021, 10:33:38

Is there a reason that every time I open Pupil Player, I have to wait for the Fixations plugin to fully load?

Chat image

papr 04 March, 2021, 14:51:05

every time I open Pupil Player You mean every time you open a recording that had its fixations detected already, correct? This is not about unprocessed recordings, correct?

papr 04 March, 2021, 10:35:54

Recent Pupil Player versions should cache the fixation results once they where calculated successfully. Fixations are recalculated automatically if gaze changes, e.g. due to recalculating a post-hoc gaze mapping.

Which version of Pupil Player do you use?

user-98789c 04 March, 2021, 10:36:46

it's v3.0.7

user-6cdb90 04 March, 2021, 14:48:43

Hi, I have a question regarding the fixation time and maximum dispersion. I need to know when the maximum dispersion is 1 Β°, how can I find the area as well as circle that includes all gaze locations in that specific fixation? Does the yellow fixation ring represent this area or I need to do some calculations to find the radius of this circle based on the maximum dispersion, and if it is the solution, I do not know how could I calculate the radius from maximum dispersion while I do not know the observer's distance to objects? Any suggestions would be appreciated.

papr 04 March, 2021, 14:56:25

Hey, apologies, I thought I had replied to your question already but it looks like I did not send it in the end. There is no need to repost the question. It is sufficient to reply to the existing message if you have the feeling that it was overlooked.

The yellow ring has a fixed size. The center is calculated as the avarage of the gaze locations that belong to the fixation. The maximum dispersion is not necessarily the radius (or better the diameter) of the circle. The dispersion is calculated as the maximum pairwise distance between the gaze samples belonging to the fixation.

The circle is just an approximation for the gaze distribution within the fixation. It is not guaranteed to be even.

user-6cdb90 04 March, 2021, 15:01:10

So, I wonder how can I estimate the area including all fixations. Is there any way?

user-6cdb90 04 March, 2021, 14:59:13

Thank you very much for you response! Sure, I will reply my message.

papr 04 March, 2021, 15:01:15

No worries. This was just meant as a recommendation should this happen again in the future.

papr 04 March, 2021, 15:02:17

You mean the gaze area of a single fixation, correct? Or do you want to calculate the area of all fixations?

user-6cdb90 04 March, 2021, 15:03:26

Yes, I mean the gaze area for each fixation.

papr 04 March, 2021, 15:15:09

Ok, I think there are two possible solutions with varying precisions. Both use the idea of fitting a geometric figure around the points (a proxy for the actual data) and calculating its area

1) Fit a simple geometric figure with a known area calculation method, e.g. a rectangle, ellipse, or circle to the data (Note: the difference to your previous idea is that the fixation center is not necessarily the center of your fitted geomtric figure)

2) Calculate the convex hull around the data points https://docs.scipy.org/doc/scipy/reference/generated/scipy.spatial.ConvexHull.html and access the area (volume attribute in case of scipy.spatial.ConvexHul)

I think I would recommend 2) as it fits tightly around your data points and there is an existing fitting function + area calculation

user-6cdb90 04 March, 2021, 15:23:14

Thanks for solutions. I will go with the second option. I also have a very basic question that made me confused. I could not understand the exact meaning of dispersion angle as a metric for distance. I do not understand the meaning of 1 Β° as a distance.

papr 04 March, 2021, 15:25:40

The fixation detector takes the 2d gaze points, unprojects them into 3d cyclopian viewing directions in the 3d camera coordinate systems and calculates the dispersion as the angle between pairs of these vectors.

user-6cdb90 04 March, 2021, 15:33:38

I got it. Thank you very much!

user-98789c 04 March, 2021, 17:30:43

no it actually happens for the recordings that I'm dragging into Player for the first time, right after recording, without prior processing

papr 04 March, 2021, 17:44:23

In this case, this is to be expected. The fixation detector processes the gaze to fixations Post-hoc. It does not read the realtime recorded fixations.

user-6cdb90 04 March, 2021, 22:45:28

Hi! Currently, I am doing research including an experiment that I need to know the fixation time for objects in the scene; so, I have two main questions:

1- To know the fixation time for each object, I look at the fixation data and video from Pupil Player at the same time to identify which fixation is related to the which object. For example, fixation number 189 is on the screen and then accumulated all fixation times of screen to know the total fixation time for the screen. Since it is a time-consuming process, I appreciate it if you could give me some advice for easier and much accurate time fixation calculation.

2- I need your help with fixation time calculations regarding the gaze mapping error. I attached two screenshots to this email. The first image is the beginning of the experiment where I asked the participant to focus on the center of the marker, and it can be seen there is an error in terms of gaze mapping. The second image shows the sample of fixation that is on the screen. Since I know there is an error, I am not sure that the participant is exactly looking at the screen, and due to the error, he probably is looking at the painting and this is a fixation for painting; so, I am wondering how I could match the error with the fixation detection.

Chat image

user-6cdb90 05 March, 2021, 14:54:01

Hello, I appreciate if you help me to find out the fixation time accurately. Thanks!

user-6cdb90 04 March, 2021, 22:45:30

Chat image

user-da621d 05 March, 2021, 08:22:13

what's wrong with my eye tracking hardware? How can I fix it

Chat image

papr 05 March, 2021, 09:51:23

It is possible that the drivers are no longer installed. Windows updates tend to purge them. Checkout the Video Source menu, enable manual camera selection, and open the activate camera selector. Do you see any entries? Do they say unknown three times?

user-da621d 05 March, 2021, 08:27:11

pupil capture 0 window is black

user-da621d 05 March, 2021, 08:27:28

No image

papr 05 March, 2021, 09:52:03

I have seen that you posted the same questions via email to [email removed] We will come back to you via email.

user-6cdb90 05 March, 2021, 16:17:18

Hi, sorry, I did not notice your message. I am waiting for your email. Thank you very much!

user-da621d 05 March, 2021, 10:04:06

activate device and no devices found, what's wrong

Chat image

papr 05 March, 2021, 10:05:54

Please enable manual camera selection below the selection menu and try again

user-da621d 05 March, 2021, 11:24:13

hello, do you know how to replace this cable?

Chat image

user-da621d 05 March, 2021, 11:25:10

I have a new cable, but I don't know how to change it

Chat image

papr 05 March, 2021, 11:26:50

Please contact [email removed] in this regard.

user-da621d 05 March, 2021, 11:28:06

Is this a email?

papr 05 March, 2021, 11:28:23

Yes, please contact this address via email.

user-da621d 05 March, 2021, 11:29:12

thank you

user-7bd058 05 March, 2021, 15:44:57

Hello, I would appreciate if you could help me with this error

user-7bd058 05 March, 2021, 15:48:00

When I try to add surfaces, the pupil player crashes. The videos are quite long, about half an hour. Marked surfaces are visible in 90% of the time. I suspect this could be part of the problem. Note: "No markers found in the image" is just because I tried to add surfaces when the image had no markers in it so this error is fine. But "process player crashed with trace" is the issue. Thank you for your effort. Edit: The main problem was that I added the surfaces while the data was still on an external hard drive. Now on the intern hard drive, it works at least without crashing but it is still very slow. Every step takes a few seconds for example when I try to adjust the surface I have to wait about 5 seconds until the step is saved and until I can go on with other steps. My Laptop has 16GB RAM, AMD Ryzen 7 4700U OctaCore and AMD Radeon RX Vega 7 Graphic shared so it's not the worst technical base..

Chat image

user-b772cc 06 March, 2021, 15:20:29

Hi , the Pupil Capture world video window failed to appear quite frequently. How do I fix this issue on Mac? Thank you

user-c22e3a 06 March, 2021, 17:06:20

Hi, So I'm using pupil core hardware for a real world navigation experiment and I was wondering is there any direct slam implementation out there for tracking the camera position within the environment (that take into consideration the fish eye distortion of the pupil core). It might be worth mentioning that the environment is the same for all recordings and there is also the possibility of having a simplified mesh of the environment in case there are some particle filter implementation out there???

user-a98526 08 March, 2021, 01:37:30

Hi@papr, when I use Pupil capture, there is a problem with the eye process and there is no eye image. This is the error message: eye0 - [INFO] numexpr.utils: Note: NumExpr detected 12 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8. eye0 - [INFO] numexpr.utils: NumExpr defaulting to 8 threads. eye0 - [INFO] root: Refraction corrected 3D pupil detector not available Running PupilDrvInst.exe --vid 1443 --pid 37424 OPT: VID number 1443 OPT: PID number 37424 Running PupilDrvInst.exe --vid 1443 --pid 37425 OPT: VID number 1443 OPT: PID number 37425 Running PupilDrvInst.exe --vid 1443 --pid 37426 OPT: VID number 1443 OPT: PID number 37426 Running PupilDrvInst.exe --vid 1133 --pid 2115 OPT: VID number 1133 OPT: PID number 2115 Running PupilDrvInst.exe --vid 6127 --pid 18447 OPT: VID number 6127 OPT: PID number 18447 Running PupilDrvInst.exe --vid 3141 --pid 25771 OPT: VID number 3141 OPT: PID number 25771 eye0 - [INFO] camera_models: No user calibration found for camera Pupil Cam2 ID0 at resolution (400, 400) eye0 - [INFO] camera_models: No pre-recorded calibration available eye0 - [WARNING] camera_models: Loading dummy calibration eye0 - [WARNING] uvc: Could not set Value. 'Backlight Compensation'. eye0 - [WARNING] launchables.eye: Process started. Estimated / selected altsetting bandwith : 617 / 800. !!!!Packets per transfer = 32 frameInterval = 83263

papr 08 March, 2021, 09:21:12

Please see this conversation for reference https://discord.com/channels/285728493612957698/285728493612957698/817333467124727810 What are responses to these questions in your case?

user-59c143 08 March, 2021, 07:35:45

My β€œold” Pupil Pro had a small modification today. Because I was concerned about damage of the free hanging wires from the camera to the frame (which are constantly in contact with the human head) I have made a new pair of adjustable sliders which now includes a clip holding the wire in its place. Excessive wire is now kept outside the frame, away from the wearer.

Chat image

user-59c143 08 March, 2021, 07:35:50

Chat image

papr 08 March, 2021, 09:19:22

Could you try to reproduce the issue and share a copy of the ~/pupil_capture_settings/capture.log please? The window not appearing indicates an issue on startup. The log file should contain more information.

user-b772cc 10 March, 2021, 01:43:12

@papr thank you for your reply. I tried to re download the software and seem to work so far. Will share the log if the same issue resurfaces. Many thanks.

user-98789c 08 March, 2021, 11:12:54

How can this warning be resolved: "could not set value: Backlight Compensation"

papr 08 March, 2021, 11:17:55

Some cameras expose this uvc control but do not actually allow to set it. This warning does not affect the eye tracking and can be ignored.

user-98789c 08 March, 2021, 11:13:03

and, do you know of a plugin to reject blink/artifact data and interpolate the remaining data?

papr 08 March, 2021, 11:16:55

The blink detector calculates the blink sections but does not discard them from the raw data export. You need to remove rejected data during your postprocessing pipeline. The same applies for data interpolation

user-98789c 08 March, 2021, 11:23:02

does the duration of annotations have any effects on the recorded data? how can the duration be used in data processing, anyway?

papr 08 March, 2021, 11:24:48

no, it is an optional field. You can add other optional fields when sending annotations remotely.

papr 08 March, 2021, 11:27:03

You could use it for example by sending a single annotation at the end of your trial, including the duration of your trial. Another example for optional data is dynamically generated content, e.g. a user key stroke.

user-98789c 08 March, 2021, 11:27:52

Thanks πŸ‘

user-90a55d 08 March, 2021, 14:00:10

Hello , I want to synchronize between recording pupil dilation and an audio file. It is expected that the glasses will start recording together with the beginning of the audio file. Currently I have an uneven delay between the start of the glasses and the audio file. (Each time a different delay)

Has anyone done this before? Any recommendations on how to perform?

user-e91538 08 March, 2021, 14:07:40

You could assume the delay, and just sync everything on start with a "calibration" blink

user-90a55d 08 March, 2021, 14:11:00

I tried to insert a delay and heard a delay even in the code, but each time it opens and closes the recording of the glasses for me at different times regardless of the delay I set in the code

nmt 08 March, 2021, 14:47:28

@user-90a55d the cameras are controlled by different processes and it is not guaranteed that all processes start recording at the exact same time. This is likely why you have a random delay. I assume that you are initiating your audio with a script that concurrently starts a pupil capture? If so, you could modify your script to begin a capture first, wait for several seconds, and then initiate the audio. You could also use the annotation plugin to make events in the recording that correspond to audio events. This script demonstrates how you can send remote annotations over the network: https://github.com/pupil-labs/pupil-helpers/blob/master/python/remote_annotations.py

user-90a55d 08 March, 2021, 14:52:25

Thanks I will try

user-6eec86 09 March, 2021, 01:32:57

Hi. I'm new here. Is anybody using the Core product with a smartphone for data capture instead of a computer for portability ?

user-0dec4c 09 March, 2021, 07:51:52

Hey, using the PupilCapture 3.1.16 and the latest LSL plugin, streams of the glasses are detected in the LabRecorder but empty. Any idea how to solve this issue? Best, Julius

papr 09 March, 2021, 08:25:47

The lsl relay only streams gaze data. Gaze requires a valid calibration since Pupil v2.0. Please calibrate and try again.

user-0dec4c 10 March, 2021, 14:05:06

Solved the Issue. Thanks!

user-82b4f1 09 March, 2021, 11:10:50

hello, when I run pupil_capture (or _service, or _player) it creates a pupil_capture_settings folders in my home directory, Can I make them be created in a directory relative to the installation path (or to the cwd when launching the program)? It's just to avoid clobbering the home directory with three more folders.

papr 09 March, 2021, 11:13:55

If you run from source, it will use directories relative to the repository clone

papr 09 March, 2021, 11:13:19

Hi, this is currently not supported.

user-82b4f1 09 March, 2021, 11:14:51

so there is no PUPIL_HOME environment variable or any such mechanism? (like a config file?)

papr 09 March, 2021, 11:16:57

No, there is not. This is the code for creating and setting up the user directory https://github.com/pupil-labs/pupil/blob/master/pupil_src/main.py#L54-L83

user-82b4f1 09 March, 2021, 11:21:41

a d how do I launch e.g. "pupil_service"? I just tried with $python launchables/service.py but it just exited in 2 second

papr 09 March, 2021, 11:23:06

Everything is started via main.py. See python pupil_src/main.py --help

user-a98526 09 March, 2021, 11:23:01

Hello@papr, can you provide a 3D model of pupil-invisible? I am trying to design a connecting part to add another camera to pupil-invisible.

papr 09 March, 2021, 11:23:38

I am not sure if we will be able to do that. Please contact [email removed] in this regard.

user-a98526 09 March, 2021, 11:24:29

OK,thank you.

user-98789c 09 March, 2021, 13:08:10

Can you think of a reason why fixations are only present around timestamp zero? I used https://github.com/pupil-labs/pupil-tutorials/blob/master/01_load_exported_data_and_visualize_pupillometry.ipynb on my data

Chat image

papr 09 March, 2021, 13:16:52

I think this is just a visualisation issue. Look at the x-axis. It goes from 0 to 40k seconds (11 hours)

user-430fc1 09 March, 2021, 15:35:29

Hello, how can I programmatically set the dispersion and duration parameters of the fixation detector? Does it have to be done when starting the plugin?

papr 09 March, 2021, 16:11:02

Yes, you need to pass these arguments as plugin init argument

user-f1866e 09 March, 2021, 17:54:07

I remotely triggered recording in Pupil Core with Pupil Mobile app. It worked before but today Pupil Core complained about time sync in pupil mobile and abort recording. It keep saying enable time sync on pupil mobile but I can't find any option in pupil mobile. However it worked before and suddenly it starts trouble. Please any body help me!!!!

papr 09 March, 2021, 17:55:04

You need to enable time sync in Capture. Pupil Mobile attempts to sync automatically.

user-f1866e 09 March, 2021, 17:56:18

Hi papr, could you tell me where I can find option to enabled it?

papr 09 March, 2021, 17:56:43

Start Pupil Capture -> Select the plugin manager menu on the right -> enable Time Sync

user-f1866e 09 March, 2021, 17:57:12

Yes I did and I can see my ONEPLUS phone in the 'sync group members'

papr 09 March, 2021, 17:57:42

ok, great.

user-f1866e 09 March, 2021, 17:57:23

actually it is grayed.

user-f1866e 09 March, 2021, 17:58:15

but still I can't trigger recording with the same reason.

papr 09 March, 2021, 17:59:32

ok, please stop Pupil Capture, open Pupil Mobile -> Settings (three dots bottom right) -> Stop background service and terminate app -> Restart both applications

user-f1866e 09 March, 2021, 18:02:28

I reboot phone and restart Pupil capture but still the same problem.

user-f1866e 09 March, 2021, 18:05:40

I tried again and same issue

papr 09 March, 2021, 18:06:29

Did you change anything between the last time it was successful and now?

user-f1866e 09 March, 2021, 18:06:06

Do I have to upgrade phone for invisible app instead of pupil mobile?

papr 09 March, 2021, 18:06:58

This will most definitively not work. Pupil Core does not work with the Pupil Invisible app.

user-10fa94 09 March, 2021, 19:47:15

Hi, does setting "auto exposure priority" impact the exposure time if the "auto exposure mode" is set to manual? thank you for your help

papr 09 March, 2021, 21:36:41

Sorry, I must have overlooked that one. To answer your question: No it does not.

user-10fa94 09 March, 2021, 21:33:20

Apologies for the followup message, I am struggling to find reference to this information in the documentation. Thank you very much for your help

user-f2c1b3 09 March, 2021, 20:44:22

Hi, my pupil player app suddenly stopped working on my PC. When I open it, the command window opens, but then nothing happens indefinitely. Anybody have suggestions for troubleshooting the issue? It was working fine up until now and nothing on the computer has changed as far as I know.

papr 09 March, 2021, 20:45:50

Please delete the user_settings_* files in the pupil_player_settings folder. Also, please make sure to use a recent version of Pupil Player to avoid this issue in the future.

user-f2c1b3 09 March, 2021, 20:46:49

I'll try that, thanks!

user-f2c1b3 09 March, 2021, 20:48:55

just kidding, there is no pupil_player_settings folder that I can see

papr 09 March, 2021, 20:49:20

It should be in your home folder. You can try searching for it, too

user-f2c1b3 09 March, 2021, 20:50:48

hmm, I also don't see a home folder nor does it come up in a search

user-f2c1b3 09 March, 2021, 20:51:17

do i have a weird version? it says v 1.23

user-f2c1b3 09 March, 2021, 20:52:03

nevermind, found pupil_player_settings

papr 09 March, 2021, 20:52:04

That is an older version, but the folder should be there anyway (if it started correctly before)

user-f2c1b3 09 March, 2021, 20:53:17

great, that worked! thanks so much

papr 09 March, 2021, 21:37:13

Auto exposure can be enabled with setting "Auto exposure mode" to "aperture priority"

user-10fa94 09 March, 2021, 21:40:00

Thank you, if auto exposure mode is set to aperture priority, and "auto exposure priority" is disabled, does this mean that frame rate will remain constant? or does "aperture priority" encompass dynamic frame rate + exposure time

for context I am building a plug in that programmatically updates the controls shown in this file: https://github.com/pupil-labs/pyuvc/blob/eb2c0a04caafcb5c2f349aa76d188b098571d982/controls.pxi

papr 09 March, 2021, 21:44:01

I am not 100% sure on that, to be honest. Please refer to the UVC standart for how it should behave (the actual camera behavior might differ though). http://www.cajunbot.com/wiki/images/8/85/USB_Video_Class_1.1.pdf

user-10fa94 09 March, 2021, 21:46:34

Thank you, I'll take a look through the documentation, sincerely appreciate the help!

user-f1866e 09 March, 2021, 22:38:17

I still have a problem of 'world - [ERROR] recorder: Pupil Mobile stream is not in sync. Aborting recording. Enable the Time Sync plugin and try again.'

user-f1866e 09 March, 2021, 22:38:52

Is this message related to this issue "pyre.pyre_node: Group default-time_sync-v1 not found." ?

user-10fa94 10 March, 2021, 04:16:14

Thank you @papr this document was very helpful. In the document it mentions that "Aperture Priority Mode – auto Exposure Time, manual Iris ". Do you happen to know if the wide angle world camera that comes with the pupil core headset maintains fixed aperture with only exposure time varying under the aperture priority mode?

papr 18 March, 2021, 15:34:43

Sorry for the delayed response. I wanted to confirm the correctness of the answer internally before posting it. The scene camera uses a fixed aperture.

user-98789c 10 March, 2021, 13:43:42

These warnings show up in Player after exporting my recording: "no 2D data for pupil visualization found" "no 3D data for pupil visualization found" how can I resolve them?

papr 10 March, 2021, 13:45:18

Can you remind me of your setup? Pupil Core; exporting recorded data? Or do you use post-hoc detected data?

user-98789c 10 March, 2021, 13:51:04

Yes, recording in Pupil Core and then exporting from Pupil Player into .csv files. The exported data seems to be fine and includes all that it should. I just thought these warnings may be indicatin something serious?

papr 10 March, 2021, 13:53:01

The error message comes from the eye video exporter, correct?

user-98789c 10 March, 2021, 13:56:54

yes

Chat image

papr 10 March, 2021, 14:04:59

Btw, that is a prime example for how a confidence signal should look like!

papr 10 March, 2021, 13:59:02

That message means that there is at least one frame for which there was no pupil data recorded. Please check the exported video. Does it include the expected pupil data visualization?

user-98789c 10 March, 2021, 14:05:00

yes there are some instances in the video where there's no data:

Chat image

user-98789c 10 March, 2021, 14:05:02

Chat image

user-98789c 10 March, 2021, 14:07:56

oh good! maybe it's because I recorded in a dark room?

papr 10 March, 2021, 14:13:51

I think it is mostly the strong contrast between pupil and iris.

user-690703 10 March, 2021, 14:18:28

Hello! I am aware that Pupil Mobile is no longer maintained, however, I decided to use it in an experiment. Do you know if there is a way to send offline triggers to 2 different mobile + Pupil Core setups, based on which data can be later synchronized? Eg. signaling the start of the experiment. The point is to be independent of the desktop softwares as much as possible during the recording phase. Is zmq good for this?

papr 10 March, 2021, 14:31:24

Pupil Mobile does not have the possibility to receive triggers. The only way to sync two Pupil Mobile instances is through a common Capture instance in the same wifi. Enable Time Sync in Capture and the Pupil Mobile instances will start following its clock. Since you are running Capture at that point anyway, you can send the remote triggers to Capture, and record them in a third recording.

Alternatively, you should be able to sync time post hoc to Unix time using https://github.com/pupil-labs/pupil-tutorials/blob/master/08_post_hoc_time_sync.ipynb. Then you can store your triggers independently of Pupil Mobile/Capture (e.g. in your experiment software) and correlate them post-hoc to the recordings as well. Make sure that the devices were recently time synced via NTP.

papr 10 March, 2021, 14:23:15

Ah, ok, looks like there is a gap in the eye video recording which is automatically filled with artificial frames by Player. It is expected, that it can't find any pupil data for these frames. These gaps can happen e.g. due to camera disconnects.

user-98789c 10 March, 2021, 14:53:52

I noticed some recordings have so many gaps in them that the video player says the files is corrupt and can not open it. Is there some way to avoid the gaps?

user-690703 10 March, 2021, 14:34:38

@papr Thank you for your help! I will give it a try.

user-98789c 10 March, 2021, 14:47:20

I see, so it works better for lighter-colored irises..

papr 10 March, 2021, 14:48:03

At least lighter-colored irises in IR light. This does not necessarily translate to visible light.

papr 10 March, 2021, 14:49:39

The darkness helps to evenly light the eye region as well. There are no other dark areas which could be mistaken for the pupil. Also, your 3d eye models are well fit which you can see from the highly correlated 3d diameter graph.

user-98789c 10 March, 2021, 14:52:29

perfect! good to know that πŸ‘

papr 10 March, 2021, 14:54:53

Which video player? Pupil Player? Also, how do you know that there are many gaps if the video does not open? Was it just a guess?

user-98789c 10 March, 2021, 14:55:35

yeah it was just a guess, my laptop's video (.mp4) player won't open them.

papr 10 March, 2021, 14:55:53

A lot of video players cannot handle the intermediate video format. The exported video should work well though.

papr 10 March, 2021, 14:56:24

The interpolation of gaps in the exported video should never lead to a corrupted video.

user-98789c 10 March, 2021, 14:57:10

then maybe I have not exported eye videos. I'll look into it.

papr 10 March, 2021, 14:57:40

Also, if the export is interrupted, the resulting video will be corrupted.

user-98789c 10 March, 2021, 14:58:05

aha good to know πŸ‘

user-f1866e 10 March, 2021, 16:24:49

Is this message related to the time sync (mobile) issue "pyre.pyre_node: Group default-time_sync-v1 not found." ?

papr 10 March, 2021, 17:29:12

That message means that there was no other node in that group yet. Groups are created and destroyed automatically. This message is expected if you start Capture and there is no other Time sync device already present

user-f1866e 10 March, 2021, 17:31:02

So it is not a message about failing time sync in Mobile?

papr 10 March, 2021, 17:31:53

Not necessarily. If the Pupil Mobile client appears in the list, it should work. Are you streaming all three cameras?

user-f1866e 10 March, 2021, 17:31:23

I am still troubling with timesync with Mobile.

user-f1866e 10 March, 2021, 17:32:06

Yes.

user-f1866e 10 March, 2021, 17:32:51

Yes I can see my phone in the Sync Group Members

user-f1866e 10 March, 2021, 17:33:53

but whenever I triggered recording remotely, Capture complains Mobile streaming is not time sync.

papr 10 March, 2021, 17:34:47

I understand the symptom of the issue. This is the code that triggers it https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/recorder.py#L279-L298 I do not have any ideas what might be the cause if time sync is already enabled.

user-f1866e 10 March, 2021, 17:36:00

However phone shows right time and date

papr 10 March, 2021, 17:36:24

The displayed time is independent of the internal clock used by Pupil Mobile/Capture

papr 10 March, 2021, 17:36:04

Could you share the capture.log file after attempting to init a recording? You can find it in the pupil_capture_settings folder.

user-f1866e 10 March, 2021, 17:46:05

OK. I found a problem. I sent a notification message and then I sent 'R' to start recording. This caused a time sync issue in Mobile streaming. I changed it to send 'R' first and then notification. Now it starts recording remotely without time-sync error message.

papr 10 March, 2021, 17:52:19

ok. And you can reproduce this behavior reliably with shutting Pupil Mobile down via the settings inbetween? I do not see a reason why the order of these notifications should play a role.

papr 10 March, 2021, 17:49:09

Do you start Capture manually and wait for the Mobile client to appear in the time sync menu before starting the recording? Or is Capture started from the external script as well?

papr 10 March, 2021, 17:46:42

What is the other notification?

user-f1866e 10 March, 2021, 17:46:31

Any explanation?

user-f1866e 10 March, 2021, 17:47:17

label = "TimeStamp_Optitrack" tempTrigger = new_trigger(label, globs.rBodyTS, 0.0) send_trigger(tempTrigger)

user-f1866e 10 March, 2021, 17:47:46

globs.rBodyTS is timestamp from Optitrack 3D tracking system

user-f1866e 10 March, 2021, 17:49:43

I started Capture manually. after starting Pupil Mobile.

papr 10 March, 2021, 17:52:54

Maybe Pupil Mobile just needed time to adjust its clock? I cannot tell for sure. But I am happy that you got it working.

user-f1866e 10 March, 2021, 17:55:33

Yes I just tested it and sending notification before 'R' caused the problem.

user-f1866e 10 March, 2021, 17:56:13

Once it started then I have to restart Capture and sending 'R' first did not cause error and started recording.

user-430fc1 11 March, 2021, 14:23:17

Is it possible that remote annotations coming in at the same time from different threads could be causing Pupil Capture to crash? If so could I get around the roblem by storing them all and sending them before ending the recording?

user-82e5bd 11 March, 2021, 14:38:54

I have a question.

custom_offset_1.py

user-82e5bd 11 March, 2021, 14:39:49

i want to use this and put it in a map called pupil_player_settings and then plugin

user-82e5bd 11 March, 2021, 14:40:06

but i cannot find in in my pupil player workspace

papr 11 March, 2021, 14:54:20

You will need to launch Player once to create these folders. Afterward, searching fo pupil_player_settings should work. The folder is in your home folder.

Please note that this is an outdated version of the plugin. You need to make this change to get the ui working properly:

-        offset_x=0,
-        offset_y=0,
+        offset_x=0.0,
+        offset_y=0.0,
user-d90133 11 March, 2021, 18:18:49

@papr So I'm relatively new to the Pupil Core as well as the Pupil Capture program. Can anyone tell me what an optimal angular precision (RMS of angular offset/distance) value should be for calibration using a single marker?

user-8acec5 12 March, 2021, 10:48:31

I am very curious about this too, as I'm unable to find any information on what a 'good' calibration is

user-60acd1 11 March, 2021, 19:08:04

I am trying to upload a file into Pupil Player (v3.1.16) to do post hoc gaze calibration and pupil detection, but instead of opening the normal window only the eye videos are displayed and no post hoc analysis can be done. I have tried using an older version (v2.6.19) but the world video will not display and when I attempt to do a post hoc gaze calibration the app shuts itself down. Is there any way to fix this problem?

papr 15 March, 2021, 09:06:14

Could you please try reproducing this issue again in v3.1 and share the pupil_player_settings folder -> player.log file with us?

user-82e5bd 11 March, 2021, 19:25:48

Still got the question (i am really new with the program) where can i find the new custom offsett (beacuase i have no phyton) only R and how can i lauch the player again

papr 15 March, 2021, 09:01:48

You should be able to select it from the "Gaze Data" menu in Player, if it is installed correctly.

user-7e9436 11 March, 2021, 22:04:32

Hi there, I'm trying to assemble the DIY kit and I'm on the step where you replace the IR filter, but I can't access the screws to unmount the lens holder because the two circuit boards are sandwiched together. What should I do?

wrp 12 March, 2021, 03:46:48

Hi @user-7e9436 it looks like you also sent [email removed] an email asking about DIY setup. I think it is best that we continue the discussion here if that is OK with you.

https://vimeo.com/53005603 - shows the disassembly of HD-6000 camera circa 2013 😸 . The design of the camera might be somewhat different in 2021. But I would conjecture that the two boards __should_ still be separable as shown here from a 2019 document: https://www.thingiverse.com/thing:3337827

user-7e9436 12 March, 2021, 04:04:46

Hi wrp, thank you for the reply, sounds good, we can continue the discussion on here. okay, judging from the photos in the link, it may just be a component with pins that connect the two boards together, in which case, i will try to "pry" the two boards apart as gently as possible using a screw driver or something. Also, with regards to the soldering of the IR emitters, could you confirm if the "black dot/product marking/angled cut" side is the + side? I called into digi-key today and the fellow said it is the anode/positive side, but after googling online i read that anode actually means negative side, so i'm not sure if what he said is correct. Edit: Okay, so i've manged to pull them apart, replace the IR filter, and soldered them back together, now how do you sandwich the two circuit boards back together so that it can be mounted onto the camera holder?

wrp 17 March, 2021, 08:43:58

@user-755e9e can you provide any tips regarding orientation of IR emitters?

user-82e5bd 15 March, 2021, 12:27:22

So is it possible than to only move the gaze to left or to the right or do i need to change everthing with the post hoc calibration.Because it doesnt work yet

papr 15 March, 2021, 12:29:16

No, the plugin is independent of the post hoc calibration. It should be listed in the same drop down menu though. Where you able to find the plugin folder?

user-bf78bf 15 March, 2021, 13:24:51

is there a way to map norm_pos_x and norm_pos_y coordinates onto a 1920x1080 monitor?

papr 15 March, 2021, 13:25:38

Yes, read more about it here https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking

user-bf78bf 15 March, 2021, 13:26:17

note that i havent used surface tracking/april tags for my data collection

papr 15 March, 2021, 13:27:52

In this case, there is no automated way of mapping the scene camera coordinates to screen coordinates.

user-bf78bf 15 March, 2021, 13:29:25

i did observe after calibration a surface approximately equal to the experiment monitor, if that is my default surface after calibration, i could be able to map the coordinates?

papr 15 March, 2021, 13:34:05

You would have to assume that there was no relative movement between scene camera and AOI/surface/display. Any scene camera movement invalidates the mapping. That is why surface tracking requires the markers. They can be detected automatically and the relative position between scene camera and surface can be calculated for each frame, without having to assume this relationship to be fixed.

user-bf78bf 15 March, 2021, 13:51:13

Thanks a lot for the help @papr

user-98789c 16 March, 2021, 10:48:06

When I specify an exact timestamp for my annotation to be saved on the recorded data, as in:

keySet = {'topic','label','timestamp','duration'}; topic = 'annotation.sizeSEQ'; label = strcat(name,num2str(j)); timestamp = toc(runtime)-t2/1000; duration = t2/1000;

will the singletrip from MATLAB to Capture still affect this timestamp and change it?

papr 16 March, 2021, 10:49:54

Capture will save the exact timestamp that comes within the annotation. Capture does not make any assumptions about its correctness and therefore does not try to modify it in anyway.

user-98789c 16 March, 2021, 10:50:55

perfect, thanks πŸ‘

user-82e5bd 16 March, 2021, 13:24:13

Unfortunately not, i need this for a deep calibration or is there anonther way to do this? i use the invisible for my thesis project but we sometimes the gaze shits a little bit to the right or the left zo i want to fix that. The custom seems like a good idea but how can i made it work on my computer

papr 16 March, 2021, 13:30:19

The first step is to find the pupil_player_settings folder. You can either search for it or get to it via the windows explorer: This Pc -> Drive C -> Users -> your user name -> pupil_player_settings

user-82e5bd 16 March, 2021, 13:30:51

i found that and put in the custom file

papr 16 March, 2021, 13:39:44

In the plugins folder, correct?

user-82e5bd 16 March, 2021, 13:40:34

yes

user-82e5bd 16 March, 2021, 13:40:52

yes in the map pluginss

papr 16 March, 2021, 13:42:28

Now, after starting Player, go to Gaze Data and you should see three options in the drop-down menu, instead of two.

user-82e5bd 16 March, 2021, 13:52:32

Chat image

papr 16 March, 2021, 13:53:00

Could you please share the player.log file that is in the pupil_player_settings folder?

user-82e5bd 16 March, 2021, 13:55:21

There is

Chat image

papr 16 March, 2021, 13:55:42

The player file. Windows is hiding the .log extension

user-82e5bd 16 March, 2021, 13:59:15

player.log

user-82e5bd 16 March, 2021, 15:07:26

So can you help me based on the player log

papr 16 March, 2021, 16:09:08

Looks like your version of the plugin contains more incompatibilities to the current Player version. Please use this one instead: https://gist.github.com/papr/d3ec18dd40899353bb52b506e3cfb433

user-2de265 16 March, 2021, 15:14:19

Hi! Not sure this is the right place to ask (if not, I am sorry), but I will give it a try: I am setting up a study in which I will capture pupil size with the pupil core system, using pye3d Pupil Detection. People will always look at a cross on the screen (at least that is what I tell them to do) and listen to sentences, which they repeat after a short interval. Now my quesitons are: 1) for an accurate measurement of pupil size with the 3D model (or even 2D) is it absoluterly necessary to calibrate also the eye gaze? Meaning, does it matter for the precision of the pupil size estimation resulting from the eye-camera data, whether I have calibrated to establish a reference to the world camera, or not? I know from the literature that eye-gaze position does matter, but if we assume that my PPs all behave perfectly and only fixate the cross, would I be fine without calibration? 2) If I do not calibrate for eye-gaze, will I have issues with slippage = will slppage affect pupil size estimation? Or is that a totally independent issue?...

papr 16 March, 2021, 16:28:47

1) no need to calibrate gaze if you only need pupil size. Only make sure that 2d detection works well and the 3d model is fit accurately. The idea of pye3d is that it is independent of the gaze angle (on the population level). Also, if you gaze angle does not change anyway, there is no need to worry about it. 2) Slippage is an issue independent of calibration. It affects both, gaze and pupil size estimation. Even though pye3d attempts to readjust its model location continuously, it requires data from different viewing angles to do so well. If you are looking mostly forward, I recommend freezing your eye model after fitting. This avoids discontinuities during the measurement. This will make the measurement susceptible to slippage. It is therefore important to monitor the model fitness and refit the model after slippage. You can also un/freeze the model programatically if you want to integrate eye model fitting phases into your experiment.

user-2de265 16 March, 2021, 21:21:11

Great! That helps a lot! Thank you!

user-82e5bd 16 March, 2021, 21:17:18

Can i fix all that problems in some way? So that i usenthe right one with no errors. But thanks in advantges

papr 16 March, 2021, 21:20:12

Delete the old one, download the new one, and put it in the plugins folder.

user-82e5bd 16 March, 2021, 21:24:49

Do you have the good link van the new version because i think i have the same problem with re install

papr 16 March, 2021, 21:27:47

If it still does not appear, please share the log file again. It will be updated

papr 16 March, 2021, 21:27:06

This is the link of the correct version https://discord.com/channels/285728493612957698/285728493612957698/821414795327963176

user-1a6a43 16 March, 2021, 22:40:05

Hey, I have a quick question. I'm using the pupil labs core with Python, and was wondering whether I should use the API or run from source. I plan to use information that is processed and output from the device live to in other processes that are running in paralle

user-1a6a43 16 March, 2021, 22:40:51

so as someone with a decent knowledge in Python but not much experience in multiprocessing/running multiple processes at once with information being directly connected between them, should I use the API or should I run from source

papr 17 March, 2021, 08:44:16

You can read more about the network API here https://docs.pupil-labs.com/developer/core/network-api/

user-1a6a43 16 March, 2021, 22:41:34

and how do you recommend I connect these parallel processes in regards to the flow of information

user-1a6a43 16 March, 2021, 22:45:56

Thank you!

user-1a6a43 17 March, 2021, 00:20:00

please also when I try installing dependencies on my virtual environment set on python 3.6, which I understand is the version used in the source code, I get this

user-1a6a43 17 March, 2021, 00:20:01

ERROR: Could not find a version that satisfies the requirement requirements.txt ERROR: No matching distribution found for requirements.txt

user-1a6a43 17 March, 2021, 01:59:29
Scanning dependencies of target uvc
[ 11%] Building C object CMakeFiles/uvc.dir/src/ctrl.c.o
[ 22%] Building C object CMakeFiles/uvc.dir/src/ctrl-gen.c.o
[ 33%] Building C object CMakeFiles/uvc.dir/src/device.c.o
[ 44%] Building C object CMakeFiles/uvc.dir/src/diag.c.o
[ 55%] Building C object CMakeFiles/uvc.dir/src/frame.c.o
[ 66%] Building C object CMakeFiles/uvc.dir/src/init.c.o
[ 77%] Building C object CMakeFiles/uvc.dir/src/stream.c.o
/Users/Karim/Desktop/CochlearityPython/libuvc/src/stream.c:479:1: warning: non-void function does not return a value [-Wreturn-type]
}
^
/Users/Karim/Desktop/CochlearityPython/libuvc/src/stream.c:570:56: warning: format specifies type 'unsigned long' but the argument has type 'uint64_t'
      (aka 'unsigned long long') [-Wformat]
                                        printf("*** Correcting clock frequency to %lu\n", strmh->corrected_clock_freq );
                                                                                  ~~~     ^~~~~~~~~~~~~~~~~~~~~~~~~~~
                                                                                  %llu
2 warnings generated.
[ 88%] Building C object CMakeFiles/uvc.dir/src/misc.c.o
[100%] Linking C shared library libuvc.dylib
[100%] Built target uvc
[100%] Built target uvc
Install the project...
-- Install configuration: "Release"
CMake Error at cmake_install.cmake:49 (file):
  file cannot create directory: /usr/local/lib.  Maybe need administrative
  privileges.
papr 17 March, 2021, 17:30:04

The issue here seems to be missing permissions though

user-1a6a43 17 March, 2021, 03:44:51

is pupil labs even supported for M1 chip Mac/Big Sur?

papr 17 March, 2021, 08:43:47

Hi, I recommend using the bundled application and processing the data in your other processes via the network API. The bundle is supported on Big Sur/M1 Macs since v3.1

user-1a6a43 17 March, 2021, 17:16:22

Thanks! I have another quick question, is it possible to run from source using Big Sur and the M1 chip

user-755e9e 17 March, 2021, 09:05:33

Hi, as you mentioned, the side of the LED with the black component should be the negative(- anode) side.

user-7e9436 18 March, 2021, 18:29:51

okay, got it, thank you so much. And regarding how to "re-sandwich" the two boards together so that it is actually functionable (and can get electricity through, obviously), how would that be done?

user-1a6a43 17 March, 2021, 17:24:48

I tried installing all the dependencies like the tutorial asked, but some failed

papr 17 March, 2021, 17:29:49

Yeah, running from source is very tricky at the moment because the dependencies are currently very convoluted regarding m1-emulation and native support. Technically, it is possible, but our support in this regard is limited.

user-2b98c7 17 March, 2021, 17:30:10

Is there a way to change the calibration background to a black screen instead of the white default?

papr 17 March, 2021, 17:32:25

Not builtin. It is possible by creating a custom plugin that subclasses https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/calibration_choreography/screen_marker_plugin.py

user-1a6a43 17 March, 2021, 17:30:34

I fixed that issue by using sudo twice, once before and after the commands between &&

user-1a6a43 17 March, 2021, 17:31:04

but there are several files that just can't be installed from what I can tell

user-1a6a43 17 March, 2021, 17:31:59

for example: ERROR: Could not build wheels for numpy which use PEP 517 and cannot be installed directly

user-2b98c7 17 March, 2021, 17:36:06

Got it. Do you know what property within the class changes the background color?

papr 17 March, 2021, 17:40:42

I think one has to call glClearColor. But I am not sure. I will have to check next week when I am back at work.

papr 17 March, 2021, 17:39:13

I will have to check. I think it will be a bit more complicated than overwriting a single attribute though.

user-2b98c7 17 March, 2021, 17:40:12

Thank you for checking!

user-2b98c7 17 March, 2021, 17:41:29

okay thanks!

papr 21 March, 2021, 17:44:35

Hey, I checked and unfortunately, it is not possible to change the background via subclassing in the current implementation. As a work around, I suggest printing the calibration marker and putting it on a dark surface, and using Single Marker Calibration in Physical mode. It might be necessary to leave an additional white border around the outer black ring for detection purposes. You can read more about Single marker calibration here https://docs.pupil-labs.com/core/software/pupil-capture/#single-marker-calibration-choreography

user-1a6a43 17 March, 2021, 17:47:54

and as a followup to this, can the applications run properly without having first installed all the dependencies? there should be some support for people who use M1 macs

papr 17 March, 2021, 18:06:54

The dependencies are quite specific and some are complex to build (on all platforms). This is why we provide the bundle which works out-of-the-box. We support Big Sur users fully by providing the bundled application.

Of course, we try to support people running from source as much as possible. But in this case, numpy, one of our main dependencies, is not providing prebuilt wheels for your specific platform. If you wanted to continue, you would have to install numpy from source. Please understand, that this is something I cannot help you with, as it exceeds my expertise.

I am not sure how you installed Python, but I recommend using the Python 3.9.1 macOS 64-bit Intel installer https://www.python.org/downloads/release/python-391/ I think this is the version that will most likely lead to success.

papr 17 March, 2021, 18:09:39

Also, I would like you to note that running form source is very often not necessary. Your use case should work with the bundled application. If you think that my assessment is wrong I would be happy to discuss details.

user-1a6a43 17 March, 2021, 18:10:06

sure, thank you for your advice! let me give some more background to what I want to use pupil core for

user-1a6a43 17 March, 2021, 18:11:46

I want to be able to track the eye movements of someone live as they are put in specific listening situations. their pupil dilation and the azimuth of their eyes should be sent live, with the least amount of latency as possible, to other processes that are running in parallel. the eye movements and angle will be used to determine which sounds are amplified

papr 17 March, 2021, 18:12:32

What is your latency constraint?

user-1a6a43 17 March, 2021, 18:12:36

because we are dealing with sound and having that sound directly output live to the user via a headset, the latency, again, needs to be minimal

user-1a6a43 17 March, 2021, 18:13:16

optimally it'd be 20ms, as that's how long it'd take the mind usually takes to notice discrepancies between what it is seeing and what it is hearing

user-1a6a43 17 March, 2021, 18:13:34

but we're looking to minimize it as much as possible

user-1a6a43 17 March, 2021, 18:13:40

that's why I wanted to run from source

papr 17 March, 2021, 18:14:55

Since version 2.6 we have the possibility to run custom pupil detectors as plugins in the bundle. https://github.com/pupil-labs/pupil/releases/tag/v2.6

These run in the eye process in the same way as if you ran from modified source code.

You can write a custom "pupil detector" that does the necessary processing.

user-1a6a43 17 March, 2021, 18:16:55

that's awesome

user-1a6a43 17 March, 2021, 18:18:11

as for the rest

user-1a6a43 17 March, 2021, 18:18:16

what do you recommend I do

user-1a6a43 17 March, 2021, 18:18:22

for minimizing the latency

papr 17 March, 2021, 18:20:08

Do you have a proof-of-concept that shows that the current implementation requires lower latency?

papr 17 March, 2021, 18:25:09

If not, I suggest building a prototype in the simplest way possible (network API) and testing how much latency you get. Afterward, you can switch to implementing it as a plugin. If this is still not quick enough, one would have to evaluate what the largest contributor to the latency is.

papr 17 March, 2021, 18:31:23

Specifically, image transport from the camera to the computer + pupil detection already use some of your latency budget. I do not know the exact numbers for M1 macs though.

user-1a6a43 17 March, 2021, 19:36:21

thank you! I will test this tomorrow after I finish with my finals

user-1a6a43 17 March, 2021, 19:36:24

one more question

user-1a6a43 17 March, 2021, 19:37:28

the device produced by the project should be one that can work entirely independently, which means that it has to work independent of the computer

user-1a6a43 17 March, 2021, 19:37:38

or a wired computer, full pc. we're thinking of using a mini computer in the device itself for the processing, which probably can't run the applications

user-1a6a43 17 March, 2021, 19:37:57

does that mean that, at that stage of the project, we would need to start running it from source

papr 17 March, 2021, 19:39:35

In this stage, I would recommend not using Capture anymore but creating a custom Python script that uses the necessary dependencies: - pyuvc for camera access https://github.com/pupil-labs/pyuvc - 2d pupil detector https://github.com/pupil-labs/pupil-detectors/ - pye3d https://github.com/pupil-labs/pye3d-detector/

user-1a6a43 17 March, 2021, 19:39:51

thank you

user-fb397c 18 March, 2021, 08:01:24

Hello all,

I am currently learning the system and have a question regarding the calibration. If I choose one of the calibration methods (e.g., screen marker, natural feature) and the participant needs to look at the calibration points on an external monitor, can he move his head towards the points while doing the calibration or should the head be directed to the center of the screen and the calibration points are only fixed with the eyes (without head movement)?

Thank you in advance!

user-fb397c 18 March, 2021, 08:30:51

And, a follow-up question, should be the calibration distance be the same as the stimulus distance? For example, when the stimulus (e.g., object) is about 100cm away, should the calibration (e.g., on an external monitor) then be done with the same distance?

Thank you!

nmt 18 March, 2021, 09:52:25

Hi @user-fb397c. With the 5-point screen-based calibration, the participant should keep their head still whilst following the marker with their gaze. For the single screen-based marker, the participants should gaze at the marker whilst slowly moving their head (e.g. in a spiral pattern). The 5-point calibration can be useful when the experiment involves just looking at the screen. The single marker (either physical or screen-based) is a good option if you want to investigate eye movements that will require larger rotations, as it allows you to calibrate at those rotation angles and cover a large range of your field of view. In terms of distance, yes, it is best practice to calibrate at the viewing distance you will be assessing. Read the calibration docs here: https://docs.pupil-labs.com/core/software/pupil-capture/#calibration

user-b772cc 18 March, 2021, 10:00:04

Hi, is there a way to enhance detection of the printed surface markers? I noted that not all the markers pasted on the computer monitor were detected throughout the experiment. Thank you.

papr 18 March, 2021, 10:09:58

Thanks for sharing a picture of the setup via DM. Please increase the width of the outer white border to improve detection. See the picture below for the reference of the white border width (white area between gray cut lines and black border).

Chat image

papr 18 March, 2021, 10:00:34

Could you share a picture of the setup with us?

user-b772cc 18 March, 2021, 10:11:36

@papr Noted with thanks !!

user-98789c 18 March, 2021, 11:32:02

Is there a way to test a script to remotely control Core (that contains recording, annotations, etc.), without having the device plugged in?

papr 18 March, 2021, 11:35:52

The short answer is: Not really πŸ˜• You can use videos recorded with Capture as video input by dragging them onto world/eye windows. But they will generate data using the recorded timestamps, which will not be in sync with Capture's clock, i.e. die pupil data will not be in sync with your annotations

user-98789c 18 March, 2021, 11:37:38

it's alright, hopefully in the near future πŸ˜‰

papr 18 March, 2021, 11:40:37

Please note, this is not a feature that we will work on in the near future. Capture's architecture is not designed for coordinated playback of recordings. For that use, we recommend using Pupil Player.

user-98789c 18 March, 2021, 11:41:52

Ah ok, noted πŸ‘

user-7bd058 18 March, 2021, 15:35:12

Hello! How could I transfer marked AOIs or Surfaces from one recording to another? I have several similar trials and I worked with markers. I would like to transfer them from one video to another

papr 18 March, 2021, 15:37:51

You can define the surfaces in one recording, close Player, and copy the surface_definitions_v01 file to your other recordings.

user-7bd058 18 March, 2021, 15:46:36

thank you very much it worked perfectly

user-3d024c 18 March, 2021, 16:45:51

Hi, is there any way to salvage data if the recording wasn't stopped before pupil capture was closed (ie quit program without hitting R first)? If I cannibalize info files from another recording, pupil player will launch the directory but not actually display data.

Chat image

papr 18 March, 2021, 17:53:28

The videos will likely not be recoverable. The remaining can be salvaged. What data are you depending on in your experiment?

user-f1866e 18 March, 2021, 16:50:04

I got an error message of sensor permission. I already enabled all permission for PupilMobile.

user-f1866e 18 March, 2021, 16:50:25

Please any body ?

user-3d024c 18 March, 2021, 17:57:01

blinks, pupil diameter, and ideally fixations

papr 18 March, 2021, 17:59:28

Blinks can be calculated post-hoc based on the recorded pupil confidence. The pupil data recoverable. The fixations can be calculated post-hoc based on the gaze data, which is also recoverable. Unfortunately, without scene video, there will be no visual context.

papr 18 March, 2021, 17:58:05

Regarding the fixations, do you need them to be placed in a visual context (scene video)?

papr 18 March, 2021, 18:01:16

You can share the pldata files from all affected recordings with [email removed] I can generate the missing timestamp files from them on Monday.

user-3d024c 18 March, 2021, 18:02:19

Awesome, thank you!

wrp 19 March, 2021, 04:36:42

@user-7e9436 can you reverse the steps you took to separate the boards? The connector between the two boards should just be able to be plugged back in again to work.

user-7e9436 19 March, 2021, 17:15:00

that doesn't seem to be possible, as the connector that i pried apart needs to be stuck back onto the chipboard somehow and was not a "socket/plug" type connector. Here is a photo attached to view the components, the blue circled area is where it used to be connected by. https://pasteboard.co/JTmD6NV.jpg

user-98789c 19 March, 2021, 12:31:07

how does the calibration sample duration affect the calibration and the recorded data? does it have something to do with the speed of calibration?

papr 22 March, 2021, 11:05:38

This parameter is indeed related to the duration of the calibration. It is the number of frames in which the calibration marker needs to be detected before proceeding to the next marker location. There is a sweet spot for that value. If you set the value too small, the subjects will have trouble tracking the marker. The longer the duration takes, the more likely it is that the subject will not fixate the marker accurately.

user-98789c 22 March, 2021, 10:59:33

any guidance on this question? πŸ™‚

wrp 19 March, 2021, 23:03:37

Looks like it is a pluggable connector. But you pulled the "male" connector off from the surface of the board. This will require re-soldering the male connector or starting fresh with a new camera.

user-7e9436 20 March, 2021, 00:51:39

i see, uh oh, i knew something like that happened but they were stuck together quite a bit. Would you happen to have a set of this camera already made that you can sell to me instead? That would make things so much easier, because trying to order another one and go through that whole process again might just end up with the same result.

wrp 20 March, 2021, 01:04:38

If we had any hd-6000 cameras we would be happy to offer them, but we do not. The only thing we can offer is Pupil Core eye cameras - https://pupil-labs.com/products/core/accessories - but the connection geometry for this eye cam uses a triangular rail cross section vs DIY headset's circular rail cross section. Also cost for the 200hz Core eye cam might be outside of a diy project budget.

user-7e9436 20 March, 2021, 02:07:12

okay, i will have to give it some considerations then, perhaps taking the camera to a local electronics shop to have it put together might be the best option. thank you for your time and patience in answering all of my questions!

user-b772cc 20 March, 2021, 01:53:15

Can I specify teh time stamps I want to export the fixation data from recording via player? How to do that? Thank you

wrp 20 March, 2021, 02:14:52

Yes! Use the trim marks in Player to set the exported range.

user-b772cc 20 March, 2021, 03:16:10

@wrp many thanks! So I can trim many parts and export those parts at one time?

wrp 20 March, 2021, 03:17:13

In Pupil Player you set the trim marks and all plugins that export data will export for the range set. By default the trim marks are set to start and end of the recording

wrp 20 March, 2021, 03:17:53

There are only 2 trim marks.

wrp 20 March, 2021, 03:18:34

So if you want multiple segments you will need to set the trim marks and export for each segment

user-b772cc 20 March, 2021, 03:19:33

@wrp noted with thanks!!

user-52cbe2 20 March, 2021, 18:39:39

Hello everyone, I have a question about latency. I have a unit that I bought 3 years ago. The refresh rate on the eye camera is 120 Hz. What I'm wondering about is the latency of the 3d model, in particular diameter_3d: the time lag between the moment you get the message and the time in the past when the pupil actually had that diameter.

papr 20 March, 2021, 18:40:47

If you are talking about the realtime delay, the response depends on your setup and may vary.

user-52cbe2 20 March, 2021, 18:45:35

I had my monitor cycle from from black to white in a sinewave at 1 He (i.e., full period of 1 second). I simultaneously measured diameter_3d. By calculating the offset between the luminance and pupil curves, I measure the latency of the pupil signal. I found 530 ms in one subject, and 580 ms in another.

papr 20 March, 2021, 18:46:32

I suggest talking to @user-430fc1 regarding PLR measurements.

user-52cbe2 20 March, 2021, 18:46:38

If you subtract the latency of my monitor, which I haven't measured but which is usually around 20-50 ms, the latency works out to around 500 ms.

user-52cbe2 20 March, 2021, 18:47:51

Just to finish: physiologically, the latency should be around 300 ms, so that give the tracker + model delay of about 200 ms.

papr 20 March, 2021, 18:51:20

The way to calculate the eye tracking + transmission delay is to sync the clock of your receiver script with Pupil Capture and to subtract the data timestamp from the reception time. See the docs regarding the clocks used by Capture https://docs.pupil-labs.com/core/terminology/#timing

user-52cbe2 20 March, 2021, 18:49:30

Do you find that a delay of 200 ms is plausible?

papr 20 March, 2021, 18:54:01

I cannot judge this value as it depends heavily on your setup and how you measure the delay. @user-430fc1 has been working on measuring the system's delay post-hoc for his PLR research. They will be able to give you an unbiased evaluation.

user-52cbe2 20 March, 2021, 19:05:25

OK, thank you very much. I'll take a look at the sync procedure, and get in touch [email removed]

user-f22600 22 March, 2021, 12:55:58

Hi, I'm trying to run pupil core from source on ODROID N2. When I run player, I get the following error ,message

player - [INFO] numexpr.utils: NumExpr defaulting to 6 threads. player - [INFO] launchables.player: Session setting are from a different version of this app. I will not use those. player - [ERROR] launchables.player: Process player_drop crashed with trace: Traceback (most recent call last): File "/home/odroid/src/pupil/pupil_src/launchables/player.py", line 887, in player_drop window = glfw.create_window(w, h, "Pupil Player", None, None) File "/home/odroid/src/pupil_venv/lib/python3.8/site-packages/glfw/init.py", line 1180, in create_window return _glfw.glfwCreateWindow(width, height, _to_char_p(title), File "/home/odroid/src/pupil_venv/lib/python3.8/site-packages/glfw/init.py", line 632, in errcheck _reraise(exc[1], exc[2]) File "/home/odroid/src/pupil_venv/lib/python3.8/site-packages/glfw/init.py", line 54, in _reraise raise exception.with_traceback(traceback) File "/home/odroid/src/pupil_venv/lib/python3.8/site-packages/glfw/init.py", line 611, in callback_wrapper return func(args, *kwargs) File "/home/odroid/src/pupil_venv/lib/python3.8/site-packages/glfw/init.py", line 832, in _handle_glfw_errors raise GLFWError(message, error_code=error_code) glfw.GLFWError: (65542) b'GLX: No GLXFBConfigs returned'

However, when I launch python and run create_window, it runs with no issue. So I don't think it is a OpenGL ES issue.

Have anyone tried running pupil core on Raspberry Pi or ODROID?

What I eventually want to do is run a recording with the player on ODROID and compare the performance. So I don't need a window to be opened.

papr 22 March, 2021, 15:41:03

Maybe one more word regarding your goal. Usually, people try to run Pupil Capture on the RPI as a mobile recording platform. But this is not what you are trying to do, correct? You are trying to run Pupil Player (without a window), correct? Please note that Player's primary purpose is the visualization and export of the recorded data. Without the former, to implement the latter in a specialised script, without the complexity that comes from running a GUI application.

papr 22 March, 2021, 13:37:14

Running Pupil on RPI requires changes to the source code as the glfw.SCALE_TO_MONITOR window hint is not supported on RPI https://github.com/pupil-labs/pupil/blob/master/pupil_src/launchables/player.py#L885-L886

There are previous discussions on this topic here on Discord and in our Github issues https://github.com/pupil-labs/pupil/issues

user-10cdee 22 March, 2021, 13:12:15

Hi, We are having issues with the eyetracking glasses. When we connect the eyetrackers to the computer the left eye camera does not show anything. We have tried on three different computer and also tried 3 different eyetrackers but the issue remains. Do someone know how to fix this issue?

papr 22 March, 2021, 13:22:50

Please contact [email removed] in this regard.

user-10cdee 22 March, 2021, 13:23:55

Thank you, will do that.

papr 22 March, 2021, 13:32:11

HI, just to clarify

However, when I launch python and run create_window, it runs with no issue. this means there is no warning being displayed?

Please note that Pupil raises all glfw errors (with one exception) as they usually cause unexpected issues down the line. You can ignore specific glfw errors by adding their error code (65542 in your case) here https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/gl_utils/utils.py#L375-L379

user-f22600 22 March, 2021, 15:19:19

Yes, no warning was displayed when running python on terminal, glfw.init() then glfw.create_window(1280, 720, "Pupil Player", None, None)

user-f22600 22 March, 2021, 15:34:33

rpi

user-f22600 22 March, 2021, 15:50:18

Exactly, I'm trying to use Player to 1. profile the sw in RPI 2. run the same dataset to see the impact of the change I made in the code, let's say in pye3d So, I don't need the visualisation. I need the cpu profile data and log of eye gaze location.

papr 22 March, 2021, 15:54:35

Out of personal curiosity, what is your motivation to run the software on a RPI/ODROID instead of a desktop computer with one of the supported operating systems? You do not need to respond if you are not ready to reveal this type of information about your project.

user-a09f5d 22 March, 2021, 16:50:04

Hi I have recently finished an experiment and need to export the raw data from the pupil core eye tracker for analysis. However, when I open the recordings in pupil player the data visualiser at the bottom does not show anything and the blink detector plugin does not seem to be working either (see screen shot). Do you know why this is? For context I am only interested in the raw pupil position values (i.e. from the 3D model) and not gaze position (as it was not possible to run a calibration). I also need to know when the subject blinks. For the experiment I used the pupil remote plugin to start and stop the recording from a custom python script and to send remote annotations.

Chat image

papr 22 March, 2021, 16:52:07

Hey, is it possible that you synced Pupil time to Unix epoch during your experiment?

user-a09f5d 22 March, 2021, 16:52:42

Sorry, not sure what you mean?

papr 22 March, 2021, 16:52:59

Did you use any type of time sync during the recording?

user-a09f5d 22 March, 2021, 17:04:00

This is the code I used to set up the connection with the eye tracker. I followed the instruction I found on here https://docs.pupil-labs.com/developer/core/network-api/#pupil-remote

Chat image

papr 22 March, 2021, 17:07:51

Ok, thank, this confirms my initial hunch. time.time() timestamps are only accurate when using 64-bit floats. Out of technical reasons, the timeline visualisation only uses 32-bit floats, which have decreased precision. As a result, multiple data points are visualized at the same location in the timeline. This issue only affects the visualization. Your recorded timestamps remain accurate to their original precision.

papr 22 March, 2021, 17:08:36

I am looking at the source code for the relevant sections right now.

user-a09f5d 22 March, 2021, 17:14:52

Okay, thanks for that. So this means the timeline visualisation does not work for recordings made using pupil remote?

papr 22 March, 2021, 17:15:47

It does not work well for recordings synced to Unix time time.time(), correct. This is independent of the usage of Pupil Remote.

user-a09f5d 22 March, 2021, 17:17:33

Okay! So long as it is just superficial I can live with it. Thanks

What about the issue I am having with the blink detector? For some, but not all, of my recording the blink detect plugin does not seem to be detecting blinks.

user-a09f5d 22 March, 2021, 17:18:54

It says that no blinks were detected (when there are clearly blinks).

Chat image

papr 23 March, 2021, 14:34:49

It looks like this recording includes timestamps from before the clock adjustment, i.e. there is a huge timestamp gap at the beginning of your recording. This causes the blink detector to use an incorrect filter width. As a result, no blinks can be detected.

How many recordings do you have made that show this issue?

papr 22 March, 2021, 17:20:06

The blink detector is very reliant on the pupil detection quality. The better the quality the better it works. It is possible that the detection is bad for these specific recordings. If you want you can share them with [email removed] and we can have a look. In the provided screenshot the detection looks good though! πŸ€”

user-a09f5d 22 March, 2021, 17:23:03

Absolutely, thanks. I will try to email you the recording now.

user-7d2ccc 22 March, 2021, 17:37:10

Does anyone knows power requirements for core??

papr 22 March, 2021, 17:37:36

Are you referring to electrical power?

user-7d2ccc 22 March, 2021, 17:37:44

yes

papr 22 March, 2021, 17:38:15

I will forward your question to our hardware team.

user-7d2ccc 23 March, 2021, 19:31:53

is there any update on this?

user-7d2ccc 22 March, 2021, 17:39:09

thanks

user-a09f5d 22 March, 2021, 17:42:39

@papr I have just sent you an email to data@pupil-labs.com with the recording that has the problem with the blink detector. Thanks a lot for your help.

papr 22 March, 2021, 17:43:07

Great, thank you. I will have a look tomorrow and come back to you via email.

user-a09f5d 22 March, 2021, 17:43:45

Great! Thanks! Please let me know if you can't download the file via the attached link.

user-10fa94 22 March, 2021, 19:17:36

Thank you very much! this is very helpful. I was also wondering if there is a way to make the world camera sit more solidly in the headset? according to the API docs (https://docs.pupil-labs.com/core/hardware/#focus-world-camera) the focus of the camera included with the core headset can be set, but in previous posts these cameras are described as fixed focal length cameras (which i see as there is usually only one setting at which the camera is focused for any depth away from the camera. However, this setting requires me to rotate the camera out of its holder quite a bit and I have noticed that the camera sits quite loosely in the fixture. Is this looseness expected or is it possible that there is something going on with the headset?

papr 22 March, 2021, 19:20:40

Please contact [email removed] in this regard.

user-a09f5d 23 March, 2021, 15:57:21

Thanks for investigating. How odd! I have identified two recordings (out of about 90- so not the end of the world but still a loss of data/confounded data) that have this issue. Interestingly, both recordings were at the start of the experiment (i.e. first recording made after opening pupil capture/starting my python script). Do you think that could have something to do with it?

papr 23 March, 2021, 15:58:37

Yeah, that makes sense. The first time you start Capture it uses its original clock. Afterward, your script adjusts the clock and Capture keeps it adjusted until shutdown.

user-a09f5d 23 March, 2021, 15:59:55

Ah, makes sense. Interesting that this did not happen to every recording at the start of the experiment though.

papr 23 March, 2021, 16:01:28

It is a race condition. If you add a time.sleep() between the clock adjustment and recording start, you can get rid of it

papr 23 March, 2021, 16:02:30

Alternatively, you do not perform the clock adjustment before the experiment but post-hoc https://github.com/pupil-labs/pupil-tutorials/blob/master/08_post_hoc_time_sync.ipynb

user-a09f5d 23 March, 2021, 16:02:33

Is there any way to resolve it post hoc or can I just not use blink detector for those recordings? Moving forward it might be worth adding a small recording at the start of the experiment (that I later delete) to reset the clock.

papr 23 March, 2021, 16:06:20

I can check tomorrow. Not sure how successful it will be.

user-a09f5d 23 March, 2021, 16:03:05

Oh right.. I sent that before reading your time.sleep() suggestion

user-a09f5d 23 March, 2021, 16:09:54

I use time.sleep() in other parts of my code for pauses during the experiment. Does this mean it will affect the clock time? Wondering if I should take out time.sleep() and use core.wait() instead.

papr 23 March, 2021, 16:15:51

That is fine. Sending the T command is all about syncing the clocks, such that the applications (Capture and experiment) can record data independently of each other.

user-a09f5d 23 March, 2021, 16:23:01

Cool. Thanks. I have added time.sleep(1) after setting up the clock as you suggested.

Chat image

papr 24 March, 2021, 20:15:07

So, there are 24-30 data points with timestamps before the recording start. Sleeping for 1 second should be plenty.

papr 23 March, 2021, 16:30:29

Can you share the second recording as well? I can have a look at how much data was recorded before the clock adjustment kicked in. This can inform the required sleep duration.

user-a09f5d 24 March, 2021, 17:39:41

Hi @papr Have you had a chance to look at the second recording? No worries if you haven't had a chance yet.

user-a09f5d 23 March, 2021, 16:59:40

Sure! Do you mean the recording made after the one I sent you or do you mean the other recording that the blink detection did not work?

papr 23 March, 2021, 16:59:54

The latter

user-a09f5d 23 March, 2021, 17:00:34

I'll do that now.

papr 23 March, 2021, 19:34:49

max 5V and 900mA as it is the standard USB3.0 power This is what I got as a preliminary response. Is that sufficient for your use case?

user-7d2ccc 23 March, 2021, 19:41:24

I have only one usb port on my laptop and I have 2 pupil core. I am using an usb hub to connect both cores. all 5 cameras(2 main + 3 IR cam) working but 4th IR cam is not working. When I use different pc which has multiple usb port, cores are working well. We think that we don't have enough a power over usb hub.

user-f195c6 23 March, 2021, 22:17:15

Hello! 😊 Is it possible to know the visual field in degrees?

papr 24 March, 2021, 08:46:28

Do you mean the scene cameras field of view? See the Scene camera FOV section at https://pupil-labs.com/products/core/tech-specs/

user-a9b74c 24 March, 2021, 07:11:31

hi ~ my pupil core can normally shows the Fixation while recording under the shades(left), but there's not any Fixation on the output videos recorded when walking out from the shades to the sun(right). Is there any way can solved this problem?

Chat image

papr 24 March, 2021, 08:50:24

Hi, Player only shows high-confidence data and it is likely that the sunshine is generating a lot of IR reflections in your subject's eyes, degrading the pupil detection performance (an issue that many mobile eye trackers face). I suggest enabling the eye video overlay to check the pupil detection. If you feel that the detection is good enough for your purpose, you can decrease the Minimum data confidence threshold in Player's general settings to show lower confidence data.

user-a9b74c 24 March, 2021, 16:44:52

thank you very much, I'll try this out !!

user-f195c6 24 March, 2021, 09:10:33

Hi! Thanks! I was talking during an experiment. Is it possible to know the angle according to the movement of the eye?

papr 24 March, 2021, 09:15:08

You can calculate the angle between gaze_point_3d values using the cosine distance. See https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/fixation_detector.py#L131-L134 for an implementation.

papr 24 March, 2021, 09:13:56

It indeed looks like there is not enough power and/or bus capacity (data transfer bandwidth). We highly recommend using a dedicated USB port per headset without intermediate devices, e.g. USB hub or passive USB extensions

user-7d2ccc 25 March, 2021, 04:15:27

thank you πŸ™‚

papr 24 March, 2021, 17:42:08

No not yet. I thought it might make sense to move that "work package" to your time zone in case I had any spontaneous questions/issues.

user-a09f5d 24 March, 2021, 17:45:36

Thanks @papr. Much appreciated.

papr 24 March, 2021, 21:25:07

I sent you the filtered files via email. This is the code I used to correct the files https://nbviewer.jupyter.org/gist/papr/96f35bebcfea4e9b9552590868ca7f54

You can skip to the last line for instructions.

user-a09f5d 25 March, 2021, 01:02:37

Fantastic! Thanks very much @papr.

user-7890df 24 March, 2021, 22:35:31

Any suggestions on how to diagnose the occasional lost frame to lost seconds on the world camera only (appears as grey screen in capture and player)?

user-06ae45 26 March, 2021, 20:36:44

Hi there - could anyone tell me the FOV of the 200Hz eye cameras?

papr 26 March, 2021, 20:38:26

Actually, I do not know if we have up-to-date numbers for that. Let me come back to you on Monday.

user-06ae45 26 March, 2021, 20:38:48

@papr Great, thanks so much!

user-82b4f1 27 March, 2021, 08:29:35

Hello, running pupil_service from source, and playing with window resize, I obtained the right band too narrow to use the controls (namely the "loop" control). How can I adjust or restore the correct width?

Chat image

papr 27 March, 2021, 08:30:39

Do you see the 3 vertical bars in the top left? You can use them to change the menu's width.

user-82b4f1 27 March, 2021, 08:31:25

Oh! I was clicking on them and nothing happened, now I see I have to drag them. Thank you!

user-be2dad 27 March, 2021, 22:41:16

Hi all, I'm here from the pupil labs website. Is anyone using this tech for individual self-study of attention/focus/concentration or vigilance/cognitive fatigue?

user-189029 28 March, 2021, 14:50:09

Hello, I am a novice and I have a question about surfaces. In my experiment participants read a printed text (on a sheet of paper). There are four Apriltags placed in the corners of the sheet. But in the Pupil Player there are a lot of frames where no markers are detected at all. For the rest of the frames only one marker (out of four) is detected. Are there any ways to create the surface for the all frames?

papr 28 March, 2021, 16:04:42

It is likely that you are missing sufficient white border around the tags. But to be sure, could you please share a picture of the paper?

user-b772cc 29 March, 2021, 03:13:43

Hi, what causes some fixation id to be missing from the exported data from pupil player? Is it due to the maximum dispersion and duration setting under Fixation Detector? I tried setting at max which seem to work but the fixation id changes. Or is it that I must scroll right to the end before exporting? But shouldn’t the trim marks suffice. You advice is much appreciated. Thank you

user-b772cc 29 March, 2021, 03:40:14

Any issue with changing the setting to this? Thank you.

Chat image

papr 29 March, 2021, 08:53:58

Pupil Core Eye Cam1 (120 Hz)

FOVinDeg(resolution=(320, 240), horizontal=51, vertical=39, diagonal_main=61, diagonal_off=61)
FOVinDeg(resolution=(640, 480), horizontal=51, vertical=39, diagonal_main=62, diagonal_off=62)

Pupil Core Eye Cam2 (200 Hz)

FOVinDeg(resolution=(192, 192), horizontal=37, vertical=37, diagonal_main=51, diagonal_off=51)
FOVinDeg(resolution=(400, 400), horizontal=39, vertical=39, diagonal_main=53, diagonal_off=53)

Pupil Core Eye Cam3 (200 Hz, HTC Vive add-on)

FOVinDeg(resolution=(192, 192), horizontal=69, vertical=69, diagonal_main=88, diagonal_off=88)
FOVinDeg(resolution=(400, 400), horizontal=71, vertical=71, diagonal_main=91, diagonal_off=91)
user-06ae45 29 March, 2021, 16:32:02

Thanks so much!

papr 29 March, 2021, 08:57:44

What do you mean the fixation id is missing? Does the exported fixations.csv have data but no id entries?

Changing the parameters changes the number of detected fixations, and therefore the ids as well. Read more about them here https://docs.pupil-labs.com/core/best-practices/#fixation-filter-thresholds

user-b772cc 29 March, 2021, 10:07:57

Thanks @papr. What I meant was some fixations were missing/ skipped. Is it something about the trim marks. Seems like if I don’t scroll right to the end of the trim mark before exporting, only limited fixations get exported.

user-189029 29 March, 2021, 09:51:38

Yes, you are right. But I also want to analyse the recodings with such technical error. Is it possible somehow? Also please add information about sufficient white border to the manual.

papr 29 March, 2021, 09:55:50

Unfortunately, there is not much one can do regarding marker detection if they are missing the white border.

I will add the requested note to the docs.

papr 29 March, 2021, 10:30:03

Fixations require a bit to be detected. There is a progress indicator around the fixation detector's menu icon. You should wait until the detection has completed before exporting.

user-b772cc 29 March, 2021, 12:38:16

Thanks. Will take note.

user-c4d5ef 29 March, 2021, 10:40:42

Hello. I`m also fairly new to the pupil core glasses. I noticed in pupil player two different colours for pupil detection. One red and one blue. The blue one seems to be more accurate. Is there a way disregard faulty detection and use the blue one instead? The player used the gaze data from the recording. See Screenshot (is this just a visual artefact or does in affect data (x/y coordinates)? Thank you very much.

Chat image

papr 29 March, 2021, 12:02:50

Please see this message for reference https://discord.com/channels/285728493612957698/285728493612957698/766385521291034636

user-320f46 29 March, 2021, 10:40:51

Hi everyone! I'm new to pupil labs. Is there a way to analyze saccades (number, length, velocity) with pupil labs?

papr 29 March, 2021, 12:02:29
papr 29 March, 2021, 12:04:48

Hi, the software does not offer built-in saccade analysis, but there are community projects that have that implemented, e.g. by @user-2be752 https://github.com/teresa-canasbajo/bdd-driveratt/tree/master/eye_tracking/preprocessing

user-320f46 29 March, 2021, 12:38:27

Hi thanks, I'll check it out

user-c4d5ef 29 March, 2021, 13:18:47

Thank you. I will try to use the 2D identification then.

nmt 29 March, 2021, 13:38:20

The 2D and 3D detectors run at the same time. You can improve the 3D model by rolling the eyes, like in this video: https://docs.pupil-labs.com/core/#_3-check-pupil-detection

user-430fc1 29 March, 2021, 14:27:41

Hello, I'm wondering what factors can lead to variability in diameter_3d. I'm guessing camera distance... Are there any others?

papr 29 March, 2021, 14:34:29

Actually, camera distance should not be one of them (in theory). The 3d eye model estimates the position of the eyes in relation to the eye camera. Model refits can cause sudden jumps in diameter values. Slippage without model refits can cause incorrect values, too. There is a gaze-angle dependency due to the distortion of the cornea. pye3d is able to correct that at the population level.

papr 29 March, 2021, 14:36:09

The correct model estimation also depends on knowing the correct focal length of the eye camera

user-430fc1 29 March, 2021, 14:34:47

Just to show what I mean. Far left is a NeurOptics pupillometer

Chat image

papr 29 March, 2021, 14:35:31

Were these measured at the same time?

user-430fc1 29 March, 2021, 14:35:59

20 trials each, alternating, with spectrally matched 1 s pulses of white light

papr 29 March, 2021, 14:38:45

Usually, when analysing pupil responses one would subtract trial-baseline/pre-stimulus values from the trial values

MathΓ΄t, S., Fabius, J., Van Heusden, E., & Van der Stigchel, S. (2018). Safe and sensible preprocessing and baseline correction of pupil-size data. Behavior Research Methods, 50(1), 94–106. https://doi.org/10.3758/s13428-017-1007-2

user-430fc1 29 March, 2021, 14:48:39

I guess there must have been some slippage of the camera as these data were collected in a dark room so some of the lower estimates don't seem too consistent πŸ€”

papr 29 March, 2021, 14:39:13

Baseline correction is useful in experiments that investigate the effect of some experimental manipulation on pupil size. In such experiments, baseline correction improves statistical power by taking into account random fluctuations in pupil size over time

papr 29 March, 2021, 14:40:42

See also the Spontaneous fluctuations in pupil size section in MathΓ΄t, S. (2018). Pupillometry: Psychology, Physiology, and Function. Journal of Cognition, 1(1), 1–23. https://doi.org/10.5334/joc.18

papr 29 March, 2021, 14:51:07

This review says that a typical pupil size range is 2-8mm. Values smaller than that indicate an inaccurate 3d eye model, yes. Especially if the data is as smooth as in your graph.

papr 29 March, 2021, 14:52:12

The smoothness indicates that the model is at least stable. i.e. the change in pupil size might be scaled incorrectly, but the shape of the curve should be correct nonetheless.

user-430fc1 29 March, 2021, 14:55:08

Yes of course, it looks very stable and repeatable. I was thinking in terms of comparability with other systems, but I suppose that will always be wishful thinking given the variability between hardware, measurement principles, etc.

papr 29 March, 2021, 14:56:58

Yeah, the only way to truly compare these is by recording them in parallel. And it should be done, even though there is no real ground truth. πŸ˜•

user-430fc1 29 March, 2021, 14:55:17

Thanks!

user-292135 30 March, 2021, 04:11:46

Hi, I have a long pupil mobile recording files. I could stop pupil mobile recording properly and the folder structure seems fine. Hoever, pupil player stops during the initial loading. In the player.log, the following messages

user-292135 30 March, 2021, 04:11:52

2021-03-30 12:59:59,839 - MainProcess - [INFO] os_utils: Disabled idle sleep. 2021-03-30 13:00:00,187 - player - [INFO] numexpr.utils: Note: NumExpr detected 16 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8. 2021-03-30 13:00:00,187 - player - [INFO] numexpr.utils: NumExpr defaulting to 8 threads. 2021-03-30 13:00:10,420 - player - [INFO] launchables.player: Starting new session with '/Volumes/PupilMobileData/20210324140539137' 2021-03-30 13:00:10,438 - player - [INFO] pupil_recording.update.new_style: Checking for world-less recording... 2021-03-30 13:01:48,044 - player - [ERROR] libav.mjpeg: error dc 2021-03-30 13:01:48,044 - player - [ERROR] libav.mjpeg: error y=52 x=2 2021-03-30 13:01:51,671 - player - [ERROR] libav.mjpeg: error dc 2021-03-30 13:01:51,672 - player - [ERROR] libav.mjpeg: error y=46 x=1 2021-03-30 13:02:06,378 - player - [ERROR] libav.mjpeg: error dc 2021-03-30 13:02:06,379 - player - [ERROR] libav.mjpeg: error y=75 x=1 2021-03-30 13:06:35,654 - player - [ERROR] libav.mp2: Header missing 2021-03-30 13:06:36,118 - MainProcess - [INFO] os_utils: Re-enabled idle sleep.

user-292135 30 March, 2021, 04:12:20

any help about this issue? thanks!

user-7890df 30 March, 2021, 04:23:01

@papr Hi! Re: "Freezing" the model during posthoc pupil detection and then pressing "redetect", are the outcomes of the 3d model/gaze essentially the same between v2.6 and v3.1 because the model is 'frozen' and is not continuously updated based on past history? If we want very precise fixation information, should we always freeze or rely on v2.6? Thanks!

papr 30 March, 2021, 09:49:10

The outcome is not the same, as v3 ships with pye3d which is able to correct for corneal distortion. The 3d detector in v2 and earlier does not have that feature. The question of freezing is not related to the software version, but wether your recording includes slippage or not. I suggest freezing if there is no slippage. If there is slippage, I recommend not freezing the model.

user-3f2e42 30 March, 2021, 08:47:33

Hello, can I get any help to solve an issue related with an intel RealSense D435 camera? The main issue is that 'free(): invalid pointer' error when I try to activate RealSense plugin (realsense2_backend.py) in Pupil-labs application. Thanks in advance.

papr 30 March, 2021, 09:50:35

This looks like an issue that is related to the librealsense software dependency that the backend is using. I do not think that we will be able to help with that. πŸ˜•

user-3f2e42 30 March, 2021, 10:13:15

@papr Okay, thanks for the reply.

user-430fc1 30 March, 2021, 11:03:42

Returning to what you said about model re-fits... I think I have observed these jumps in diameter_3d in my data, and occasionally the 'jumps' survive a <.95 confidence filter. For a trial-based experiment, would freezing the model before a period of interest make sense, and then unfreezing during rest periods?

papr 31 March, 2021, 08:31:12

would freezing the model before a period of interest make sense, and then unfreezing during rest periods? 100%. I would even ask the users to roll their eyes during the resting period to get good eye model fitting data.

user-430fc1 30 March, 2021, 11:05:34

Do these look like the kind of 'jumps' you were referring to?

Chat image

papr 31 March, 2021, 08:32:52

No, this look more like false-positive high-confidence 2d data. If you follow https://doi.org/10.3758/s13428-018-1075-y you should be able to get rid of that noise.

user-7bd058 31 March, 2021, 08:29:45

Hello! When I want to jump from one fixation to another: Is there a possible way to use the keyboard instead of clicking "next fixation"?

papr 31 March, 2021, 08:33:44

The hotkeys are f for next fixation and F (shift + f) for previous fixation

user-7bd058 31 March, 2021, 08:35:33

thank you! I couldn't find previous fixation

user-7bd058 31 March, 2021, 08:39:19

when I work with the annotation player instead of surface tracker in order to mark areas of interests: Is there any possibility to see in display bar the marked fixations?

papr 31 March, 2021, 08:41:02

Currently, this is not implemented.

user-7bd058 31 March, 2021, 08:39:50

I press the hot key button and it seems to work fine but I can't see the fixation I've marked

user-7bd058 31 March, 2021, 08:40:32

meanwhile using the surface tracker the display bar shows when this areas was focused

user-7bd058 31 March, 2021, 08:41:32

and there is no possibilty to correct it?

papr 31 March, 2021, 08:42:13

Unfortunately, this is not implemented either, at the moment.

user-82b4f1 31 March, 2021, 10:29:55

Hello I am using pupil_service, I run it from source, and read a recorded file. I had problem with the loop option today. At the beginning the loop didn't work, although the button was colored. So I dragged the recorded file in the eye0 window multiple times. Until I realized that this makes a new camera icon and menu appear, so that there will be one for every newly dragged file (even if it is the same file). Why are there many if only the last dragged file is actually played/playable? In case it is not so, how to tell which is being played then? And how to direct pupil_service to play the desired one? By the way the loop option seems to be working now

papr 01 April, 2021, 08:00:13

The multiple video source instances are indeed an issue with Pupil Service. It works as expected in Pupil Capture. We will look into it. The videos play back correctly for me, though. I used an existing eye video recording made with Pupil Capture.

user-d90133 31 March, 2021, 18:30:06

Hello, on Pupil Capture is there a way to save single-marker calibrations for post-hoc analysis? Also, if calibrating multiple times, are the previous calibrations overrode? And if so is there a way to access previous calibrations that are optimal?

nmt 01 April, 2021, 07:50:09

Hi @user-d90133. Indeed you can use single markers for post-hoc calibration. If you record multiple calibration choreographies, you can also validate and choose the most optimal. You can read more about that in the docs: https://docs.pupil-labs.com/core/software/pupil-player/#pupil-data-and-post-hoc-detection. The videos on that page show you how to perform a basic post-hoc calibration, transfer calibrations between recordings, and validate calibrations. I suggest that you try to replicate the videos to fully understand the processes.

End of March archive