πŸ‘ core


user-f99202 01 November, 2023, 00:00:56

hello, I followed the instruction on the website to setup the pupil core DIY. I have the hardware setup. Both cameras turn on when i try it with my computer camera app. I installed pupil capture 3.5 software. However, the cameras are not discoverable to the software. Everytime I tweak it, It returns "camera is in use or blocked." Please how do i make the hardware discoverable to the software?

Also, the Microsoft HD-6000 is blur. It became blur after I inserted the IR filter. under the lense. How do i tweak it to focus?

NB: I have windows OS

Thanks.

user-cdcab0 01 November, 2023, 09:18:39

Can you try running Pupil Capture as an administrator?

user-e1d04f 22 July, 2024, 03:18:21

Do you solve it? How can i find the devices?

user-fafdae 01 November, 2023, 06:22:04

Hello! After inserting the Pupil core device into my personal computer, the system cannot recognize the usb device, so I cannot use Pupil Capture and other programs. I tried to update the driver and other solutions, but it still cannot be recognized. May I ask how else can I solve this problem? My computer is running Windows 11, version 22H2.😭

user-870276 01 November, 2023, 16:25:51

hello! is there any way that i can avoid those gaze point distortions while eye blinking? https://drive.google.com/file/d/1MN8QynRLq856oyMH2l48EHfde_PhUid6/view?usp=drive_link

user-ca5273 02 November, 2023, 15:07:20

hey everyone! I am just getting started with analyzing pupil data.. so this might be a beginner question. Is there a way to get the annotation labels from annotations.csv into fixations.csv other than going in manually?

user-d407c1 02 November, 2023, 15:28:59

Hi @user-ca5273 ! Have you seen our python tutorials to work with the data? https://github.com/pupil-labs/pupil-tutorials/blob/master/01_load_exported_data_and_visualize_pupillometry.ipynb https://github.com/pupil-labs/pupil-tutorials/blob/master/06_fixation_pupil_diameter.ipynb They demo how you can merge different parameters and could be a great source to show you how you can achieve what you want.

user-ca4e2e 02 November, 2023, 15:31:49

What are the power requirements of the individual Pupil Core eye cameras?

user-d407c1 02 November, 2023, 15:36:55

Hi @user-ca4e2e πŸ‘‹ ! Have a look at the previous answer from my colleague. https://discord.com/channels/285728493612957698/285728493612957698/1168794380019179600

user-ca5273 02 November, 2023, 15:33:50

Hi @user-d407c1 , yes, I was looking at these notebooks. Definitely a handy resource.. but didn't quite help me in what I want to achieve. I want to get the world labels in annotations into fixations. and the tutorials don't really cover that.. unless I am missing it.

I was thinking I could join the two files using the world index columns but they aren't exactly aligned in terms of row count

user-d407c1 02 November, 2023, 15:41:41

annotations and fixations both contain timestamps that you can merge similarly as shown here https://github.com/pupil-labs/pupil-tutorials/blob/master/10_merge_fixation_and_blink_ids_into_gaze_dataframe.ipynb

user-ca5273 02 November, 2023, 15:51:54

@user-d407c1 thanks! Two more questions, 1. I considered the timestamps and they don't exactly align between the two files. That's my main challenge and why I didn't go that route. In this case, do I just look at the start and end timestamps and join whatever is in between those?

  1. my annotation file is like ~35k rows while my fixation file is like ~3k rows. However, the weird thing is that the timestamps are within approximate distances of each other, joining in this case is a challenge since I might lose some labels. any ideas for doing that? That's another reason why I didn't go down the timestamp route or the index too
user-d407c1 02 November, 2023, 16:57:14

Just to be 100% sure, you want to know the fixation id per annotation or the annotations within that fixation?

user-7f68a7 02 November, 2023, 15:59:48

Hi there, I am collecting real-time gaze data using the pyplr module in Python, and I am using the function 'pupil_grabber'. It returns me a dictionary of different values, and I am using the value in b'ellipse' as the gaze location. Is it correct? If so, could anyone tell me in what units it is?

user-ca5273 02 November, 2023, 17:08:05

@user-d407c1 see snapshot of the two files I exported. What I want are columns AOI_* and Event_name in annotations to be joined to fixations by either timestamps or world index. We are interested in the AOI columns and the fixations data, and want to join the two somehow

Chat image Chat image

user-d407c1 03 November, 2023, 08:49:08

Hi @user-ca5273 ! To achieve that you can do something along these lines:

import pandas as pd
annotations = pd.read_csv('annotations.csv') # change them to the adequate path
fixations = pd.read_csv('fixations.csv')

def find_matching_annotations(row):
    matching_annotations = annotations[
        (annotations['timestamp'] >= row['start_timestamp']) & 
        (annotations['timestamp'] <= row['end_timestamp'])
    ]
    unique_aoi = matching_annotations['AOI_name'].unique()
    unique_events = matching_annotations['Event_name'].unique()
    combined_unique_values = list(unique_aoi) + list(unique_events)
    return combined_unique_values

fixations['AOI_Event_Matches'] = fixations.apply(find_matching_annotations, axis=1)
print(fixations.head())
fixations.to_csv('fixations_with_annotations.csv', index=False)

Kindly note that I haven't tested it, so you might need to finetune the code. Also, you will still need to place the proper path to each file when reading it, and if you do not have end_timestamp, you may have to use start_timestamp and add the duration to it.

user-7f68a7 02 November, 2023, 18:32:45

Hi Pupil lab, how do I get the real time gaze data in degrees in python?

user-d407c1 03 November, 2023, 09:39:51

Hi @user-7f68a7 ! I assume you are using Pupil Core, is that right? with the network API https://docs.pupil-labs.com/developer/core/network-api/ and ZMQ you can subscribe to any topic including gaze data in degrees. Depending on what you are looking for, you can for example subscribe to phi and theta or to the gaze normal 3d.

import msgpack
import zmq

ctx = zmq.Context()
pupil_remote = ctx.socket(zmq.REQ)

ip = "localhost"
port = 50020

pupil_remote.connect(f"tcp://{ip}:{port}")
pupil_remote.send_string("SUB_PORT")
sub_port = pupil_remote.recv_string()

subscriber = ctx.socket(zmq.SUB)
subscriber.connect(f"tcp://{ip}:{sub_port}")
subscriber.subscribe("gaze.3d.")

try:
    print("Listening for 'gaze.3d.1.' messages...")
    while True:
        topic, payload = subscriber.recv_multipart()
        message = msgpack.loads(payload, raw=False)
        eye = "right" if "gaze.3d.0." in str(topic) else "left"
        print(
            f"Eye {eye}:\n"
            f"  gaze_normal_3d: {message['gaze_normal_3d']}\n"
            f"  phi: {message['base_data'][0]['phi']}\n"
            f"  theta: {message['base_data'][0]['theta']}\n"
        )


except KeyboardInterrupt:
    print("Stopped listening for 'gaze.3d.1.' messages.")
finally:
    subscriber.close()
    ctx.term()
user-870276 02 November, 2023, 19:31:57

hello! is there any way that i can avoid those gaze point distortions while eye blinking? https://drive.google.com/file/d/1MN8QynRLq856oyMH2l48EHfde_PhUid6/view?usp=drive_link

user-d407c1 03 November, 2023, 09:44:34

Hi @user-870276 ! usually one would remove the blink data when analysing it, as you can see the confidence drops when there is a blink, if you remove the data with low confidence you should get rid of this.

user-5346ee 03 November, 2023, 02:49:21

Pupil Player causing Desktop Windows Manager high GPU usage? I'm using Windows 11 on a laptop with discrete GPU. When Pupil Player is running, DWM also starts to use GPU. Interestingly, if I force Pupil Player to use GPU 1, DWM will still use GPU 0. I don't see think problem occurring on my office computer, which is a desktop.

Chat image

user-d407c1 03 November, 2023, 09:58:10

Hi @user-5346ee πŸ‘‹ ! Could you check if your drivers are up to date?

user-7f68a7 03 November, 2023, 14:09:07

Hello everyone, can some one please tell me what "norm_pos" data indicate?

user-d407c1 03 November, 2023, 14:14:44

Hi @user-7f68a7 ! Those refer to the position of the gaze in the scene camera in normalised coord. You can find this and the rest of parameters here: https://docs.pupil-labs.com/core/software/pupil-player/#gaze-positions-csv

user-7f68a7 03 November, 2023, 14:22:28

thanks Miguel, so the [0,0] in the norm_pos is it in center of the screen or left bottom of the screen ?

user-d407c1 03 November, 2023, 14:28:47

The origin of 2D norm space is at the bottom left, you can find this here: https://docs.pupil-labs.com/core/terminology/#coordinate-system

user-956f89 03 November, 2023, 17:22:57

hi

user-956f89 03 November, 2023, 17:23:33

how to convert timestamps to the computer time in my excell file from recording

nmt 03 November, 2023, 17:47:41

Hi @user-956f89 πŸ‘‹. Check out this section of the docs for an overview of timestamps and how to convert them: https://docs.pupil-labs.com/core/terminology/#timestamps

user-956f89 03 November, 2023, 17:48:15

my goal is to represent the data as pupil size 3d per real time of the computer

world_timestamps.csv Pupil_positionskw_importedkw_1.xlsx

user-956f89 03 November, 2023, 17:48:45

Hi Neil

user-956f89 03 November, 2023, 17:48:55

thanks for the reply

user-956f89 03 November, 2023, 17:49:06

is only via python

user-956f89 03 November, 2023, 17:49:21

to convert between two times

user-956f89 03 November, 2023, 17:49:30

?

user-956f89 03 November, 2023, 17:50:30

it is complicated and i get stuck with my data now

user-956f89 03 November, 2023, 17:50:40

i recorded for 2 minutes

user-956f89 03 November, 2023, 17:51:22

several participants

user-956f89 03 November, 2023, 17:51:40

and now unable to process my data

nmt 03 November, 2023, 18:15:52

Converting the timestamps can also be done in spreadsheet software. It's essentially just a case of working out the temporal offset and adding it to the timestamps. The more important question is why do you want to convert to system time?

user-956f89 03 November, 2023, 18:17:30

in our study we want to represent the data according to the circadian time of the day. so we need to have a real time representation

nmt 03 November, 2023, 18:21:04

Ah ok. That's a valid motivation. If you were doing it for sync purposes there are certain caveats (outlined in the link I shared). In that case, I recommend implementing the steps on our docs but using spreadsheet software, if you're unable to run the Python example. Each step is explained in the code comments πŸ™‚

user-956f89 03 November, 2023, 18:18:37

we record in different times during the day from morning to evening

user-956f89 03 November, 2023, 18:19:20

the idea is to have a circadian graph or plot of pupil size at specific time of the day

user-956f89 03 November, 2023, 18:22:38

but i can not open my csv file with python from my recording folder and i do know how to fix it.

user-956f89 03 November, 2023, 18:22:58

so i am looking for any help or alternative solution

nmt 03 November, 2023, 18:25:21

You can open your csv files with any spreadsheet software, such as excel or libreoffice

user-956f89 03 November, 2023, 18:26:40

yes i did with excel

user-956f89 03 November, 2023, 18:35:31

how convert 70000 observation to 2 minutes of recording is the challenge i have

user-c37528 03 November, 2023, 19:15:01

hello can i run pupil lab capture in a raspberry wich has arm cpu??

user-8619fb 04 November, 2023, 18:47:44

Hey! What are indicators that a calibration would not be good? I mean if the circles are dark blue on each eye, the idconf on the top left show 1.00 most spots, and the red and yellow color don't flicker too much, why would it still have low confidence?

user-f76a69 06 November, 2023, 14:44:08

Hi, I am a Phd psych student, so please do forgive me if some of this is obvious to fix. I have been looking to install the MATLAB code through pupil helpers github repository, however I am running into significant problems with the prerequisite zmq master installation. I am running on Matlab2023b on an M2 macbook air. I seem to be running the following main error code: "Error using mex ld: warning: -undefined error is deprecated ld: warning: -undefined error is deprecated ld: Undefined symbols: _zmq_version, referenced from: _mexFunction in version.o clang: error: linker command failed with exit code 1 (use -v to see invocation)". I have installed a compiler for the mex and ran a setup command, however this error appears recurring. Any advice would be greatly appreciated πŸ™‚ I am hoping to install this so that we can use the remote API to control pupil via Matlab on a PC seperate from our stimulus. πŸ™

user-b2d077 06 November, 2023, 20:50:44

Hi, I wonder if anyone can confirm my understanding. It appears from the documentation that one can gather data from the Pupil Invisible in Matlab, bit for the Pupil Core you need to use Python. Is that right?

user-91d7b2 07 November, 2023, 00:29:56

I did a recording but unfortuantely it says the recording contains no scene video...im assuming this is a permissions issue, but no chance in getting it at this point correct?

nmt 07 November, 2023, 06:38:16

Hi @user-b2d077 πŸ‘‹. There are ways to communicate with Pupil Core using Matlab. Although our example code snippets are a bit more limited when compared to our Python examples. Check out this page for reference: https://github.com/pupil-labs/pupil-helpers/blob/master/matlab/README.md

user-956f89 07 November, 2023, 15:18:30

please can someone help me to solve this message when trying to load my data

user-d407c1 07 November, 2023, 15:35:16

Hi @user-956f89 ! It seems like you are trying to run a Jupiter notebook, but you do not have your environment properly configured. This resource can help you get started with Jupiter Notebooks https://www.youtube.com/watch?v=h1sAzPojKMg&t=220s

user-956f89 07 November, 2023, 15:18:58

from IPython.display import display Traceback (most recent call last): File "<stdin>", line 1, in <module> ModuleNotFoundError: No module named 'IPython'

user-956f89 07 November, 2023, 15:29:19

Chat image

user-956f89 07 November, 2023, 15:34:51

also how to open info.player.json file

user-956f89 07 November, 2023, 15:52:57

ok

user-956f89 07 November, 2023, 15:53:02

thks

user-b5484c 07 November, 2023, 17:42:05

Hello,

user-b5484c 07 November, 2023, 17:44:48

I'm using annotations to the PupilCaptures to synch data from the eye cameras with other devices. Every 2 seconds I send an annotation to PupilCapture. I get the right number of annotations in the annotation and annotation_timestamps files, but the timestamps I get are all the same (=2.14). Any idea of why this is the case?

user-d407c1 08 November, 2023, 10:09:32

Hi @user-b5484c ! It is hard to know without further context, would you mind sharing how you create those annotations? Regading sync with other devices, have you seen https://docs.pupil-labs.com/core/best-practices/#synchronization ?

user-51bca2 07 November, 2023, 23:10:40

Hello, I'm currently engaged in research that involves the development of smartphone-based pupillometry for applications in neurofeedback training, real-time cognitive load assessment, and preliminary screening for ADHD, Autism, and Parkinson's disease. Given the extensive groundwork that already exists in this domain, I am exploring potential collaborations from computer vision, and neuropyschology domains that could enhance the precision and efficacy of my project. My research focuses not only on the measurement of pupil diameter but also on how these measurements can inform us about various neural and cognitive processes. With this in mind, I am keen on understanding how Pupil Labs' technology could be leveraged in my work, especially in terms of data collection and processing capabilities, as well as the algorithmic aspect of measuring pupil size with precision without the need for auxiliary equipments. The main challenges I foresee with smartphone camera integration involve accounting for variables such as gaze angle discrepancies and ambient lighting conditions, which could potentially interfere with the near-infrared (NIR) camera's ability to capture precise pupillary data. Given these constraints, I'm curious to learn about any existing or developing solutions that Pupil Labs may offer to address these issues, particularly for NIR camera usage without a focal lens.

user-aa03ac 08 November, 2023, 09:40:39

Hello! We have never before used our Pupil Core for saccades estimation. We would like to reduce the weight of the output files as much as possible, since the experiment will be very long. Is it possible for us not to record eye video or is it better to do this in order to later extract some missing coordinates from the video? We will be very grateful for guiding us

user-d407c1 08 November, 2023, 10:07:19

Hi @user-aa03ac ! We highly advise retaining the eye-tracking video data for any subsequent analysis. For optimal results, especially during lengthy experiments, we also recommend splitting your session into blocks not exceeding 20 minutes each and asking the participants to roll their eyes to readjust the 3D eye model at the end of each segment, followed by a calibration. This practice is crucial to maintain the accuracy of the eye-tracking data, throughout the whole session and ensures that the recordings can be opened in Pupil Player later without issues (independent of your hardware). https://docs.pupil-labs.com/core/best-practices/#split-your-experiment-into-blocks

user-6cf287 08 November, 2023, 11:07:03

Hi team, i searched for this error "pyre.pyre_node: Group default-time_sync-v1 not found." here but my case is slighlty different i think. I am using Pupil Capture 3.5.1 and I am running it from my laptop. Today i also noticed another error saying pupil group not found. What could cause this problem?

user-d407c1 08 November, 2023, 11:55:13

Hi @user-6cf287 πŸ‘‹ ! Is the Pupil Groups plugin activated on your Pupil Capture instance, and are you utilising it?

Regarding the potential impact on data quality, it is challenging to provide a definitive answer, as this often varies with the specific task and individual differences or even with each recording. While generalisations are not feasible, our recommended workflow aims to mitigate any decrease in data quality. That said, it may be acceptable to omit the instruction for subjects to roll their eyes if:

  • The 3D model remains accurately fitted, as per the guidelines found at Pupil Labs Best Practices https://docs.pupil-labs.com/core/best-practices/#pye3d-model
  • You are employing the 2D Gaze pipeline and your research does not depend on pupil size measurements or other factors reliant on the 3D model.

In relation to the eye tracker's performance with diverse populations, our experience indicates that it has been effectively used across various groups. Nevertheless, certain groups may present more challengesβ€”for instance, individuals wearing heavy eye makeup, such as mascara or eyeliner, might complicate pupil detection. Similarly, the facial structure commonly found in Asian ethnicities could pose difficulties as the eyelid tends to obscure the pupil. Is there anything else I can help you with?

user-10bc7b 09 November, 2023, 03:07:48

Hi team, i search for this error "pkg-config is required for building PyAV", but my case is different. After successfully executing command "python3.8 -m pip install pkgconfig", the error won't go away when executing command "python3.8 -m pip install -r requirements.txt". The OS platform is Ubuntu 18.04. Any suggestions?

user-fafdae 09 November, 2023, 07:11:35

Hi!I found a insoluble problem, its that when I am compiling the Pupil core program, there's a error reporting from python that:" FileNotFoundError: Could not find module 'C:\ProgramData\Anaconda3\envs....\pupil_apriltags\lib\apriltags.dll", I have tried to add 'add_dll_directory' before using the 'pupil_apriltags' library, but it still doesn't work. Do you have any suggestions?

nmt 09 November, 2023, 09:02:46

Hi @user-fafdae πŸ‘‹. May I ask why you don't wish to use our pre-compiled bundles?

user-da5552 09 November, 2023, 08:39:13

When I try to calibrate, I get this error message. What should I do?

Chat image

nmt 09 November, 2023, 09:17:19

Hi @user-da5552! This message means that there are no pupil detection data. To calibrate, you'll need to ensure the eye cameras are positioned optimally and pupil detection confidence is high. Check out the getting started guide and let me know if you run into any further issues: https://docs.pupil-labs.com/core/#_3-check-pupil-detection

user-b5484c 09 November, 2023, 11:24:04

Hi Neil...

Thank you. When I use the Player, my exported annotations.csv file is empty (only the header is there).

To clarify: For my annotations I'm always sending the same string {"topic":"annotation", "label":"myclock", "timestamp": 2.14} at every two seconds. In the PupilCapture, my annotations clearly arrive and have appropriate timestamps (see CaptureAnnotationSaving.png); I'm not sure why they are negative, but they are clearly ~2s apart.

Now when I load my capture directory into PupilPlayer I can also see that the right number of annotations is loaded, but as you can see in PlayerAnnotationExporting.png they all have the same frame number, which already indicates that there is something wrong with the data I'm saving. Indeed when I play the data, all annotations are shown at the end of the session, and all at once. Then when I export the data the annotation.csv file is empty.

The only thing I need are the timestamps that appear in the terminal of the PupilCapure (as in CaptureAnnotationsSaving.png). Is this possible?

PS: I saw in one of your synch examples that you go back and forth getting timestamps from the pupil capture and estimating clock offsets between your device and PupilCapture, but I would rather avoid this solution; especially if I can have directly the timestamps being saved at the time the annotations arrive.

Thank you in advance.

Chat image Chat image

user-908b50 09 November, 2023, 20:06:45

Hi, I wanted to confirm is the vector dispersion formula in the code outputs degrees of disperson right?

user-908b50 09 November, 2023, 20:07:19

Based on Blignaut's paper, i identified 1.11 degree as the appropriate parameter for Salvucci's I-DT using their regression model. I want to bounce it off with someone here.

nmt 10 November, 2023, 10:47:36

Correct. It's degrees of visual angle.

nmt 10 November, 2023, 10:44:40

Getting started w. Core

user-13078d 10 November, 2023, 14:10:59

Hello, I'm trying to get the raw data from one eye in real time, without having to record. I'm using this code in Python: pgr_future = p.pupil_grabber(topic='frame.eye.0', seconds=0.01)

data = pgr_future.result() frame_bytes = data[0]['raw_data'][0] frame = [int(byte) for byte in frame_bytes] print(frame)

This way I'm getting some bytes from the eye camera. However, the number of values in the variable 'frame' is smaller than the set frame dimensions which is 129x129. I don't know what I'm doing wrong... How could I get the signal captured by the eye camera?

nmt 10 November, 2023, 14:17:56

This script shows how you can grab eye video frames: https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames_with_visualization.py

user-a6c91e 10 November, 2023, 17:01:57

Hello! I'm a newbie here. Could you help please? I've recorded some trials on visual tracking of moving objects in real world (so head may move and so on). Opening data in "offline_data\gaze-mapping" results in norm_pos negative values. Same in gaze_positions.csv (obviously).

How should I set up the experiment, so this problem won't occur too frequent (https://discord.com/channels/285728493612957698/998187988943110257/998480510571528203) ? I've done calibrations for several times during each trial with special marker, as demonstrated here: https://www.youtube.com/watch?v=aPLnqu26tWI&ab_channel=PupilLabs

Also, may head position monitoring take into account the shift in the main camera view?

nmt 11 November, 2023, 17:31:10

Hi @user-a6c91e πŸ‘‹. Negative values can occur when gaze leaves the FoV of the scene camera. But generally speaking, this coincides with poor quality data. It's difficult to say why you may encounter this without seeing a recording. Can you share a screen capture of your calibration choreography + some of the recording such that we can take a look?

user-5adff6 11 November, 2023, 10:06:30

Hello, I am currently using Pupil Core for an experiment and I have been successfully sending trial information online from MATLAB to Pupil Capture. However, I would also like to send gaze information from the Pupil Labs recording to MATLAB. I want to implement a fixation break rule such that the trial restarts when fixation is broken. But I would like to use a specific fixation rule from a previous publication, so I want to get gaze location data and then implement fixation break rule on MATLAB (so I don't want to use the fixation detection from Pupil Capture, and I am also using AprilTags so surface tracker plug-in is also available). The MATLAB computer and the Pupil Labs computer are separate, and they currently communicate with an ethernet cable connection. Can I get some advice on the steps to follow? Can I stream the gaze location on the surface defined by AprilTags? I have zero prior experience with LSLs but I need this for my project.

user-cdcab0 11 November, 2023, 11:55:05

Are you already using LSL? If so, here's a fork of the LSL plugin that streams surface gazes. It's still a work in progress and not well tested, but it may work for you (and feedback is welcome!) https://github.com/domstoppable/App-PupilLabs

user-870276 11 November, 2023, 16:59:44

Hey group! When I’m doing the eye tracking for the abnormally aligned eye patients the gaze is not being detected properly, is there any way to resolve this issue?

nmt 11 November, 2023, 17:38:33

Hey @user-870276! Can you share a screencapture showing the calibration and some of the recording such that we can provide concrete feedback?

user-870276 11 November, 2023, 17:46:56

also jus curious, should the pupil camera be side way like image 1 as mentioned on your website to get best accuracies in gaze detection? or like image 2? which one is best for getting higher accuracies?

Chat image Chat image

user-d407c1 13 November, 2023, 07:50:18

Hi @user-870276 πŸ‘‹ ! Positioning the camera on the side can prevent obstructing the subject's line of sight. However, for some individuals, this might not be feasible due to corneal reflexes or pupil obstruction.

Ultimately, the placement of the camera largely depends on the unique facial features of each individual, such as eyelid coverage, iris colour, etc. I would recommend to find the most suitable camera position for each person before starting.

Regarding the first video you mentioned, it appears there's an iris coloboma in one of the eyes. This condition could cause the pupil detection system to struggle with recognising these atypically shaped pupils in combination with eyelid partially covering the pupil. Therefore, I suggest either trying different pupil detection parameters/detectors https://github.com/pupil-labs/pupil-community?tab=readme-ov-file#pupil-detector-plugins or focusing solely on the unaffected eye. Independently, you can benefit by fitting the 3D model by asking the subject to rotate their eyes around like this https://youtu.be/_1ZRgfLJ3hc

For the second video, where the pupils are notably small, adjusting the minimum size parameter in the pupil detector would be advisable to ensure accurate detection (see the screenshot)

Chat image

user-870276 11 November, 2023, 18:59:13

added to this I used the screen calibration method do you suggest any other methods of calibration? angular accuracy is between 3-4 degrees and angular precision is between 0.2 and 0.4 degrees.

user-005de5 13 November, 2023, 14:20:54

Is it possible to extend recording times of the core? We used the invisible and the limiting factor was the battery of the phone, if this is still the bottleneck is it possible to recharge the phone while recording with a power bank?

nmt 13 November, 2023, 15:18:42

Hi @user-005de5 πŸ‘‹. Can you confirm which eye tracker your enquiry is about? Pupil Core or Invisible?

user-1caf5c 13 November, 2023, 14:38:28

Hey, I'm still fairly new at using pupil core though I'm trying to set up the needed plugins for pupillometry. I found the github link above but can't seem to get the plugins to work right. Could anyone help or point me towards a resource detailing the process?

nmt 13 November, 2023, 15:20:32

Hey @user-1caf5c! Pupil Core records Pupil diameter both in pixels and mm. You don't need any plugins for this, it happens automatically. Recommend reading our best practices for pupillometry: https://docs.pupil-labs.com/core/best-practices/#pupillometry

user-005de5 13 November, 2023, 15:19:56

I just noticed my mistake I meant the neon, sorry

nmt 13 November, 2023, 15:23:33

Np! Neon already records for longer than Invisible, up to 4 hours depending on which settings you have enabled in-app. You can also extend the autonomy should you wish with a USB hub: https://docs.pupil-labs.com/neon/glasses-and-companion/companion-device/#using-a-usb-c-hub

user-be0bae 14 November, 2023, 07:59:45

When I open the pupil player app to open eyetracking files. It reports error: Bad file descriptor (c:\projects\libzmq\src\epoll.cpp:100οΌ‰ HELP!

Chat image

user-d407c1 14 November, 2023, 08:40:47

Hi @user-be0bae ! May I ask a couple of questions to help you? What version of Windows are you using? Did you install the bundle with admin rights?

Here are a few things you can try to solve this issue:

  1. Ensure that the file epoll.cpp exists at the specified path and that you have permission to access it.

  2. Start Pupil Player with admin rights.

  3. It seems like you have multiple drives. Could you try installing the bundle at "C:" and/or avoiding non-unicode characters on the path?

user-6cf287 15 November, 2023, 13:10:14

Hi team, i keep getting this warning: uvc: Could not set Value. 'Backlight Compensation'. what does this mean and would it effect the eye measurements? thank you

nmt 16 November, 2023, 02:37:44

Hi @user-6cf287 πŸ‘‹. Please see this message for reference: https://discord.com/channels/285728493612957698/285728493612957698/818442406159843349

user-ffe6c5 15 November, 2023, 13:13:25

hi everyone, i have a question: Pupil Core uses the dark pupil method, in which the reflected IR light from the pupil is not captured by the eye cameras. What do the cameras capture then?

nmt 16 November, 2023, 02:39:50

The cameras capture IR images of the pupils, which are segmented with a pupil detection algorithm. You can read more about that here: https://arxiv.org/abs/1405.0006

user-c9af80 16 November, 2023, 09:11:38

How and where set the mapper?

user-d407c1 16 November, 2023, 09:41:18

Hi @user-c9af80 ! Would you mind developing your question? which mapper are you looking to set it up? and what product are you using (is it Pupil Core?) and gaze mappers https://docs.pupil-labs.com/core/best-practices/#choose-the-right-gaze-mapping-pipeline ?

user-956f89 17 November, 2023, 09:13:59

Hi, i did my plotting successfully following your coding instruction but i would like to have some support for how to change the coding to get "pupil_timestamp_datetime" in X axis instead of "pupil timestamp". What should i change in this script please to have diameter_3d against pupil_timestamp_date time in my plot: import matplotlib.pyplot as plt

plt.figure(figsize=(16, 5)) plt.plot(eye0_df['pupil_timestamp'], eye0_df['diameter_3d']) plt.plot(eye1_df['pupil_timestamp'], eye1_df['diameter_3d']) plt.legend(['eye0', 'eye1']) plt.xlabel('Timestamps [s]') plt.ylabel('Diameter [mm]') plt.title('Pupil Diameter')

user-d407c1 17 November, 2023, 09:21:33

Hi @user-956f89 ! The line plt.plot(eye0_df['pupil_timestamp'], eye0_df['diameter_3d']) is in essence plot(x,y), so, if you change the first parameter you will get what you are looking for.

Dataframes are kind of like a table, if you want to know what options do you have on your eye0_df "table", you can use print(eye0_df.columns) https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.columns.html

user-956f89 17 November, 2023, 09:26:24

Hi, thank you for your answer

user-956f89 17 November, 2023, 09:26:43

i already did that before but not sucessfull

user-956f89 17 November, 2023, 09:26:45

import matplotlib.pyplot as plt plt.figure(figsize=(16, 5)) <Figure size 1600x500 with 0 Axes>

plt.plot(eye0_df['pupil_timestamp_datetime'], eye0_df['diameter_3d']) [<matplotlib.lines.Line2D object at 0x00000284D8A82E48>] plt.plot(eye1_df['pupil_timestamp_datetime'], eye1_df['diameter_3d']) [<matplotlib.lines.Line2D object at 0x00000284D9B80108>] plt.legend(['eye0', 'eye1']) <matplotlib.legend.Legend object at 0x00000284D996AA08> plt.xlabel('Timestamps [s]') Text(0.5, 0, 'Timestamps [s]') plt.ylabel('Diameter [mm]') Text(0, 0.5, 'Diameter [mm]') plt.title('Pupil Diameter') Text(0.5, 1.0, 'Pupil Diameter') plt.show()

user-956f89 17 November, 2023, 09:27:21

can you tell if i have written something wrong in the script

user-d407c1 17 November, 2023, 09:29:59

Without seeing the output or error is hard to know, can you add the print columns command before to check what columns do you have there?

user-956f89 17 November, 2023, 09:30:57

ok

user-956f89 17 November, 2023, 09:33:02

print(eye0_df.columns) Index(['Unnamed: 0', 'pupil_timestamp', 'world_index', 'eye_id', 'confidence', 'norm_pos_x', 'norm_pos_y', 'diameter', 'method', 'ellipse_center_x', 'ellipse_center_y', 'ellipse_axis_a', 'ellipse_axis_b', 'ellipse_angle', 'diameter_3d', 'model_confidence', 'model_id', 'sphere_center_x', 'sphere_center_y', 'sphere_center_z', 'sphere_radius', 'circle_3d_center_x', 'circle_3d_center_y', 'circle_3d_center_z', 'circle_3d_normal_x', 'circle_3d_normal_y', 'circle_3d_normal_z', 'circle_3d_radius', 'theta', 'phi', 'projected_sphere_center_x', 'projected_sphere_center_y', 'projected_sphere_axis_a', 'projected_sphere_axis_b', 'projected_sphere_angle', 'pupil_timestamp_unix', 'pupil_timestamp_datetime'], dtype='object')

user-956f89 17 November, 2023, 09:34:55

this is the out put

Chat image

user-d407c1 17 November, 2023, 09:48:21

I see, so the issue is not that it is not plotting , but rather that you can not read the labels, right? you can use plt.xticks(rotation=90) https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.xticks.html to rotate the labels

user-956f89 17 November, 2023, 09:35:15

as you see no datetime in the X axis

user-956f89 17 November, 2023, 09:49:39

where should i write plt.xticks(rotation=90)

user-956f89 17 November, 2023, 09:50:04

before writing plt show ()?

user-d407c1 17 November, 2023, 09:52:24

yes

user-956f89 17 November, 2023, 09:59:48

Chat image

user-956f89 17 November, 2023, 09:59:56

same thing

user-956f89 17 November, 2023, 10:00:33

i think because of high number of values in the axis

user-956f89 17 November, 2023, 10:09:07

how can i reduce the axis values to every 10 seconds for example instead of plotting every second.

user-d407c1 17 November, 2023, 10:12:04

You can use xaxis.set_major_locator() function to define it

user-956f89 17 November, 2023, 10:23:03

sorry to bother you Mgg

user-956f89 17 November, 2023, 10:23:47

please help me how to write it according to the table i have written

user-956f89 17 November, 2023, 10:23:49

pupil_timestamp_datetime eye_id confidence norm_pos_x norm_pos_y diameter_3d 1238 2023-06-24 14:53:00.981393408 0 0.819 0.458 0.615 1.214 1240 2023-06-24 14:53:00.992414208 0 0.915 0.459 0.614 1.216 1241 2023-06-24 14:53:01.001674240 0 0.820 0.460 0.613 1.236 1243 2023-06-24 14:53:01.012230144 0 0.890 0.458 0.615 1.191 1245 2023-06-24 14:53:01.019725312 0 0.994 0.458 0.615 1.207 1250 2023-06-24 14:53:01.030388224 0 0.901 0.459 0.615 1.187 1252 2023-06-24 14:53:01.038635264 0 0.731 0.459 0.615 1.189 1255 2023-06-24 14:53:01.047466240 0 1.000 0.459 0.615 1.205 1257 2023-06-24 14:53:01.059594240 0 0.971 0.458 0.615 1.201 1260 2023-06-24 14:53:01.067557376 0 1.000 0.459 0.615 1.200

user-956f89 17 November, 2023, 10:24:34

i want to plot this table

user-956f89 17 November, 2023, 10:24:52

and not all the value

user-4c48eb 18 November, 2023, 21:41:30

Hello, I'm setting up an experiment where the subjects should: 1. Follow a dot along a moving sinusoidally 2. "Find Waldo" 3. Find the differences in a image

I am having some problems with the last two. For testing purposes, I'm just following the dot and looking at the edge of the images, but especially in trial 2 I get an offset which I can't really explain. Could someone help?

EDIT: I'm using the surface tracker, analyzing the gaze positions. I have got this recording the video and the using the pupil player. The calibration consistently shows an accuracy of >2 degrees (that a lot?). Don't mind the legend on the graph as fixations are the red dots and samples the blue stars

Chat image

user-4c48eb 18 November, 2023, 21:43:49

Chat image

user-cdcab0 19 November, 2023, 03:50:38

Have you inspected surface_tracker/gui.py? Have a look at this class: https://github.com/pupil-labs/pupil/blob/e9bf7ef1a4c5f2bf6a48a8821a846c5ce7dccac3/pupil_src/shared_modules/surface_tracker/gui.py#L620

user-6072e0 19 November, 2023, 10:03:49

From what I see in that function, I don't think you're exposing the image data directly. You create a 4x4 transformation matrix stored in the trans_mat variable and apply it using the world_tex.draw() function, maybe? Correct me if I'm wrong πŸ˜… . Can you tell me what each component represents in the 4x4 OpenGL transformation matrix? Or maybe how to convert the OpenGL matrix to a 3x3 OpenCV matrix so that I can do the transformation and get the surface tracker image?

nmt 20 November, 2023, 02:57:11

Hi @user-4c48eb! Usually an unexpected offset indicates a sub-optimal calibration and/or pupil detection, but it's difficult to say for sure without seeing those things. Can you share a screen capture of your calibration choreograpy?

user-4c48eb 20 November, 2023, 08:16:52

Unluckily I don't have one 😦 I'll do a new one ASAP and if the problem persists I'll come back to you. Thanks for the reply!

user-cdcab0 20 November, 2023, 05:27:59

Something like this should work for you. As a quick-and-dirty test, I just put it in Surface_Window.gl_display_in_window, so that I could easily turn it on and off by opening the surface window and closing it. You'll definitely find a better spot for it though

output_size = 512

# get the corners of the surface in the original image
norm_corners = np.array([[0, 0], [1, 0], [1, 1], [0, 1]], dtype=np.float32)
img_corners = self.surface.map_from_surf(
    norm_corners, self.tracker.camera_model, compensate_distortion=False
)

# define the corners of our output image
output_corners = np.array([[0, 0], [output_size, 0], [output_size, output_size], [0, output_size]], dtype=np.float32)

# calculate the transformation matrix
output_trans_mat = cv2.getPerspectiveTransform(img_corners, output_corners)

# apply the transformation matrix
surface_image = cv2.warpPerspective(self.tracker.current_frame.img, output_trans_mat, (512, 512))

# save the image
cv2.imwrite(f'surface-image-{self.tracker.current_frame.index}.png', surface_image)
user-6072e0 20 November, 2023, 14:00:02

Ok thank you for the answer πŸ‘πŸ‘Œ

user-1aeacd 20 November, 2023, 10:52:59

Dear people, as you can see in the screenshot, the brightness of the captured eyes is way to high, does someone know how I can change that? Thank you for your time:)

Chat image

user-d407c1 20 November, 2023, 10:55:40

Hi @user-1aeacd ! Click on the camera icon at the sidebar, there you will find the options to change the exposure and gain.

user-1aeacd 20 November, 2023, 11:01:05

Perfect It works!! Thank you!!:)

user-1aeacd 20 November, 2023, 11:16:48

When I play with the absolute exposure time of the world view camera it seems to give good results for 126. However, I would like to set it to auto, but when I try to do an error message displays saying "WORLD: Could not set value. "Auto exposure mode"

user-1aeacd 20 November, 2023, 11:18:11

Chat image

user-8825ab 20 November, 2023, 11:28:42

Hi I am trying to upload all the videos but it seems like I cannot do it because it keeps loading and after 3 days it's still 0%

nmt 20 November, 2023, 13:03:42

You'll want to choose aperture priority mode for auto adjustments (apologies for the confusing naming scheme)

user-1aeacd 22 November, 2023, 19:31:56

Thank you:)

user-908b50 20 November, 2023, 20:07:01

Is there a guide to pupillometry analyses that someone can share?

user-6cf287 21 November, 2023, 09:59:46

Hi Team, i noticed that the pupil timestamp and gaze timestamp are recorded differently as shown. I would like to understand the reason for this different timestamps and it is possible to synchronize them? For the pupil timestamp there are duplicates and i understand that this is due to the method being used either 2d or 3d. please ignore the different time format as it is somehow getting inconsistent due to european utf format i think

Chat image

nmt 22 November, 2023, 02:36:50

Hi @user-6cf287 πŸ‘‹. Contained within the pupil data are left and right eye streams. The eye cameras operate independently and are not perfectly synchronised. Gaze data are generated using pupil data - we utilise a data matching algorithm to match pairs of eye images for these gaze estimations. You can read more about the entire process here: https://docs.pupil-labs.com/developer/core/overview/#pupil-data-matching

Additionally, you may want to alter the formatting in your spreadsheet software: https://discord.com/channels/285728493612957698/1149399816149925888/1149717927944278128

user-6cf287 22 November, 2023, 13:07:57

Hi Neil thanks for the explanation but I don't understand why the Pupil export started saving the timestamp as integers instead of decimals. Is this because of some settings in the Pupil player export? As I have downloaded the data before and did not face this issue but it is happening now. Also the person did not explain how he/she changed the formatting, any tips on that? thanks!

user-be0bae 22 November, 2023, 11:34:55

Hi! Could you please explain how the coordinate for the gaze position are generated? I'm working on creating heatmaps and would like to understand what the x_norm, y_norm, x_ scaled, y_scaled coordinates represent.

user-be0bae 22 November, 2023, 11:35:39

Chat image

user-d407c1 22 November, 2023, 12:17:47

Hi @user-be0bae ! Are you using the surface tracker? I think this tutorial does exactly what you are looking for. https://github.com/pupil-labs/pupil-tutorials/blob/master/02_load_exported_surfaces_and_visualize_aggregate_heatmap.ipynb

the norm values are normalised to the surface and the scaled are relative to the size of the surface, have a look at the coord system of Pupil Core hee https://docs.pupil-labs.com/core/terminology/#coordinate-system

user-be0bae 22 November, 2023, 12:42:40

Thanks for your reply and materials.

nmt 22 November, 2023, 13:25:19

It's nothing to do with Player. Rather, Excel is known to misinterpret .csv data depending on Excel's language settings. For example, 2.5 (2,5 in German notation) is interpreted as 2500 because the . in German is the separator for large numbers. This discussion has some tips on how to solve it: https://stackoverflow.com/questions/11421260/csv-decimal-dot-in-excel

user-6cf287 22 November, 2023, 15:20:35

Thanks, the issue was somehow resolved after re-exporting the data. I have no explanation πŸ˜„

user-fa19c6 22 November, 2023, 14:45:53

Hello, my company is interested in eye-tracking system solutions and we have some questions about your Core product. Is it possible to chat ? πŸ™‚

user-d407c1 22 November, 2023, 14:50:07

Hi @user-fa19c6 ! Sure! What would you like to know? To request an online demo, please contact info@pupil-labs.com

user-fa19c6 22 November, 2023, 14:51:40

Hello @user-d407c1 , thanks ! β€’ Is the Core product having an integrated microphone? β€’ What are available analysis software for this product? β€’ Is it possible to have a real time analysis solution? β€’ Is it possible to record analysis sessions? If so, is it saved on SD card or in cloud?

user-d407c1 22 November, 2023, 15:07:48

Thanks for following up! Pupil Core (https://pupil-labs.com/products/core) does need to be tethered to a computer, is it in this computer where data is stored (so, no SD card or Cloud).

After recording, you can load the recording to Pupil Player (https://docs.pupil-labs.com/core/software/pupil-player/) for analysis.

It does not have a microphone built-in, although it can be synced with an external audio source using LSL.

Regarding real-time analysis, there are many parameters that can be accessed in real-time. Could you specify which parameters you're particularly interested? This will help us provide more targeted information.

If you are looking for something more portable and with stereo microphones built-it in, you may want to check Neon https://pupil-labs.com/products/neon

user-1aeacd 22 November, 2023, 19:37:23

Dear Community,

I am currently developing an "Intelligent User Interface with Head-mounted Eye Tracker." To enhance its functionality, I need to implement a feature that streams images with eye gaze markers, similar to those seen in Pupil Capture, to other parts of my application.

Could you please advise if there exists any existing helper function or utility within the Pupil Capture ecosystem that facilitates this? Or would it be necessary for me to custom-develop this functionality?

I appreciate your time and assistance:) Thank you!

user-cdcab0 30 November, 2023, 11:56:34

Hi, @user-1aeacd - did you find what you need? Your project sounds very interesting and I'd love to hear more about it

user-fa19c6 23 November, 2023, 08:40:23

Thank you ! It could be use in defferent cases: indoor without natural light, indoor with natural light, outdoor... Our needs are plural, but we need to analyze what the user will see, where their look will focus, for how long, if possible real time verbatims. The scenarios could be for the user using screens or physical products

user-d407c1 23 November, 2023, 09:48:46

Hi @user-fa19c6 ! thanks for following up, I think it can be beneficial to set up a demo and Q&A video call to discuss these cases and how it could fit. Please send an email to info@pupil-labs.com and we can look for a time that suits you.

user-c39646 23 November, 2023, 12:39:46

Hello, thank you for the possibility u give to ask you questions! i would like to buy the software and the eye tracking that goes with...i have few questions:

user-d407c1 23 November, 2023, 12:52:53

Hi @user-c39646 ! Which eye-tracker are you interested most in? Pupil Core? The software to record and analyse is included in the price, no matter which, and there are no subscriptions.

We do not offer direct software for stimuli presentation, but you can use our products with Psychopy https://psychopy.org/api/iohub/device/eyetracker_interface/PupilLabs_Core_Implementation_Notes.html or present it with Psychotoolbox and send annotations https://docs.pupil-labs.com/neon/real-time-api/track-your-experiment-in-matlab/

user-c39646 23 November, 2023, 12:40:31

is it any code shared to use with phyton or matlab that allows me to show visual stimuli ?

user-c39646 23 November, 2023, 13:55:41

thank you very much ! im interestend in the vivo pro full kit with which i will have to record pupil dilation

user-c39646 23 November, 2023, 13:56:12

could you also tell me where i can find the soecification and requirements of a computer to assure the best performance of the sofrware please?

nmt 27 November, 2023, 03:05:37

@user-d407c1 responded here: https://discord.com/channels/285728493612957698/285728635267186688/1177248889351446668

user-956f89 24 November, 2023, 12:04:48

mgg

user-04c6a4 27 November, 2023, 09:50:02

May I ask if there are any learning materials for the code of this project? I would like to call the gaze estimation module and annotate the gaze direction. How can I modify the code

Chat image

nmt 28 November, 2023, 11:38:18

Unfortunately we don't have documentation for this. May I ask why you need to run pye3d standalone?

user-a09f5d 27 November, 2023, 15:26:23

Hi I am planning to use the pupil core eye tracker to investigate smooth pursuit movements as a subject follows a moving target in 3d space. For example, the target may start in the bottom left corner of the field of view at a distance of 50 cm from the observer, and then move to the top right corner of the field of view at a distance of 400 cm from the observer. I was wondering if you had any tips please on the best way to perform the calibration to ensure accurate gaze estimation? I plan to use a physical printed calibration market.

user-f76a69 27 November, 2023, 16:04:39

Hi, I’ve been running into an error when trying to connect to the core glasses via matlab. I’ve installed zmq, zmq master and the msg pack as pointed out on pupil helpers. After plenty of trouble shooting I finally got the mexfiles for the above compiled. I’m now running into a new error when trying to run pupil remote: error using recv β€œresource temporarily unavailable”. Any guidance would be greatly appreciated

user-cdcab0 30 November, 2023, 06:00:09

Hi, @user-f76a69 - is there more of the Matlab output you can share? Also, what version of Matlab are you using?

user-870276 27 November, 2023, 20:10:51

hi, I'm doing the experiment on a kitchen top which involves slight head and body movements to reach the objects. For this setting using the surface tracker and headpose tracking increases the accuracy of gaze and fixations?

nmt 28 November, 2023, 11:44:06

It won't improve the gaze accuracy, unfortunately, as they are separate entities. Good quality pupil detection and calibration are key here.

nmt 28 November, 2023, 11:40:05

Hey @user-a09f5d! The main consideration really is ensuring good pupil detection, and then performing a calibration that covers the field of view of what you want to measure. Have you already collected some pilot data? We'd be happy to take a look at your recording and provide some feedback.

user-a09f5d 29 November, 2023, 16:22:50

Hi @nmt The experiment is still in development but I am hoping to have some pilot data soon, at which point I would be happy to share a sample with you to check the calibration is as good as it can be. That would be fantastic. Thank you!

user-a09f5d 07 December, 2023, 15:36:28

Hi @nmt If the offer still stand, what is your preferred method (I assume email) for me to send you an example recording?

user-a09f5d 11 January, 2024, 21:05:37

Hi @nmt Following up on this message from a few months ago. I now have some pilot recordings and unfortunately the gaze estimation is not as good as we hoped/need. This might be in part due to the method of calibration (using a physical printed calibration target). If the offer still stands it would be very helpful if you or your team are able to offer some feedback on a sample recording? If so, what is the best way to send the recording (it is a large file)?

nmt 28 November, 2023, 11:40:57

Which example specifically from the Pupil Helpers repo are you trying to run?

user-f76a69 29 November, 2023, 09:17:13

I was trying to run pupil_remote_control, as I’m hoping to be able to remote start a recording from a separate pc.

nmt 28 November, 2023, 11:43:20

Check out this message for reference: https://discord.com/channels/285728493612957698/285728493612957698/1145994078161485824

user-6cf287 28 November, 2023, 12:21:51

Hi team, i would like to know if there are any tips on synching the eye tracker timestamp with the timestamps from another device B. Device B has fewer update frequency than the eye tracker. My plan is to postprocess the data collected and only have a resolution of 1 second for every data point. In this case, I am wondering whether I should use the mean of the eye tracker recording or just take the closes data point that matches device B's timestamp? Thanks

nmt 29 November, 2023, 10:30:01

Hi @user-6cf287! My first question here: what clock did device B collect it's timestamps from? Pupil Core uses it's own clock to produce 'Pupil Time'. So it might not be as simple to correlate the two as you think.

user-1f606a 28 November, 2023, 16:35:58

Hi, I currently try to access the world camera frames of the pupil labs core in python and display it with openCV in realtime (as you would with a webcam). My intention is to use the world camera frames to do some object recognition tasks. I came across the Network API but unfortunately I couldn't find anything helpful regarding accesing the world camera data. Maybe I missed something? Another solution is the implementation of a plugin with the Plugin API. However, my question is, if there is a simpler solution. Thanks in advance.

user-480f4c 28 November, 2023, 17:30:03

Hi @user-1f606a πŸ‘‹πŸ½ ! OpenCV is known to be unreliable, often leading to frame drops. We recommend pyav (https://pyav.org/docs/stable/) instead. We also have some tutorials and helpers that might be helpful:

user-6cf287 29 November, 2023, 11:51:37

Hi Neil, device B is connected to the computer as well and the data is also recorded using unix timestamp

nmt 30 November, 2023, 03:45:33

In that case, you'd first need to convert Pupil time to unix/system time: https://docs.pupil-labs.com/core/developer/#convert-pupil-time-to-system-time. It would then be feasible, as you suggest, to find the nearest timestamps in each dataset.

That said, system/unix time is known to drift. For the best accuracy, you might want to look at our Lab Streaming Layer plugin: https://github.com/labstreaminglayer/App-PupilLabs/tree/master/pupil_capture

user-b0efbb 29 November, 2023, 12:55:11

Greetings πŸ‘‹ . My apologies if this question has been asked many times. I have an annotation for each relevant fixation on pupil player. However, on export the csv file does not offer a fixation duration column. Is there a way around to exporting both the annotation and fixation duration in one csv file? This would save me a lot of time manually adding from the fixations csv!

user-b0efbb 29 November, 2023, 17:14:40

Never mind, I have managed to get it done through JupyterLab! πŸ˜„

user-f76a69 30 November, 2023, 08:57:49

Hi @user-cdcab0 , I’m using 2023b on windows 11. The rest of the output looks like this. Error using recv Resource temporarily unavailable. Error in pupil_remote_control (line 33) Result= zmq.core.recv(socket) ;

user-cdcab0 30 November, 2023, 09:03:17

Thanks! And, just to be sure, you have the Network API plugin enabled and the port numbers match?

user-f76a69 30 November, 2023, 09:07:09

@user-cdcab0 I do πŸ™‚

user-cdcab0 30 November, 2023, 11:53:55

You mentioned in your first post that you are using the ZMQ packages referenced in our Pupil Helpers repo, but other users have reported that those do not work with Matlab 2023. Have a look here for a potential solution: https://discord.com/channels/285728493612957698/446977689690177536/1123432094526361753

user-6cf287 30 November, 2023, 10:13:12

thank you. yes we already converted the pupil time using the code snippet in the link you provided. If i understood correctly the LSL plugin should be included in pupil capture before recording right? as we have already done the recordings, it seems like we have to go with choosing the nearest point for time matching...

nmt 30 November, 2023, 10:40:09

Yes, if the recordings are already made then you'll have to use the matching approach

user-4c48eb 30 November, 2023, 11:05:21

Hello, I am trying to track gaze and fixations on a surface. With the 3D model gaze I had so much errors (approx. 3 deg) so I switched to 2D. The problem is that I now have some problems as post analysis from the pupil player data will show some gaze where I have not looked at. You can see the spikes in bottom left corner. Do you know how to fix it?

Chat image

user-4c48eb 30 November, 2023, 11:06:07

What's interesting is that in the video the gaze dot doesn't go there

user-f76a69 30 November, 2023, 11:55:26

I actually used that fork mentioned in that channel in order to get the mex compiling process for zmq working.

user-cdcab0 30 November, 2023, 12:01:43

Are you running Matlab and Pupil Capture on the same PC? If not, tell me about your network configuration

Also, can you share your code?

user-f76a69 30 November, 2023, 12:04:36

I am running Matlab on one PC and Pupil Capture on a separate laptop, we created a local network via an ethernet cable. We are using the code as is on the Pupil Helpers repo. It's important for our experiment setup to run Matlab on one PC and Pupil capture on a separate laptop.

user-cdcab0 30 November, 2023, 12:06:23

Aw, that code as-is assumes both are running on the same PC. You need to modify the endpoint so that it matches the IP address of the computer running Pupil Capture

user-1aeacd 30 November, 2023, 12:06:01

Hi @user-cdcab0 πŸ™‚ No I have not found what I need yet, do you have any suggestions?:)

user-cdcab0 30 November, 2023, 12:12:46
user-1aeacd 30 November, 2023, 12:24:29

@user-cdcab0 Thank you very much for the help, will try it right away:)

End of November archive