👁 core


user-741d8d 01 December, 2023, 08:21:05

Hello everyone! Can anyone tell me whether it is possible to set via the network api on which monitor the calibration should be displayed? ( I have two monitors and want to start calibration on the second as soon as the one on the main screen is finished) Thanks in advance!

user-2b1780 02 December, 2023, 10:02:36

Hi, all! I'm so new to the whole game. First time using any eye-tracking system. First time using Pupil Lab Core. I managed to run some test trials and export pupil/gaze position estimations (.CSV) using the GUI apps. Have no idea what each variable means besides my wild guess based on their variable names and a basic understanding (it tries to fit a sphere [eyeball] and a circle on the sphere [pupil]). I'm interested in calculating metrics preferably in physical units (e.g., saccade in visual angles). Could someone kindly point me to where I can find some technical details on what those variables mean and how they are computed? Thanks! 🤓

nmt 04 December, 2023, 05:18:26

Hey @user-2b1780 👋. You can read an overview of the data made available here: https://docs.pupil-labs.com/core/software/pupil-player/#raw-data-exporter - I'd also recommend reading the rest of the Core docs to learn more about how everything is computed. If you can't find specific details of something in particular, make sure to search for it's name here on Discord as there's a wealth of information in the history of chats 🙂

user-7d8d41 03 December, 2023, 19:59:24

I am using the LSL plugin w/core and am able to collect xdf files from Lab Recorder. I want to use python and am trying to use the pyXDF function. Is there python code which exists to do such processing? The python code which I have found for the LSL references a lsl csv file but I am unsure how to convert/access this as I am using lab recorder (https://github.com/pupil-labs/pupil-helpers/blob/master/LabStreamingLayer/lsl_inlet.py).

nmt 04 December, 2023, 05:19:37

Hi @user-7d8d41. You'll definitely want to check out this: https://discord.com/channels/285728493612957698/285728493612957698/1070056032304386098

nmt 04 December, 2023, 05:12:10

Hi @user-741d8d! This is not possible using the network api. May I ask why you want to calibrate on both screens?

user-741d8d 04 December, 2023, 08:23:44

thanks for the answer! I developed a system that recognizes which monitor the user is looking at and activates the corresponding screen. Since the documentation recommends to calibrate with points that resemble the experimental conditions i figured i might get better results if i calibrate with both screens.

user-873c0a 04 December, 2023, 06:51:51

Hello, I have some problems when using the head pose tracker function. I have attached multiple Apriltags to the environment. These Apriltags can also be well detected, but in the 3D modeling process, only one Apriltag can be modeled, and the camera positioning only surrounds this Apriltag, and the camera positioning of the other tags cannot be obtained. , I would like to ask why?

Chat image

user-cdcab0 04 December, 2023, 07:18:35

Hey, @user-873c0a - are all of your AprilTag markers the same physical size?

user-873c0a 04 December, 2023, 07:24:11

Hi@user-cdcab0,I did use Apriltags of the same size and made sure they didn't duplicate the same ones.

user-cdcab0 04 December, 2023, 07:27:36

Are you doing it live with Pupil Capture or on a recording with Pupil Player? Could you share a recording with us?

user-873c0a 04 December, 2023, 07:38:05

I recorded a video and then used pupil player to perform offline head pose tracker.How do I share my recordings with you?

user-cdcab0 04 December, 2023, 07:44:12

Are there multiple markers visible at all times? As the video progresses, the model is constantly refined by re-calculating the relative positions of the AprilTag markers with each other - so marker visibilities need to overlap temporally. If you have that covered as well, you can send your recording to data@pupil-labs.com and I'll take a look

user-873c0a 04 December, 2023, 07:57:34

I have sent recordings

user-873c0a 04 December, 2023, 07:48:38

I think I am following the requirement to ensure that there are 2-8 Apriltags in the foreground camera within a period of time. I will send a recording to this email address, please take a look, sorry to trouble you.

user-cdcab0 04 December, 2023, 08:16:59

I looked at your recording and it looks like you need to capture more angles of the markers (with multiple in view simultaneously) before the markers can be successfully incorporated into the model

user-873c0a 04 December, 2023, 08:24:57

If more angles mean I need to move around the scene repeatedly, I'll experiment further.

user-cdcab0 04 December, 2023, 08:35:59

Not so much repeatedly but from a greater variety of angles. In your recording, you mostly move horizontally. In addition to that, you should also move the camera up (while pointing it downward at the markers) and down (while pointing it up at the makers).

Even better than just purely horizontal or purely vertical movement, try moving the camera in large circles as you move around the scene to capture as many unique angles as possible

user-873c0a 04 December, 2023, 08:48:48

Thank you very much for your guidance, I will try further

user-1f8b45 04 December, 2023, 10:57:37

Hi! I am having some issues with the pupil core package on my macbook M1 apple silicon. It takes a good long while for any of the apps to open... I haven't found a version of the apps that is specifically compatible with apple silicon - i've only found the one that is compatibel with macOS... any help would be appreciated, thanks!

user-cdcab0 04 December, 2023, 19:35:24

The first time loading each app can be slower than subsequent loads. Are you still experiencing slow load times when you open the app the 2nd, 3rd, etc. time?

There is no M1/2 native version. The app has a number of lower level dependencies that are not made for Apple silicone, so it's not really feasible. Further, Apple's Rosetta provides a pretty satisfactory solution

user-2cc535 04 December, 2023, 13:17:45

Hi, I must buy the Pupil Core for my Ph.D project in Milan. I need some help for info. on the other side, I want to participate in your workshops. can you provide me some info about the reference for my questions?

user-480f4c 04 December, 2023, 14:02:29

Hi @user-2cc535 👋🏽 ! I've already replied to your email, providing more information on your specific questions. I hope this helps!

user-8a525f 04 December, 2023, 17:43:41

Is pupillometry now working in Pupil Cloud?

user-cdcab0 04 December, 2023, 19:41:48

Yes - see the announcement here: https://discord.com/channels/285728493612957698/733230031228370956/1177542261551140896. If you have further questions about collecting and using data with your Neon, feel free to ask in the 👓 neon channel

user-1f8b45 04 December, 2023, 20:02:00

I will download Apple's Rosetta and try again! Thank you 🙂

user-1f8b45 04 December, 2023, 20:45:35

Ok I see. In that case I am not to sure how to run the apps through rosetta....do you have any guide I can refer to? Thank you for all the help

user-cdcab0 04 December, 2023, 21:00:20

There is nothing you need to do (other than have Rosetta installed). If an Intel-architecture app runs at all on your Apple silicone, it's running through Rosetta

user-1f8b45 04 December, 2023, 21:30:16

Ok, that still doesn't work unfortunatelly... I reverted to see if checking that box would help after noting that the apps didn't work having installed Rosetta.

user-cdcab0 05 December, 2023, 17:00:14

Not working at all? They were working before, right - just slow to load?

user-c1bd23 05 December, 2023, 12:36:09

Hello. Is anyone using Pupil Cloud having issues with accessing their workspace?

user-c1bd23 05 December, 2023, 12:36:46

I logged in the morning and it's giving me this error when I try to access a project: "Project not found. This project does not exist."

user-c1bd23 05 December, 2023, 12:37:07

I've used incognito mode, cleared cookies etc, deleted and re-created projects

user-c1bd23 05 December, 2023, 12:37:26

My uploaded videos etc are still available in the workspace, I just can't put them into projects

user-c1bd23 05 December, 2023, 12:37:38

or access the projects

nmt 06 December, 2023, 07:25:12

Hi @user-c1bd23! I'm sorry to hear about that - do you mean to suggest that even when you createa a new project, you're unable to access it altogether? If so, could you please reach out to info@pupil-labs.com and share a screenshot of the said behaviour, and also an ID of one of the recordings that you can't add to the project.

user-c39646 05 December, 2023, 15:29:37

Hi, i am back here for another question...i hope you can help me. I am using VIVE cosmo device with pupil labs and i want to set up my pupillometry protocol with Phyton. Is it possible to display the images i need to on the VIVE Cosmo goggles only by using Phyton or do i need to use the Cosmo software? Note that i only have Linux

nmt 06 December, 2023, 02:50:35

Hey @user-c39646! We have some documentation for presenting stimuli in Unity3D (search for 'hmd-eyes' on here), but not directly with Python. Also, be sure to read our pupillometry best practices: https://docs.pupil-labs.com/core/best-practices/#pupillometry

user-1f8b45 05 December, 2023, 17:37:37

Yes, so the apps are still slow to load, and when I try to use them (I tried to use the player) it crashed - without giving any error. I never tried to use the apps before, because they were so slow to load (they would take 15-20 min). With Rosetta, they open after a few minutes (5-6 min) - so a bit faster but then crash.

user-cdcab0 05 December, 2023, 18:06:43

In your home folder you should find a folder named pupil_player_settings. Inside that folder you should see a player.log file. Can you share that here?

user-23899a 06 December, 2023, 07:43:31

Good morning 🙂 We have a project with 57 movies and 7 enrichments. We downladed Enrichment data - so we have 7 folders, separately for every enrichment. We are working with files fixations. The problem is - every fixation file in our 7 folders has got different number of fixations - as we thing should be the same number with True or False for this particular enrichment. We need help with finding the reason for that.

nmt 06 December, 2023, 07:55:38

Hi @user-23899a 👋. I'll need a little more information to fully grasp the situation. Could you describe in more detail which kind of enrichments you ran, on what data, and why you think they fixations should be equivalent?

user-23899a 06 December, 2023, 07:59:32

Hello Neil, I will try to describe the situation in a more understandable way.

user-23899a 06 December, 2023, 08:00:51

I recorded 57 participants using Pupil Invisible (short ride on the tram sim). Then, in the Pupil Cloud environment, I defined 7 enrichments based on markers. Now, I need to check how many fixations each participant had in each area designated by the markers.

user-23899a 06 December, 2023, 08:03:14

The fixations file contains all the fixations of my participants. I have 7 enrichments, so I have 7 files. I believe that each file should contain the same number of fixations. The difference lies in the fact that in each file corresponding to a specific enrichment, there is a column with either true or false - depending on whether the fixation was present in that area or not. But numer of all fixations should be the same.

user-23899a 07 December, 2023, 06:35:35

I will be glad for the answer

user-873c0a 06 December, 2023, 09:12:15

Hello, I would like to ask some questions about Surface tracker. When defining a monitor screen, we usually use 4 Apriltags. The website that introduces related functions introduces that more than two Apriltags can define a plane. I would like to ask why 2 Apriltags can define a plane. How is this different from more Apriltags defining a plane?

user-cdcab0 06 December, 2023, 09:23:34

Sometimes image noise, weird lighting/shadows, extreme angles, or other effects can cause an AprilTag marker to not be detected. The more tags that define the surface and are detectable by the camera/algorithm, the more accurate the exact location of the surface in the scene can be determined.

So having many AprilTag markers will help ensure more robust surface tracking

user-873c0a 06 December, 2023, 11:53:37

Thank you very much, very clear.

user-01bb59 06 December, 2023, 18:56:29

is it possible to download pupil capture on windows 11?

nmt 07 December, 2023, 02:44:34

Pupil capture should work with Windows 11 - be sure to install it with admin rights.

nmt 07 December, 2023, 02:42:50

Pupillometry in hmd

user-1f8b45 07 December, 2023, 11:38:06

Hi, @user-cdcab0 sorry for the delay! Here is the player.log file.

player.log

user-1f8b45 07 December, 2023, 11:42:12

This is the log after I have dragged a recording into the player. It basically disappears after a few minutes and its not bringing up the window with the video. Sometimes it does save an "export folder" but the whole process is incredibly slow, other times the player crashes.

player.log

user-cdcab0 07 December, 2023, 16:48:13

Hm. Let's try a fresh start. Make sure the apps are closed and then rename the pupil_player_settings folder to pupil_player_settings_backup, launch the app again and try to load a recording

user-ca96f7 07 December, 2023, 15:46:17

Hello, an other question, somebody know if the glasses are CE marking ? (neon, core and invisible one)

user-d407c1 07 December, 2023, 17:01:49

Yes! all of them are!

user-4c4fec 08 December, 2023, 15:57:19

Hello everyone, I'm working with the pupil core and using the pupil player to extract data on eye movement. I'm thinking of putting the data through a matlab script that generates a saliency map for me. However I'm confused about the difference between fixations.csv and fixations_on_surface.csv. Is someone able to explain it to me? Thank you very much for the help.

user-4c4fec 08 December, 2023, 15:58:41

To clarify my question: I see more values in fixations_on_surface.csv than i do in fixations.csv. Where there are multiple entries for the same fixation_id in fixations_on_surface.csv. How does this work?

user-cdcab0 08 December, 2023, 17:37:21

What if your eyes do not move but the surface does? In that case, it makes sense to me that the fixation ID stays the same (since the eyes haven't moved), but surface gazes need multiple entries (since the gaze position on the surface has changed)

user-c075ce 12 December, 2023, 13:21:38

Hello everyone,

I would like to know if there is an android application for smartphone that works with pupil labs glasses core version? Thanks in advance.

Best regards, Azamat

user-480f4c 12 December, 2023, 14:46:02

Hi @user-c075ce 👋🏽 ! Pupil Core connects to a laptop and runs the Pupil Core software (https://docs.pupil-labs.com/core/getting-started/). What prompted your interest in whether Pupil Core connects to a smartphone? Are you exploring different options or do you have specific requirements that a smartphone-connected eye tracker would better meet?

user-301f9a 12 December, 2023, 15:01:40

Heya people where can I find to download pupil capture? Every link leads to the codes for the software but I want to download it

user-480f4c 12 December, 2023, 15:11:13

Hi @user-301f9a ! To download it, go to this link https://github.com/pupil-labs/pupil/releases/tag/v3.5#downloads, scroll down to Assets, and there you'll find the different bundles for macOS, windows, linux.

user-6cf287 12 December, 2023, 15:51:00

Hi Team, i have multiple recordings and I would like to export the data and I am wondering if there is a way to run the export script without having to drag and drop each folder into pupil player before exporting? Thanks.

user-480f4c 13 December, 2023, 06:44:53

Hi @user-6cf287 👋🏽 ! There is no batch export built-in to Player. However, there is a community contributed tool for exporting recorded data from multiple recordings without starting Player. You can find it here: https://github.com/tombullock/batchExportPupilLabs

user-0aefc0 13 December, 2023, 12:07:31

Hi, I am currently working on experiments in our lab using Pupil Neon, specifically focusing on typical field operator tasks. While exploring the exported data, I couldn't locate information on the 3D gaze vector and its origin in any of the files.

Could you kindly provide guidance on accessing or extracting the 3D gaze vector and its origin data from the Pupil Neon export files?

user-d407c1 13 December, 2023, 12:51:41

Hi @user-0aefc0 ! 3D Eye states as well as pupilllometry, are currently only available in Cloud. https://docs.pupil-labs.com/neon/data-collection/data-streams/#_3d-eye-states

From there, you would like to download the CSV files and look at https://docs.pupil-labs.com/neon/data-collection/data-format/#_3d-eye-states-csv

May I ask, are you using the data exported from the phone?

user-6586ca 13 December, 2023, 13:30:45

Hi Pupil Team. I hope this message finds you well !

We are conducting a study exploring the visual search behavior, specifically focusing on participants' gaze spatial distribution. We would like to work with the "gaze_positions" file for our analysis (we don't have surfaces in this study). However, we are encountering uncertainty regarding the specific columns related to X and Y coordinates of the gazes and specifically the distinctions between columns : norm_pos_x/_y and gaze_point_3d_x/_y/_z.

In essence, my question revolves around discerning the nuanced disparities between the terms "position in the world image frame" (relating to the "norm_pos_x/_y" columns) and "point in the world camera coordinate system" (relating to the "gaze_point_3d_x/_y/_z" columns) that you use in your user guide.

Additionally, my second question pertains to the point of reference for these X and Y coordinates. For example, is it the center of the calibration zone? This inquiry arises from our observation of coordinates with negative values in the mentioned columns.

Thank you in advance for your consideration of our inquiries.

user-cdcab0 13 December, 2023, 18:40:08

Hi, @user-6586ca - sorry for the confusion. Have you seen this page in the documentation? https://docs.pupil-labs.com/core/terminology/#coordinate-system

user-e518ed 13 December, 2023, 15:09:37

Hi Pupil Team, I just got the pupil Core but I can't find where to download the pupil labs software Capture (https://docs.pupil-labs.com/core/software/pupil-capture/). Could you help me ? (I am so sorry but this must be the dumbest question you have ever received )

user-d407c1 13 December, 2023, 15:21:47

Hi @user-e518ed ! you can download them here https://github.com/pupil-labs/pupil/releases/tag/v3.5

It's not a stupid question, in fact seems in the update of the docs, they went MIA. We will update this ASAP

user-7ff310 13 December, 2023, 18:09:22

Hello! I have a problem with one eye camera in Pupil Core. Image looks blurry, and pupil detection isn't working. This happened suddenly, after it worked fine for months. Deleting the settings didn't help. I am attaching an image from this eye camera. Computer is Macbook Pro M1. Can anyone help me with this? Many thanks!

Chat image

nmt 14 December, 2023, 14:31:14

Hi @user-7ff310. That image looks quite exposed. Have you tried changing the eye camera exposure settings?

user-73f31c 14 December, 2023, 16:15:12

Hello Pupil team, after about one year of utilisation of the pupil core, i encountered a problem for which i may need help. Since recently, I have an error message at pupil capture launch: world - [ERROR] launchables.world: Process Capture crashed with trace: Traceback (most recent call last): File "launchables\world.py", line 671, in world File "plugin.py", line 409, in init File "plugin.py", line 432, in add File "video_capture\uvc_backend.py", line 75, in init AssertionError

In the worst case I'd like to be able to run capture even without the world camera if it is indeed the cause of the error. Thanks in advance for any help you can give.

user-cdcab0 14 December, 2023, 16:48:15

What operating system are you running and what version number of Pupil Capture? It would be great if you could share the log file from ~/pupil_capture_settings/capture.log

user-73f31c 14 December, 2023, 17:20:27

I am running on windows 10 with the 3.5.1 pupil capture. Here is the log: 2023-12-14 19:29:26,847 - MainProcess - [DEBUG] os_utils: Disabling idle sleep not supported on this OS version. 2023-12-14 19:29:27,191 - world - [DEBUG] launchables.world: Application Version: 3.5.1 2023-12-14 19:29:27,191 - world - [DEBUG] launchables.world: System Info: User: Olfaction humaine, Platform: Windows, Machine: DESKTOP-04JA75C, Release: 10, Version: 10.0.19041 2023-12-14 19:29:27,191 - world - [DEBUG] launchables.world: Debug flag: False 2023-12-14 19:29:27,521 - world - [DEBUG] video_capture.ndsi_backend: Suppressing pyre debug logs (except zbeacon) 2023-12-14 19:29:27,561 - world - [DEBUG] remote_recorder: Suppressing pyre debug logs (except zbeacon) 2023-12-14 19:29:27,571 - world - [DEBUG] pupil_apriltags: Testing possible hit: C:\Program Files (x86)\Pupil-Labs\Pupil v3.5.1\Pupil Capture v3.5.1\pupil_apriltags\lib\apriltag.dll... 2023-12-14 19:29:27,571 - world - [DEBUG] pupil_apriltags: Found working clib at C:\Program Files (x86)\Pupil-Labs\Pupil v3.5.1\Pupil Capture v3.5.1\pupil_apriltags\lib\apriltag.dll 2023-12-14 19:29:27,584 - world - [DEBUG] plugin: Scanning: fix_error_message.py 2023-12-14 19:29:28,054 - world - [ERROR] launchables.world: Process Capture crashed with trace: Traceback (most recent call last): File "launchables\world.py", line 671, in world File "plugin.py", line 409, in init File "plugin.py", line 432, in add File "video_capture\uvc_backend.py", line 75, in init AssertionError

user-cdcab0 14 December, 2023, 17:36:25

Can you try renaming your pupil_capture_settings folder to pupil_capture_settings_backup, and then launching the app? This will give it a "fresh start", so to speak.

If that loads, you can then try enabling each of the plugins you need one-by-one

user-7daa32 14 December, 2023, 19:48:49

Hello

user-cdcab0 14 December, 2023, 21:53:52

Hi, @user-7daa32 - if you put on the Pupil Core headset first and then eyeglasses on top of, it does work for some people with some adjustment to the eye cameras. It's not ideal, but depending on the person and the glasses, it does work for some people.

Contact lenses definitely work with Core

user-7daa32 14 December, 2023, 19:49:38

Can we use eye glasses or contact lenses with the pub Lab core eye tracker?

nmt 15 December, 2023, 08:34:02

Low contrast eye images

user-8bce45 15 December, 2023, 10:49:25

Hello, I managed to integrate the Gaze Tracker (Pupil Capture) into my Unity project. I receive gaze data in .npy and pldata. I cannot find anywhere how I can read these datafiles.

My goal is to have data about where my participants are gazing within the Unity project. Does anybody know where I can find information on how to do this? Thanks 🙂

user-d407c1 15 December, 2023, 10:59:10

Hi @user-8bce45 ! Is this project using the VR Add-ons and an XR headset, or is it using Pupil Core and Unity to project data on some screen?

user-b02f36 18 December, 2023, 09:07:10

Hello! I have a problem with world camera in Pupil Core when using other cameras. Image looks below, and pupil detection isn't working for several months. This problem is continuously appearing except the first and successful debug. In fact, it only works when I turn on the Pupil Service, and it cannot work when I change into the Pupil Capture. The world GUI made an abnormal debug to the gaze track camera and causes my system to crash. Computer is ASUS TUF laptop using Windows 11. The cameras include an OV9281 as world camera and two cameras called HBVCAM-12M2221 V22 with IR LEDs for gaze tracking. Can anyone help me with this? Many thanks!

Chat image

nmt 18 December, 2023, 10:51:21

Hi @user-b02f36! So the cameras you have work with Pupil Service? In that case, please try restarting Pupil Capture with default settings.

user-480f4c 18 December, 2023, 12:27:30

Hi @user-e3da49! Since this is a Neon discussion, let's move it to the 👓 neon channel.

user-4514c3 18 December, 2023, 14:29:32

Hello, is there a way to validate after calibration using the single marker? Thank you!

nmt 18 December, 2023, 20:27:55

There is. You can adopt similar head (or marker) movements as you did during the calibration, just during the validation instead.

user-f2b05d 19 December, 2023, 00:35:49

Hello!! Is there any method to convert data from iMotions exports back into the original Pupil format? Unfortunately, my hard drive crashed, and I lost all the data except for the iMotion exports.

user-d407c1 19 December, 2023, 07:35:16

Hi @user-f2b05d ! Here you have the iMotions Exporter, you will need to reverse what is done there, that said, you won't be able to get back all the data, as not everything is exported/supported by iMotions.

user-f792d5 19 December, 2023, 04:27:32

Hello. Sorry if this is the wrong place to ask a question.

https://docs.pupil-labs.com/core/getting-started/ I downloaded Public Capture v3.3.0 based on the above. The application started, but when I connect Public Core to my PC, the camera image does not show up. What could be the cause? I will present any necessary information. Please help. Public Capture v1.14 worked fine.

user-d407c1 19 December, 2023, 07:37:53

Hi @user-f792d5 👋 ! Is there any reason why you opted for an old release? Please try installing version 3.5, and refer to this specific OS troubleshooting steps.

user-6586ca 19 December, 2023, 11:00:29

Hu Pupil Team. I hope this message finds you well !

As part of our study, we are working with fixation data, specifically with gaze and fixation files. However, upon closer examination at the macro level, we have identified an inconsistency between the timestamps and indexes in these two files. To provide a specific example, we have encountered a discrepancy between the start_timestamps in the fixation file and the corresponding timestamp in the gaze file, where they do not consistently share the same index. In this cases, there appears to be a discrepancy where the index indicated in the fixation file is greater than the index indicated in the gaze file for the same timestamp.

We are currently exploring potential solutions to rectify this issue and ensure the accuracy of our analyses. If there are any recommendations or specific steps you suggest we take to address this index misalignment, we would greatly appreciate your guidance.

Thank you for your help.

nmt 19 December, 2023, 11:48:26

Hi @user-6586ca 👋. When you say index, are you referring to the fixation id, or the fixation start and end index/world index? This is an important detail, because the two are different entities. E.g. A fixation with a given ID, might span multiple world indices.

user-8bce45 19 December, 2023, 13:45:28

Hi, I'm using Pupil Labs plugins (Gaze Tracker with Screen Cast Camera) within my Unity project. I managed to collect data about the gaze positions within the Unity world, which is perfect. Now I was wondering if it might be possible to collect data about the velocity around each point of gaze. For instance, whether it would be possible to see within the data whether the participants are gazing at a fast-moving environment or at a slow-moving environment.

nmt 20 December, 2023, 08:22:15

Hi @user-8bce45 👋. We don't have the functionality built into our demo scenes. However, it would be somewhat straightforward to compute it yourself based on the velocity of your stimuli in Unity and the gaze signal.

user-4bc389 20 December, 2023, 08:55:58

Hi Using Pupil Core to export data, it was found that there are duplicate fixation IDs. Does the number of fixation points include duplicates, or is it only counted as one

user-4bc389 20 December, 2023, 09:00:23

Chat image

user-d407c1 20 December, 2023, 10:52:13

Hi @user-4bc389 ! that looks like the fixations_on_surface file, am I right?

It is normal to have the same fixation ID multiple times in there, as the position on the surface may have varied. So it would depend also on the surface detection.

Have a look on how gaze and fixations are remapped to the surface here

Depending on what you want to achieve you may want to filter by unique IDs or not. If your intention is to plot over the surface, you may want to keep it, while if you simply want to use metrics as duration, you can filter them

user-7daa32 20 December, 2023, 20:12:54

Would you reccommend including participants that were glasses or lenses even though it is not ideal?

user-071a54 21 December, 2023, 15:54:20

I have the core eye tracker, the four wires to the eye cameras have disconnected. Can you provide a wiring diagram so I can properly reinsert the correct wire into the 4 pins on each side?

nmt 22 December, 2023, 13:13:17

Hi @user-071a54 👋. Please reach out to info@pupil-labs.com and someone will assist you with the HW issue from there

user-908b50 22 December, 2023, 02:22:25

Hi again, i have three questions. One, how do I confirm the intrinsics of my camera? Two, do you suggest correcting for sampling error during pre-processing (add 4.166 ms to gaze timestamps?). Three, pupil size is larger when the eye is in line with the viewing angle. Is that being corrected for by the software?

nmt 22 December, 2023, 13:21:59

Hi @user-908b50! 1. We have an intrinsics estimation plugin that can be used for this purpose 2. I'm not sure I understand what you mean by correct for sampling error. Could you elaborate? 3. Which data stream are you examining? Pupil size is reported in pixels as observed in the eye image, and mm as output by our 3D eye model. The latter is corrected for perspective

End of December archive