core


user-183822 01 August, 2022, 08:37:34

Thank you so much πŸ™‚

user-183822 01 August, 2022, 13:44:15

one quick stupid question to run "temporal-correctness.ipynb" do i need to install psychopy ?

papr 01 August, 2022, 13:44:51

Check out https://jupyter.org/

user-2bd5c9 01 August, 2022, 16:33:05

Hi, I've got a problem with displaying the data from pupil invisible in pupil player. I can see this notification but don't know how to fix this. Can someone help? These are the notifications that I got.

user-2bd5c9 01 August, 2022, 16:49:41

Chat image

nmt 01 August, 2022, 17:09:42

Hi @user-2bd5c9 πŸ‘‹. Loading recordings from a shared drive account can be problematic. Could you move it to a local drive and try again?

user-e8411e 01 August, 2022, 16:59:50

Hi. We are running a research lab using Pupil Lab and we ran into some problems that we were wondering if you could help. Problem 1 is our pupil player is not working. Only the black screen is showing up but not the gray screen to drag the files in to play the eye tracking data. Secondly, we recorded a file and pupil recording had a glitch so when we went to save the data, they are missing files in the recording folders. However , there are mp4 files for the eye and world videos and we would really like to salvage those video. How could we do that? Thanks so much!

nmt 01 August, 2022, 17:13:39

Hi @user-e8411e πŸ‘‹. 1. Please try deleting the user_settings_* files in your pupil_player_settings folder. 2. If possible, please share the recording with [email removed] and we can take a look!

user-374bb4 01 August, 2022, 17:28:39

Hello, we think our Pupil Core headset has a hardware issue. The right eye camera repeatedly disconnects. It has been going on for a few months and only impacts the right eye. We tried downloading the newest version of the software but still having issues. Do you have a way to send it in for repairs?

nmt 01 August, 2022, 19:00:17

Please reach out to info@pupil-labs.com with a description of the issue πŸ™‚

user-2e1368 01 August, 2022, 17:37:57

Hey There, I was trying to get the Gaze position data. could you please help me?

Chat image

nmt 02 August, 2022, 07:25:50

Hi @user-2e1368 πŸ‘‹. In order to get gaze data, you'll need to calibrate prior to making a recording. Further details here: https://docs.pupil-labs.com/core/software/pupil-capture/#calibration

user-488d6d 02 August, 2022, 18:47:49

Hello Pupil Labs, I am looking to buy Pupil Core glasses but I would like to chat with an expert before that. I already sent an e-mail to the team, but I am posting on the discord chat because I only have until Thursday to decide on the Eye Tracker that I want. Could someone contact me? Thanks!

nmt 02 August, 2022, 19:16:45

Hi @user-488d6d πŸ‘‹. We've just responded via email!

user-990e57 02 August, 2022, 20:57:11

Hi, in comparing the fixations on surface csv files between subjects ran using an earlier version of Pupil Player and the newer version there is a difference. The older csv files list 1 fixation per row, listing the "id" column from 1 to however many fixation the participant had. The newer version lists fixation by some combination of fixation number and world frame and "fixation_id" can list multiple rows for the same fixation. Can you please explain this difference? In playing around with copies of newer subject files I was able to make the csv file look like the older format by removing the surface tracker plugin before running the fixation plugin, if that's of any help in describing what I'm talking about. Thank you for any guidance you can provide!

user-9429ba 03 August, 2022, 11:22:03

Hi @user-990e57 πŸ‘‹ The fixations_on_surface.csv will contain one row per world frame for each fixation. So a long fixation for example, will span multiple world frames, but retain its individual fixation id. The position of the fixation relative to the surface could, however, change on consecutive frames; see norm_pos_x/y Could you be refering to the fixations.csv export? This file will only contain one row per fixation id, but fixations are not mapped to the surface in this case.

user-23177e 03 August, 2022, 13:11:40

Hi there Pupil, I've got a question regarding video-encoding. I noticed the world videos are either mjpeg encoded or h263. Is the latter done via quicksync or do you use a different way of encoding? What kind of CPU is recommended if we decide to record in h263?

papr 03 August, 2022, 13:15:14

We use ffmpeg python bindings for the video encoding. For performance reasons, I would recommend recording the mjpeg video. I don't have any explicit CPU recommendations in this regard.

papr 03 August, 2022, 13:15:32

And hi, welcome to the community! πŸ‘‹

user-23177e 03 August, 2022, 13:16:51

thanks, I appreciate your reply.

user-23177e 03 August, 2022, 13:19:08

we will be doing recordings of 30-60 minutes per session, resulting in rather large files. Is the impact on performance big enough to justify this?

papr 03 August, 2022, 13:24:49

Recording mjpeg instead of h263 saves you performance in real time. It may have an influence on the recorded frame rate. I doubt that the ffmpeg shipped with the bundle has quicksync support. You would need to install a custom pyav (the python ffmpeg binding) as well as run Pupil Core from source.

Recordings of that length have a post-hoc performance impact in Pupil Player. I would recommend to split the recording into chunks of 15-20 minutes.

user-23177e 03 August, 2022, 13:21:55

Would it be possible for Pupil to utilize Intel Quicksync via the ffmpeg encoding? Of course, this would require an supported cpu on the client side.

user-23177e 03 August, 2022, 13:26:02

ok, great. Thanks for the help and the tips, much appreciated.

user-6ec20c 03 August, 2022, 18:07:14

Hi, I am having some issues with the Pupil Core glasses saying the image coming from both eye cameras and the scene camera that says the image files are corrupted and is preventing me from recording. Do you know what may be causing this issue?

papr 03 August, 2022, 18:08:36

Could you please share a Screenshot of the error?

user-23177e 04 August, 2022, 13:48:27

Quick question, can I run a 3m usb-c cable with a Pupil Core without issues? What's the transfer rate of the included cable?

papr 04 August, 2022, 13:50:10

Longer usb cables might cause transmission errors. I recommend using "active" usb extension cables that have their own power source. The included cable is able to transmit the full frame rate. If you are seeing less fps than expected, the cause might be related to insufficient cpu resources.

user-23177e 04 August, 2022, 13:50:51

ok, thanks!

user-89d824 04 August, 2022, 21:02:33

Hi again, I have some questions about tracking fixations but to fully explain what my project intends to do, I'd have to share a video privately (it's a Youtube link). May I know who can I share the video with (and ask loads of further questions?)

user-9429ba 05 August, 2022, 07:01:14

Hi πŸ‘‹ Please send the files to [email removed] for review.

user-25da3f 05 August, 2022, 13:45:21

Hi there, I was wondering if anyone can help me. I used MatLab to run an experiment with the Core. Now I have the raw data on graphs, the data is plotted x= time y=pupil dilation (mm). THe numbers are far too large for it to be in mms and i was wondering if there was a scaling factor issue here I was missing. My supervisor ran the code to develop the experiment on MatLab and he is not sure - I am hoping it's not an error with programming! Any help would be greatly appreciated πŸ™‚

user-9429ba 05 August, 2022, 16:29:03

Hi @user-25da3f πŸ‘‹ Are you sure you are using diameter_3d which would be in millimetres provided by 3d eye model? diameter is in pixels, observed in the eye videos. You can read more about these exports in eye camera coordinates in our documentation: https://docs.pupil-labs.com/core/software/pupil-player/#pupil-positions-csv

user-266adf 06 August, 2022, 18:00:07

Hi, I have my gaze positions.csv data and would like to calculate the number of macrosaccades and microsaccades. I have been using the formula: angle = arccos[(xa * xb + ya * yb) / (√(xa2 + ya2) * √(xb2 + yb2))] to calculate the degree of differences by xyz positions. (e.g., x= gaze point x, y= gaze point y..etc). I got the degrees, but I wonder if this is the correct way to get microsaccades (less than 1 degree) and macrosaccades (more than 1 degree)?

user-266adf 06 August, 2022, 18:01:42

Anyone with experience investigating saccades before?

user-53a74a 08 August, 2022, 08:37:38

Hello, I am interested in using Pupil Core for my Ph.D. project, and I plan to buy a compatible motherboard (plus, possibly, an extension cable) for my setup. Could you let me know what the recommended spec of the USB connection is? For example, what is the USB version of the USB cables included with the glasses?

user-9429ba 08 August, 2022, 14:32:56

Hi! Pupil Core ships with a 2 meter USB-C to USB-A cable. See this message for reference: https://discord.com/channels/285728493612957698/285728493612957698/1004748082346463364

user-eb6164 08 August, 2022, 11:39:22

Hello, is there any tutorial or guide on how to plot gaze and fixation positions in matlab? I tried to do it, but the data obtained is not correct. I mean I recorded myself looking in a rectangular way through a screen but when I plotted the norm_x y and z and the scanning pattern is not correct. All gazes are shown around one point in the coordinates

user-9429ba 08 August, 2022, 15:08:18

Hi @user-eb6164 Which data are you using exactly? You can see a full breakdown of the gaze_positions.csv export here: https://docs.pupil-labs.com/core/software/pupil-player/#gaze-positions-csv

marc 08 August, 2022, 13:34:55

@papr forwarding a message from @user-183822 from the invisible channel:

hello everyone; for the data analysis of pupil data i tried running the code from 'https://pyplr.github.io/cvd_pupillometry/04d_analysis.html' but i am not able to run it.. i am getting an error in plr processing.. i dont know what to do.. i want to do the preprocessing of pupil diameter but i am not able to do it..' does anyone have any solution regarding this?

user-219de4 09 August, 2022, 00:36:09

Hello! Is there anywhere we could manually change the video setting when exporting in Pupil Player? I set 640*480 with 120 fps under the sensor setting in the Pupil Capture, but the exported file has sampling rate of around 49 fps only. I am using both program with the latest version 3.5.1. Thank you! (p.s. could the eye video be exported alone?)

user-9429ba 09 August, 2022, 08:12:41

Hi πŸ‘‹ FPS can fluctuate depending on the specs of your computer and other processes you have running - it may not be able to reach the full 120Hz. What machine are you using?

You can export export the eye videos alone. Just disable the other plugins in Pupil Player, but enable Eye Overlay.

user-9429ba 09 August, 2022, 07:51:03

Hi @user-183822 πŸ‘‹ Have you tried this tutorial on working with pupil_positions data? https://github.com/pupil-labs/pupil-tutorials/blob/master/01_load_exported_data_and_visualize_pupillometry.ipynb Note the difference between diameter in the .csv export and diameter_3d. The former is in eye camera pixel coordinates, while the latter is in millimeters from the eye model. It may also be useful for you to know that we have a new plugin (not yet in the official release) for editing blinks to remove them from the export: https://gist.github.com/papr/b7e5f86bdad8eb9ee98723f8d5053f5f

user-b9005d 09 August, 2022, 12:28:01

Hello! My pupil core headset isn’t seeming to give me the option to record audio. When I navigate the menus, the only option is β€˜sound only’ and with that on, we don’t get a sound output file

user-9429ba 09 August, 2022, 12:39:08

Hi @user-b9005d Pupil Core does not record audio. We made the decision to remove audio capture when we released Pupil Capture v2.0. Unfortunately, this feature was too difficult to maintain across our supported operating systems. It is possible to collect audio using the Lab Streaming Layer (LSL) framework. This would provide accurate synchronisation between audio and gaze data, but takes more steps to set up.

user-3e8682 09 August, 2022, 21:33:40

I would like to track my gaze on my computer screen in order to make clicking/navigating around easier e.g. when editing code in my IDE. Is this a use case pupilcore would do well in? Essentially, I’d like to be able to code on a 13-15 inch laptop screen as normal with the addition of eye tracking to help me navigate through code and the like.

user-9429ba 10 August, 2022, 10:55:23

Hi @user-3e8682 πŸ‘‹ There are some existing implementations similar to this which I'm sure you will find useful. Cursor Control:https://github.com/emendir/PupilCore-CursorControl#readme Mouse Control: https://github.com/pupil-labs/pupil-helpers/blob/master/python/mouse_control.py and https://github.com/trishume/PolyMouse (a bit old but still helpful) There is also an accompanying blog article for this https://thume.ca/2017/11/10/eye-tracking-mouse-control-ideas/ Surface Tracker: If you add April Tag markers to the corners of a monitor you can map gaze to screen coordinates using our Surface Tracker plugin. More details here: https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking PsychoPy Integration: We have a software integration for PsychoPy which uses Surface Tracking. https://psychopy.org/api/iohub/device/eyetracker_interface/PupilLabs_Core_Implementation_Notes.html There is an Eye Tracking Feature Demo in PsychoPy's Builder view menu which you can use with Pupil Core to control the cursor or simulate with a mouse.

user-eb6164 10 August, 2022, 15:30:52

hello guys, I have a question. do i need to use markers fir AOI to extract data (such as gaze point,fixation and etc) or can I record without them

user-9429ba 11 August, 2022, 13:34:00

Hi Reem! If you make recordings without using fiducial markers to define Surfaces, gaze data will then be in scene camera coordinates only. Markers are needed to map gaze to Surfaces/AOIs - it is important the markers are present in the recording as they cannot be added post-hoc.

user-219de4 12 August, 2022, 00:36:15

Also, we still met some issues related with image input: we tried another two window devices, and both received the error message"video_capture.uvc_bacend: could not connect to device. No image will be supplied" Any troubleshooting idea? Thanks! p.s. we tried unplug/replug back, checked the usbk driver etc.

user-9429ba 12 August, 2022, 10:30:26

Hi! If you search the error message video_capture.uvc_backend: could not connect to device directly in Discord there's plenty of previous messages on this πŸ™‚ e.g. see this message for reference: https://discord.com/channels/285728493612957698/285728635267186688/974253545287204874 It could be that the cameras have become physically disconnected.

user-9429ba 12 August, 2022, 10:18:09

Hi @user-219de4 fps can fluctuate during recordings depending on CPU resources available. Details on Pupil Time in our documentation: https://docs.pupil-labs.com/core/terminology/#_2-pupil-time The cameras are free running and received frames are assigned a pupil-timestamp. There's a tutorial in frame identification here: https://github.com/pupil-labs/pupil-tutorials/blob/master/09_frame_identification.ipynb

user-b9005d 12 August, 2022, 17:11:49

I’m currently messing around with your camera intrinsic estimation plugin. Is there a way to have the undistorted image be the default recorded image? When I hit record, it records with the typical fisheye filter

papr 15 August, 2022, 08:03:44

Hi, this is not possible. Undistorting and reencoding the video is too taxing on the CPU. You can use the iMotions exporter plugin to undistort the video and gaze data post-hoc.

user-f93379 13 August, 2022, 12:30:56

Hello, colleagues.

I have two questions: 1) Can you tell me if you can give a different name to a record folder, instead of "000"? How can this be changed? Is it possible to achieve a synchronization point by naming a file according to the system time of the computer? How asynchronous would such a time be to the real situation? 2) Are there any other labels (qr) other than square? It's not very convenient to use for a laptop screen, since the lower labels don't fit. And an additional question: What is the best way to position the monitor screen recorded on the glasses and recorded from the computer, by capturing the image? Are the square marks positioned in the center?

papr 15 August, 2022, 08:10:33

Hi, there is no option to avoid the numbered folders. They ensure that each recording can be stored to a unique folder.

Creating a file on disk is a comparably slow action. You might want to read out the "system start time" field in the recording's info.json file. It contains the recording start time in system time (unix epoch).

What do you mean by the lower labels don't fit? Did you know that you can place the markers in any co-planar location with the screen that you want and later adjust the surface definition to fit your screen? https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking

user-53a74a 13 August, 2022, 17:11:17

Hello folks, I have a question about calibration. In my current setup, I place AprilTags at each corner of my monitor frame for surface tracking, but I would like to know whether it is possible to place them inside my monitor (I.e., my monitor will display them all the time). While this sounded easy implementation to me, I soon realized that during the calibration phase, Pupil Capture would place markers at each corner by making a temporal full-screen window. This makes my inside-monitor-AprilTags dissapear during the calibration, but does this affect accuracy of calibration/surface tracking? Or as long as calibration and surface tracking are successfully conducted independently, this will not affect the performance?

user-6e1219 14 August, 2022, 08:21:41

Hello, Is there any Unit that you use to represent Pupil Diameter?

papr 15 August, 2022, 08:15:08

Hi, see https://docs.pupil-labs.com/core/best-practices/#pupillometry

papr 15 August, 2022, 08:14:01

@user-53a74a hi! Gaze estimation (eye to scene) and surface tracking (scene to surface) are two independent mappings. It is ok for the surface markers to disappear during the calibration. It will not affect performance. When displaying the apriltags make sure to include a sufficiently large white margin around them (required for detection)

user-b9005d 15 August, 2022, 17:34:26

In the 3d gaze export, how is the program calculating a position in space if there is only one eye present? Ultimately, we are interested in calculating the visual angle on certain eye movements and are wondering which export variable would help us make that calculation

user-b9005d 16 August, 2022, 16:44:14

Hey @papr , just following up on this one

papr 16 August, 2022, 16:52:34

It assumes a specific distance (which is hard coded). You might be better off by looking at differences between consecutive circle_3d_normals in the Pupil positions export

user-26b243 15 August, 2022, 19:54:16

Hi @papr , I was wondering what purpose does leaving a gap between the glasses and the user's forehead serve? We have been advised to do this in response to us wanting to improve our calibration method.

papr 15 August, 2022, 19:58:44

If the headset touches the forehead, slippage of the headset is more likely e.g. when the subject raises their eyebrows

user-75df7c 16 August, 2022, 12:44:07

Is anyone else having a strange thing where pupil player just closes when you drag a folder on it to open it? It's happening on both 3.3.0 and 3.5.1. Not sure if the folder is too big (it's a 30-minute recording)

papr 16 August, 2022, 12:45:04

Hi πŸ‘‹ Could you share the player.log file after having attempted to open the recording in 3.5.1? You can find it in the pupil_player_settings folder.

user-75df7c 16 August, 2022, 12:47:13

player.log

papr 16 August, 2022, 12:51:44

Thank you. There seems to be an irregularity with your recording. Could you share the list of files that are contained in it?

papr 16 August, 2022, 12:53:12

I just noticed that the recording is located on One Drive. We have gotten multiple issue reports that were related to that in the past. Please either move the recording out of the One Drive folder and/or disable the One Drive sync.

user-61fa47 16 August, 2022, 15:15:46

Hi, we run a test experiment with pupil invisible glasses. Now I am in another country, and I was trying to review the videos directly from pupil cloud, but when I started a preview on the site, it opened the window with a black screen, and the video didn't start. Any help?

papr 16 August, 2022, 15:37:54

Hi! This looks like a connection issue. Do you continue experiencing it?

user-61fa47 16 August, 2022, 15:16:07

Chat image

user-61fa47 16 August, 2022, 15:38:54

yes, from different networks also. I am downloading the raw data and the download speed don't reach the 200 kb/s

user-26b243 16 August, 2022, 19:54:09

Hi @papr , I saw in the chat that unit of the "gaze_point_3d_xyz" is measured in mm, is that also true for Norm X and Norm Y values of gaze position data?

papr 16 August, 2022, 19:59:40

No, norm_pos is in distorted 2d coordinates. The unit is the field of view of the scene camera. 0,0 bottom left, 1,1 top right

user-7fde53 17 August, 2022, 08:26:51

Hello dear community, I noticed the last time I used the Pupil Labs Invisible glasses that they got pretty warm by the earpiece The usage time was about 14min. Is this normal that this gets so warm? Thanks for the help

papr 17 August, 2022, 16:02:31

It is normal that the glasses get warm on the right side as this is were the electronics lie.

user-8247cf 17 August, 2022, 15:52:05

Hello @papr I have some questions. I was wondering, does this eye tracker will still work with the computer camera(in-built), for example who doesn’t have the resources and money for getting the tracking hardware? Also, if yeah, does will be applicable to 5x12, 5x12 resolution, say a doctor has to be mark the co-ordinates of gazes in an MRI? Can you please help me in answering the questions? Thank you😊

papr 17 August, 2022, 16:03:16

Hi, remote eye trackers, like the laptop cameras, are not supported.

user-ced35b 17 August, 2022, 17:37:24

Hi if we are mainly interested in measuring pupil size (alongside gaze position), do you think its best if we freeze the pye3d model? Thank you!

nmt 18 August, 2022, 10:35:43

If you can minimise headset slippage, then this is advisable. If you haven't already, check out the pupillometry best practices: https://docs.pupil-labs.com/core/best-practices/#pupillometry

user-ced35b 17 August, 2022, 18:51:31

Also, is there a way to open files with pupil player from google drive?

nmt 18 August, 2022, 10:36:24

You'll have to download them to a local drive first

user-4bad8e 18 August, 2022, 07:35:53

Hi. Are there any documents about pupil core's measurement uncertainty of fixation durations and fixation numbers ?? I guess it's depending on confidence, but I'm wondering how to describe it in research articles.

nmt 18 August, 2022, 10:49:56

Fixations are only computed using high-confidence data by default. When deciding on 'uncertainty' about fixation duration and number, it would be more important to consider the duration and dispersion thresholds selected by the user. They must be appropriate for the task you're recording. You can read more about that here: https://docs.pupil-labs.com/core/best-practices/#fixation-filter-thresholds. You can also read more about the fixation algorithm here (with link to publication): https://docs.pupil-labs.com/core/terminology/#fixations

user-c800fc 18 August, 2022, 09:44:45

Hello, I noticed this error occured. so I followed this (https://github.com/pupil-labs/pupil/issues/2088) and found out there was no uvc.dll. How can I get an uvc.dll? I already installed uvc 0.14. Any help is appreciated.

Chat image Chat image Chat image

papr 18 August, 2022, 09:45:34

hey, which python version are you using?

user-c800fc 18 August, 2022, 09:45:47

I'm using python 3.6.8

papr 18 August, 2022, 09:46:08

And how did you install pyuvc?

user-c800fc 18 August, 2022, 09:46:38

I installed pyuvc using pip install -r requirements.txt

papr 18 August, 2022, 09:51:37

let me try to reproduce the issue

user-c800fc 18 August, 2022, 09:56:54

When I first used DependenciesGui.exe, there was no turbojpeg.dll, but I installed it later.

papr 18 August, 2022, 10:01:01

Please download and install this wheel using the --force-reinstall flag.

uvc-0.15.0-cp36-cp36m-win_amd64.whl

user-c800fc 18 August, 2022, 10:08:33

Thanks to you, the problem has been solved. Thank you so much!

Chat image Chat image

papr 18 August, 2022, 10:14:36

Thanks for testing and the feedback!

user-057596 18 August, 2022, 10:20:56

Hi we are presently testing and with one of our computers and we getting this notice on the world view that there is corrupt JPEG Data fortunately we have a spare device but how do we rectify this problem. Thanks Gary

nmt 18 August, 2022, 10:54:46

Hi @user-057596 πŸ‘‹. Can you double check that all of the wired connections (PC to headset USB) are securely fastened, and try again? If the error is still showing, reach out to [email removed]

user-057596 18 August, 2022, 10:21:28

Chat image

user-057596 18 August, 2022, 10:56:09

Will do Neil we are currently testing at the moment but will reconnect the laptop at the end of testing and try again. Thanks.

user-2e1368 18 August, 2022, 17:48:48

Hi, How can I get the gaze position data? I did calibrate and export and I got all the data except the gaze position data. Thank you!

nmt 19 August, 2022, 06:14:48

If you calibrated successfully, gaze data should be in gaze_positions.csv. Are you sure that file is empty?

user-ced35b 18 August, 2022, 18:54:51

I'm looking at the pupil timestamp and the eye_id column (exported pupil_positions.xls file). I'm interested in knowing the pupil size of both eyes at the exact same timestamp (ie. what is the right eye doing while the left eye is fixating on something else - at the same time). How do I figure this out? I understand pupil time is arbitrary but it seems like the pair of identical timestamps are for one eye only.

nmt 19 August, 2022, 06:12:12

Hi @user-ced35b πŸ‘‹. Pupil Core's eye cameras are free running, which means two pupil samples (one from each eye) may not share identical timestamps. That said, both eye cameras use the same clock (Pupil time), and so you just need to find the closest temporal match for a given pair.

user-345aa1 19 August, 2022, 03:28:53

Hi, can anyone help me? I'm trying to run pyuvc's example.py program but I'm having the following problem and I don't know how to solve it(I'm using linux mint) --> module 'uvc' has no attribute 'device_list'

papr 19 August, 2022, 06:35:27

Hey, it is very likely that you did not install Pupil Labs's custom pyuvc version but a pre-built version from a different fork/source. Please make sure to follow the instructions in the repository.

papr 19 August, 2022, 06:33:16

@user-2e1368 In addition to the above, make sure that you do not have the Post-hoc calibration enabled in Pupil Player. You need to have "Gaze from Recording" (see Gaze data menu) enabled to export the gaze data that was estimated during the recording.

user-2e1368 22 August, 2022, 17:41:51

Thank you for your response. I enabled the Post-hoc gaze calibration on the Pupil Player and I got a massage (( Fixation detection: No data available to find fixation))

user-f93379 19 August, 2022, 18:52:25

Hi! Colleagues, tell me about a recording crash. How can I revive the files?

Chat image

papr 22 August, 2022, 12:41:32

Unfortunately, it is not possible to recover this recording. It is missing crucial external timestamp information to align the video streams.

user-183822 21 August, 2022, 14:11:22

hello, what is the difference between diameter and diameter_3d? and which one is better for analyzing?

user-183822 21 August, 2022, 14:14:49

my motive is to check the pupil changes and I have found some codes for diameter_3d that's why I want to know if diameter_3d is better than the diameter to find out about pupil dilation ?

user-6338f3 22 August, 2022, 00:48:26

Hello, I am researching gaze processing of pediatric patients using pupil core. In the case of a pupil core, is there a way to display the direction of movement (green point) of the eye, such as pupil capture, on a computer monitor?

papr 22 August, 2022, 12:43:29

For that you need surface tracking https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking The easiest way to get the visualization working is to use our PsychoPy integration https://psychopy.org/api/iohub/device/eyetracker_interface/PupilLabs_Core_Implementation_Notes.html#pupil-labs-core

user-0b9182 22 August, 2022, 09:06:35

Hello. I am studying the data in gaze_positions.csv and pubil_positions.csv.

I understand that the 3d gaze and normal_pos of the gaze_position are generated based on the data of the pupil_position, where I wonder what mathematical formula the normal_pos is obtained by.

papr 22 August, 2022, 13:20:36

The pipeline looks like this on a high level 1. Use 3d pupil data as input 2. Transform the circle_3d_normal and sphere_center into scene coordinates (using the transformation calculates during calibration) 3. Find the approximate intersection of the gaze rays (eye_center0/1 + gaze_normal0/1) -> resulting in gaze_point_3d 4. gaze_point_3d is then projected into the image plane and normalised. https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/gaze_mapping/gazer_3d/gazer_headset.py#L171-L179

user-edaf68 22 August, 2022, 15:26:28

Hi, I know this is a very basic question, but I'm new to using Pupil Core, just started to set it up to experiment and learn how it works. I'm using it on a MacbookAir, and I can't seam to get any videos appear in the pupil capture windows. World and Eye windows are both grey. Could you please help me find out what I am doing wrong? Thank you so much!

Chat image

nmt 22 August, 2022, 15:30:10

Hi @user-edaf68 πŸ‘‹. You'll need to start Capture with admin rights. See this message for reference: https://discord.com/channels/285728493612957698/285728493612957698/995974229071777873

user-f641a4 22 August, 2022, 15:32:58

Hi, everyone. I am thinking about using Pupil Labs together with Microsoft Kinect in an environment. Will the IR sensors of these two devices interfere with one another? Does anyone have expereinece doing this? Thank you in advance

papr 22 August, 2022, 16:16:02

Hey! As long as there isn't an IR reflection directly on the pupil edge, you should be fine.

user-dd1963 22 August, 2022, 16:11:01

Hi, could you give me Pupil Core User Manual with PDF file? I cannot find it. But user manual is visible in the website.

papr 22 August, 2022, 16:12:17

Hi, we don't have a pdf version of the documentation, sorry. But you can get a copy of the docs here and browse them offline here https://github.com/pupil-labs/pupil-docs/

user-ced35b 22 August, 2022, 16:56:56

Hello, looking at my pupil_positions.csv file, what is the difference between the diameter column (which gives me very high values ranging from 24-40) and the diameter_3d column (which gives me more realistic values for pupil diameter)? I thought both columns were in mm when looking at the pye3d method. Do I use the diameter_3d column if I am interested in pupil diameter? Thanks in advance!

papr 22 August, 2022, 17:48:25

https://docs.pupil-labs.com/core/best-practices/#pupillometry please see the docs

user-89d824 22 August, 2022, 17:06:10

Hi, I was using Pupil Player just fine a while ago but it suddenly stopped working -- crashed every time I tried to load a recording (the same recording).

I've tried restarting the laptop a few times and even reinstalled the pupil labs software but the problem persists. Any idea what's going on?

player - [ERROR] launchables.player: Process Player crashed with trace:
Traceback (most recent call last):
  File "launchables\player.py", line 624, in player
  File "plugin.py", line 409, in __init__
  File "plugin.py", line 441, in add
  File "observable.py", line 368, in __call__
  File "gaze_producer\gaze_from_offline_calibration.py", line 201, in init_ui
  File "gaze_producer\ui\select_and_refresh_menu.py", line 65, in render
  File "gaze_producer\ui\select_and_refresh_menu.py", line 83, in _render_item_selector_and_current_item
  File "gaze_producer\ui\select_and_refresh_menu.py", line 90, in _on_change_current_item
  File "gaze_producer\ui\storage_edit_menu.py", line 81, in render_item
  File "gaze_producer\ui\gaze_mapper_menu.py", line 66, in _render_custom_ui
  File "gaze_producer\ui\gaze_mapper_menu.py", line 114, in _create_mapping_range_selector
  File "gaze_producer\gaze_from_offline_calibration.py", line 224, in _index_range_as_str
  File "gaze_producer\gaze_from_offline_calibration.py", line 229, in _index_time_as_str
IndexError: index 5941 is out of bounds for axis 0 with size 5813

player - [INFO] launchables.player: Process shutting down.
papr 22 August, 2022, 17:47:57

I will look into it tomorrow

user-2e1368 22 August, 2022, 18:09:54

Yes, I will. Thank you so much!

user-6e1219 22 August, 2022, 19:05:06

Hello Team, I have already used your "annotations.py" file to do remote annotations on my Eye data but is there any way to do so with Matlab? Any annotations file for Matlab?

user-219de4 23 August, 2022, 01:33:50

Hello! My study design aims to utilize pupillab core with 120 fps. We currently met fps fluctuation during recording and can reach about 100 fps on average. Do you have any system recommendations about the optimal recording? Thank you.

papr 23 August, 2022, 07:19:33

The M1 macs perform best. Otherwise, try to get a CPU with high clock speeds (> 2.8 Ghz)

user-8556bf 23 August, 2022, 02:38:32

Hello everyone, is there a sales team in China here? I am a hardware engineer at MiHoYo(the company that developed Genshin ImpactοΌ‰, and we hope to get a live demonstration and support from the sales team in Shanghai. is it possible?

papr 23 August, 2022, 07:17:01

Please contact sales@pupil-labs.com in this regard

user-eb55e3 23 August, 2022, 08:17:09

hello, I am using pupil labs camera for eye tracking and would like to know if there's a way to synchronise the timestamps from the 2 videos (left eye and right eye). then my other issue is I would like to open the posthoc mp4 video in matlab to look at it frame by frame but I did not find a way. any solution? Thanks

papr 23 August, 2022, 08:19:49

Hi! Each video has external timestamps that can be used to synchronize the frames between videos.

If matlab does not have an accurate way to extract frames, you can convert the video to a sequence of images using ffmpeg. See this tutorial https://github.com/pupil-labs/pupil-tutorials/blob/master/07_frame_extraction.ipynb

user-eb55e3 23 August, 2022, 08:33:59

Thanks for your answer. Is there any timestamps synchronisation option in the pupil capture software?

papr 23 August, 2022, 08:36:27

Do you mean with another non-Pupil-Capture clock? Yes. You can either adjust the realtime clock or perform the sync to Unix epoch post-hoc

user-eb55e3 23 August, 2022, 08:34:27

does this ffmpeg also works in windows?

papr 23 August, 2022, 08:36:31

yes

user-eb55e3 23 August, 2022, 08:55:38

I mean is there any way that the Pupil Capture could already synchronize the time of each frame between the 2 videos?

papr 23 August, 2022, 08:57:18

Ah, you mean if it is possible that the n'th frame in each video has the same timestamp as the other? This is not possible because the cameras run independently. There is no hardware synchronization between them.

user-eb55e3 23 August, 2022, 12:16:01

not necessarily. So what I wanted to do is to find the closest frames from each other in the 2 videos. If there's no possibility to do this through the pupil capture software then I will do it using the external timestamps of each videos. Thanks for your help!

papr 23 August, 2022, 12:27:50

after calibrating, Pupil Capture tries to find matching pupil data. You can find that information as part of the gaze data's base_data field. But that matching is not as accurate as matching timestamps explicitly post-hoc. I recommend you go with the second approach.

user-ba8bd3 23 August, 2022, 15:15:20

Hi! I have a quick question regarding confidence. In pupil_positions.csv, there are two columns: β€œconfidence” and β€œmodel_confidence.” Am I correct to assume that the β€œconfidence” column applies to 2D pupil detection and β€œmodel_confidence” applies to 3D pupil detection? I am just trying to filter out low confidence values but I’m wondering if values under the β€œconfidence” column are even relevant/accurate for 3D pupil data (such as the β€œdiameter_3d” column). Thanks in advance!

papr 23 August, 2022, 15:18:34

Hey, your assumptions are correct. confidence values are relevant. low confidence will likely lead to low model_confidence but does not have to. Likewise, model_confidence can be low but confidence high. I recommend removing any samples with low confidence and/or model_confidence.

user-ba8bd3 23 August, 2022, 15:19:24

Ok sounds good! Thank you for the quick reply

user-ef3ca7 23 August, 2022, 17:23:47

Hello, I have a pupil core. Can I use this code for pupil core? If so what do I need to change in the code? https://github.com/pupil-labs/realtime-matlab-experiment

nmt 23 August, 2022, 19:05:12

Hi @user-ef3ca7. You'll need to use the Core Network API. Check out our Matlab helpers for Core here: https://github.com/pupil-labs/pupil-helpers/tree/master/matlab

user-eb6164 23 August, 2022, 18:07:28

Hello guys, I have a question. In surface fixation detection csv file exported. After doing my analysis, I found that my data is inconvenient to me. Then I realized that inside the csv file there are some fixations below 100 ms. When I checked the settings in pupil labs I saw that the minimum duration for fixation is set to 300 ms (which I should change). So why I have results for fixations below 300 ms if the minimum duration set is 300?

papr 23 August, 2022, 21:24:42

Could you share this recording such that we can reproduce this issue?

user-2bd5c9 24 August, 2022, 14:49:12

@papr You notified me that my right eye camera is not working. Is there a quick way to fix this? Since I have multiple measurements today?

papr 24 August, 2022, 14:57:14

Unfortunately not. We believe this to be a hardware issue. Our hardware support team will be following up as soon as possible. I am very sorry.

user-2bd5c9 24 August, 2022, 15:07:25

All right, no worries. Thanks for the reply

user-e7f2e2 25 August, 2022, 00:46:12

Hi all, I have a question regarding eye-tracking data that I have collected with Pupil Core (glasses) and eye-tracking metrics (using Pupil Player). We downloaded the fixations.csv file and recorded the fixation when an AOI appeared and the fixation when the AOI disappeared. We see a big difference (43.8%) when we calculate the duration of the AOI being shown based on 1) timestamp of first and last fixation vs. 2) the sum of the duration of the fixations.

I am trying to better understand what this part (43.8%) is that is not a fixation. Therefore, my question is what this 43.8% could consist of? Is this, for example, missing data, gazes that are too short to count as a fixation (the shortest seems to be 80.69 ms; tbtw, is this used as the default threshold?), saccades, anything else?

papr 25 August, 2022, 07:23:24

Could you specify which data you use as input for your calculations? Does this ratio stay roughly the same for all your recordings?

user-789ddb 25 August, 2022, 01:39:12

Hi all,

I am completing my PhD at WSU. I am trying to integrate the Pupil Labs Core with an AR headset (HoloLens 2)

I can't seem to install the zmq dependency.... I have pip installed it and it shows up in the pip list, but when I run the program (pre-existing program) - I get an error - Module not found.

What are some possible causes?

I was thinking it could be that the system runs in a virtual environment, so installing to base Python will not work?

Sorry about the vague question... I am still learning how to program πŸ™‚

papr 25 August, 2022, 07:20:57

Yes, you need to activate the virtual environment first and install the dependencies into it.

user-15e3bc 15 October, 2023, 07:32:43

May I ask how you integrated pupil core with hololens2? I am currently facing this problem

user-4f5ef1 25 August, 2022, 04:09:18

Hi all, I have two types of Pupil Core. One has a pair of smaller eye cameras and it looks the same as the illustration in the documentation. The other one looks the same as the advertising pic. Do they work differently? Is there any difference in spec between them?

Chat image Chat image

papr 25 August, 2022, 07:19:50

The only difference is there available resolutions and frame rates

user-6e1219 25 August, 2022, 07:50:41

Hello, I am designing an experiment on Psychopy where I have integrated Pupil Labs eyetracker. I will going to present couple of pictures ( Stimulus) through Psychopy and wanted to monitor the reaction of our eyes when we look at the stimulus.

I have added a picture where I can get the Start and Ending time of the stimulus using the Psychopy generated CSV file but what I wanted to find the timestamps in the Pupil_position.csv file? Like , I wanted to know which part of the Pupil_position.csv will be in the first Stimulus start and End time . (Annotations for both start and end time)

It will be really helpful if you can let me know

Chat image

papr 25 August, 2022, 07:58:42

Hi! Did you use the iohub integration or your own?

user-6e1219 25 August, 2022, 08:01:20

I have used the ioHub integration

papr 25 August, 2022, 08:03:53

This integration performs time sync in realtime and applies it to the data that it receives. Pupil Capture is not aware of PsychoPy. I recommend enabling the "Save HDF5" option in the "Data" tab of your experiment settings. This will store the gaze and pupillometry data as part of your psychopy data, in psychopy time and space.

user-6e1219 25 August, 2022, 08:34:38

Thank you so much

user-d72f39 25 August, 2022, 10:52:01

hello

user-d72f39 25 August, 2022, 10:52:15

I need help with eporting the files I recorded

user-d72f39 25 August, 2022, 10:52:21

exporting*

user-d72f39 25 August, 2022, 10:52:48

on the website it says that the files are supposed to be saved in a folder named Exports

user-d72f39 25 August, 2022, 10:52:57

where is that folder exaclty?

user-d72f39 25 August, 2022, 10:53:52

Also when I open the player and type "e" I get the following message: " player - [INFO] launchables.player: Session setting are from a different version of this app. I will not use those."

user-d72f39 25 August, 2022, 10:53:59

can someone help please?

user-d72f39 25 August, 2022, 10:54:22

@wrp

wrp 25 August, 2022, 11:00:11

@user-d72f39 within the recording folder will be a folder named exports (see screenshot for example)

Chat image

user-d72f39 25 August, 2022, 12:15:35

is that from the player or the recorder please?

papr 25 August, 2022, 12:33:02

This screenshot shows a "recording" folder that was created with Pupil Capture and processed with Pupil Player.

user-26b243 25 August, 2022, 20:46:12

Hi @papr, I am collecting gaze position data for a lot of trials and creating surface tracking boundaries to let me know when/how many times the participant is not looking at the designated area. Is there a way to copy/standardize the location of the surface tracker boundary to all the other trials so that I do not have to create a surface for every trial?

papr 26 August, 2022, 06:52:55

Yes, set up the surface definitions on one recording and then copy the surface_definitions_v01 file over to the other recordings.

user-26b243 25 August, 2022, 22:48:31

I have another question regarding blink detection. Is there a way for data that is classified as a blink through blink detection to be excluded from the exported data files (specifically, gaze_positions_on_surface_...)?

papr 26 August, 2022, 06:54:13

There is no way to exclude that data before the export. But this tutorial shows how you can merge blink data with other exports https://github.com/pupil-labs/pupil-tutorials/blob/master/10_merge_fixation_and_blink_ids_into_gaze_dataframe.ipynb Once merged, you can easily discard the blink data

user-a2b8af 26 August, 2022, 02:02:54

Hi. Is the Pupil Mobile App now unavailable on Google Play? Anywhere can I download the apk?

papr 26 August, 2022, 06:55:45

Pupil Mobile has been discontinued for a while now. We recommend running Pupil Capture on a tablet instead.

user-598f9d 26 August, 2022, 09:20:33

hi I'm in Korea and I want to buy eye camera, but do I have any customs clearance issues?

papr 26 August, 2022, 09:21:07

Please contact info@pupil-labs.com in this regard

user-292135 26 August, 2022, 22:24:36

Hi, anyone have tried gaming portable PCs like Ayaneo with Core for portability? https://www.ayaneo.com/

user-2e1368 28 August, 2022, 21:35:05

Hey, Please see the screenshot of the blinks file. I have 22 blinks and there is a start time, duration, and end time. my question is How can I adjust the start time that allows me to start from the first second instead of 2214.4019? Thank You!!!!!

Chat image

papr 29 August, 2022, 06:03:23

Check out the info.player.json file. It contains the "start_time_synced_s" field. Subtract value from your "start_timestamp" and "end_timestamp" columns to receive the corresponding times since recording start.

user-2e1368 29 August, 2022, 14:16:10

Please see the screenshot of the info.player.json file. I changed the start time to 0 then I exported and still give me the same start time 2214.403, Thank you

Chat image

papr 29 August, 2022, 14:19:04

There has been a misunderstanding. You must not set this value to 0. Instead, you need to copy the original value and subract it from the timestamp values in your Excel application.

user-e7df9b 29 August, 2022, 14:41:08

Hello everyone!

I wanted to ask a couple of questions and get some advice from the community:

1.Calibration:The participants in an on-road research I'm doing will use the Tesla in both automatic and manual mode. The participant workload will be assessed using the eye metrics (eye blinks, fixation, total eyes off the road, etc). I've been calibrating the eye tracker with a single physical marker, however it appears that in this situation that isn't the most effective approach. The eye tracker will be worn by the participants while they are driving. What method would, in your opinion, be the most effective for calibrating the eye tracker to cover a greater area?

  1. Exported Data Analysis:To understand the exported data, I need to know where to start reading. I want to check to see if my data is accurate, but I don't know where to begin. What do you guys suggest I should start (besides documentation) to understand my data a little bit better?

Thank you in advance for your patience and help.

user-90ba8c 29 August, 2022, 15:25:06

Where might I find information on how gaze mapping is calculated. I can see lots of information about pye3d model and how it detects the eyeballs and pupils, however can't find any information about how this data is used to map a gaze into the world cam footage. Can anyone advise? Thanks!

papr 29 August, 2022, 15:26:11

That information is indeed not as well documented as pye3d πŸ˜• What level of information are you looking for?

user-90ba8c 29 August, 2022, 15:28:27

Nothing too in-depth - it's for an MSc dissertation so more a checklist like "Pupil Labs utilise pye3d model for pupil detection, which works by.... " (which i've got plenty of info on) and likewise need to make note something like "For the gaze mapping, such and such is done".

papr 29 August, 2022, 15:29:23

Are you using 2d or 3d calibration?

user-90ba8c 29 August, 2022, 15:29:29

3d

papr 29 August, 2022, 15:38:31

During 3d calibration, Pupil Core software uses "bundle adjustment" to estimate the physical relationship between the eye cameras and scene camera. To be more specific, it assumes a specific location of the eye model centers in scene camera coordinates and then rotates the eye camera coordinate systems such that the circle_3d_normal values point at the collected calibration target locations. The result are two transformation matrices that can be used to map new circle_3d_normal values from eye camera to scene camera coordinate systems (gaze rays).

During gaze estimation, the software tries to find the point that is closest to both gaze rays (gaze_point_3d) and projects it into 2d normalized coordinates (gaze norm_pos).

The implementation can be found here https://github.com/pupil-labs/pupil/tree/v3.5/pupil_src/shared_modules/gaze_mapping/gazer_3d

user-90ba8c 29 August, 2022, 15:47:22

Thanks very much! Super helpful!

user-b9005d 29 August, 2022, 17:19:36

Is there a way to take the exported mp4 world videos and put them back into pupil player? We are trying to undistort the world video and then use that version to calibrate gaze positions

papr 30 August, 2022, 07:38:48

Yes, but you might need to delete the _lookup.npy file and replace the _timestamps.npy file with the exported one, too. Also note, that you will probably need to adjust the world.intrinsics file, too. May I ask what you are trying to achieve with this setup?

user-934716 30 August, 2022, 17:10:40

Hey! As of today I have a problem using Pupil Capture. After approx. 1 minute the main world view window gets unresponsive and closes itself. There is no error message in the Pupil Capture console. According to the Windows error logs there is a problem with the libusbk driver and on some occasions I get a blue screen with "MULTIPLE IRP COMPLETE REQUESTS" error message. The problem is independent from the host pc and Pupil Capture version. I hope there is someone who can give me some advice. Thanks!!

papr 30 August, 2022, 18:22:13

Has anything changed? Have you installed any new hardware, updated any drivers or performed Windows Updates? Could you share the logs that claim libUSBk's fault? I have MULTIPLE IRP COMPLETE REQUESTS seen as an issue once before but I was not able to find it yet.

user-26b243 30 August, 2022, 17:56:40

Hi, I am trying to exclude blink data from the gaze_on_surface_ and I am using the blinks.csv file to determine the start and stop time of a blink. It is listed as start_timestamp. Do you know if this lines up with the world_timestamp or the gaze_timestamp?

papr 30 August, 2022, 18:19:34

Blinks are pupil datum based, i.e. the start_timestamp is not guaranteed to be part of the gaze_timestamps. But all these files use the same clock. You can discard all gaze samples that fullfill this condition:

start_timestamp <= gaze_on_surface.timestamp <= start_timestamp + duration
user-ae76c9 30 August, 2022, 21:38:07

Hi team, we are doing post-hoc calibration using recordings from the Core. Is there a way to adjust any of the eye camera parameters to get better pupil detection? Currently, Pupil Capture is unable to detect the pupil for a majority of the recording, despite a pretty clear eye image.

papr 31 August, 2022, 06:13:45

You can adjust pupil detection parameters by running the Post-hoc pupil detection in Player. Feel free to share an example recording with data@pupil-labs.com for concrete recommendations

End of August archive