πŸ‘ core


user-38b270 01 May, 2023, 09:40:30

what is the difference between Pupil core and Pupil Neon?

user-cdcab0 01 May, 2023, 10:08:09

Hi, @user-38b270! πŸ‘‹πŸ½

Both Pupil Core and NEON are wearable eye-tracking solutions that we offer (along with Pupil Invisible). Pupil Core was our original product. It's ideal for studies demanding high accuracy or pupillometry. It is fully open-source and can be extended and customized to meet research aims. A wealth of raw data is exposed to the user. Pupil Core requires calibration and a controlled environment such as a lab to achieve the best results. It connects to a laptop/desktop via USB. For more details see https://pupil-labs.com/products/core/

NEON is our newest eye-tracking product. It has a modular design that fits a variety of headsets and utilizes deep learning for calibration-free and slippage-invariant gaze estimation. This allows for long recordings in any environment, with no decrease in data quality. You can even take the glasses off and on without needing to re-calibrate. For more details, see https://pupil-labs.com/products/neon/

Pupil Core has been a reliable tool for researchers since 2014, providing accurate gaze, eye state, and pupillometric measurements. However, to achieve the best results, Core requires a controlled environment such as a lab. Neon's deep learning-powered gaze accuracy matches Core's and even surpasses it in certain conditions. This versatility makes Neon extremelyΒ powerful.

user-38b270 01 May, 2023, 10:22:33

And what is the prices difference?

user-cdcab0 01 May, 2023, 10:42:54

The prices are listed at the top of each product page. Right now, NEON is listed at €5900 and Core is €3440. Academic discounts are available if you're affiliated with an academic institution.

It should be noted that some import fees, tariffs, taxes, or other charges may be applicable depending on your region. You'll want to reach out to us via email at info@pupil-labs.com if you need more specifics about that for your region.

user-38b270 01 May, 2023, 10:44:03

Does that include the software and license? what would be the total for each?

user-cdcab0 01 May, 2023, 10:52:36

NEON comes with a companion device (included in the price) and uses an app that we make available for free. Recordings and data can be exported from the app and/or uploaded to our Pupil Cloud service (included).

Pupil Core requires an active connection to a PC running our Pupil Core software. We do not provide the PC, but we do provide the software which is free and open source.

user-6a684e 01 May, 2023, 12:45:39

You are right, AOIs will be a major part of my research interest but I still would like to see, for example, their first fixation duration within one of these AOIs. πŸ™‚

user-cdcab0 01 May, 2023, 12:51:14

Ah, I see. I believe there are data/research that shows that 4000ms is something of an upper limit for fixation duration. I don't have any references readily available, but I think you may find it worthwhile to review the literature in this regard

user-4334c3 02 May, 2023, 11:48:51

Hi, I'm using Pupil Invisible for an eye-tracking project. The goal is to have a stream of the Video, as well as the x and y coordinates of the gaze in ROS as two individual messages. The best way to do this, would be to have an android application on the phone that is connected to the glasses, that already sends this data in the right format. Are there any resources from your side, that could assist me to develop such an app? Thanks in advance!

user-cdcab0 02 May, 2023, 12:52:48

Hi @user-4334c3 πŸ‘‹ - since you're using Invisible, do you mind posting this to the πŸ•Ά invisible channel instead?

user-ded122 02 May, 2023, 14:05:45

hello, is it possible to retrieve the real time video stream from pupil core using the network api only? Or do I need to tinker with the source file here https://github.com/pupil-labs/pupil/tree/master/pupil_src/launchables

user-cdcab0 03 May, 2023, 12:40:54

Hi, @user-ded122 - yes, you can stream video with the network api. Here's an example: https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py

user-074e3a 02 May, 2023, 14:46:04

Hey there, I am using Pupil Core for a quantitative research study and I am still quite unexperienced with the software, so aplogies in advance if the questions are unclear. Within my study I am interested in the number of fixations of a certain AOI, duration of these fixations, and the pupil dilation/diameter of the participants. Participants will be divided in two groups, and they need to watch a one-minute advertisement where the AOI remains constant. My struggle right now is 1) creating these AOI's using the april tags, and 2) exporting the data. Since I will have at least 60 participants, my plan was to compare the average values of the number of fixations, duration of fixations and the pupil diameter to examine possible differences between the two groups. However, I am unsure how to obtain these average values based on the raw extracted data that the pupil player provides.

user-cdcab0 03 May, 2023, 12:45:30

Hi, @user-074e3a πŸ‘‹ - let's tackle your challenges one at a time. I believe you first mentioned struggling with defining AOIs using AprilTags. Have you followed the guide at https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking ? If not, please take a look. If you have followed that and are still struggling, can you tell us more about which part of the surface tracking process you're stuck at?

user-786f38 03 May, 2023, 12:33:46

Hi, I am Stefano Lasaponara from Sapienza University. I purchased Pupil Core a couple of months ago. Today for ethics purposes I need a certificate of conformity with EU standards, does anybody knows where to find it?

user-13fa38 03 May, 2023, 16:24:17

Another question, do you know which CPU on the market runs best with the eye trackers? Disregarding cost issues and we want to achieve 200 Hz performance if possible.

nmt 03 May, 2023, 19:45:44

We prefer Apple silicon - e.g. M1 with 16gb unified memory - we easily max out the sampling rate of Core with these

user-1ed146 03 May, 2023, 18:13:56

Hi there, I have some questions about data analysis. We have collected gaze data using Pupil Core from 10 participants. We want a heat map on a reference image for each participant's data, like what the reference image mapper enrichment does. I found this example here: https://docs.pupil-labs.com/alpha-lab/multiple-rim/, but it is provided with Pupil Invisible. So can you please point me to any repositories/examples if possible?

nmt 03 May, 2023, 19:44:22

Hi @user-1ed146 πŸ‘‹. Reference Image Mapper is only available for Invisible and Neon recordings, unfortunately as it runs in Cloud. For Core recordings, you can of course use the Surface Tracker Plugin + AprilTag markers to generate heatmaps: https://docs.pupil-labs.com/core/software/pupil-player/#surface-tracker

user-eeecc7 05 May, 2023, 08:31:42

Hi @nmt @papr Can you please let me know where I can find the reason behind why the bivariate polynomial is used? Is there a paper or study that I could refer for this?

user-9e9b86 05 May, 2023, 09:06:20

Hi @user-d407c1 @nmt - we want to use the glasses in shop alongs and soon after sit down with the participant to just playback/ show them where/what they were looking at. Is this easy to play back instantly? Or do we need to create a scanpath following the steps listed on the website

user-c2d375 05 May, 2023, 09:44:02

Hi @user-9e9b86 πŸ‘‹ I just replied to you by email. Please feel free to reach out if you have any further questions.

user-89d824 05 May, 2023, 10:17:05

Hi,

This is a new problem I've encounter in Unity - I'm unable to start the recording from Unity (last week, I still could) and I'm wondering this could be the culprit. I haven't touched any of my scripts in my projects for the past two months so it couldn't be that.

Please could you help me out with this? Thanks!

Chat image

user-c828f5 05 May, 2023, 20:23:24

Hi all, I want to know if the AddOns (especially HoloLens AddOns) have been spec'ed for gaze tracking accuracy and 3D pupil location accuracy? I understand that we don't have a "ground truth" 3D pupil location on real imagery but if anyone has attempted to spec out on CGI images? I also posted this on vr-ar but since the platform is Core, I didn't know which channel to post this question to.

user-b2f430 07 May, 2023, 02:28:45

Hi sir, I have problem with using camera in pupil_src. I can't use the camera to detect the image and calibrate the point of view. Please help me. Thankiu so much

user-b2f430 07 May, 2023, 05:42:57

ERROR eye0 - video_capture.uvc_backend: Could not connect to device! No images will be uvc_backend.py:133 supplied. meanwhile my device has uvc support and is not being used on any software

user-93ff01 07 May, 2023, 05:11:33

I didn't get an answer previously, so here are my questions again:

user-93ff01 07 May, 2023, 05:11:35

Hi All, we just got some uncased Pupil core systems that we want to try to adapt to use in non-human primates. We need to custom build our own helmet, and some 3D models of the eye and world cameras would be helpful for prototyping. Is that possible? Secondly, the pye3d model must have some constants that are based on human eye anatomy, would it matter on the substantially smaller eye diameters of non-human primates like macaques? Could we tune these parameters somehow? The interpupil distance is also much smaller in monkeys, and I suspect this may affect some of the 3D estimations that may use binocular data? And finally, as a user of Eyelink and Tobii Spectrum Pro hardware, I really appreciate the beautiful hardware and great software design coming from Pupil Labs, congratulations!!!

user-d407c1 08 May, 2023, 07:55:48

Hi @user-93ff01 πŸ‘‹ ! Thanks for the positive feedback! And sorry that we missed the first time you asked. Here you have the 3D models to help you prototype your own custom mount https://github.com/pupil-labs/pupil-geometry

Yes! pye3d is opensource, so you can go to the repository of pye3d https://github.com/pupil-labs/pye3d-detector fork it and modify it to your needs. As small guide, you can find these physiological parameters such as ranges for the location of the eye-ball, diameter of the eye ball, phi and theta... at https://github.com/pupil-labs/pye3d-detector/blob/master/pye3d/detector_3d.py#L575

You would probably not need to modify the pupil diameter range, but the sphere diameter (_EYE_RADIUS_DEFAULT) and potentially circle_3d_center (x,y,z) values at which one can expect it to be in the eyeball.

Specially if you are interested in binocular data, as the algorithms would estimate the distance from the eyeball's centre and the gaze vector to infer the binocular data.

Alternatively, and depending on what data you are looking for, you can use the 2D detector, although it would not be able to correct for slippage.

In short, yes, to all the questions. πŸ˜‰

user-93ff01 07 May, 2023, 05:13:35

TLDR: 1) do you have 3D models of the eye and world camera we can use to build a custom headset. 2) can pye3d model apply to non-human subjects given the constants dervied from human eyes. 3) does inter-pupil distance have any impact on your algorithms?

user-6cf287 07 May, 2023, 13:59:34

Hi there, I understand that the core tracker needs to be connected to a PC or laptop via USB but is there any possibility for it to be connected to a data logger that can allow it to be mobile? Do you have any recommendations? thank you.

user-cdcab0 08 May, 2023, 06:12:27

Hi, @user-6cf287 - if you're looking for a truly mobile solution, you really should consider Neon over Core

user-93ff01 08 May, 2023, 10:35:35

We may be able to get away with the 2D detector, but monkeys can be like kids and can move quite a bit and we are making a prototype helmet so we may certainly get some slippage, which was why the 3D detector looked more robust if applicable to non-human primate.

user-93ff01 08 May, 2023, 10:42:22

I am still a bit unsure of which parameter aligns with the inter-pupil (or inter-center) distance?

user-dd1963 08 May, 2023, 18:30:04

I am using Pupil Core. Currently, eyes cannot be detected. Are you familiar with this issue and any solution for this? πŸ‘ core @user-cdcab0 @user-92dca7 @marc

Chat image Chat image Chat image

user-cdcab0 08 May, 2023, 19:45:55

Hi @user-dd1963 πŸ‘‹πŸ½ - are you using a Core headset purchased from Pupil Labs or a DIY unit?

If it's a unit from us, it seems the drivers weren't installed properly. There are some troubleshooting steps here: https://docs.pupil-labs.com/core/software/pupil-capture/#windows

If it's a DIY unit, can you tell us more about the cameras you're using? The first two images indicate an issue with the drivers you're using or the hardware itself. That will need to be resolved before it'll ever work with Pupil Capture

user-1ed146 08 May, 2023, 20:43:08

Hi there, I have a question on the exported fixation data. In the data frame, I have values larger than one and values less than zero in the "norm_pos_y" column. However, the bounds should be x:[0, 1] and y:[0, 1] in 2D normalized space (the coordinate system). Can you please help me figure out what is going on here?

nmt 09 May, 2023, 05:46:51

Gaze/fixation coordinates can exceed the bounds of the scene camera's FoV, since the eye cameras are able to capture a bigger range of eye movements. Note that such data can also be of low quality, so it's worth checking the associated confidence values.

user-93ff01 09 May, 2023, 09:04:40

Hi, some of our experiments will take place in a dim room: does the world camera have an IR-cut filter or could we use IR illuminator in the room to help the world camera? Any other advice?

nmt 09 May, 2023, 16:44:01

The scene camera records visible light. What are your participants going to be looking at and how low is the ambient illumination?

user-93ff01 09 May, 2023, 09:21:52

Question 2: With the Eyelink 1000 we start and stop recording on every trial (with additional trial /sync markers to time sync properly), with our Tobii Spectrum Pro we were recommended to keep recording for the whole block and only send trial sync markers. For Pupil core, what is the preference, is it ok to start / stop recording before / after each trial giving the same session name (I didn't see a pause / unpause command), or record the whole session and use trial markers? It seems given we can record all camera data raw this might grow the files with lots of useless inter-trial video etc?

nmt 09 May, 2023, 16:34:17

How long are your trials going to be? We recommend to split your Pupil Core experiment into chunks to avoid slippage errors accumulating, and for managing the size of recordings, like you mention. Read more about that here: https://docs.pupil-labs.com/core/best-practices/#split-your-experiment-into-blocks

user-b2f430 09 May, 2023, 12:32:11

Hi QG TrαΊ§n TiαΊΏn 1665 πŸ‘‹πŸ½ were there any

user-6b1efe 09 May, 2023, 13:14:24

@nmt Hi! If I want to define a rectangular area of interest, should the tags added to the four corners of the rectangle be the same or different?

user-c2d375 09 May, 2023, 13:42:41

Hi @user-6b1efe πŸ‘‹ it is crucial to use unique Apriltag markers, so please ensure to add different markers to each corner of the area of interest

nmt 09 May, 2023, 16:41:02

For synchronisation, I'd recommend sending events over the network api. This example script shows how you can do that using Python: https://github.com/pupil-labs/pupil-helpers/blob/master/python/remote_annotations.py. Or use our LSL relay: https://github.com/labstreaminglayer/App-PupilLabs/tree/master/pupil_capture#pupil-capture-lsl-plugins

nmt 09 May, 2023, 16:45:42

I've responded to your question in the software-dev channel (it'd be great not to duplicate Qs in future πŸ™‚)

user-93ff01 10 May, 2023, 00:35:25

Using IR illuminators would solve this problem for us, but we are worried about the world camera (and any possible side effect on the eye cameras?)

user-660f48 10 May, 2023, 17:23:24

Hi Pupil Labs, just checking if this problem has been encountered by anyone: https://discord.com/channels/285728493612957698/1093713310261723217/1104725246440902658 Thanks in advance:)

user-f77049 10 May, 2023, 20:17:35

Hello I'm trying to use core device with pupilcapture 3.5.1 but i need to add audio and sync it with my pupillometry. I find in old request that this is possible using LSL. I'm really not familiar with coding etc ( i'm in my medical fellowship) So i m trying following every step... Unfortunately, plugin seems to be well installed in pupil capture but i always have the same problem which is : see the screen i tried to remove the flush fonction with "#" (sorry fisrt time for me) error is catched from resample fonction and all audio frames still droped

I think i missed something... if you can help me.. did someone succeed in recording audio ?

Thanks a lot

Chat image

user-fcd05e 11 May, 2023, 00:35:57

Hi, I am working with pupil data that was recorded using pupil labs hardware about 8 years ago. Does pupil player allow me to upload videos from the past or does the recording have had to be recent/live for me to analyze pupil dilation? If that is the case, does anyone have any recommendations for software that allows analysis of pupils based on past video recordings?

nmt 11 May, 2023, 08:55:21

Hey @user-fcd05e! It's a bit difficult to say without knowing the specific version the recordings were made with. In the first instance, I would make a backup of the recordings, and then just try to open them in Pupil Player. If it doesn't work, feel free to report back here πŸ™‚

user-d407c1 11 May, 2023, 09:04:55

To expand on @nmt answer: Over the last few years, there have been a lot of updates and improvements to Pupil Capture/Player. If your recording was made in an old version of Capture, it might not be supported by the latest Player versions.

To get all the latest fixes and improvements, you will need to upgrade the recording format. To do this, please follow these steps. Note that it's worth firstly only trying this with one recording to see if it works.

  1. Back-up data: To avoid any data loss, create a copy of the recording(s)
  2. Upgrade recording format: this operation is only possible by opening the recordings in Pupil Player 1.16 (see release notes at the link for further information)
  3. Get the upgraded recording folder and now you should be able to open it on recent versions of Player
user-59397b 11 May, 2023, 10:09:14

Hello everyone,

I've broken a piece (see photo) Do you know where I can order another or where I can find the file for the 3D printer?

Thanks

Chat image

user-cdcab0 11 May, 2023, 10:16:28

Hi, @user-59397b - I believe you'll find those here: https://github.com/pupil-labs/pupil-geometry

user-934716 11 May, 2023, 10:18:01

Hey there!

I want to use the export functionalities of the Pupil Player in a script for batch processing. I need to include marker/surface detection and gaze on surface mapping. Does anyone have any suggestions on what could be a good approach to build this pipeline using the Pupil repository? I'd also be happy if anyone has some scripts to share.

nmt 11 May, 2023, 11:33:19

Hey @user-934716! There is a community-contributed batch exporter, which looks like it handles existing surface data: https://github.com/tombullock/batchExportPupilLabs

user-52c504 11 May, 2023, 20:28:57

Hi, I used Pupil Core to collect data during sight interpreting tasks (in which participants read text printed on paper and interpret orally simultaneously) and want to calculate saccade lengths moving out from one fixation to another to see if there are various reading processes, e.g., scanning with relatively longer saccades while typical reading with relatively shorter saccades. I searched the chat history and found similar discussion. I'm not sure if the velocity-based approach would be fit for purpose so I intend to use the equation attached. I wonder if head movements will affect the results and if there's a way to account for them? Many thanks!

Chat image

user-480f4c 12 May, 2023, 15:03:49

Hey @user-52c504 πŸ‘‹ . This equation calculates the distance between consecutive 2D coordinates. If you are not interested in angular metrics (e.g., angular displacement, velocity etc) this is fine. Now, regarding your question about the effects of head movements, this depends on whether you have surface-mapped fixations or not:

  • If you do have surface-mapped fixations, using this equation could be useful, even in a head-free condition.
  • If you do not have surface-mapped fixations, and instead the fixations are only in scene camera coordinates, the distance between consecutive fixations calculated using this equation will be more sensitive to head movements/VOR and smooth pursuit.

I hope this helps!

user-f77049 12 May, 2023, 16:45:19

Hey, anyone using LSL can help me ? https://discord.com/channels/285728493612957698/285728493612957698/1105951795756421272 thank you πŸ™‚

nmt 12 May, 2023, 20:10:05

Hey @user-f77049 πŸ‘‹. Apologies, your message slipped off the radar!

It looks like your using the audio-rec branch of the Pupil Capture LSL repo. Unfortunately, this is a dev branch and may or may not work - in your case it's the latter and I don't have an obvious fix.

One workaround is as follows: 1. Run 'pupil_capture_lsl_relay' Plugin from the master branch: https://github.com/labstreaminglayer/App-PupilLabs/tree/master/pupil_capture – this will relay pupillometry over the LSL network 2. Use LSL AudioCapture app to capture and stream audio data over the LSL network: https://github.com/labstreaminglayer/App-AudioCapture 3. Record everything in LSL LabRecorder https://github.com/labstreaminglayer/App-LabRecorder

That method is a bit cumbersome, but will get you a unified collection of a raw audio waveform and pupil size. Let us know how you get on!

user-e2fd67 15 May, 2023, 11:44:13

Hello everybody! IΒ΄m want to use the Core-Tracker for my research project, however IΒ΄m facing a problem. When I open the Pupil Capture Software and I try to record, IΒ΄m not getting any recordings - not from the world camera and not from the eye cameras. The software says that it has recognized the world camera and the pupil cameras. When I record, it says that no world video has been captured (and I canΒ΄t find any pupil recordings in the file that has been created automatically). ItΒ΄s probably a very basic problem or even something hardware-related, but IΒ΄m completely new to the system and would love some help from you guys. Thank you in advance πŸ™‚

nmt 15 May, 2023, 12:24:57

Hey @user-e2fd67! What operating system are you using on your computer running Pupil Capture?

user-1c31f4 15 May, 2023, 20:47:53

Hey folks! Tomorow I'm starting a project that will use a pupil core to control some servomotors. I will need to get the gaze position in real-time and send it to an Arduino. I just want to make sure I'm starting in the right place -- should I be using the Network API (https://docs.pupil-labs.com/developer/core/network-api/)?

user-cdcab0 15 May, 2023, 20:55:22

Hi, @user-1c31f4 πŸ‘‹πŸ½! That sounds like a pretty neat project! The Pupil Core Network API which you linked to is exactly what you want for that πŸ™‚

nmt 16 May, 2023, 09:22:36

Glad to hear you're up and running now!

user-1c31f4 16 May, 2023, 21:20:32

Hey again folks -- is it possible to run the pupil core software on a Raspberry Pi 3?

user-cdcab0 16 May, 2023, 21:28:20

Hi, @user-1c31f4 - yes, we have had users report success running Core on Raspberry Pi's. You can find various discussions using Discord's search feature

user-1c31f4 16 May, 2023, 21:21:02

Or at least, receive the data stream from the Core Glasses?

user-cdcab0 17 May, 2023, 09:28:58

Another option is to use the Raspberry Pi to stream the video to another PC where Pupil Capture is running. Check out https://github.com/Lifestohack/pupil-video-backend

user-37b6ea 18 May, 2023, 16:40:50

Hello, I am new here.☺️ Glad to see you! I hope someone is willing to help me with a problem I am having with Pupil Core. The point is that I'm doing some aviation research and I made a recording with eyetrackers on an air traffic controller. Now I want to transfer the information to the Blickshift software for a detailed analysis. But at the time of export I have no fixations and I don't know how to fix this problem. Can anyone help me with some information or anything else?

nmt 19 May, 2023, 07:26:35

Greeting @user-37b6ea πŸ‘‹. Make sure you've enabled the fixation detector plugin in Pupil Player before you export: https://docs.pupil-labs.com/core/software/pupil-player/#fixation-detector

user-00cc6a 18 May, 2023, 17:08:55

Can we use reference image mapper with pupil core data?

user-00cc6a 19 May, 2023, 06:02:10

@user-cdcab0

user-c5ca5f 18 May, 2023, 23:40:54

Hi, So I am using Pupil Mobile app and Pupil Player. I want to know what would the sampling frequency for each camera for the data collected with Pupil Mobile. I suspect its is 30 fps since but is there a way to increase it ?

user-cdcab0 19 May, 2023, 08:12:21

Hi, @user-c5ca5f - it looks like you can change the framerate on Pupil Mobile for each camera. Open the pupil cam, then hit the hamburger button ☰ at the top left. Scroll down to frame rate and tap on it to see the available options.

user-2dcf9f 20 May, 2023, 22:33:42

Sorry for a basic question but is this something I would be able to use to build an eye tracker to speech or to use a computer for my aunt with locked in syndrome? Her insurance is denying the one she needs so I'm seeing if I can use something open source.

user-d407c1 22 May, 2023, 06:59:30

Hi @user-2dcf9f πŸ‘‹ ! Yes, you might be able to use Pupil Core for that, yet you would need an additional layer of software that you'd need to implement by yourself to do some of these things.

Other users have written some routines to use the computer's cursor with their gaze https://github.com/emendir/PupilCore-CursorControl or to communicate using simple signals.

Nonetheless, it might be preferable to go for a calibration-free system like Neon to avoid the nuisance of frequent calibration for your aunt.

If you need further information, do not hesitate to contact info@pupil-labs.com

PS. We will soon release an article in our blog that might be of your interest.

user-8e2030 22 May, 2023, 13:50:21

Hello, I would like to know the differences between Neon and Tobii Pro Glasses 3 with a 100Hz sampling rate. Specifically, what kind of data can be captured, whether there are any paid software for analyzing the data, and any other relevant information. Thank you.

nmt 22 May, 2023, 16:26:01

Responding in πŸ‘“ neon πŸ™‚

user-1c31f4 22 May, 2023, 19:15:17

Hey guys, I'm having a very bad time trying to run from source on a raspberry pi. I've worked through a bunch of 'collecting packages' errors, but now I'm getting errors following the 'Building wheels for collected packages: ndsi, pupil-detectors, pyglui, opencv-python, pupil-labs-uvc' stage.

user-1c31f4 22 May, 2023, 19:15:27

Here is the pastebin of the errors I'm getting: https://pastebin.com/SZs9nf2x

user-cdcab0 22 May, 2023, 21:12:50

Pip requires some build dependencies to install some of those packages. You probably just need to run sudo apt install libglew-dev libeigen3-dev libturbojpeg0-dev to get what you need.

Alternatively, you may be able to find pre-built wheels for those packages for Raspberry Pi

user-1c31f4 23 May, 2023, 05:50:14

This was it. Still no luck getting it running on the pi, but that fixed those errors!!

user-1c31f4 22 May, 2023, 19:15:57

If anybody could point me in the right direction, it would be very helpful. This is the first time I've tried to build any program from source, so these errors seem quite impossible to me.

user-04c6a4 23 May, 2023, 09:21:15

Dear author

I keep getting this error while running the code, but my environment is always configured. Is there any way to solve this problem

Chat image

wrp 23 May, 2023, 10:19:40

Looks like this was responded to here: https://discord.com/channels/285728493612957698/446977689690177536/1110509528082026566

user-04c6a4 23 May, 2023, 09:26:18

@user-d407c1

user-ded122 23 May, 2023, 11:47:52

Hello, is there a way to access the pupil core API without keeping the Pupil Capture GUI application running in the background?

user-d407c1 23 May, 2023, 11:53:33

Hi @user-ded122 ! that's what Pupil Service is meant for., depending on your needs, as it does not have scene video feed. https://docs.pupil-labs.com/core/software/pupil-service/

nmt 23 May, 2023, 15:27:46

Hey @user-ded122! Just to expand on Miguel's response, Pupil Service has no functionality for scene video, so you won't be able to calibrate and obtain gaze data (unless you're doing calibration in some VR scene). If you're just interested in pupillometry and eye state obtained from the eye cameras, Service will record those things.

user-1c31f4 23 May, 2023, 21:16:36

I hope that's helpful to people in the future!

user-cdcab0 23 May, 2023, 21:17:56

Keep us posted on how your efforts go!

user-1c31f4 23 May, 2023, 21:18:32

Will do! Has anybody here had success running pupil capture from source on an ARM chip?

user-93ff01 24 May, 2023, 02:35:57

No but we are probably going to try to buy some https://wiki.radxa.com/Rock5/5B (or other RK3588 board) soon as the CPU+GPU are way faster than the RPi-4 and hopefully have MESA support so we can test and feedback

user-93ff01 24 May, 2023, 02:36:27

LattePanda sigma looks great too, but much more expensive...

user-6e2996 24 May, 2023, 07:35:44

Hi Pupil Lab, Why I can't find the "forget password" button on the current Pupil Cloud login page? Any solution for this? Thanks.

nmt 24 May, 2023, 09:13:45

Looks like this was responded to in https://discord.com/channels/285728493612957698/633564003846717444/1110839688383692820

user-b83651 24 May, 2023, 13:47:27

hello. I bought a core product. I want to directly receive and output data from an ir camera using Python on Ubuntu. However, even with reference to related references, it is difficult to find a way to connect to a serial port to receive input data, convert it, and output it. where can i check it?

user-cdcab0 24 May, 2023, 19:08:46

Hi, @user-b83651 - have you tried using the Network API https://docs.pupil-labs.com/developer/core/network-api/ ?

You may find some helpful samples on this repo as well https://github.com/pupil-labs/pupil-helpers/tree/master/python

user-9a0705 25 May, 2023, 00:21:25

Hi everyone, IΒ΄m having an issue while running the codes as it seems there is a problem with compatibility in the version i have of python, as i keep getting this error: AttributeError: module 'packaging.version' has no attribute 'LegacyVersion'. the code in particular is the fixations detector I downloaded from the github associated with pupil labs: https://github.com/pupil-labs/pupil/tree/318e1439b9fdde96bb92a57e02ef719e89f4b7b0/pupil_src/shared_modules IΒ΄m wondering if I can fix this by downgrading the python version, which python version was used when writting this codes?

user-eb6164 25 May, 2023, 16:10:30

hi i have a question i was reading about the best practices to use the markers and it was mentioned that it should be set upward always? is that correct for me am using several markers to create a a squeeze surface on the tv screens so that i can capture everything inside. I assigned some markers downwards some to the left is that an issue?

user-eb6164 25 May, 2023, 16:11:19

Chat image

user-cdcab0 26 May, 2023, 03:42:47

Hi, @user-eb6164 - no, I don't think the orientation of the markers makes any difference. You can arrange and orient them however you want - even mix and match, as long as you don't change that after you define the surface. If you do change the orientation or position of any of the markers, you'll want to delete and redefine the surface

user-9f7f1b 25 May, 2023, 16:13:57

Hello, team! I have a question about 3D calibration. 2d gaze calibration/mapping works via polynomial regression and 3d calibration is calculated using bundle adjustment. As we can see, pye3d model outputs circle_3d_normals, why not map them to 3d ref points(we can fix ref points' depth, so we can map [x, y, z] normals to [X,Y] ref points), but use complex bundle adjustment algorithm?

user-a7d513 25 May, 2023, 17:22:38

Hi! I was wondering how can I configure a surface and obtain its related data (fixations_on_surface, gaze_positions_on_surface, etc.) via API or without directly using Pupil Capture and Pupil Player. Thanks a lot!

user-cdcab0 26 May, 2023, 03:50:04

Hi, @user-a7d513, I'm actually working on a project right now which has accomplished this. In short, the Surface Tracker class can be used without Pupil Capture, but the API isn't really made to be used stand-alone, so getting it done is a little bit hack-ish. I'll be wrapping up some of the harder bits into a separate Python package and we'll have a couple of demos to show how to use it.

We're still probably a couple of weeks out from making it publicly available if you can wait just a bit. Otherwise if you have more specific questions I can probably help answer them sooner.

user-b49db3 25 May, 2023, 23:13:01

Hi there. We have managed to break a wire that is going into the eye camera plug. Is it possible/easy to do a DIY fix ? e.g., is the white plug a standard size/type? Any help appreciated - this is a pupil core headset

user-93ff01 26 May, 2023, 02:29:10

Have there been any updates regarding running pupil core on arm64 macs natively? The instructions suggest to use an intel python, requiring quite a bit of extra setup (I use pyenv or conda-forge and getting them to build intel binary isn't straightforward)?

user-1aa180 26 May, 2023, 09:09:18

Hello, I'm am experiencing frequent crashes in the Pupil Capture app. I am using the Pupil Core device, and during recording, the application often gives me the warnings in the screenshot. Furthermore, the eye camera windows often freeze - in this case, I can 'unfreeze' them by manually selecting the camera source; if I don't do that in a timely manner, the window crashes and the Capture app will not let me open it again. I thought the issue could be related to high CPU usage (I'm running the app on a 11 gen i5 laptop), but it also occurs when the CPU is at low usage. Previous messages in this server hit at a low quality cable causing similar issues, but we are using the original cable that came with the glasses. Do you have any suggestions to prevent this issue?

Chat image

user-1aa180 29 May, 2023, 09:27:38

Hello, I am still experiencing this issue. Anyone could please help me out?

user-d99595 26 May, 2023, 10:51:26

Hi everyone, for my thesis we are using the Pupil Core. I have two questions regarding something me and my thesis buddy can't seem to figure out:

  1. does the core also record audio and if so, which setting do I enable. In Pupil Capture I have it selected on 'sound only' in general settings. However, I cannot hear any audio when I play back the recordings.

  2. We're specifically interested in pupil size measurements. After creating an export file we're looking into the pupil_positions csv file. After seperating the text to columns we'll get a whole bunch of data. Is pupil size among on of these columns?

Sorry in advance lol... we're complete eye tracking noobs and my supervisor has no clue as well πŸ˜…

user-d407c1 26 May, 2023, 10:58:58

Hi @user-d99595 πŸ‘‹ !

  1. Pupil Core has no microphone so it can't record audio by default. Yet, if you rely on audio, you can use iMotions software or the Lab Streaming Layer framework to record audio and sync it with the data.

  2. Yes, you should look for the diameter_3d column. Check out this link for a description of all fields on pupil_positions.csv https://docs.pupil-labs.com/core/software/pupil-player/#pupil-positions-csv

user-d99595 26 May, 2023, 11:23:33

Hi Miguel,

thanks for you swift response. So just to clarify, the diameters for both eyes are displayed in the same column?

user-d407c1 26 May, 2023, 11:57:26

you would need to filter by the column eye_id - where 0 stands for left and 1 for right eye

user-a246bc 28 May, 2023, 17:56:47

Hello, does Pupil Labs work for DIY headset units that only have one camera (eye camera with IR reflections)?

user-d407c1 29 May, 2023, 08:52:53

Hi @user-a246bc ! Depends, how do you plan to relate (calibrate) it to the environment without scene camera?

user-a246bc 28 May, 2023, 17:57:13

I understand that would assume static head position, but I would like to experiment nonetheless.

user-9f7f1b 29 May, 2023, 05:14:22

HI, team! Does anyone have time to reply this question? "Hello, team! I have a question about 3D calibration. 2d gaze calibration/mapping works via polynomial regression and 3d calibration is calculated using bundle adjustment. As we can see, pye3d model outputs circle_3d_normals, why not map them to 3d ref points(we can fix ref points' depth, so we can map [x, y, z] normals to [X,Y] ref points), but use complex bundle adjustment algorithm?"

user-d407c1 29 May, 2023, 09:07:54

Hi @user-9f7f1b, I just wanted to understand better what you mean, do you want to map the circle_3d_normal output of pye3d onto a sphere of fixed diameter (at a given depth) using, for example, some polynomial regression? Is that right? Note that this approach may have issues as it will be limited to only one distance, and the polynomial mappers may not work so well with slippage.

To estimate gaze using the 3D Mapper, we use a geometric method that involves intersecting visual axes. This requires knowing the eyeball position and gaze normals in SCENE camera coordinates. The bundle adjustment is used to determine the relative poses of the eye cameras with respect to the SCENE camera as in Pupil Core, they can be adjusted.

user-1aa180 29 May, 2023, 09:36:57

thank you!

user-eb6164 29 May, 2023, 20:28:42

i am having an issue today with pupil capture it is not allowing me to set the fixation threshold even i watched the video of recorded eye tracking all was what i fixate on is wrong so i discovered that minimum fixation seemes emtpy or low and i cannot change it. I do not want to reinstall the software because i already defined my surfaces what could be done? i attached an image

user-eb6164 29 May, 2023, 20:31:08

Chat image

user-eb6164 29 May, 2023, 20:35:08

also if i need to reisntall everything is there a way i can backup my surfaces and use them again? any clear guide for this?

user-cdcab0 30 May, 2023, 04:33:28

You'll find your settings in C:\Users\<username>\pupil_capture_settings, including surface definitions. You can back up those files if you want to be safe

user-eb6164 30 May, 2023, 20:06:46

thank you

user-5fbe70 30 May, 2023, 08:17:33

pupil-tutorials

user-74c615 30 May, 2023, 10:12:50

hi !! I am wondering if pupil invisible could record pupil size data ? i cant find this data in those csvs. Thank you!!!!!

user-d407c1 30 May, 2023, 10:15:27

Hi @user-74c615 ! Pupil Invisible does not perform pupillometry. The eye cameras are off-axis and for some gaze angles, the pupils aren't visible. This is not a problem for the gaze estimation pipeline as higher-level features in the eye images are leveraged by the neural network, but it does make it difficult to do pupillometry robustly

user-74c615 30 May, 2023, 10:13:05

πŸ‘€ πŸ‘

user-74c615 30 May, 2023, 10:13:28

@user-cdcab0

user-74c615 30 May, 2023, 10:22:00

thank you for your reply!!!

user-74c615 30 May, 2023, 10:31:27

i have another question ? dose core has this ability?

user-d407c1 30 May, 2023, 10:33:51

Yes, Pupil Core and Neon (in the future) can do pupillometry.

user-74c615 30 May, 2023, 10:35:20

ok ! thank you !

user-740d7c 30 May, 2023, 12:59:36

Hi everyone, I'm pretty new with Unity and Pupil Core so I'd be glad if someone could help me! Is it possible to use a totally custom-made calibration instead of the one that pops up when I connect my pupil core glasses and start the unity project? How can I separate the calibration from all the other script that I need? is there an efficient way to do it? Thanks in advance!

nmt 30 May, 2023, 19:12:57

Hi @user-740d7c! The demo calibration scene can be modified, i.e. it's possible to change the number and positions of the reference targets: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#calibration-settings-and-targets. Would that be sufficient for your needs?

user-744321 30 May, 2023, 13:16:18

Hello, I'm trying to find out the accuracy of the calibration for a recording but I can't find it in the data. Can I access it directly? thank you !

nmt 30 May, 2023, 19:19:04

Hey @user-744321! You'll need to recompute the accuracy and precision values. Here's a tool you can use for this purpose: https://github.com/papr/pupil-core-pipeline

user-5fbe70 30 May, 2023, 13:17:42

Hi@papr ! This is a set of experimental data. I am using Aggregating Fixation Metrics Inside AOIs, how do I export the sections.csv file? thank you !

Chat image

nmt 30 May, 2023, 19:22:39

Hi @user-5fbe70! Would you be able to clarify what Aggregating Fixation Metrics Inside AOIs is? Are you referring to this Alpha Lab content: https://docs.pupil-labs.com/alpha-lab/gaze-metrics-in-aois/#define-aois-and-calculate-gaze-metrics ?

user-908b50 30 May, 2023, 21:51:48

What do negative values for the absolute time cell mean? This happened for 4 of my participants. I am using those values to calculate duration in my code.

nmt 31 May, 2023, 10:32:38

Pupil Time can indeed be negative. Read more about that here: https://docs.pupil-labs.com/core/terminology/#timestamps

user-908b50 30 May, 2023, 22:03:32

For these participants, the gaze timestamps, base data, and gaze point (x and y), as well as eye centre (xyz) coordinates are all negative.

user-5fbe70 31 May, 2023, 01:56:02

@nmt yes!I found that Define AOIs and Calculate Gaze Metrics only work for the Products used: Pupil Invisible, Neon, and Cloud. I'm using Pupil Core, can't get the sections.csv file.😭

nmt 31 May, 2023, 10:35:15

Yes, that tutorial is meant for post-Reference Image Mapper enrichments using Neon or Invisible. That said, you can use the Surface Tracker Plugin in Pupil Core to map your gaze to flat surfaces in the environment: https://docs.pupil-labs.com/core/software/pupil-player/#surface-tracker. The plugin gives you the option to generate different AOIs. Metrics would need to then be computed, but it shouldn't be too difficult πŸ™‚

user-7a517c 31 May, 2023, 11:10:12

Hello! Under the support webpage, I see that an Onboarding Workshop is included with Core / Invisible. May I ask if anyone knows how to register for this workshop? Many thanks πŸ™‚

nmt 31 May, 2023, 11:12:52

Hey @user-7a517c! Please reach out to info@pupil-labs.com with your original order id and someone will assist you with getting that scheduled!

user-7a517c 31 May, 2023, 11:18:21

Thank you @nmt πŸ™‚

user-6cf287 31 May, 2023, 15:09:52

Hi, I have a question regarding the eye tracker calibration. We are having studies where the screen is projected to a wall and is about 2-2,5 meters away from the participant. How should we calibrate the eye tracker for this situation. Do we just use the card or do we also have to place the calibration points on the screen digitally or manually? the second question is do we need to calibrate the eye tracker for every participant or is it enough to calibrate it once and save the settings? thank you.

nmt 31 May, 2023, 16:38:49

Hey @user-6cf287 πŸ‘‹. Either would work. The key point is to make sure you capture sufficient gaze angles to cover the screen as part of the calibration choreography. You'll need to calibrate each wearer. We recommend a new calibration whenever they put on the headset.

user-1c31f4 31 May, 2023, 17:24:07

Hey folks, very quick question -- what version of python do you reccomend I use to use the network API without running into dependency issues?

user-cdcab0 01 June, 2023, 01:34:55

Are there specific dependencies you're concerned with? pyzmq requires at least Python 3.3, but it looks like msgpack recently dropped support for Python 3.6. Generally I'd recommend using the latest stable Python release you can, but it seems you need to be at least on 3.7 to use the network API. If you need to use Python 3.6 or ealier, it seems you'll need to install an older version of msgpack as well.

End of May archive