core


user-f641a4 02 September, 2022, 08:07:38

Hi, i s it possible to subscribe to the normalized fixation points on a surface directly using zmq filter such as "surfaces.surface_one..."? If not, are there any implementations available for reference of this filtering in c++? Thanks#

papr 02 September, 2022, 08:36:55

Hi, you will need to subscribe to surfaces..... It will include the surface-mapped fixations if the realtime fixation detector is running. The field should be named fixations_on_surfaces instead of gaze_on_surfaces

user-f641a4 02 September, 2022, 08:49:02

so, i want to subscribe to fixations on surface named a, it would be

surfaces.a.fixations_on_surfaces?

papr 02 September, 2022, 09:13:56

No, the fixation data is not published in a message of its own. You need to subscribe to surfaces.a and then extract the fixations from the message (see the fixations_on_surfaces) field

user-8697b3 02 September, 2022, 11:14:23

Hi, is the pupil core MRI compatible?

nmt 02 September, 2022, 11:30:15

Hi @user-8697b3 👋. Would you be able to elaborate a bit on your proposed use of eye tracking and MRI compatibility?

user-632640 02 September, 2022, 21:59:21

Hi! I’m a beginner to this and haven’t messed with any code. I’m just trying to use the Downloadable software pupil capture for macOS and I’m having trouble. When I switch to local USB it’s telling me that the cameras are in use or blocked. How do I fix this?

papr 03 September, 2022, 07:57:48

Please see https://github.com/pupil-labs/pupil/issues/2240

user-6e1219 03 September, 2022, 06:58:05

Hello, How I can connect and use my eyetracker with smartphone?

papr 03 September, 2022, 07:58:57

Pupil Core requires a tablet or computer running Pupil Capture. 🙂

user-b7599b 04 September, 2022, 11:35:07

hi I made a code to extract gaze and pupil data from unity, however it for some reason only works when used in conjunction with old pupil capture software (1.0) and when using pupil capture 3.51 it keeps giving me the error message "notification without timestamp will not be saved". Does anyone have any idea how I can fix this?

papr 05 September, 2022, 06:22:06

Hi 🙂 I am not sure if the two things are related. Could you explain what your code does in more detail?

user-484421 05 September, 2022, 10:05:17

We have a loose wire (the red on) on the eye-tracking camera of out pupil core. How do we go about getting it fixed?

papr 05 September, 2022, 10:05:41

Hi! Please contact info@pupil-labs.com in this regard

user-90ba8c 05 September, 2022, 16:53:47

Can anyone tell me what is meant by Outlier Threshold in the validation section of pupil player and why its default is 5 degrees?

user-9429ba 07 September, 2022, 12:23:01

Hi 👋 For post-hoc calibration, Pupil Player removes samples greater than the outlier threshold from its calculation. The default is samples with a larger angular error than 5.0 degrees, but this can be adjusted. If too many samples are disregarded from the below result this could indicate a problem with the camera set up. You can check out this explainer video for more information: https://www.youtube.com/watch?v=_Jnxi1OMMTc&feature=emb_rel_end

user-80123a 06 September, 2022, 13:40:56

Hi team, I'm with Pupil. I tried a few lines of code to extract the position of the eyes and the gaze direction. I received a 3D position (x,y,z). What I couldn’t find is the position of the origin. Can anyone help me with that? Thanks in advance 🙂

papr 06 September, 2022, 13:49:03

Hi! What file are you using as input?

user-59c06b 06 September, 2022, 15:10:32

Hi, My question is probably answered somewhere but I can't find it. Does pupil labs use a mix of bright and dark pupil methods?

papr 06 September, 2022, 15:11:54

No, Pupil Core only looks for the the dark pupil. Reflections are not used.

user-80123a 07 September, 2022, 12:04:24

Hi all, could anyone help me for launching the python source code of Pupil - eye tracking platform? I followed the two instructions 1) "> python -m pip install --upgrade pip wheel" 2) "> pip install -r requirements.txt". During the installation, I got one error with the setuptools>61.0. And when I launched the application "> python ./main", I got an error calling git. Thanks in advance

papr 07 September, 2022, 12:07:28

Hi! Please note that you need to clone the source code. It is not sufficient to download the zipped archive.

Which operating system and Python version are you using?

user-969151 08 September, 2022, 01:28:26

Hi, I am learning to use Pupil Core for the first time and have a question about the pupil detection and calibration. I followed the Pupil Core Quick Start guide but during the pupil detection the circles around my eyes are blue not green. Any suggestions on how to resolve this? Also, after calibration, I had a thin green rectangle with orange figures in each corner and a red circle in the middle. Refer to image below. Does anyone know what this is?

Chat image

nmt 08 September, 2022, 09:58:38

Hi @user-969151 👋. No worries about the blue circle (we update the colour scheme in our software) – check out the legend in the pye3d detector settings by clicking on the 3d icon in one of the eye windows. For a description of what the green and orange visualisations represent, open the 'Calibration' menu by clicking on the little target icon in the right of the Capture window. The red circle is your gaze point 🙂

user-2ab654 08 September, 2022, 10:20:51

Hi ✌️ I am trying to calculate head position angles via the Head Pose Tracker Plugin, while looking on a 2d screen. I put 3 april tags on my screen but the calculation of the 3d Modell uses just one of the april tags. I think this is because the other two april tags are not visible all the time but I dont see why this would be a problem. Is there a way to manually select the markers or is this no problem as I am just looking at a 2d window anyway? Thanks in advance 😃

Chat image Chat image

nmt 08 September, 2022, 13:30:42

Hi @user-2ab654! It looks like your markers are being detected, as shown by the green semi-opaque overlay. What you'll need to do is build up the 3D model by recording the markers from more angles and perspectives. Do that enough and the markers will appear red. Note that you can then use that model for other recordings (as long as the same markers are present of course). Relevant docs for reference: https://docs.pupil-labs.com/core/software/pupil-player/#head-pose-tracking

user-80123a 08 September, 2022, 13:51:22

Hi all, I want to add the pupil core application to an existing Unity Application. So I added a button in the Unity Application to launch the pupil core (DID IT). Now I want that the pupil core is running in the background with no graphical interface. In other words, is there a clean way to remove the UI from the pupil core? The goal of putting the pupil core in the background is to allow me to collect data. Thanks in advance 🙂

papr 08 September, 2022, 13:52:39

There is a --hide-ui flag that you can pass when you call the exe. This will create a hidden window.

user-9fa187 09 September, 2022, 07:35:36

Hi, how are you I came to Germany from Mongolia and wrote an e-mail to get acquaint with your company and products, but I can't get in touch. What shoud i do?

wrp 09 September, 2022, 07:39:42

Hi 👋 - I apologize that you haven't received a response yet. Please send us another email to info@pupil-labs.com and reference your discord user name. We are also happy to continue discussion here if relevant to the channel 😸

user-80123a 09 September, 2022, 11:20:07

Hi all, I encountered difficulties running the pupil core from the source code. 1) When I export the data, I do not find any data on the gaze_positions, but the rest of the data is OK (purpil_positions, world_timestamp,...). 2) I do not receive the stream data from subscribe('gaze'). These two problems do not exist when I run the app from the installed program. Would anyone have any idea? Thank you in advance 🙂

nmt 09 September, 2022, 12:02:13

Hi @user-80123a! The behaviour you describe is usually indicative of no calibration. Did you calibrate the eye tracker when running from source?

user-80123a 09 September, 2022, 12:06:32

I forgot, I cannot use calibration because the Eye Tracking I use right now does not have a frontal camera

nmt 09 September, 2022, 12:11:06

To get gaze data, it is necessary to calibrate. Check out the docs for future reference: https://docs.pupil-labs.com/core/#_4-calibration Indeed, a scene camera is also necessary 🙂

user-80123a 09 September, 2022, 12:09:04

But I will use a new eye tracker soon, and I hope my code will be ready when the new eye tracker arrives

user-ff97c9 12 September, 2022, 08:32:12

Hello, best productive days to everybody 🙂

I'm going through the documentation for a recent query in our lab regarding the fixation export file, fixation.csv.

Does the frame index of the video begin with 0 or 1; in that, if the first fixation is, say, at the 4th frame, is it then the 5th or the 4th frame of the video?

We're developing a frametime critical approach in our present work, a single frame is very important in this for us as a result. Hence the basic question, but it is not written explicitly in the documentation.

papr 12 September, 2022, 08:39:55

It starts at zero 🙂 You can verify it in Pupil Player. Player displays the current world frame index in the timeline, next to the relative world video time, as well as information about the current fixation (incl the fixation id) in the fixation detector menu.

papr 12 September, 2022, 08:49:47

May I suggest an alternative approach that may resolve your experienced variance?

Instead of comparing frames, use the fixation start timestamp. Fixations are based on gaze data, which has a higher temporal resolution than the scene camera. Assuming an event at scene video frame index J and time T, do not calculate the number of frames until the next fixation (fixation->start_frame_index - J) but the duration in seconds until this fixation (fixation->start_timestamp - T)

user-ff97c9 12 September, 2022, 08:44:46

Great, thank you very much for your swift support as always 🙂

The variance in frame beginning and end points in the fixations output file have somehow become inconsistent in our current data set of about 40 participants. I'll most likely follow up soon with more detailed questions which I hope will benefit everybody.

Best wishes.

papr 12 September, 2022, 08:45:25

What do you refer to by "frame beginning"?

user-d72f39 12 September, 2022, 10:54:45

Hello

user-d72f39 12 September, 2022, 10:55:27

Can you please provide me with a step by step instructions of how to export pupil size data from my recording in an excel format?

user-9429ba 12 September, 2022, 13:00:07

Hi! To export Pupil Size data: Load your recording in Pupil Player -> Make sure you have the raw data exporter plugin enabled from the settings menu, and click on the Export button (down arrow) on the far left hand side of the Player window. This will export .csv files you can open in excel. Full breakdown of the pupil_positions.csv export here: https://docs.pupil-labs.com/core/software/pupil-player/#pupil-positions-cs

user-d72f39 12 September, 2022, 10:56:01

I used the core device to do the recording

user-219de4 12 September, 2022, 20:50:34

Hello! Where can I find the audio setting in Pupil Capture? I rendered the videos, and they have no audio input or audio.timestamp files.

papr 13 September, 2022, 06:47:20

Hi, we have removed audio-recording capabilities from Capture ~3-4 years ago. The feature did not work stable and reliable enough. I would recommend recording the audio with an external audio recording app.

user-219de4 13 September, 2022, 14:25:44

Our study aims to observe children’s gaze in response to partner’s verbal labels during interaction, so we need the full transcript to further identify the key timings (it’s not fixed as other experimental paradigms). Do you know any audio program can receive external triggers?

papr 13 September, 2022, 14:33:27

I think the easiest option (if your experimental design allows it) would be to record a clearly visible and audible "clap", e.g. with something similar like a clapperboard. Then it is just a matter of finding the clap time within the audio recording and subtracting it from the transcript timestamps. This shifts the relative time starting point from file-beginning to clap event.

Next, find the frame index during which the clap happens and extract its absolute timestamp. Add the absolute time to the relative manuscript time, shifting the manuscript time to pupil recording time.

user-b9005d 13 September, 2022, 17:42:55

In your pupil positions export, are your theta and phi outputs using the physics or mathematics convention? We were hoping to use your sphere radius output in combination with these two to generate a calculation for degrees of eye movements

papr 13 September, 2022, 18:30:48

The sphere radius is a constant value. The phi and theta are not to convention but rotated by a specific amount. Let me look up the link

papr 13 September, 2022, 18:31:39

https://docs.pupil-labs.com/core/terminology/#coordinate-system see the eye model section

user-80123a 15 September, 2022, 08:25:01

Hi all, is there a metric to identify if the calibration went well or not? Similar to the confidence parameter metric in the csv data, after exporting with pupil player, but can be used before starting to record. Thanks in advance.

papr 15 September, 2022, 08:26:08

Hi! The Accuracy Visualizer calculates the accuracy in angular error. That should be suitable.

user-80123a 15 September, 2022, 09:01:40

Hi again, another question, can I subscribe to another topic than "gaze"? For example, if I want different information like pupil coordinates or angular error precision... If there is a list of topics to subscribe to. Thanks in advance

papr 15 September, 2022, 09:06:15

Yes, subscribing to other topics is possible. If you subscribe to the empty string, you will receive everything. Use it to find the data that you are interested in and then use a more specific subscription string later

user-6971cf 15 September, 2022, 13:57:07

Sorry I am new to this project, does this software require a infrared camera or would a regular dektop camera be sufficient to use the eye tracking technology

papr 15 September, 2022, 13:58:56

Hi! Welcome to the community. Yes, an IR camera is required. It is also required that it is head-mounted. A camera on the display/table in front of the subject (remote eye tracking) will not work with our software.

user-6971cf 15 September, 2022, 13:57:18

also hi everyone!

user-6971cf 15 September, 2022, 14:14:29

Ok thank you!

user-52dfe7 15 September, 2022, 14:57:26

Hello and congratulations for the very nice work! I am interested in the pupil core system, I was wondering if it can be used with an ARM-based/embedded systems (e.g. a zedboard) instead that a normal desktop/laptop?

papr 15 September, 2022, 14:58:37

Hi, the pre-build software is only compiled for x86_64. You would need to install the dependencies and run from source. Note, that the board is likely not powerful enough to run the pupil detection in real time

user-52dfe7 15 September, 2022, 15:00:57

I see thanks for the answer - is all the source code available? Speaking about power, does the system require a GPU?

papr 15 September, 2022, 15:03:07

https://github.com/pupil-labs/pupil no gpu is needed for the calculations. I am referring to CPU power in this case.

Note that the software needs to be able to open a window. So running it on an embedded system without a screen might not work.

Raspberry Pi users have used this tool https://github.com/Lifestohack/pupil-video-backend/ in the past to stream the video from the RPi to a laptop running the software

user-52dfe7 15 September, 2022, 15:06:31

ok thanks a lot for the answers and the links I'll check them to get a better idea - it's a very interesting system!

papr 15 September, 2022, 15:07:13

Would you mind sharing a bit more about what you are trying to accomplish / your use case?

user-52dfe7 15 September, 2022, 15:11:02

Broadly speaking, I would like to understand how easy would it be to use it in an embedded/ARM based-system with a camera and a projecting screen

papr 15 September, 2022, 15:12:48

What type of camera are we talking about here? Something like a webcam?

user-52dfe7 15 September, 2022, 15:14:13

more like a wearable device - I am afraid I cannot go into more details for the moment

papr 15 September, 2022, 15:15:12

ok, just wanted to make sure that you were not attempting to perform remote eye tracking, which is not possible with our software. Good luck with your project!

user-b10192 16 September, 2022, 01:57:44

Hi there, From pupil service, I subscribe the frame and trying to get the image and wanna save by myself. import zmq import msgpack ctx = zmq.Context() pupil_remote = ctx.socket(zmq.REQ) ip = 'localhost' port2 = 50020 pupil_remote.connect(f'tcp://{ip}:{port2}')

Request 'SUB_PORT' for reading data

pupil_remote.send_string('SUB_PORT') sub_port = pupil_remote.recv_string()

Assumes sub_port to be set to the current subscription port

subscriber = ctx.socket(zmq.SUB) subscriber.connect(f'tcp://{ip}:{sub_port}') subscriber.subscribe('frame')

while True: array = subscriber.recv_multipart() item = msgpack.loads(array[2]) print(item) break

and I get the array which contains these 3 elements. [b'frame.eye.0', [email removed] b'\xff\xd8\xff\xc0\x00\x11\x.........................']

how can I translate this into an image or numpy array that I can save as an mp4 video? because I am receiving this error when I tried to load the 3rd element with msgpack item = msgpack.loads(array[2]) File "msgpack_unpacker.pyx", line 202, in msgpack._cmsgpack.unpackb msgpack.exceptions.ExtraData: unpack(b) received extra data.

papr 16 September, 2022, 07:00:08

Hi, check out this example https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py

user-80123a 16 September, 2022, 09:31:56

Hi all, is there a possibility to develop the Network API with a different programming language? In C# for example.

papr 16 September, 2022, 09:32:57

Yes, that is possible. https://github.com/pupil-labs/hmd-eyes/ has a c# implementation of the api

user-b10192 16 September, 2022, 17:36:42

Hi all, Thanks to @papr , I can get image from Network API. I got another problem. I could not get gaze information anymore. I was receiving gaze data previously from the network by using subscriber.subscribe('gaze'). But now I am receiving the following topic: pupil.0.2d pupil.0.3d frame.world frame.eye.0 How can I get back gaze information or gaze topic? Do I need to change anything in pupil capture?

papr 16 September, 2022, 17:43:29

Hi, you need to calibrate to receive gaze data.

user-219de4 16 September, 2022, 23:31:19

Hello, what kind of material do you use to print the camera extender?

user-53a74a 19 September, 2022, 15:21:55

Hi Pupil Labs community! I'm thinking to use a chin rest along with Pupil Core. Are there any commercially available chin rests that you folks recommend?

user-fcda8b 19 September, 2022, 15:34:12

Hi, I’m getting a lot of messages like this from Pupil Capture:

2022-09-19 16:47:33,981 - eye0 - [DEBUG] video_capture.uvc_backend: Received non-monotonic timestamps from UVC! Dropping frame. Last: 19869.178274, current: 19865.838207

How can this be and is there anything that can be done about it? It seems to disrupt pupil recognition quite severely. (I’m running from source on Mac OS Monterey in native (arm64) mode.)

papr 19 September, 2022, 16:25:49

Hey, this usually happens after a reconnect. You might have a loose connection

user-fcda8b 19 September, 2022, 16:42:00

Thanks for your quick reply. But there is no external connection, everything is running locally on the same machine. How could a local connection become loose?

papr 19 September, 2022, 16:51:03

I am talking about the physical connection of the cameras. Sometimes the connectors become loose. I recommend checking the headset

user-fcda8b 19 September, 2022, 17:00:01

Ah, I see. Thanks, I'll check.

user-219de4 19 September, 2022, 18:02:13

Hello! I am trying to refresh my questions regarding the headset extender. I found the geometry map, but not sure of the appropriate material for printing. PLA, ABS, nylon? Do you have any advice?

user-fcda8b 19 September, 2022, 18:06:30

I recently printed it in PLA. Works nicely!

user-219de4 19 September, 2022, 19:52:21

great to know! thanks for sharing! 😀

user-632640 20 September, 2022, 22:41:33

Hi! Wondering what type of machine/laptop everyone is using for the pupil core software/ what works best to avoid overloading/crashing that I’ve been noticing on my Mac laptop.

papr 21 September, 2022, 09:04:51

The M1 macs work best in terms of performance. There is a known issue on Unix systems that may cause a crash if the hardware disconnects unexpectedly. If you are encountering the crash often, you might have a loose connection in your hardware.

I am working on debugging the mentioned software issue as we speak.

user-2ab654 21 September, 2022, 08:57:32

Hi guys ✌️
I have Recordings that are about 25 minutes long and try to do marker detection to extract the head-pose with the extension. The problem is, that the Pupil-Player keeps crashing. Does anyone know if there is a way to resolve it? The problem shouldnt be cpu, gpu or ram as it uses between 5-10 % on these. Have a great day!

papr 21 September, 2022, 09:05:56

Hey! Could you share the player.log file after having attempted the marker detection and reproducing the crash? You can find it in the pupil_player_settings folder.

user-2ff80a 21 September, 2022, 13:04:26

Hello, A wire to one of our pupil-cams broke, and we would like to fix it. I figured out, that I have to get it crimped. Do you maybe have the serial-numbers of the connector parts, or do you even sell replacements? Thank you.

Chat image

papr 21 September, 2022, 13:05:14

Please contact info@pupil-labs.com in this regard

user-2ff80a 21 September, 2022, 13:57:53

thanks

user-193e84 21 September, 2022, 18:31:58

Copying this to the correct thread: https://discord.com/channels/285728493612957698/633564003846717444/1022212302986031218

papr 22 September, 2022, 06:08:18

Hi, this sounds like a hardware issue with the headset. Can you make sure the small connector at the right eye camera is correctly plugged in?

user-ac1446 22 September, 2022, 15:31:01

Is the stated discrepancy in accuracy/precision between VR/AR and core true in real world scenarios, if so, is there a technical reason for this?

user-fcda8b 24 September, 2022, 15:40:35

I noticed that the pupil detection parameters in the eye windows (such as "Pupil intensity range") are not persistent across sessions, unlike the plugin parameters for the video source, for example. Is there a way to change that?

papr 26 September, 2022, 11:14:29

I am able to reproduce the issue. Let me check as to why that is.

papr 26 September, 2022, 11:47:35

Hi, good catch! The issue will be fixed in the next Pupil Core release. Meanwhile, you can use this plugin to fix the issue https://gist.github.com/papr/9b3ba71b227bc6f092088077b334cf9c

user-80123a 26 September, 2022, 08:26:39

Hello everyone, when I use the pupil player, I can see a video of my gaze on the screen (in 2D). But when I export and open the gaze_position.csv file, I see a 3D gaze position (X, Y, Z). My question is: is the gaze position played on the pupil reader video simply the (Z and X) or (Z and Y) coordinates of the gaze position? Or is there a geometric transformation to translate the 3D coordinates into 2D coordinates? My goal is only to get the 2D coordinates of the gaze position. Thanks in advance 🙂

papr 26 September, 2022, 08:39:52

There is a geometric transformation involved that corrects for the lens distortion https://docs.pupil-labs.com/core/terminology/#coordinate-system

papr 26 September, 2022, 08:43:13

The norm_pos_* fields are what you are looking for

user-c8ad1f 26 September, 2022, 11:26:51

Hello, -- New user here. I wanted to know if there is an existing script to stream live data into MATLAB. thanks

papr 26 September, 2022, 11:27:29

Hi, welcome! Check out https://github.com/pupil-labs/pupil-helpers/tree/master/matlab

user-fcda8b 26 September, 2022, 12:39:26

It happens irregularly, but on average every 10 minutes. Sometimes it recoveres after a while, but often I have to restart.

papr 26 September, 2022, 12:42:42

One dropped frame every 10 minutes, do I understand this correctly?

user-fcda8b 26 September, 2022, 12:49:40

No, many dropped frames. And when the dropping of frames stops, the cameras are (and stay) out of sync.

papr 26 September, 2022, 12:51:00

out of sync, as in if you blink with both eyes at the same time, the blink appears in one eye windows first and then in the other?

user-fcda8b 26 September, 2022, 13:13:42

That I couldn’t say (hard to see what’s happening during a blink...). But I keep getting "Resetting history" messages from the blink plugin, and the red circle in the world window splits into two independent ones, presumably one for each eye.

papr 26 September, 2022, 13:20:48

Just realised that I missed an important detail in your initial message. The timestamp would be zero on reconnect. Yours is just inconsistent. I am fairly sure that there is something going wrong with the handling of the camera's hardware timestamps. But I can't tell what it is right now. This will require more time to investigate.

As a work around, you can switch to software timestamps. These are a bit less precise but more consistent over time.

diff --git a/pupil_src/shared_modules/video_capture/uvc_backend.py b/pupil_src/shared_modules/video_capture/uvc_backend.py
index f376eadf5..945b168cf 100644
--- a/pupil_src/shared_modules/video_capture/uvc_backend.py
+++ [email removed] -272,7 [email removed] class UVC_Source(Base_Source):
     def configure_capture(self, frame_size, frame_rate, uvc_controls):
         # Set camera defaults. Override with previous settings afterwards
         if "Pupil Cam" in self.uvc_capture.name:
-            if platform.system() == "Windows":
+            if platform.system() in ("Windows", "Darwin"):
                 # NOTE: Hardware timestamps seem to be broken on windows. Needs further
                 # investigation! Disabling for now.
                 # TODO: Find accurate offsets for different resolutions!
user-fcda8b 26 September, 2022, 13:23:58

Thanks a lot, I’ll give it a try!

user-fcda8b 26 September, 2022, 14:25:30

A first test indicates that the problem may be fixed completely by this workaround!

user-632640 26 September, 2022, 14:33:56

so it seems all new MacBooks Pro have M2 chip. My question is whether that may generate any troubles. M1 are still in MacBook Air, but that has 8-core CPU and is comparable with Intel i5 (maybe i7), which is listed by Pupil Labs as minimum requirements. Does anyone have any experience with M2 chips? And if not, what concrete non-Apple new laptops are people using without hiccups?

papr 26 September, 2022, 14:35:07

Pupil Core performs much better on M1 than any Intel CPU I know 🙂 M2 shouldn't be any issue either.

user-632640 26 September, 2022, 14:47:28

25+ minute recordings might be likely for the experiments we plan to do. Do you have a specific recommendation of a laptop for this?

papr 26 September, 2022, 14:53:21

Usually, we recommend to split your recordings into blocks of no longer than 20-30 minutes, each with their own calibration. This has mostly two reasons: Over time, there usually is headset slippage, reducing the gaze estimation accuracy over time. The second reason is, based on your planned analysis, Pupil Player might need more RAM than what your system has to offer. The longer the recordings, the more RAM should your system have.

Note, that I cannot give any specific recommendations as to the amount of RAM or the system model. I can speak from personal experience with the M1 MBA and it works fine for my personal use cases (Development of the Pupil Core software.)

user-2798d6 27 September, 2022, 22:42:08

found device

user-736bf7 28 September, 2022, 08:28:27

Hello! I am trying to tag different windows of a software for a study. My test persons will sit between 80 and 100 cm away from the screen. Unfortunately, the tags (tag36h11) seem to have to be quite large at 260 x 260 pixels for the surface to be robustly detected. Are there ways to reduce the size of the tags and still have good tracking of the areas of interest?

papr 28 September, 2022, 09:37:07

Note that the markers require sufficient white border to be recognized. Is that included in the 260x260 pixel area?

Also, would printing the markers and attaching them to the outside of the screen an alternative for you?

user-430fc1 28 September, 2022, 09:32:54

Hi, when I click freeze model in Pupil Capture the red ellipse suddenly changes shape, becomes more round, and appears not to fit the pupil as well as when the model was not frozen. Is this expected behavior?

user-80123a 29 September, 2022, 07:57:05

Hello everyone, is it possible to make pupil player work without drag and drop? using python parameters for example? My goal is to develop an application that uses pupil core and pupil player. Inside the application, I can: (1) launch the pupil core, (2) launch some experiments, (3) start recording (using network API), (4) stop recording (using network API),
(5) export the data. I am stuck at the step (5), because I have to quit the application and launch the pupil player to export the data.

papr 29 September, 2022, 08:02:58

What kind of data are you looking to export? Are you performing any kind of post-processing? e.g. blinks or fixations?

user-80123a 29 September, 2022, 14:35:27

Hello again, is it normal to have coordinate values outside [0,0] and [1,1] for x_norm and y_norm, for the file gaze_positions_on_surface_<surface_name>.csv ?

papr 29 September, 2022, 14:36:25

Yes, that is when the gaze was outside of the surface. Note, it is recommended to remove low confidence gaze values as they are likely inaccurate

user-53a74a 29 September, 2022, 15:15:24

Hello! I'm using Pupil Core and I've been experiencing pupil detecter jumping across eye camera FOV and not staying on users' pupil correctly. I tried to change the angle/position of camera, framerate, brightness of the room, even rebooted PC and turned off one of the right/left eye camera, but they did not improve the situation. Is this a common issue with Pupil Core?

papr 29 September, 2022, 15:17:00

Could you create a Pupil Capture recording of you rolling the eyes (capturing different eye positions) and share it with data@pupil-labs.com for concrete feedback?

user-bbd687 29 September, 2022, 15:56:18

how to solve this problem? win10

Chat image

user-62c153 29 September, 2022, 21:09:07

You could try this process:

Windows Videos do not appear in Pupil Capture. This could mean that drivers were not automatically installed when you run Pupil Capture as administrator. You should first try to run Pupil Capture as administrator (right click pupil_capture.exe > Run as administrator). If that does not work, follow the troubleshooting steps below:

In Device Manager (System > Device Manager) View > Show Hidden Devices Expand libUSBK Usb Devices, Cameras, and Imaging Devices categories. For each Pupil Cam device (even hidden devices) click Uninstall and check the box agreeing to Delete the driver software for this device and press OK Unplug Pupil headset (if plugged in) and plug back in. Right click on pupil_capture.exe > Run as administrator. This should install drivers automatically.

found here: https://docs.pupil-labs.com/core/software/pupil-capture/

I was having issues getting my cameras to detect and this fixed it for me

user-bbd687 29 September, 2022, 15:57:44

i have a usb cam

Chat image

user-bbd687 29 September, 2022, 15:58:04

@papr

user-01c0ae 29 September, 2022, 19:20:09

Hello everyone, the Pupil Сapture program does not display video from glasses. Laptop model: MacBook Pro (13-inch, 2019, Four Thunderbolt 3 ports), 2.4 GHz Quad-Core Intel Core i5 processor. Glasses model: Pupil Core Please tell me what the problem might be; thank you in advance

user-d407c1 30 September, 2022, 06:47:30

Hi @user-01c0ae ! Which version of MacOS are you using? If you are using Monterey (12) or above, please check out this note https://github.com/pupil-labs/pupil/issues/2240

user-62c153 29 September, 2022, 20:51:43

Hi, I'm working with an older Pupil Pro Binocular rev 037 and Pupil Capture v3.5.1 but am having some issues with the setup. Cant detect both pupil cameras simultaneously, occasional program crashes when selecting camera sources, blue screen crashes. Is there some older software more suited to this hardware that I can use on a win10 machine?

user-62c153 03 October, 2022, 14:07:03

Hi Pupil Labs, Just following up on this request. Is there any support for older Pupil Pro Binocular Rev 037 models? Thanks.

user-f93379 30 September, 2022, 06:59:24

Hi all! When setting up the camera an error occurs and crash program if the camera gets too close to the field of dots

Chat image

papr 30 September, 2022, 07:50:20

Hi, this is a known issue https://github.com/pupil-labs/pupil/pull/2248 Use the linked user plugin to avoid it.

Note: The camera calibration is usually not necessary and is not to be confused with the gaze calibration, which is necessary.

user-f93379 30 September, 2022, 07:20:53

Colleagues, tell me, what is the difference in calibration with the flag "show undistorted image" on and off? When calibrating, you can see that the calibration area changes.

How does raising the flag affect the calibration?

papr 30 September, 2022, 07:56:12

Pupil Core estimates gaze in two coordinate systems: Distorted 2d image and undistorted 3d camera coordinate systems. https://docs.pupil-labs.com/core/terminology/#coordinate-system Instead of undistorting the image, which requires a lot of CPU resources, Pupil Capture uses a simplified mathematical representation of the distortion (camera intrinsics) to transform selected points between the two coordinate systems.

The "show undistorted image" option uses the current intrinsics to create preview of how well the intrinsics represent the actual distortion. If the intrinsics fit well, any straight edge in real life will also appear straight in the undistorted image. Inaccurate intrinsics will have a negative affect on the gaze estimation accuracy.

Note, that this is just a preview. The undistorted image is not being used otherwise.

user-f93379 30 September, 2022, 07:40:43

How do I start GPU acceleration for pupil capture?

papr 30 September, 2022, 07:56:35

Pupil Capture does not support GPU acceleration. It purely relies on the CPU.

user-bbd687 30 September, 2022, 08:08:51

i alway can't use this cam .

Chat image Chat image Chat image

user-bbd687 30 September, 2022, 08:09:01

in win10

user-bbd687 30 September, 2022, 08:09:30

i can't get the image of cam

user-bbd687 30 September, 2022, 08:10:08

please help me @papr

user-bbd687 30 September, 2022, 08:11:22

I already did what you asked, but it didn't work

user-bbd687 30 September, 2022, 08:14:16

If I open the app that came with Windows 10, I can get videos.

papr 30 September, 2022, 08:18:19

That means that the drivers are not correctly installed. Try installing them manually using steps 1-7 from these instructions https://github.com/pupil-labs/pyuvc/blob/master/WINDOWS_USER.md

user-bbd687 30 September, 2022, 08:14:33

so,i think

user-bbd687 30 September, 2022, 08:14:56

i need your help

user-bbd687 30 September, 2022, 08:16:30

there are some trouble with :Pupil Capture v3.5.1

user-bbd687 30 September, 2022, 08:21:28

thanks for your help. Now.i will do it as the same ways.

user-bbd687 30 September, 2022, 08:34:50

Could you please show the picture of the successful replacement of the driver?

Chat image

papr 30 September, 2022, 08:41:48

I don't useWindows at the moment. It might be easiest to jump into the #pupil-voice channel and do a quick screen share to get this issue fixed

user-bbd687 30 September, 2022, 08:34:57

@papr

user-bbd687 30 September, 2022, 08:45:17

can you give me a doc or a video about this operation?

papr 30 September, 2022, 08:45:45

No, the linked documentation is the only one we have.

user-bbd687 30 September, 2022, 08:50:54

i don't know how to us this code.

Chat image

papr 30 September, 2022, 08:51:09

Step 8 does not need to be performed

user-bbd687 30 September, 2022, 08:51:07

i don't know how to use this code.

user-bbd687 30 September, 2022, 08:51:43

ok

user-869b8d 30 September, 2022, 12:06:45

Hi, I would like to know if any of you can help me with information to integrate PsychoPy and Pupil experiments. with LSL. UnU

nmt 30 September, 2022, 12:22:24

Hi @user-869b8d 👋. We have an official PsychoPy integration: https://psychopy.org/api/iohub/device/eyetracker_interface/PupilLabs_Core_Implementation_Notes.html Have you seen that already?

user-869b8d 30 September, 2022, 12:22:43

not yet! Thank you ❤️

user-869b8d 30 September, 2022, 12:22:59

If I have any doubts, I tell you

user-632640 30 September, 2022, 16:34:22

What is the consensus on using the pupil core software with 3rd party recording devices like a web camera instead of the eye tracker?

nmt 01 October, 2022, 12:44:51

Hi @user-632640 👋. Do you mean in the context of remote eye tracking, i.e. having the cam mounted on or near a monitor, or head-mounted, i.e. close to the eye?

End of September archive