πŸ‘ core


user-e91538 02 September, 2022, 08:07:38

Hi, i s it possible to subscribe to the normalized fixation points on a surface directly using zmq filter such as "surfaces.surface_one..."? If not, are there any implementations available for reference of this filtering in c++? Thanks#

user-e3f20f 02 September, 2022, 08:36:55

Hi, you will need to subscribe to surfaces..... It will include the surface-mapped fixations if the realtime fixation detector is running. The field should be named fixations_on_surfaces instead of gaze_on_surfaces

user-e91538 02 September, 2022, 08:49:02

so, i want to subscribe to fixations on surface named a, it would be

surfaces.a.fixations_on_surfaces?

user-e3f20f 02 September, 2022, 09:13:56

No, the fixation data is not published in a message of its own. You need to subscribe to surfaces.a and then extract the fixations from the message (see the fixations_on_surfaces) field

user-8697b3 02 September, 2022, 11:14:23

Hi, is the pupil core MRI compatible?

user-4c21e5 02 September, 2022, 11:30:15

Hi @user-8697b3 πŸ‘‹. Would you be able to elaborate a bit on your proposed use of eye tracking and MRI compatibility?

user-632640 02 September, 2022, 21:59:21

Hi! I’m a beginner to this and haven’t messed with any code. I’m just trying to use the Downloadable software pupil capture for macOS and I’m having trouble. When I switch to local USB it’s telling me that the cameras are in use or blocked. How do I fix this?

user-e3f20f 03 September, 2022, 07:57:48

Please see https://github.com/pupil-labs/pupil/issues/2240

user-6e1219 03 September, 2022, 06:58:05

Hello, How I can connect and use my eyetracker with smartphone?

user-e3f20f 03 September, 2022, 07:58:57

Pupil Core requires a tablet or computer running Pupil Capture. πŸ™‚

user-6e1219 03 September, 2022, 08:05:25

But there is this Pupil Remote app right? Is that for tablet only?

user-e3f20f 03 September, 2022, 08:08:16

If you are referring to Pupil Mobile, that app has been deprecated a few years ago and has been removed from the play store. If you need to be mobile, we recommend either to use our Pupil Invisible product or to run Pupil Capture on a compatible tablet, e.g. a Microsoft Surface.

user-b7599b 04 September, 2022, 11:35:07

hi I made a code to extract gaze and pupil data from unity, however it for some reason only works when used in conjunction with old pupil capture software (1.0) and when using pupil capture 3.51 it keeps giving me the error message "notification without timestamp will not be saved". Does anyone have any idea how I can fix this?

user-e3f20f 05 September, 2022, 06:22:06

Hi πŸ™‚ I am not sure if the two things are related. Could you explain what your code does in more detail?

user-e91538 05 September, 2022, 10:05:17

We have a loose wire (the red on) on the eye-tracking camera of out pupil core. How do we go about getting it fixed?

user-e3f20f 05 September, 2022, 10:05:41

Hi! Please contact info@pupil-labs.com in this regard

user-90ba8c 05 September, 2022, 16:53:47

Can anyone tell me what is meant by Outlier Threshold in the validation section of pupil player and why its default is 5 degrees?

user-e91538 07 September, 2022, 12:23:01

Hi πŸ‘‹ For post-hoc calibration, Pupil Player removes samples greater than the outlier threshold from its calculation. The default is samples with a larger angular error than 5.0 degrees, but this can be adjusted. If too many samples are disregarded from the below result this could indicate a problem with the camera set up. You can check out this explainer video for more information: https://www.youtube.com/watch?v=_Jnxi1OMMTc&feature=emb_rel_end

user-80123a 06 September, 2022, 13:40:56

Hi team, I'm with Pupil. I tried a few lines of code to extract the position of the eyes and the gaze direction. I received a 3D position (x,y,z). What I couldn’t find is the position of the origin. Can anyone help me with that? Thanks in advance πŸ™‚

user-e3f20f 06 September, 2022, 13:49:03

Hi! What file are you using as input?

user-80123a 06 September, 2022, 13:52:45

First I used Pupil Player, and then I used a script with "suscriber.suscribe('gaze.')".

user-e3f20f 06 September, 2022, 13:57:00

Are you just playing around to see what works or do you have a specific analysis (realtime vs post-hoc) in mind?

In any case, the origin of gaze coordinate system is the scene camera origin

user-e3f20f 06 September, 2022, 13:55:58

Just to clarify: With Pupil Capture you record the recordings which you can then export with Pupil Player. If you subscribe, you connect to Capture' realtime data stream. This might not be suitable for your analysis

user-80123a 06 September, 2022, 14:56:22

Thank you. Why subscribe might not be suitable for the analysis? I have to integrate the code into Unity. My goal is to project an image, then capture the gaze of the subject perceiving the image.

user-e3f20f 06 September, 2022, 14:57:24

Do you need to know where the subject is looking at in real time? Or is it sufficient to know where they were looking at after the recording?

user-80123a 06 September, 2022, 15:01:21

It is preferable to know where the subject is looking in real-time. I have to compare the position of the image and the direction of the gaze.

user-e3f20f 06 September, 2022, 15:02:25

In this case, subscribing to gaze is very suitable.

user-80123a 06 September, 2022, 15:08:50

Ok thank you. I will use subscribe then

user-e91538 06 September, 2022, 15:10:32

Hi, My question is probably answered somewhere but I can't find it. Does pupil labs use a mix of bright and dark pupil methods?

user-e3f20f 06 September, 2022, 15:11:54

No, Pupil Core only looks for the the dark pupil. Reflections are not used.

user-e91538 06 September, 2022, 15:12:43

Thank you. Could you please let me know where I can read more in depth about the methods pupil labs have used?

user-e3f20f 06 September, 2022, 15:14:10

https://arxiv.org/abs/1405.0006

user-e91538 06 September, 2022, 15:16:14

Thank you

user-26b243 06 September, 2022, 17:20:15

Thank you for your response! I was also wondering how much of the green dot with the yellow rim and the center red dot that reflects where the user is looking has to be outside of a set surface tracker for that data point to be classified as FALSE for gaze_onsurface?

user-e3f20f 06 September, 2022, 17:27:35

The circle center need to be outside. Note, that the displayed surface outline is just an approximation of the true outline (which lives in undistorted 3d space to compensate for lens distortion)

user-80123a 07 September, 2022, 12:04:24

Hi all, could anyone help me for launching the python source code of Pupil - eye tracking platform? I followed the two instructions 1) "> python -m pip install --upgrade pip wheel" 2) "> pip install -r requirements.txt". During the installation, I got one error with the setuptools>61.0. And when I launched the application "> python ./main", I got an error calling git. Thanks in advance

user-e3f20f 07 September, 2022, 12:07:28

Hi! Please note that you need to clone the source code. It is not sufficient to download the zipped archive.

Which operating system and Python version are you using?

user-80123a 07 September, 2022, 12:12:59

I'm using windows and python 3.6. Ok I will clone the source code then

user-80123a 07 September, 2022, 14:26:46

Hi, sorry I come back with the same problem with launching the python source code of Pupil core. I found the full description of the installation. But in the end, when I execute the command "> python main.py capture", I receive the error message "ValueError : Version Error"

user-80123a 07 September, 2022, 15:29:57

Problem solved πŸ™‚

user-969151 08 September, 2022, 01:28:26

Hi, I am learning to use Pupil Core for the first time and have a question about the pupil detection and calibration. I followed the Pupil Core Quick Start guide but during the pupil detection the circles around my eyes are blue not green. Any suggestions on how to resolve this? Also, after calibration, I had a thin green rectangle with orange figures in each corner and a red circle in the middle. Refer to image below. Does anyone know what this is?

Chat image

user-4c21e5 08 September, 2022, 09:58:38

Hi @user-969151 πŸ‘‹. No worries about the blue circle (we update the colour scheme in our software) – check out the legend in the pye3d detector settings by clicking on the 3d icon in one of the eye windows. For a description of what the green and orange visualisations represent, open the 'Calibration' menu by clicking on the little target icon in the right of the Capture window. The red circle is your gaze point πŸ™‚

user-2ab654 08 September, 2022, 10:20:51

Hi ✌️ I am trying to calculate head position angles via the Head Pose Tracker Plugin, while looking on a 2d screen. I put 3 april tags on my screen but the calculation of the 3d Modell uses just one of the april tags. I think this is because the other two april tags are not visible all the time but I dont see why this would be a problem. Is there a way to manually select the markers or is this no problem as I am just looking at a 2d window anyway? Thanks in advance πŸ˜ƒ

Chat image Chat image

user-4c21e5 08 September, 2022, 13:30:42

Hi @user-2ab654! It looks like your markers are being detected, as shown by the green semi-opaque overlay. What you'll need to do is build up the 3D model by recording the markers from more angles and perspectives. Do that enough and the markers will appear red. Note that you can then use that model for other recordings (as long as the same markers are present of course). Relevant docs for reference: https://docs.pupil-labs.com/core/software/pupil-player/#head-pose-tracking

user-80123a 08 September, 2022, 13:51:22

Hi all, I want to add the pupil core application to an existing Unity Application. So I added a button in the Unity Application to launch the pupil core (DID IT). Now I want that the pupil core is running in the background with no graphical interface. In other words, is there a clean way to remove the UI from the pupil core? The goal of putting the pupil core in the background is to allow me to collect data. Thanks in advance πŸ™‚

user-e3f20f 08 September, 2022, 13:52:39

There is a --hide-ui flag that you can pass when you call the exe. This will create a hidden window.

user-80123a 08 September, 2022, 13:57:20

Thanks, it's working πŸ™‚

user-9fa187 09 September, 2022, 07:35:36

Hi, how are you I came to Germany from Mongolia and wrote an e-mail to get acquaint with your company and products, but I can't get in touch. What shoud i do?

wrp 09 September, 2022, 07:39:42

Hi πŸ‘‹ - I apologize that you haven't received a response yet. Please send us another email to info@pupil-labs.com and reference your discord user name. We are also happy to continue discussion here if relevant to the channel 😸

user-80123a 09 September, 2022, 11:20:07

Hi all, I encountered difficulties running the pupil core from the source code. 1) When I export the data, I do not find any data on the gaze_positions, but the rest of the data is OK (purpil_positions, world_timestamp,...). 2) I do not receive the stream data from subscribe('gaze'). These two problems do not exist when I run the app from the installed program. Would anyone have any idea? Thank you in advance πŸ™‚

user-4c21e5 09 September, 2022, 12:02:13

Hi @user-80123a! The behaviour you describe is usually indicative of no calibration. Did you calibrate the eye tracker when running from source?

user-80123a 09 September, 2022, 12:04:08

I did not use calibration for the two scenarios, I was thinking of using calibration when the code works. I will try with calibration then

user-80123a 09 September, 2022, 12:06:32

I forgot, I cannot use calibration because the Eye Tracking I use right now does not have a frontal camera

user-80123a 09 September, 2022, 12:09:04

But I will use a new eye tracker soon, and I hope my code will be ready when the new eye tracker arrives

user-4c21e5 09 September, 2022, 12:11:06

To get gaze data, it is necessary to calibrate. Check out the docs for future reference: https://docs.pupil-labs.com/core/#_4-calibration Indeed, a scene camera is also necessary πŸ™‚

user-80123a 09 September, 2022, 12:16:08

But I receive some gaze data when I use the installed one. I know it might not be the right data, but it works. Ok then, I will wait for the new eye tracker. Thank you. Another last question, is it necessary to calibrate every time the software is used? Or just the first time?

user-4c21e5 09 September, 2022, 12:28:35

The installed software will use the last-run calibration. This won't be accurate in all likelihood, unless the headset went back onto the same wearer and into exactly the same position (highly unlikely). The calibration is wearer-specific. If the headset is put onto a new wearer, or removed and re-worn, we recommend re-calibrating. Read more about calibration best-practices here: https://docs.pupil-labs.com/core/best-practices/#avoiding-slippage

user-80123a 09 September, 2022, 12:41:44

Ok, understand, thank you

user-ff97c9 12 September, 2022, 08:32:12

Hello, best productive days to everybody πŸ™‚

I'm going through the documentation for a recent query in our lab regarding the fixation export file, fixation.csv.

Does the frame index of the video begin with 0 or 1; in that, if the first fixation is, say, at the 4th frame, is it then the 5th or the 4th frame of the video?

We're developing a frametime critical approach in our present work, a single frame is very important in this for us as a result. Hence the basic question, but it is not written explicitly in the documentation.

user-e3f20f 12 September, 2022, 08:49:47

May I suggest an alternative approach that may resolve your experienced variance?

Instead of comparing frames, use the fixation start timestamp. Fixations are based on gaze data, which has a higher temporal resolution than the scene camera. Assuming an event at scene video frame index J and time T, do not calculate the number of frames until the next fixation (fixation->start_frame_index - J) but the duration in seconds until this fixation (fixation->start_timestamp - T)

user-e3f20f 12 September, 2022, 08:39:55

It starts at zero πŸ™‚ You can verify it in Pupil Player. Player displays the current world frame index in the timeline, next to the relative world video time, as well as information about the current fixation (incl the fixation id) in the fixation detector menu.

user-ff97c9 12 September, 2022, 08:44:46

Great, thank you very much for your swift support as always πŸ™‚

The variance in frame beginning and end points in the fixations output file have somehow become inconsistent in our current data set of about 40 participants. I'll most likely follow up soon with more detailed questions which I hope will benefit everybody.

Best wishes.

user-e3f20f 12 September, 2022, 08:45:25

What do you refer to by "frame beginning"?

user-ff97c9 12 September, 2022, 08:54:59

In the fixations.csv file there is a column named "start_frame_index", which refers to the first frame of the video export at which the indexed fixation begins. At least so we assumed.

user-e3f20f 12 September, 2022, 08:56:00

But to be exact: It is the index of the frame that is closest in time to the start_timestamp

user-e3f20f 12 September, 2022, 08:55:20

That is correct

user-ff97c9 12 September, 2022, 08:59:07

From what I can see the frame durations in the video are not constant, or in other words not all frames are 33.33ms periods. So calculating the frame we should target based on the duration hasn't worked for us so far. This is why we're using the frame indexes.

Unfortunately what we do right now is dependent on the world camera video, so we are constrained by the temporal resolution the video is recorded at.

user-e3f20f 12 September, 2022, 09:09:49

From what I can see the frame durations in the video are not constant, or in other words not all frames are 33.33ms periods Correct. See the world_timestamps.csv file exported by the World Video Exporter plugin. It contains the timestamps for each frame. This is were you can lookup the variable T from the example above.

Unfortunately what we do right now is dependent on the world camera video, so we are constrained by the temporal resolution the video is recorded at. Yes, I assume to annotate events like in the example above. I understand that subtracting timestamps did not work well for you, given the 0.33 sec/frame assumption. But given accurate frame timestamps, comparing timestamps should be superior than comparing frame indices, e.g.

J = 15
T = 100.00

start_frame_index = 15
start_timestamp = 100.20

# start_frame_index - J = 0
# start_timestamp - T = 0.20

Your statistic should gain in precision by using timestamps.

user-d72f39 12 September, 2022, 10:54:45

Hello

user-d72f39 12 September, 2022, 10:55:27

Can you please provide me with a step by step instructions of how to export pupil size data from my recording in an excel format?

user-e91538 12 September, 2022, 13:00:07

Hi! To export Pupil Size data: Load your recording in Pupil Player -> Make sure you have the raw data exporter plugin enabled from the settings menu, and click on the Export button (down arrow) on the far left hand side of the Player window. This will export .csv files you can open in excel. Full breakdown of the pupil_positions.csv export here: https://docs.pupil-labs.com/core/software/pupil-player/#pupil-positions-cs

user-d72f39 12 September, 2022, 10:56:01

I used the core device to do the recording

user-219de4 12 September, 2022, 20:50:34

Hello! Where can I find the audio setting in Pupil Capture? I rendered the videos, and they have no audio input or audio.timestamp files.

user-e3f20f 13 September, 2022, 06:47:20

Hi, we have removed audio-recording capabilities from Capture ~3-4 years ago. The feature did not work stable and reliable enough. I would recommend recording the audio with an external audio recording app.

user-219de4 13 September, 2022, 14:12:06

Thanks for your answer. Do you have any advice for further synch between audio and video files if they exported from different sources?

user-e3f20f 13 September, 2022, 14:14:29

What do you need the audio for?

user-219de4 13 September, 2022, 14:25:44

Our study aims to observe children’s gaze in response to partner’s verbal labels during interaction, so we need the full transcript to further identify the key timings (it’s not fixed as other experimental paradigms). Do you know any audio program can receive external triggers?

user-e3f20f 13 September, 2022, 14:33:27

I think the easiest option (if your experimental design allows it) would be to record a clearly visible and audible "clap", e.g. with something similar like a clapperboard. Then it is just a matter of finding the clap time within the audio recording and subtracting it from the transcript timestamps. This shifts the relative time starting point from file-beginning to clap event.

Next, find the frame index during which the clap happens and extract its absolute timestamp. Add the absolute time to the relative manuscript time, shifting the manuscript time to pupil recording time.

user-219de4 13 September, 2022, 14:36:10

Will try it. Thanks for your advice!

user-b9005d 13 September, 2022, 17:42:55

In your pupil positions export, are your theta and phi outputs using the physics or mathematics convention? We were hoping to use your sphere radius output in combination with these two to generate a calculation for degrees of eye movements

user-e3f20f 13 September, 2022, 18:31:39

https://docs.pupil-labs.com/core/terminology/#coordinate-system see the eye model section

user-e3f20f 13 September, 2022, 18:30:48

The sphere radius is a constant value. The phi and theta are not to convention but rotated by a specific amount. Let me look up the link

user-80123a 15 September, 2022, 08:25:01

Hi all, is there a metric to identify if the calibration went well or not? Similar to the confidence parameter metric in the csv data, after exporting with pupil player, but can be used before starting to record. Thanks in advance.

user-e3f20f 15 September, 2022, 08:26:08

Hi! The Accuracy Visualizer calculates the accuracy in angular error. That should be suitable.

user-80123a 15 September, 2022, 08:27:32

Thanks, where can I find the angular error?

user-e3f20f 15 September, 2022, 08:28:11

It is displayed as a log message as well as in the accuracy visualizer menu

user-80123a 15 September, 2022, 09:01:40

Hi again, another question, can I subscribe to another topic than "gaze"? For example, if I want different information like pupil coordinates or angular error precision... If there is a list of topics to subscribe to. Thanks in advance

user-e3f20f 15 September, 2022, 09:06:15

Yes, subscribing to other topics is possible. If you subscribe to the empty string, you will receive everything. Use it to find the data that you are interested in and then use a more specific subscription string later

user-80123a 15 September, 2022, 09:08:04

Ok, interesting, thanks

user-6971cf 15 September, 2022, 13:57:07

Sorry I am new to this project, does this software require a infrared camera or would a regular dektop camera be sufficient to use the eye tracking technology

user-e3f20f 15 September, 2022, 13:58:56

Hi! Welcome to the community. Yes, an IR camera is required. It is also required that it is head-mounted. A camera on the display/table in front of the subject (remote eye tracking) will not work with our software.

user-6971cf 15 September, 2022, 13:57:18

also hi everyone!

user-6971cf 15 September, 2022, 14:14:29

Ok thank you!

user-e91538 15 September, 2022, 14:57:26

Hello and congratulations for the very nice work! I am interested in the pupil core system, I was wondering if it can be used with an ARM-based/embedded systems (e.g. a zedboard) instead that a normal desktop/laptop?

user-e3f20f 15 September, 2022, 14:58:37

Hi, the pre-build software is only compiled for x86_64. You would need to install the dependencies and run from source. Note, that the board is likely not powerful enough to run the pupil detection in real time

user-e91538 15 September, 2022, 15:00:57

I see thanks for the answer - is all the source code available? Speaking about power, does the system require a GPU?

user-e3f20f 15 September, 2022, 15:03:07

https://github.com/pupil-labs/pupil no gpu is needed for the calculations. I am referring to CPU power in this case.

Note that the software needs to be able to open a window. So running it on an embedded system without a screen might not work.

Raspberry Pi users have used this tool https://github.com/Lifestohack/pupil-video-backend/ in the past to stream the video from the RPi to a laptop running the software

user-e91538 15 September, 2022, 15:06:31

ok thanks a lot for the answers and the links I'll check them to get a better idea - it's a very interesting system!

user-e3f20f 15 September, 2022, 15:07:13

Would you mind sharing a bit more about what you are trying to accomplish / your use case?

user-e91538 15 September, 2022, 15:11:02

Broadly speaking, I would like to understand how easy would it be to use it in an embedded/ARM based-system with a camera and a projecting screen

user-e3f20f 15 September, 2022, 15:12:48

What type of camera are we talking about here? Something like a webcam?

user-e91538 15 September, 2022, 15:14:13

more like a wearable device - I am afraid I cannot go into more details for the moment

user-e3f20f 15 September, 2022, 15:15:12

ok, just wanted to make sure that you were not attempting to perform remote eye tracking, which is not possible with our software. Good luck with your project!

user-b10192 16 September, 2022, 01:57:44

Hi there, From pupil service, I subscribe the frame and trying to get the image and wanna save by myself. import zmq import msgpack ctx = zmq.Context() pupil_remote = ctx.socket(zmq.REQ) ip = 'localhost' port2 = 50020 pupil_remote.connect(f'tcp://{ip}:{port2}')

Request 'SUB_PORT' for reading data

pupil_remote.send_string('SUB_PORT') sub_port = pupil_remote.recv_string()

Assumes sub_port to be set to the current subscription port

subscriber = ctx.socket(zmq.SUB) subscriber.connect(f'tcp://{ip}:{sub_port}') subscriber.subscribe('frame')

while True: array = subscriber.recv_multipart() item = msgpack.loads(array[2]) print(item) break

and I get the array which contains these 3 elements. [b'frame.eye.0', [email removed] b'\xff\xd8\xff\xc0\x00\x11\x.........................']

how can I translate this into an image or numpy array that I can save as an mp4 video? because I am receiving this error when I tried to load the 3rd element with msgpack item = msgpack.loads(array[2]) File "msgpack_unpacker.pyx", line 202, in msgpack._cmsgpack.unpackb msgpack.exceptions.ExtraData: unpack(b) received extra data.

user-e3f20f 16 September, 2022, 07:00:08

Hi, check out this example https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py

user-b10192 16 September, 2022, 16:17:46

Thanks.

user-80123a 16 September, 2022, 09:31:56

Hi all, is there a possibility to develop the Network API with a different programming language? In C# for example.

user-e3f20f 16 September, 2022, 09:32:57

Yes, that is possible. https://github.com/pupil-labs/hmd-eyes/ has a c# implementation of the api

user-80123a 16 September, 2022, 09:35:04

Thank you for the link πŸ™‚

user-b10192 16 September, 2022, 17:36:42

Hi all, Thanks to @user-e3f20f , I can get image from Network API. I got another problem. I could not get gaze information anymore. I was receiving gaze data previously from the network by using subscriber.subscribe('gaze'). But now I am receiving the following topic: pupil.0.2d pupil.0.3d frame.world frame.eye.0 How can I get back gaze information or gaze topic? Do I need to change anything in pupil capture?

user-e3f20f 16 September, 2022, 17:43:29

Hi, you need to calibrate to receive gaze data.

user-b10192 16 September, 2022, 17:44:50

Thanks

user-219de4 16 September, 2022, 23:31:19

Hello, what kind of material do you use to print the camera extender?

user-53a74a 19 September, 2022, 15:21:55

Hi Pupil Labs community! I'm thinking to use a chin rest along with Pupil Core. Are there any commercially available chin rests that you folks recommend?

user-e91538 19 September, 2022, 15:34:12

Hi, I’m getting a lot of messages like this from Pupil Capture:

2022-09-19 16:47:33,981 - eye0 - [DEBUG] video_capture.uvc_backend: Received non-monotonic timestamps from UVC! Dropping frame. Last: 19869.178274, current: 19865.838207

How can this be and is there anything that can be done about it? It seems to disrupt pupil recognition quite severely. (I’m running from source on Mac OS Monterey in native (arm64) mode.)

user-e3f20f 19 September, 2022, 16:25:49

Hey, this usually happens after a reconnect. You might have a loose connection

user-e91538 19 September, 2022, 16:42:00

Thanks for your quick reply. But there is no external connection, everything is running locally on the same machine. How could a local connection become loose?

user-e3f20f 19 September, 2022, 16:51:03

I am talking about the physical connection of the cameras. Sometimes the connectors become loose. I recommend checking the headset

user-e91538 19 September, 2022, 17:00:01

Ah, I see. Thanks, I'll check.

user-219de4 19 September, 2022, 18:02:13

Hello! I am trying to refresh my questions regarding the headset extender. I found the geometry map, but not sure of the appropriate material for printing. PLA, ABS, nylon? Do you have any advice?

user-e91538 19 September, 2022, 18:06:30

I recently printed it in PLA. Works nicely!

user-219de4 19 September, 2022, 19:52:21

great to know! thanks for sharing! πŸ˜€

user-632640 20 September, 2022, 22:41:33

Hi! Wondering what type of machine/laptop everyone is using for the pupil core software/ what works best to avoid overloading/crashing that I’ve been noticing on my Mac laptop.

user-e3f20f 21 September, 2022, 09:04:51

The M1 macs work best in terms of performance. There is a known issue on Unix systems that may cause a crash if the hardware disconnects unexpectedly. If you are encountering the crash often, you might have a loose connection in your hardware.

I am working on debugging the mentioned software issue as we speak.

user-2ab654 21 September, 2022, 08:57:32

Hi guys ✌️
I have Recordings that are about 25 minutes long and try to do marker detection to extract the head-pose with the extension. The problem is, that the Pupil-Player keeps crashing. Does anyone know if there is a way to resolve it? The problem shouldnt be cpu, gpu or ram as it uses between 5-10 % on these. Have a great day!

user-e3f20f 21 September, 2022, 09:05:56

Hey! Could you share the player.log file after having attempted the marker detection and reproducing the crash? You can find it in the pupil_player_settings folder.

user-2ab654 21 September, 2022, 09:40:20

This is a copy of the player.log

player_copy.txt

user-632640 21 September, 2022, 09:12:03

Is there a windows/Linux laptop you might recommend as well ? My Mac book pro is crashing with it and I’ve checked the hardware multiple times and the logs don’t have much of any information regarding what happened. I think my Mac book pro is 2018 though with Monterey OS so could that be an issue?

user-e3f20f 21 September, 2022, 09:16:50

the logs don’t have much of any information regarding what happened This speaks for the issue that I was referring to. It is a C-level crash, i.e. one cannot catch it on Python level, making it a very difficult issue to handle. I’ve checked the hardware multiple times Such loose connections are not always visible to the bare eye and might only trigger with very specific movements.

I can't recommend any specific Ubuntu/Windows device. But if you are determined to switch, I recommend using Windows as the bug does not seem to happen on that OS. The most important feature to look for in the new hardware is CPU speed.

user-2ab654 21 September, 2022, 09:43:09

the pupil player immediately gives no response

user-e3f20f 21 September, 2022, 09:47:15

Thank you for clarifying this. Could you share the recording with data@pupil-labs.com s.t. we can try to reproduce the issue?

user-2ab654 21 September, 2022, 09:59:32

I can but it happens with every recrding after little time. The Marker detection runs for about 3-4 Minutes and then the Player gives no response

user-e3f20f 21 September, 2022, 10:00:52

Any recording causing the issue would help πŸ™‚

user-e91538 21 September, 2022, 13:04:26

Hello, A wire to one of our pupil-cams broke, and we would like to fix it. I figured out, that I have to get it crimped. Do you maybe have the serial-numbers of the connector parts, or do you even sell replacements? Thank you.

Chat image

user-e3f20f 21 September, 2022, 13:05:14

Please contact info@pupil-labs.com in this regard

user-e91538 21 September, 2022, 13:57:53

thanks

user-193e84 21 September, 2022, 18:31:58

Copying this to the correct thread: https://discord.com/channels/285728493612957698/633564003846717444/1022212302986031218

user-e3f20f 22 September, 2022, 06:08:18

Hi, this sounds like a hardware issue with the headset. Can you make sure the small connector at the right eye camera is correctly plugged in?

user-ac1446 22 September, 2022, 15:31:01

Is the stated discrepancy in accuracy/precision between VR/AR and core true in real world scenarios, if so, is there a technical reason for this?

user-e91538 24 September, 2022, 15:40:35

I noticed that the pupil detection parameters in the eye windows (such as "Pupil intensity range") are not persistent across sessions, unlike the plugin parameters for the video source, for example. Is there a way to change that?

user-e3f20f 26 September, 2022, 11:14:29

I am able to reproduce the issue. Let me check as to why that is.

user-80123a 26 September, 2022, 08:26:39

Hello everyone, when I use the pupil player, I can see a video of my gaze on the screen (in 2D). But when I export and open the gaze_position.csv file, I see a 3D gaze position (X, Y, Z). My question is: is the gaze position played on the pupil reader video simply the (Z and X) or (Z and Y) coordinates of the gaze position? Or is there a geometric transformation to translate the 3D coordinates into 2D coordinates? My goal is only to get the 2D coordinates of the gaze position. Thanks in advance πŸ™‚

user-e3f20f 26 September, 2022, 08:43:13

The norm_pos_* fields are what you are looking for

user-e3f20f 26 September, 2022, 08:39:52

There is a geometric transformation involved that corrects for the lens distortion https://docs.pupil-labs.com/core/terminology/#coordinate-system

user-80123a 26 September, 2022, 08:45:41

Thanks, I saw the norm_pos_* fields, but can I have for example norm_pos_0 and norm_pos_1 for the two different eyes?

user-e3f20f 26 September, 2022, 08:47:01

For that, you need to use a dual-moncular gazer https://gist.github.com/papr/5e1f0fc9ef464691588b3f3e0e95f350 The default gazer tries to estimate binocular gaze as much as possible.

user-e3f20f 26 September, 2022, 08:47:42

Out of curiosity, may I ask about what your use case is?

user-80123a 26 September, 2022, 08:50:23

I want to display on the same screen the position of the gaze and the position of an object that I ask the subject to look at on a screen.

user-e3f20f 26 September, 2022, 08:51:43

Are you aware that you will need to transform the gaze position from scene camera to screen coordinates? And that you can use the built-in surface tracking feature for that?

user-80123a 26 September, 2022, 08:56:58

I will use a video projector, and I will project some objects on the video projector, then capture the gaze. I am aware that I need a scene camera. I have one.

user-e3f20f 26 September, 2022, 08:52:26

Also, do I understand it correctly that the goal is to do that in realtime?

user-80123a 26 September, 2022, 08:52:49

Not necessarily in realtime

user-e3f20f 26 September, 2022, 08:54:21

Ah, then I misunderstood. The visualization is for analysis only then? Not for an interaction with the subject?

user-80123a 26 September, 2022, 08:58:04

The visualisation is for analysis after the experiment

user-e91538 26 September, 2022, 11:26:51

Hello, -- New user here. I wanted to know if there is an existing script to stream live data into MATLAB. thanks

user-e3f20f 26 September, 2022, 11:27:29

Hi, welcome! Check out https://github.com/pupil-labs/pupil-helpers/tree/master/matlab

user-e3f20f 26 September, 2022, 11:47:35

Hi, good catch! The issue will be fixed in the next Pupil Core release. Meanwhile, you can use this plugin to fix the issue https://gist.github.com/papr/9b3ba71b227bc6f092088077b334cf9c

user-e91538 26 September, 2022, 12:00:54

Splendid! Thank you very much for the quick resolution of the issue.

user-e91538 26 September, 2022, 12:30:07

I checked the headset, there are no visible defects. Also, there are never any disconnect or reconnect messages. The dropping of frames starts out of the blue for any or all of the three cameras.

user-e3f20f 26 September, 2022, 12:32:05

When it happens, does it happen repeatedly? If so, can you estimate the frequency?

user-e91538 26 September, 2022, 12:39:26

It happens irregularly, but on average every 10 minutes. Sometimes it recoveres after a while, but often I have to restart.

user-e3f20f 26 September, 2022, 12:42:42

One dropped frame every 10 minutes, do I understand this correctly?

user-e91538 26 September, 2022, 12:49:40

No, many dropped frames. And when the dropping of frames stops, the cameras are (and stay) out of sync.

user-e3f20f 26 September, 2022, 12:51:00

out of sync, as in if you blink with both eyes at the same time, the blink appears in one eye windows first and then in the other?

user-e91538 26 September, 2022, 13:13:42

That I couldn’t say (hard to see what’s happening during a blink...). But I keep getting "Resetting history" messages from the blink plugin, and the red circle in the world window splits into two independent ones, presumably one for each eye.

user-e3f20f 26 September, 2022, 13:20:48

Just realised that I missed an important detail in your initial message. The timestamp would be zero on reconnect. Yours is just inconsistent. I am fairly sure that there is something going wrong with the handling of the camera's hardware timestamps. But I can't tell what it is right now. This will require more time to investigate.

As a work around, you can switch to software timestamps. These are a bit less precise but more consistent over time.

diff --git a/pupil_src/shared_modules/video_capture/uvc_backend.py b/pupil_src/shared_modules/video_capture/uvc_backend.py
index f376eadf5..945b168cf 100644
--- a/pupil_src/shared_modules/video_capture/uvc_backend.py
+++ [email removed] -272,7 [email removed] class UVC_Source(Base_Source):
     def configure_capture(self, frame_size, frame_rate, uvc_controls):
         # Set camera defaults. Override with previous settings afterwards
         if "Pupil Cam" in self.uvc_capture.name:
-            if platform.system() == "Windows":
+            if platform.system() in ("Windows", "Darwin"):
                 # NOTE: Hardware timestamps seem to be broken on windows. Needs further
                 # investigation! Disabling for now.
                 # TODO: Find accurate offsets for different resolutions!
user-e91538 27 September, 2022, 07:57:13

Brilliant! It works flawlessly now.

user-e91538 26 September, 2022, 13:23:58

Thanks a lot, I’ll give it a try!

user-e91538 26 September, 2022, 14:25:30

A first test indicates that the problem may be fixed completely by this workaround!

user-632640 26 September, 2022, 14:33:56

so it seems all new MacBooks Pro have M2 chip. My question is whether that may generate any troubles. M1 are still in MacBook Air, but that has 8-core CPU and is comparable with Intel i5 (maybe i7), which is listed by Pupil Labs as minimum requirements. Does anyone have any experience with M2 chips? And if not, what concrete non-Apple new laptops are people using without hiccups?

user-e3f20f 26 September, 2022, 14:35:07

Pupil Core performs much better on M1 than any Intel CPU I know πŸ™‚ M2 shouldn't be any issue either.

user-632640 26 September, 2022, 14:45:02

Would you say that the Mac book air with 8-core CPU is sufficient? Or should I opt for better CPU over the M1 chip?

user-e3f20f 26 September, 2022, 14:46:20

Depending on your recording lengths, you might want to invest into the higher-RAM option

user-e3f20f 26 September, 2022, 14:45:23

M1 is very much sufficient (I use it myself)

user-632640 26 September, 2022, 14:47:28

25+ minute recordings might be likely for the experiments we plan to do. Do you have a specific recommendation of a laptop for this?

user-e3f20f 26 September, 2022, 14:53:21

Usually, we recommend to split your recordings into blocks of no longer than 20-30 minutes, each with their own calibration. This has mostly two reasons: Over time, there usually is headset slippage, reducing the gaze estimation accuracy over time. The second reason is, based on your planned analysis, Pupil Player might need more RAM than what your system has to offer. The longer the recordings, the more RAM should your system have.

Note, that I cannot give any specific recommendations as to the amount of RAM or the system model. I can speak from personal experience with the M1 MBA and it works fine for my personal use cases (Development of the Pupil Core software.)

user-2798d6 27 September, 2022, 22:42:08

found device

user-e91538 28 September, 2022, 08:28:27

Hello! I am trying to tag different windows of a software for a study. My test persons will sit between 80 and 100 cm away from the screen. Unfortunately, the tags (tag36h11) seem to have to be quite large at 260 x 260 pixels for the surface to be robustly detected. Are there ways to reduce the size of the tags and still have good tracking of the areas of interest?

user-e3f20f 28 September, 2022, 09:37:07

Note that the markers require sufficient white border to be recognized. Is that included in the 260x260 pixel area?

Also, would printing the markers and attaching them to the outside of the screen an alternative for you?

user-430fc1 28 September, 2022, 09:32:54

Hi, when I click freeze model in Pupil Capture the red ellipse suddenly changes shape, becomes more round, and appears not to fit the pupil as well as when the model was not frozen. Is this expected behavior?

user-80123a 29 September, 2022, 07:57:05

Hello everyone, is it possible to make pupil player work without drag and drop? using python parameters for example? My goal is to develop an application that uses pupil core and pupil player. Inside the application, I can: (1) launch the pupil core, (2) launch some experiments, (3) start recording (using network API), (4) stop recording (using network API),
(5) export the data. I am stuck at the step (5), because I have to quit the application and launch the pupil player to export the data.

user-e3f20f 29 September, 2022, 08:02:58

What kind of data are you looking to export? Are you performing any kind of post-processing? e.g. blinks or fixations?

user-80123a 29 September, 2022, 08:06:19

I work on the comparison between the position of the gaze on the surface and the position of an object on the surface. I use Surface Tracker to find the coordinates on of the gaze on the surface.

user-e3f20f 29 September, 2022, 08:06:46

And do you perform the surface tracking in realtime/Capture?

user-80123a 29 September, 2022, 08:07:49

Yes, I use the surface Tracking in RealTime

user-e3f20f 29 September, 2022, 08:08:19

Then this script will allow you to extract the surface-mapped gaze data directly from the recording without running Player https://gist.github.com/N-M-T/b7221ace2e7acf0c0c836773a3b4cf7c

user-80123a 29 September, 2022, 14:35:27

Hello again, is it normal to have coordinate values outside [0,0] and [1,1] for x_norm and y_norm, for the file gaze_positions_on_surface_<surface_name>.csv ?

user-e3f20f 29 September, 2022, 14:36:25

Yes, that is when the gaze was outside of the surface. Note, it is recommended to remove low confidence gaze values as they are likely inaccurate

user-80123a 29 September, 2022, 14:41:05

You are right, after removing the low confidence values, most values are now inside the [0,0] and [1,1], Thanks πŸ™‚

user-53a74a 29 September, 2022, 15:15:24

Hello! I'm using Pupil Core and I've been experiencing pupil detecter jumping across eye camera FOV and not staying on users' pupil correctly. I tried to change the angle/position of camera, framerate, brightness of the room, even rebooted PC and turned off one of the right/left eye camera, but they did not improve the situation. Is this a common issue with Pupil Core?

user-e3f20f 29 September, 2022, 15:17:00

Could you create a Pupil Capture recording of you rolling the eyes (capturing different eye positions) and share it with data@pupil-labs.com for concrete feedback?

user-53a74a 29 September, 2022, 15:17:57

Thx! Will do!

user-bbd687 29 September, 2022, 15:56:18

how to solve this problem? win10

Chat image

user-bbd687 29 September, 2022, 15:57:44

i have a usb cam

Chat image

user-bbd687 29 September, 2022, 15:58:04

@user-e3f20f

user-01c0ae 29 September, 2022, 19:20:09

Hello everyone, the Pupil Π‘apture program does not display video from glasses. Laptop model: MacBook Pro (13-inch, 2019, Four Thunderbolt 3 ports), 2.4 GHz Quad-Core Intel Core i5 processor. Glasses model: Pupil Core Please tell me what the problem might be; thank you in advance

user-d407c1 30 September, 2022, 06:47:30

Hi @user-01c0ae ! Which version of MacOS are you using? If you are using Monterey (12) or above, please check out this note https://github.com/pupil-labs/pupil/issues/2240

user-62c153 29 September, 2022, 20:51:43

Hi, I'm working with an older Pupil Pro Binocular rev 037 and Pupil Capture v3.5.1 but am having some issues with the setup. Cant detect both pupil cameras simultaneously, occasional program crashes when selecting camera sources, blue screen crashes. Is there some older software more suited to this hardware that I can use on a win10 machine?

user-62c153 03 October, 2022, 14:07:03

Hi Pupil Labs, Just following up on this request. Is there any support for older Pupil Pro Binocular Rev 037 models? Thanks.

user-62c153 29 September, 2022, 21:09:07

You could try this process:

Windows Videos do not appear in Pupil Capture. This could mean that drivers were not automatically installed when you run Pupil Capture as administrator. You should first try to run Pupil Capture as administrator (right click pupil_capture.exe > Run as administrator). If that does not work, follow the troubleshooting steps below:

In Device Manager (System > Device Manager) View > Show Hidden Devices Expand libUSBK Usb Devices, Cameras, and Imaging Devices categories. For each Pupil Cam device (even hidden devices) click Uninstall and check the box agreeing to Delete the driver software for this device and press OK Unplug Pupil headset (if plugged in) and plug back in. Right click on pupil_capture.exe > Run as administrator. This should install drivers automatically.

found here: https://docs.pupil-labs.com/core/software/pupil-capture/

I was having issues getting my cameras to detect and this fixed it for me

user-e3f20f 30 September, 2022, 08:09:56

@user-bbd687 check out this message

user-f93379 30 September, 2022, 06:59:24

Hi all! When setting up the camera an error occurs and crash program if the camera gets too close to the field of dots

Chat image

user-e3f20f 30 September, 2022, 07:50:20

Hi, this is a known issue https://github.com/pupil-labs/pupil/pull/2248 Use the linked user plugin to avoid it.

Note: The camera calibration is usually not necessary and is not to be confused with the gaze calibration, which is necessary.

user-f93379 30 September, 2022, 07:20:53

Colleagues, tell me, what is the difference in calibration with the flag "show undistorted image" on and off? When calibrating, you can see that the calibration area changes.

How does raising the flag affect the calibration?

user-e3f20f 30 September, 2022, 07:56:12

Pupil Core estimates gaze in two coordinate systems: Distorted 2d image and undistorted 3d camera coordinate systems. https://docs.pupil-labs.com/core/terminology/#coordinate-system Instead of undistorting the image, which requires a lot of CPU resources, Pupil Capture uses a simplified mathematical representation of the distortion (camera intrinsics) to transform selected points between the two coordinate systems.

The "show undistorted image" option uses the current intrinsics to create preview of how well the intrinsics represent the actual distortion. If the intrinsics fit well, any straight edge in real life will also appear straight in the undistorted image. Inaccurate intrinsics will have a negative affect on the gaze estimation accuracy.

Note, that this is just a preview. The undistorted image is not being used otherwise.

user-f93379 30 September, 2022, 07:40:43

How do I start GPU acceleration for pupil capture?

user-e3f20f 30 September, 2022, 07:56:35

Pupil Capture does not support GPU acceleration. It purely relies on the CPU.

user-bbd687 30 September, 2022, 08:08:51

i alway can't use this cam .

Chat image Chat image Chat image

user-bbd687 30 September, 2022, 08:09:01

in win10

user-bbd687 30 September, 2022, 08:09:30

i can't get the image of cam

user-bbd687 30 September, 2022, 08:10:08

please help me @user-e3f20f

user-bbd687 30 September, 2022, 08:11:22

I already did what you asked, but it didn't work

user-bbd687 30 September, 2022, 08:14:16

If I open the app that came with Windows 10, I can get videos.

user-e3f20f 30 September, 2022, 08:18:19

That means that the drivers are not correctly installed. Try installing them manually using steps 1-7 from these instructions https://github.com/pupil-labs/pyuvc/blob/master/WINDOWS_USER.md

user-bbd687 30 September, 2022, 08:14:33

so,i think

user-bbd687 30 September, 2022, 08:14:56

i need your help

user-bbd687 30 September, 2022, 08:16:30

there are some trouble with :Pupil Capture v3.5.1

user-bbd687 30 September, 2022, 08:21:28

thanks for your help. Now.i will do it as the same ways.

user-bbd687 30 September, 2022, 08:34:50

Could you please show the picture of the successful replacement of the driver?

Chat image

user-e3f20f 30 September, 2022, 08:41:48

I don't useWindows at the moment. It might be easiest to jump into the #pupil-voice channel and do a quick screen share to get this issue fixed

user-bbd687 30 September, 2022, 08:34:57

@user-e3f20f

user-bbd687 30 September, 2022, 08:45:17

can you give me a doc or a video about this operation?

user-e3f20f 30 September, 2022, 08:45:45

No, the linked documentation is the only one we have.

user-bbd687 30 September, 2022, 08:50:54

i don't know how to us this code.

Chat image

user-e3f20f 30 September, 2022, 08:51:09

Step 8 does not need to be performed

user-bbd687 30 September, 2022, 08:51:07

i don't know how to use this code.

user-bbd687 30 September, 2022, 08:51:43

ok

user-869b8d 30 September, 2022, 12:06:45

Hi, I would like to know if any of you can help me with information to integrate PsychoPy and Pupil experiments. with LSL. UnU

user-4c21e5 30 September, 2022, 12:22:24

Hi @user-869b8d πŸ‘‹. We have an official PsychoPy integration: https://psychopy.org/api/iohub/device/eyetracker_interface/PupilLabs_Core_Implementation_Notes.html Have you seen that already?

user-869b8d 30 September, 2022, 12:22:43

not yet! Thank you ❀️

user-869b8d 30 September, 2022, 12:22:59

If I have any doubts, I tell you

user-632640 30 September, 2022, 16:34:22

What is the consensus on using the pupil core software with 3rd party recording devices like a web camera instead of the eye tracker?

user-4c21e5 01 October, 2022, 12:44:51

Hi @user-632640 πŸ‘‹. Do you mean in the context of remote eye tracking, i.e. having the cam mounted on or near a monitor, or head-mounted, i.e. close to the eye?

End of September archive