πŸ‘ core


user-789ddb 01 June, 2023, 08:24:25

Hey guys!

I'm wondering if the Pupil Labs Core equipment is capable of dealing with radiation?

user-1aa180 01 June, 2023, 13:45:05

Another thing- the application won't let me select the "shutter priority mode" or the "auto mode" - only the "manual mode" and "aperture priority mode" are selectable. Maybe that could be related?

nmt 01 June, 2023, 13:50:37

It's also worth restarting Capture with default settings to see if that solves it - you can do that from general settings. Re. the different exposure settings, unfortunately the naming conventions are a bit confusing. 'Aperture priority mode' is actually the auto-exposure mode

user-5fbe70 01 June, 2023, 13:53:37

@nmt Thank youπŸ˜ƒ

user-908b50 01 June, 2023, 17:48:01

Hi all, I am familiar with the dispersion-based algorithm used in pupil player to detect fixations post-hoc. I have been reading the code and trying to understand it. So my questions are: 1) does it take into the sample frequency of the gaze data (frame size?); 2) also, when calculating duration, is missing data considered? If there are m ore than a few missing timestamps, how is the end time of the fixation then calculated. Thanks!

user-cdcab0 01 June, 2023, 20:20:10

Hey @user-908b50 - the search for the end of a fixation starts at line 207 in shared_modules/fixation_detector.py (https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/fixation_detector.py#L207).

Starting from the gaze datum that achieves the minimum duration within the maximum dispersion (call this "A"), the detector iterates sample-by-sample until it finds gaze data whose timestamp exceeds the maximum duration (call this "B"). The samples in the range between A and B (but not including B) are searched to determine which one first exceeds the maximum dispersion.

I think that addresses both of your questions, but let me know if you need more clarification

user-ffe6c5 02 June, 2023, 11:18:37

Hi pupil labs team. We need to calibrate the Pupil Core at 1,5 m distance. This works perfectly when using the calibration marker that is send together with the eyetracker, which is sized for 1-2.5 m distance and has a diameter of about 6 cm. We were wondering if making the calibration marker slightly bigger (when printing the marker from the docs to DinA4 it results to a diameter of 8 cm instead of 6 cm) does affect the calibration results (accuracy and precision)? A quick test showed no difference in calibration results, do you have any experience here?

user-d407c1 02 June, 2023, 13:30:09

Hi, it should not make a difference, the only issue would be if you print it too small and it could not be detected in the scene camera.

user-fa2527 02 June, 2023, 12:31:51

Hello, Is the screen marker calibration method only for recording gaze on the screen?

user-d407c1 02 June, 2023, 13:37:45

Hi @user-fa2527 ! You can use it outside too, it is more convenient for the screen as the area calibrated would match but you can also use it outside.

user-fa2527 02 June, 2023, 12:58:11

I'm a brand new user desperately trying to generate some gaze data. I'm using Mac OS. My recordings show real world video but not gaze. Is that because Pupil Capture was not started with administrative privileges?

user-d407c1 02 June, 2023, 13:44:14

Admin rights are required to get the camera streams, so you would need them to run Pupil Capture. If you grant them, you should be able to make a recording and play it in Pupil Player. If you recently brought Pupil Core, you would be eligible for an Onboarding session. You can ask for it at info@pupil-labs.com

user-fa2527 02 June, 2023, 13:46:35

Wow! Thank you for letting me know about the onboarding. That should really help. Question: in the onlilne documentation, it says to copy the pupil capture application to the /Applications folder. Is that the same as the Applications folder?

user-d407c1 02 June, 2023, 13:50:56

The procedure does not differ from installing any other app in MacOS, it will ask you to drag it into the Applications folder (usually at Macintosh HD - Applications). Once there you can open the terminal on you mac and use the command described here: https://docs.pupil-labs.com/core/software/pupil-capture/#macos-12-monterey-and-newer

user-fa2527 02 June, 2023, 15:55:54

I am working on a Macbook Pro. Can I plug the glasses directly into it without the USB adapter?

user-fa2527 02 June, 2023, 15:58:01

I changed the administrative permissions as recommended and still do not get gaze data. Here is what Pupil Capture looks like - apparently no data from right eye:

Chat image

user-cdcab0 04 June, 2023, 19:58:09

From your screenshot it looks like you are receiving right-eye data. In the confidence graph, the left- and right-eye values overlap (the right-eye values are behind the left-eye values)

user-00cc6a 04 June, 2023, 15:23:06

Hello @nmt. I am using a plugin developed by jesseweisberg (https://github.com/jesseweisberg/pupil) to automatically detect objects from the scene camera. I have placed plugin files in the pupil capture settings folder, and when I am trying to start pupil capture software I received the following warnings. Moreover, I cannot see the object detector plugin in the plugin manager (Pupil capture application). Is it the correct procedure for loading a custom plugin, and if so, why is this warning showing? So how can I use this plugin with a pupil core headset?

eye0 - [WARNING] plugin: Failed to load 'object_detector_app'. Reason: 'cannot import name 'Visualizer_Plugin_Base'' eye1 - [WARNING] plugin: Failed to load 'object_detector_app'. Reason: 'cannot import name 'Visualizer_Plugin_Base''

user-cdcab0 04 June, 2023, 20:14:01

Hi, @user-00cc6a - at first glance that repo was forked a long time ago and hasn't kept up with changes to the main project in several years. Digging just a little deeper and it looks like that fork includes changes to code besides just the addition of the plugin. In short, you are not going to be able to run that plugin on an up-to-date version of the Core software without significant effort. You will probably have a much easier time running that fork from source, although I suspect you will have to also hand-pick dependency versions from that time frame just given how much time has passed since that fork was updated.

user-74c615 05 June, 2023, 12:13:46

hi everyone! This is my first time using core to connect to my computer(MacOS) and i have trouble with it . I am wondering where can i get some help?Thank you!

user-c2d375 05 June, 2023, 12:19:09

Hi @user-74c615 πŸ‘‹ Do you have troubles in accessing the Pupil Core video streams from Pupil Capture on MacOS? If yes, please take a look here (https://docs.pupil-labs.com/core/software/pupil-capture/#macos-12-monterey-and-newer) and let me know how it goes

user-74c615 05 June, 2023, 12:20:07

okok thank you!

nmt 05 June, 2023, 18:16:09

Getting started

user-1c31f4 05 June, 2023, 18:46:01

Hey folks, I'm having a little trouble with processing gaze data.

For reference, here is the structure of the gaze datum:

{
    # monocular gaze datum
    'topic': 'gaze.3d.1.',
    'confidence': 1.0,  # [0, 1]
    'norm_pos': [x, y],  # norm space, [0, 1]
    'timestamp': ts,  # time, unit: seconds

    # 3D space, unit: mm
    'gaze_normal_3d': [x, y, z],
    'eye_center_3d': [x, y, z],
    'gaze_point_3d': [x, y, z],
    'base_data': [<pupil datum>]  # list of pupil data used to calculate gaze
}

And here is the 'base_data':

{
    ### pupil datum required fields

    'id': 0,  # eye id, 0 or 1
    'topic': 'pupil.0',
    'method': '3d c++',
    'norm_pos': [0.5, 0.5],  # norm space, [0, 1]
    'diameter': 0.0,  # 2D image space, unit: pixel
    'timestamp': 535741.715303987,  # time, unit: seconds
    'confidence': 0.0,  # [0, 1]

    ### 2D model data

    # 2D ellipse of the pupil in image coordinates
    'ellipse': {  # image space, unit: pixel
        'angle': 90.0,  # unit: degrees
        'center': [320.0, 240.0],
        'axes': [0.0, 0.0],
    },

    ### 3D model data

    # Fixed to 1.0 in  pye3d v0.0.4.
    'model_confidence': 1.0,

    # pupil polar coordinates on 3D eye model. The model assumes a fixed
    # eye ball size. Therefore there is no `radius` key
    'theta': 0,
    'phi': 0,

    # 3D pupil ellipse
    'circle_3d': {  # 3D space, unit: mm
        'normal': [0.0, -0.0, 0.0],
        'radius': 0.0,
        'center': [0.0, -0.0, 0.0],
    },
    'diameter_3d': 0.0,  # 3D space, unit: mm

    # 3D eye ball sphere
    'sphere': {  # 3D space, unit: mm
        'radius': 0.0,
        'center': [0.0, -0.0, 0.0],
    },
    'projected_sphere': {  # image space, unit: pixel
        'angle': 90.0,
        'center': [0, 0],
        'axes': [0, 0],
    },
}
nmt 06 June, 2023, 07:18:12

Hi @user-1c31f4 πŸ‘‹. Check out this page of our docs for a description of each variable: https://docs.pupil-labs.com/core/software/pupil-player/#raw-data-exporter and this page for a coordinate systems: https://docs.pupil-labs.com/core/terminology/#coordinate-system (description of theta and phi under 3d eye model header)

user-1c31f4 05 June, 2023, 18:46:11

First, I'm curious what the field 'gaze_normal_3d' is.

user-1c31f4 05 June, 2023, 18:47:29

Second, I am wondering how 'theta' and 'phi' for each eye are established. I.E., Would (theta, phi) = (0, 0) represent the user looking straight forward?

user-1c31f4 05 June, 2023, 18:47:37

Thanks for any help!

user-d4e38a 07 June, 2023, 02:26:46

I have the phone buzzing, and I get a message that says Recording Error. When I unplug and connect the wire it seems to work again, but this is pretty annoying. Also the Live Stream works 75% of the time, and does not work the rest of the time. Long time loading up the page (even with the IP address of the companion specified). Sometimes it comes up, and other times it struggles.

nmt 07 June, 2023, 09:24:19

Responded to in πŸ•Ά invisible

user-74c615 07 June, 2023, 07:45:44

hi everyone! When i using core i find out that it cannot connect to pupil capture .The screen went out the picture of my eyes. Restart cannot fix the problem.Does anyone know what to do? thank you !

user-74c615 07 June, 2023, 07:47:05

hi i restart the default settings and problem solved!!!

user-6eeda2 07 June, 2023, 09:48:01

Hello! I have a Core headset and I'm creating a new plugin where I read the gaze position of the eyes. As I would like to use it on a Raspberry Pi 4b I don't want to show the windows of the eyes and the camera of the world. I do "python main.py capture --hide-ui" but sometimes the eye cameras don't open. If I don't use --hide-ui I can click on the buttons and activate "Detect eye 0" and "Detect eye 1" but I want to run it without showing/using any windows. Is there any way to program it so that it can open the cameras through the python file of the custom plugin? I tried to look for information on the Pupil website and this channel but I couldn't find it. thanks!

user-1c31f4 07 June, 2023, 15:13:55

I have a similar question to this. My project is going to be embedded, so any interaction with the pupil capture software using its GUI must be avoided. About 5% of the time, the camera feeds for eye 0 and eye 1 don't open, and I have to click the buttons to show them.

user-4514c3 07 June, 2023, 11:54:12

Good morning, can you recommend a computer that works well according to your experience with Pupil-Labs? Thank you so much

nmt 07 June, 2023, 15:32:20

Hi @user-4514c3 πŸ‘‹. We prefer Apple computers with M1 or M2 chips (+ 16GB unified memory). These get great performance out of Core systems!

nmt 07 June, 2023, 15:30:51

Hi @user-6eeda2! If I could ask, what's your reasoning for trying to run Core on a Raspberry Pi? It has pretty limited processing power and you'll likely not be able to do real-time gaze estimation on it.

user-6eeda2 08 June, 2023, 09:38:07

I was thinking of using a Raspberry Pi because of its small size and weight to be able to use it to process the position of the gaze while the user moves without the need to limit the user's displacement.

user-9a0705 07 June, 2023, 22:24:39

Hi guys, I'm looking for the mouse controller on this repo: https://github.com/pupil-labs/pupil. Didn't find it, googling on internet I found this repo: https://github.com/pupil-labs/pupil-helpers with this specific library mentionned previously. At the moment of running the code it shows that "ModuleNotFoundError: No module named 'windows'", package that I already intalled via conda and pip. Any ideas of how to solve it? the current python version I have is 3.11.3

user-cdcab0 08 June, 2023, 04:20:15

Hi, @user-9a0705 - it looks like our mouse controller is using a now deprecated PyUserInput package. It shouldn't be too hard to replace it though. I'll see if I can whip something up

user-00cc6a 08 June, 2023, 05:12:05

When using the surface tracker plugin, do we need to draw surfaces in the pupil capture application before starting the experiment or else, do we need to draw in the pupil player application and get fixation data onto the defined surfaces.

user-4bc389 08 June, 2023, 08:50:14

Hi on the right is my Pupil Capture software, but the lsl relay option is not displayed. How can I solve this? Thank you

Chat image

user-cdcab0 08 June, 2023, 08:58:22

Hi, @user-4bc389 - the LSL relay for Pupil Capture has to be installed separately. You can find instructions here: https://github.com/labstreaminglayer/App-PupilLabs/tree/master/pupil_capture

nmt 08 June, 2023, 09:47:08

@user-6eeda2 this is another potential (more hacky) solution: https://discord.com/channels/285728493612957698/285728493612957698/1108325281464332338

user-d407c1 08 June, 2023, 10:13:02

@user-6eeda2 If you run it from source you can use the flag --hide-ui https://github.com/pupil-labs/pupil#command-line-arguments, although it would only hide the window, the gui would still be there

nmt 08 June, 2023, 10:17:07

@user-6eeda2 my understanding is you would like a stripped-down version of Capture software to reduce computational load, is that correct?

user-6eeda2 08 June, 2023, 10:19:43

It could be nice to have that version.

user-4bc389 08 June, 2023, 11:34:47

Hi May I ask what caused the problem in the video, which resulted in the loss of pupil capture

user-d407c1 08 June, 2023, 11:42:01

Hi @user-4bc389 ! Would you mind developing what you mean by loss of Pupil Capture? Did it crash? Regarding the video, I see one issue, the py3D model of the eye seems poorly fitted, also seems like you freeze the 3D model. I would suggest unfreezing it, moving the eye around until the model fits better the eye, and then freezing it again if you need to freeze it for example to obtain pupillometry.

user-4bc389 08 June, 2023, 11:47:13

Sorry, my description may not be accurate. Thank you for your suggestion. I will give it a try

user-d407c1 08 June, 2023, 13:19:43

Hi @user-4bc389 ! It's me again, I've looked at the video again on a larger window and it does look like is changing (so you probably did not freeze it), apologies for the confusion, you can scratch that. And coming back to the loss of Pupil Capture, wouldn't you be referring to the pupil detection, and the moments where it is in an erroneous location? On that video, this occurs due to the eyelid occlusion preventing the pupil to be properly detected, in fact if you look at the confidence value you should see a drop. You can try to minimise this, positioning the eye cameras to capture the eye from a lower angle, providing a more clear view of the pupil at instances where the eyelid partially occludes it.

user-d99595 08 June, 2023, 13:30:42

Hi!

Short question. I notice that before my recording I get a good py3D model for pupil size detection if I let my participants look around. So once I have a nice py3D model I fixate the head and ask the participants not to move their eyes during trials. I keep close track of how the py3D model changes during my experiment. Yesterday I came across a participant that had a good py3D model, but once I ran the post-hoc pupil detection in pupil player, the algorithm was struggling to stabilize to a fitting model.

I started recording when the head and eyes were already stationary. Next time, should I let participants look around after starting the recording so that the post hoc pupil detection can create a fitting py3D?

I hope this is sufficient explanation I might be very vague sorry in advance.

Cheers

nmt 09 June, 2023, 07:21:16

Hi @user-d99595! Yes we would definitely recommend recording the model fitting process (i.e. when the wearer samples different gaze angles). We would also recommend freezing the model if you have a controlled environment. Have you already read out best practices for doing Pupillometry? If not, it's worth a read: https://docs.pupil-labs.com/core/best-practices/#pupillometry

user-d99595 08 June, 2023, 13:45:31

Basicly, my question is: should I start the recording before or after obtaining a good py3d model if I want to use post-hoc pupil detection? I'm primarilly interested in pupillometry (fixed head, eyes don't move during trial).

user-fa2527 08 June, 2023, 13:56:49

When I change the color of the Vis Fixation plugin circle and dot, it remains red. Is this normal?

user-00cc6a 09 June, 2023, 17:34:45

Hello, @user-cdcab0. I am using surface tracker plugin to capture gaze patterns using the surfaces defined in the pupil player. I am drawing multiple AOIs with the help of april tags and during this process, some AOIs are being overlapped one over the other. So, with the existing plugin, is it possible to deactivate a particular AOI and keep the other AOI active?

user-cdcab0 09 June, 2023, 21:44:56

No, you can't deactivate a surface/AOI. You can delete one, but then you'd have to recreate it when you need it again

user-eb6164 12 June, 2023, 03:01:13

hi a quick question i was looking into fixation files i just saw the confidence level is recorded for gazes what about fixations?

nmt 12 June, 2023, 06:56:22

Hey @user-eb6164 πŸ‘‹. Confidence for each fixation is reported in the export. It's the mean of the base gaze data used to generate a fixation. By default, values <0.6 are discarded from fixation detection.

user-4bc389 12 June, 2023, 07:19:11

Hi ,When I set up the eye video, I found that when my eyes look down, there is always a problem where the pupils cannot be well monitored, leading to unstable gaze points and easy drift. Common methods have been tried, but they are still easy to occur. Is there any other way to solve this problem?

Chat image

nmt 12 June, 2023, 07:51:50

Hi @user-4bc389! Thanks for sharing your video. Pupil obscuration like this can be a challenge and is often dependent on the wearer's eye appearance and/or facial characteristics. When the pupil is obscured by the upper eyelid, it's usually possible to find a more optimal camera position. When the pupil is occluded by the lower eyelid, the best course of action would be to tailor the experiment such that large downward shifts of gaze aren't necessary. E.g. keep the stimuli in a more neutral position.

user-6ce1ed 13 June, 2023, 01:42:49

Hi, I installed Pupil Capture on Ubuntu 22.04 and when I start the calibration in fullscreen mode, target display is extremely slow, making it impossible to complete the calibration. I have this problem neither without fullscreen mode nor on Windows 11 (dual boot). I have not found anything about a similar problem, has someone experienced the same issue ? Thanks

user-cdcab0 13 June, 2023, 04:08:01

Hi, @user-6ce1ed πŸ‘‹πŸ½ - what kind of laptop is that? Depending on your hardware, Ubuntu usually requires an extra post-installation step for graphics drivers. Are your graphics drivers installed and up to date?

user-6ce1ed 13 June, 2023, 04:48:03

Thank you for your answer, it's a lenovo ideapad 5 pro, with integrated graphics of a ryzen 7 5800u cpu. I don't think I have installed any graphics driver after installing ubuntu, is it needed even without a dedicated gpu ?

user-cdcab0 13 June, 2023, 06:45:55

Yes, some integrated graphics (or APUs as AMD likes to refer to them) may still require kernel modules/drivers. First though, can you confirm that you have pupil detection enabled and the eye cameras turned on?

user-6ce1ed 13 June, 2023, 06:54:45

Ok, I will check, thank you. Yes they are turned on, and the calibration is going smoothly when I don't run it on fullscreen mode

user-cdcab0 13 June, 2023, 06:58:29

Oh - it's only slow like that when you're running in fullscreen mode? That's interesting and unusual - sorry, I didn't understand that from your earlier message. It's still a hint to me that it may be a driver issue.

user-6ce1ed 14 June, 2023, 02:07:45

After driver installation, it works fine, thank you !

user-584956 13 June, 2023, 08:51:47

hi guys, i'm new to eye tracking. i'm just about to start using the pupil labs core for a study that's taking place outside. unfortunately, the sun's rays are bothering me so much that the pupil calibration isn't really working. does anyone have any ideas on how i can do something about this?

nmt 13 June, 2023, 09:05:23

The first thing to check when outside is the eye camera exposure. If the images are overexposed (washed out) that will hinder pupil detection

nmt 13 June, 2023, 09:05:41

You can adjust this setting in the eye windows

user-584956 13 June, 2023, 09:39:52

that didn't do much. the detectionrate of the pupil then jumps between 0.4 and 0.7

nmt 13 June, 2023, 09:49:15

Here are some additional steps you can try: 1. As the participant to wear a cap to block direct sunlight 2. Set the ROI to only include the pupil. Note that it is important not to set it too small (watch to the end): https://drive.google.com/file/d/1tr1KQ7QFmFUZQjN9aYtSzpMcaybRnuqi/view 3. Modify the 2D detector settings: https://docs.pupil-labs.com/core/software/pupil-capture/#pupil-detector-2d-settings 4. Adjust gain, brightness, and contrast: In the eye window, click Video Source > Image Post Processing

user-75df7c 13 June, 2023, 12:43:25

hi! trying to control pupil core with tcp messages, but having trouble. Do I just need to have Pupil Capture open and then send messages at the ip:port set?

user-d407c1 13 June, 2023, 12:48:32

Hi @user-75df7c ! to programmatically control Core, have a look at our network API https://docs.pupil-labs.com/developer/core/network-api/ it uses zeromq protocol

user-75df7c 13 June, 2023, 12:49:41

thanks! I saw that, but I can't use python. we have a custom system to run my experiment and with it I can send tcp messages, is that not enough?

user-d407c1 13 June, 2023, 12:59:32

what language do you use? Perhaps there are already some ZMQ bindings for it

user-443c5a 13 June, 2023, 18:03:38

Dear Team Pupil_Labs, I am a new kid on the block here & have an enquiry about what might be the best Pupil Labs system for me to purchase. I am interested in recording portable EEG/ExG [I plan to purchase a 32 channel mBrainTrain Smarting Pro system]. As I understand it, I could use LSL to integrate the eye tracking stream from one of your systems with their EEG/ExG data acquisition. I do not have the Smarting Pro system yet - I am making enquiries & trying to work out what would work best overall in terms of eye tracking systems [Core, Invisible or Neon]. My application would be indoors - ppl walking around an art museum looking at artworks - so there will not be a huge ballistic motion component. I would like to have some future flexibility to also do some task-activated recordings - we use PsychoPy to deliver stimuli in our lab to an existing tethered EEG system. [We perform EEG data analysis using MNE-Python. I would want to record EEG/ExG data using BDF format [24 bits]]. I am not wedded to any eye tracking analysis software at this stage, but my main main issue is making sure that the data acq of neuro & eye tracking data are properly synched. I am not a beginner with EEG/ExG, but am a newbie with eye tracking. Thanks for your consideration, Aina. πŸ€“

user-d407c1 14 June, 2023, 05:53:45

Hi @user-443c5a ! That sounds really interesting, I think Neon together with our reference image mapper and our Psychopy integration can fit you perfectly, but why don't you write us at sales@pupil-labs.com and request a demo so we can ensure that's the right fit.

user-443c5a 14 June, 2023, 12:04:48

@user-d407c1 : thank you for your response. I will do exactly that. It will be good to see the system demo. πŸ”

user-6127f0 14 June, 2023, 09:38:07

Hi, me and my resaerch team was doing a test with Pupil Invisible and once we did the recordings, it was giving an error and did not allow to create a project. Do you know what can be the problem?

Chat image

user-7ab893 14 June, 2023, 09:39:31

Hello. I want to do the Camera intrinsics estimation using the narrow-angle lens. I do see the scree with the dots, but when I pres "i" on my keyboard nothing happens. When I press the circular I button in the world window, the pattern disappears. Can you please tell me how to fix this? Many thanks

user-cdcab0 14 June, 2023, 09:58:12

Hi, @user-7ab893 πŸ‘‹πŸ½ - when you're pressing i on the keyboard you have to be sure that the main Pupil Capture window (and not the window with the circles) has keyboard focus. Can you give that a shot?

user-7ab893 14 June, 2023, 10:06:51

Thanks. How exactly do I do this? The cursor is already on the Pupil capture window, but still nothing happens when I press "i". When I press the ciruclar "I" button in the world window, the window with the circles still disappears

user-cdcab0 14 June, 2023, 10:11:16

After showing the circles, click back on the main window in a blank space (not on the circled-I or on any other graphical element). Then try pressing i on your keyboard.

If you're using a single monitor, you may need to resize the window so that the circles and main window both fit on the screen.

user-23177e 14 June, 2023, 12:04:50

Hi all, I have a quick question regarding the recorded videos from the eyetracker, the 'world' videos that have the views of the cameras combined. We would like to sync these videos with recordings made with go-pro cameras. I notice that the recorded framerate is 28.980 (mediainfo), however. Is this something that can be adjusted? And is this a variable framerate? Thanks!

nmt 15 June, 2023, 06:23:14

Hi @user-23177e πŸ‘‹. We do have a plugin that enables synchronisation of Core recordings with third-party video sources, like from a gopro. It takes a little bit of work to get set up, but functions really well. Further details here: https://discord.com/channels/285728493612957698/446977689690177536/903634566345015377 @user-2e75f4 has very recently used this with an azure video!

user-7ab893 14 June, 2023, 12:26:11

thanks. now it is working. However, when we press i, we get the message "World capture 10 calibration patterns" . This appears 10 times, but nothing happens. We dont think that there is a pattern that is captured and detected.

user-cdcab0 14 June, 2023, 12:44:48

The circled "I" should turn blue, and it'll have a counter next to it. The counter should go down by one each time the camera captures an image of the circles. If the number stays at 10, that means it hasn't captured the circles.

You may need to manually adjust the camera exposure down - I have found that a white background on a computer monitor creates a strong bloom effect and washes out nearby edges, making them impossible to detect.

user-7ab893 14 June, 2023, 12:26:35

and no data is displayed

user-6eeda2 14 June, 2023, 13:03:33

Hi! I want to estimate the gaze position of the eyes in millimeters with respect the scene camera using the Core glasses. I don't want to use the Pupil Capture software to obtain the gaze since i don't want to display any GUI and i want to run the code in a raspberry pi. What i am doing now is to access to the Pupil Labs cameras with pyuvc and then with pye3d i obtain for each eye the center of the sphere, projected sphere, phi, theta, etc. I don't know how to estimate the gaze wrt the scene camera from the data that pye3d is given to me. I tried to look to the code of Pupil Labs/pupil but, i get lost with all the files. how can i obtain the gaze? Thank you in advance!

user-584956 14 June, 2023, 15:11:52

Hey guys! Do you know what kind of infrared lights are there on the cam for recognize pupils? and what wavelenght spectrum do they have?

user-eb6164 15 June, 2023, 02:09:22

Hello, I am having an issue with the export process in pupil player. I have to repeat the process many times so that I can get the surfaces folder and the files inside it. I am worried that even after getting them (after many tries) some data are not exported from surfaces. What might be the issue?

user-def465 15 June, 2023, 02:30:33

Hello @user-eb6164 , just a quick question, do you get an error when you try to export ? Something like player - [ERROR] surface_tracker.surface_tracker_offline: Marker detection not finished. No data will be exported. ?

user-eb6164 15 June, 2023, 02:31:48

you mean when downloading the file on the player gui?

user-956845 16 June, 2023, 13:54:45

Hi teams. I just a question about data filtering. Now I am trying to analyze the gaze data. For each gaze data, I got a value, like β€œon_surface_confidence” which present the probability of the existing of corresponding gaze. I wonder if there is an official threshold for this value? For example, I should only keep the data that has more then 0.8 confidence. Or I need to decide it myself? Thanks for considering my question!

user-d407c1 19 June, 2023, 06:41:01

Hi @user-956845 ! First, let me clarify that on the gaze_position_on_surface you obtain one on_surface column which is a boolean telling you whether the gaze point is in the surface, and a confidence value of the gaze, which is related to the gaze itself and not to the surface.

There is no official threshold for this value, as it depends on the specific use case and the level of accuracy required for your analysis.

However, a common approach is to discard data points with a confidence level lower than 0.6, as it is a trade-off between the accuracy of the data and the amount of data that is discarded.

Ultimately, the threshold you choose will depend on your specific experiment and the level of accuracy required for your analysis. If you have clean data, you can afford to increase the threshold. If your data is noisy, you might need to reduce the threshold to have any data left.

I hope this helps!

user-eb6164 16 June, 2023, 17:14:51

Hello... any tips on how to make the process of adjusting the eye trakcer and calibration easier it is taking me a long time to do so for each participant i am using the single marker because i need a wider area to be covered on the screens but i always get bad calibration and sometimes it takes me more than 15 minutes to adjust it. especially the eye camera part i always have to adjust it several times to fit the user and always getting bad confidence

user-d407c1 19 June, 2023, 06:45:43

Hi @user-eb6164 ! Is a bit hard to provide feedback without further context, perhaps you can share an example recording with us at data@pupil-labs.com where you record the calibration choreography as well? That would help us give you more concrete help.

That said, have you tried the standard on screen calibration? What results do you get?

user-eb6164 16 June, 2023, 17:15:31

is physical markers on a white board better than single one?

user-5c56d0 17 June, 2023, 13:15:28

Thank you for your management. Could you please answer the following questions? Below are questions for Pupil Core.

Q1. I have a csv file named "blinks" in Pupil Core. As shown in the attached image (left side), line F is marked as index. What does this index mean?

Q2 In the csv named "blinks," is there an index for the blink fold point (the point where the eye is completely closed)? The blink fold point is the point where the eye is completely closed between the start and the end of a blink, as shown in the right figure in the attached image.

Q3 In the csv named "blinks", there are start_frame and end_frame. For example, which of the following is start_frame?γ€€(1) the frame at the moment the eyelid closes (i.e., the blink is recognized from the eyelid border), or (2) the frame at the moment the pupil begins to be hidden by the eyelid.

Q4 Which of the following is the blink recognition method?γ€€(1) Recognizing the blink from the eyelid border. (2) Recognizing the blink from whether the pupil is hidden or not.

Q5 Does the frame at start_time in line B of the csv mean the start_frame in line E of the csv?γ€€The two values did not exactly match when these correlations were calculated.

Chat image

user-d407c1 19 June, 2023, 07:07:13

Hi @user-5c56d0 πŸ‘‹ ! Pupil Core has two ways to obtain blink events depending on whether your require them in realtime or post-hoc. Most of the questions you have are already answered in the documentation. Which you can find here: For Capture, https://docs.pupil-labs.com/core/software/pupil-capture/#blink-detector For Player, https://docs.pupil-labs.com/core/software/pupil-player/#blink-detector

I would also recommend reading our best practices about blink detection with Core https://docs.pupil-labs.com/core/best-practices/#blink-detector-thresholds

But generally speaking, the blink detector uses the confidence in the pupil detection, and a filter over it (here, is the posthoc filter https://github.com/pupil-labs/pupil/blob/eb8c2324f3fd558858ce33f3816972d93e02fcc6/pupil_src/shared_modules/blink_detection.py#L360). On a blink event, there is normally a sudden drop in the confidence, followed by a quick recovery which is used as a signal.

  • Q1: The index, as noted in the docs, is the median frame of the world camera on that blink event.
  • Q2: We do not provide PERCLOS, but you can work around by defining a proper threshold and using the median of the blink event.
  • Q3: Please have a look at https://docs.pupil-labs.com/core/software/pupil-player/#blink-detector as all fields have a description.
  • Q4: See my answer above on how the blink detection works in Pupil Core.
  • Q5: Start time refers to the start of the blink event, while the start_frame_index is the world frame (scene camera) at which the blink event starts.
user-5c56d0 19 June, 2023, 09:04:10

Thank you for your reply. I am sorry to bother you, but could you also answer the following?

What does 'the median of the blink event.' mean? Does this mean the point where the eye is completely closed between the start and the end of a blink, as shown in the attached image?

Chat image

user-d407c1 19 June, 2023, 09:29:06

As described previously, the blink detector uses the confidence in the pupil detection, this signal is convolved with a filter and the response is named "activity".

Assuming a good pupil detection, on a blink, as the eyelid closes and occludes the pupil, the confidence in the detection of the pupil rapidly decreases and the "acitivity" signal increases.

Here comes at play, the onset threshold, which is the value at which the activity response signal starts peaking, or in some sort of way, the confidence in the detection of the pupil starts decreasing. You can define that threshold in the settings, then, as the confidence in the detection of the pupil starts increasing, the activity signal falls, and there you have the offset threshold, which determines how much it should fall to be considered end of a blink.

Although the activity response its directly linked to the pupil detection confidence and highly correlated with the openness of the eyelid, this is not a 1-1 relationship and therefore there is no fold point. You can highly increase the onset value, and lower the offset one to filter blink events where the eyelid may have not been fully closed, but again nothing ensures that the confidence totally decreased by the fully closeness of the eye lid.

The index, is the median of the world frame indexes that appear between the onset event and the offset. In other words, the start of the blink event as per the rise in the activity signal and the end of it by the signal falling under the offset threshold.

user-5c56d0 23 June, 2023, 07:00:21

Thank you very much [email removed]

user-eb6164 19 June, 2023, 16:18:07

I have tried the standard screen calibration it is better than the single physical marker i discovered that there is an issue with the eye tracker so will send an email to the team about that because i prefer the single marker since i have large area to cover. I have another question in the gazes output how can we differentiate between unique gazes is it the world_index?

user-91a92d 20 June, 2023, 09:25:17

Hello, I broke the connector on my pupil core headset with the eye0 camera. I would like to resolder it but I could not find the reference of the connector. Does someone know it ?

nmt 20 June, 2023, 17:50:45

Hey @user-91a92d πŸ‘‹. Please reach out to info@pupil-labs.com and a member of the hardware team will help you out!

nmt 20 June, 2023, 17:49:27

Hey @user-eb6164! In gaze_positions.csv, each row represents a unique gaze datum. The world_index tracks the scene cam sampling rate, so you'll find multiple gaze datums per world index.

user-eb6164 20 June, 2023, 17:57:17

oh thank you that clear things so for each surface I just have to cout number of rows will give me number of gazes

user-2e75f4 21 June, 2023, 09:55:30

On the pupil player is there a way to chop up the recording/trim the recording down?

user-d407c1 21 June, 2023, 10:16:31

Hi @user-2e75f4 ! Sure! You can click on the general settings (the icon that looks like a gear wheel) and there you can select the relative time range or frame index range to export

user-2e75f4 21 June, 2023, 10:24:59

Thanks πŸ™‚

user-3874a1 21 June, 2023, 18:05:27

Hi there! I am starting a new project using Pupil Core and am excited to learn more about this eye tracker! I have been playing with pupil player and the sample downloadable data, and had three quick questions: 1) what are the units for gaze_point_3d_x? 2) should gaze_point_3d_x basically be the same as norm_pos_x (both from gaze_positions.csv) except the norm is scaled? 3) if I look at something in the real physical world, keep my eyes there, and then pitch my head down a little bit, what would that look like in the gaze_positions.csv file? Maybe another way to put it is if we get eyes-in-head or eyes-in-real-world at the end? Thank you very very very much for your time!!

user-cdcab0 23 June, 2023, 10:52:36

Hi, @user-3874a1 - I think you'll find answers to your questions (and more!) in the official documentation. Here are a couple of links that I recommend reviewing: https://docs.pupil-labs.com/core/software/pupil-player/#raw-data-exporter https://docs.pupil-labs.com/core/terminology/#coordinate-system

To answer your question about fixating on an object while pitching your head up and down, you would see gaze Y coordinates change

user-6bf52d 21 June, 2023, 21:35:08

Hi! I am working on a computer vision project with the pupil core. For my work, I am trying to record a few clips with the world camera (for analysis) but for some reason, the recording is being saved at a lower resolution than what I have put in the settings. Is this a common problem? Can anyone who has faced this issue pls help me?

user-cdcab0 23 June, 2023, 10:57:07

Hi, @user-6bf52d πŸ‘‹πŸ½ - that's unusual! What resolution are you trying to record at and what resolution are you seeing? Have you tried more than one resolution? It might also help to know what version of the software you're using and on what operating system

user-e66c8c 22 June, 2023, 12:05:19

hello! I have a problem importing files from Pupil Labs to iMotions, previously I did it by downloading Pupil Player Data but it is no loger possible to download it from Puppils Lab Cloud, can you tell me how to import files to iMotions now? There have been some changes in the software and it haven't been changed in the help center yet....

user-480f4c 22 June, 2023, 12:07:08

Hey @user-e66c8c πŸ‘‹ ! I understand that you have the Pupil Invisible glasses, is that right?

user-e66c8c 22 June, 2023, 12:07:45

exactly

user-480f4c 22 June, 2023, 12:09:32

Thanks for clarifying that - Let's move this conversation to the πŸ•Ά invisible channel. I've replied to your question: https://discord.com/channels/285728493612957698/633564003846717444/1121411569608298547

user-e66c8c 22 June, 2023, 12:10:47

sure, thanks

user-5c56d0 23 June, 2023, 07:02:28

Dear sir. The Pupil core eye camera is currently at 120HZ, can you tell me how to make it 200HZ?

user-ffe6c5 23 June, 2023, 08:37:48

Hi, I want to calculate the horizontal pupil position/angle relative to the head/world camera origin in degrees. I'm streaming my data through LSL, which records norm_pos_x/y and gaze_point_3d_x/y/z. At the moment I'm calculating the horizontal angle (psi) in spherical coordinates like in this tutorial: https://github.com/pupil-labs/pupil-tutorials/blob/master/05_visualize_gaze_velocity.ipynb (psi = np.arctan2(gaze_point_3d_z, gaze_point_3d_x)). I was now wondering, if I should use norm_pos_x/y instead or if my calculation is right?

user-cdcab0 23 June, 2023, 11:05:04

I think what you have is correct. You wouldn't be able to calculate psi from x/y image coordinates because you have no depth component there.

Also, if you do end up working with those normalized coordinates for something else, keep in mind that they are normalized to the scene image resolution. Because the image width and height are different, the x and y normalized values have different scales

user-ffe6c5 23 June, 2023, 11:26:51

great thank you!

user-5c56d0 23 June, 2023, 13:20:36

Dear sir. The Pupil core eye camera is currently at 120HZ, can you tell me how to make it 200HZ?

user-d407c1 23 June, 2023, 13:33:02

Hi @user-5c56d0 ! You should be able to change this setting in the eye camera window under video source

Chat image

user-5c56d0 23 June, 2023, 13:36:03

Thank you very much [email removed] I got it.

user-5251cb 26 June, 2023, 05:34:38

Is it possible to purchase an HMD add-on camera that supports FHD resolution? I heard that it was available for sale a few years ago.

user-1a4b64 27 June, 2023, 14:41:59

Hi,

For our project we want to know when someone is looking at a particular surface based on a person's gaze data using the surface tracker feature from PupilLabs Core. However, the timestamps do not seem to match with what we expected. The timestamps of gaze.3d.0._on_surface are lagging behind the timestamp of the world video frame. Does anyone know if it is supposed to be like this, if not is there a quick way to fix this problem?

The data was logged using a hdf5 logger.

Chat image

user-1c31f4 27 June, 2023, 20:13:03

Hey folks, I'm running pupil capture from source on Linux. The software opens correctly and I can see all the camera feeds. But when I try to perform calibration, the calibration screen freezes, and I get the error:

(venv) [email removed] python3 main.py capture
[14:08:37] WARNING  eye0 - uvc: Could not set Value. 'Backlight Compensation'.                                                                                                                          uvc_backend.py:424
           WARNING  eye1 - uvc: Could not set Value. 'Backlight Compensation'.                                                                                                                          uvc_backend.py:424
[14:09:31] INFO     world - calibration_choreography.base_plugin: Starting  Calibration                                                                                                                 base_plugin.py:537
open(): No such file or directory
[14:09:34] INFO     world - calibration_choreography.base_plugin: Stopping  Calibration                                                                                                                 base_plugin.py:574
open(): No such file or directory
           ERROR    world - gaze_mapping.gazer_base: Calibration Failed!                                                                                                                                 gazer_base.py:203
           ERROR    world - gaze_mapping.gazer_base: Not sufficient reference data available.                                                                                                            gazer_base.py:297
           WARNING  world - plugin: Plugin Gazer3D failed to initialize
user-1c31f4 27 June, 2023, 22:44:39

@user-cdcab0 , do you or any of the other pupil labs members have any ideas to help me start fixing this?

user-1c31f4 27 June, 2023, 20:13:46

I don't think the problem is 'Gazer3D' failing to initialize. I'm pretty sure it's whatever's causing the "open(): no such file or directory" message. I looked at the source code in base_plugin.py and found no calls to an 'open()' method, so I'm quite confused.

Any help would be so appreciated! In return, I can keep you guys updated on running the pupil software on a single board computer πŸ˜„

user-2772fd 27 June, 2023, 21:02:54

Hello, can we get the gaze in Pixels of the World camera? in real time via the network api, Also what is the 0 and 1 limit of the normalized gaze values? thanks

user-1c31f4 27 June, 2023, 21:18:38

The normalized gaze value is mapping your field of view to a 2d plane. 0 would represent looking all the way to the left, and 1 would represent looking all the way to the right. So, 0.5 would represent looking straight forward. The same applies for the vertical axis (looking upwards or downwards).

user-1c31f4 27 June, 2023, 21:19:46

I'm not sure about getting the gaze in pixels of the world camera. I'm pretty sure the pink dot on the screen is actually using the norm_pos parameter -- which then I suppose could be mapped to the size of the world camera

user-2772fd 27 June, 2023, 21:21:12

Hi Kin, thanks for the reply , the normalized values chage between 0.4 and 0.6 even if the pink circle moves from one side of the image to the other, do I need to scale the values?

user-1c31f4 27 June, 2023, 21:29:30

That's interesting. Maybe my understanding of the normalized values is incorrect. You are looking at the norm_pos value, correct?

user-2772fd 27 June, 2023, 21:48:40

Exactly,

user-2772fd 27 June, 2023, 21:48:49

Chat image

user-2772fd 27 June, 2023, 21:51:19

Here is the message,

user-1c31f4 27 June, 2023, 22:02:34

https://youtu.be/h___c_abjx8

user-1c31f4 27 June, 2023, 22:03:21

The plot in this video is norm_pos. I am re-normalizing to -1,1 on both axes, but I am getting values between the full range of 0 and 1 before normalizing

user-1c31f4 27 June, 2023, 22:04:12

I'm not sure why your values only between 0.4 and 0.6 😦

user-2772fd 27 June, 2023, 22:08:23

I am using norm_pos of the gaze data after 3D calibration, is it will be helpful to try 2D calibration?

user-2772fd 27 June, 2023, 22:09:13

Also, what are the units of the gaze_point_3d? centimeters?

user-1c31f4 27 June, 2023, 22:36:50

I used 3d calibration too. Not sure about the units

user-2772fd 27 June, 2023, 22:55:48

We are using the physical marker calibration (not the screen),

user-cdcab0 27 June, 2023, 23:05:14

It's probably not actually frozen. Do you see one of the calibration markers on the screen? If you hit escape, does it close the calibration?

When you start calibration, the program is trying to collect data from both the world scene camera (specifically, the location of the calibration marker in the image) and from pupil detection. Those two pieces of the puzzle are both necessary together - if either of them are missing, then the calibration will not move forward (which may have give you the impression that it's frozen).

So your plugin: Plugin Gazer3D failed to initialize probably actually is the culprit here

user-1c31f4 27 June, 2023, 23:07:40

The first calibration marker appears on the screen, but it is greyed out (in the first frame of the 'fade in' animation). If I recall correctly, hitting escape does not do anything, but hitting "alt+tab" to navigate back to the main pupil capture window does close the calibration.

If plugin: Plugin Gazer3D failed to initialize is the culprit, do you have any tips on where I should start in order to fix this?

user-1c31f4 27 June, 2023, 23:49:58

@user-cdcab0 I've been looking through the source code to try to find where the error is coming from, and I can't find where the plugin: Plugin Gazer3D failed to initialize logging statement is coming from. I also looked for FileNotFound exceptions that could be logging the 'open(): No such file or directory' messages, but I couldn't find any that would create that message. I also looked for general except exception as e messages, but none seemed relevant.

user-cdcab0 28 June, 2023, 02:56:50

I hadn't previously noticed the open(): No such file or directory messages in my log, but I do see them now that you pointed them out (even when calibration is working). I tracked it down to https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/audio/init.py#L101

It's just trying to play a little notification sound, but the sound file probably doesn't exist on your PC. It's unlikely to be the cause of what you're experiencing

user-1c31f4 27 June, 2023, 23:51:53

I can't even find the words 'failed to initialize' anywhere in the repo...

user-cdcab0 28 June, 2023, 01:59:42

That error message is generated here: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/plugin.py#L436

Let me try to break my build in a similar way so I can help you figure out the best way to troubleshoot

user-1c31f4 27 June, 2023, 23:53:37

I've also searched all of the issues and discussions for the same error message, and none of them are related to my issue of the calibration screen freezing.

user-1c31f4 27 June, 2023, 23:53:43

Thanks again for any help, by the way!

user-1c31f4 27 June, 2023, 23:59:29

This https://discord.com/channels/285728493612957698/285728493612957698/1100792975132471326 and this https://discord.com/channels/285728493612957698/285728493612957698/1098517309552869386 appear to be the same issue. I read through the responses to those. All three camera feeds are working and so is pupil detection and the 3d eye model.

user-cdcab0 28 June, 2023, 02:19:12

@user-1c31f4 - let's start with the basics. 1) Have you made any changes to the source (git status, git diff, etc to check)? 2) Have you tried clearing your capture_settings folder and running with all the defaults? Note that when you're running from source, settings are loaded from the capture_settings folder in the source tree

user-1c31f4 28 June, 2023, 05:19:45

I'm going to take a look and make sure I haven't made any changes when I'm back at my lab. But, this evening, I found something interesting: the issue only occurs when calibration is run in full screen mode. And, when holding down alt tab (cycling through all the open windows), the program un-freezes. The same problem occurs with the rate at which the pupil glasses sends packets via the network api -- everything slows to a halt unless I hold alt tab. Pretty bewildering. Must be something to do with the focused window -- possibly a performance setting that throttles background windows. Or, maybe when the window is selected with alt-tab, a refresh is forced -- so by holding alt tab, a refresh is forced at a rapid rate (making it appear to run smoothly).

user-cdcab0 28 June, 2023, 05:48:12

"the issue only occurs when calibration is run in full screen mode"

Another (linux) user recently had this issue - they just needed to install graphics drivers

user-1c31f4 28 June, 2023, 05:22:57

I'm going to dig into it a bit more tomorow, and now I have a good starting point. Thank you so much for your help, @user-cdcab0 ! It's reassuring that I can move on from the 'no such file' message and the 'plugin3d failed to initialize' message. Whatever is going on is definitely tied to the 'focused window' thing I mentioned in my previous message.

user-1c31f4 28 June, 2023, 05:25:32

I should mention -- I am running the pupil software on a single board computer, the Lattepanda 3 delta. It runs fantastically smooth (except for this calibration issue), much faster than my 2020 MacBook Air. This device is likely the best solution for users attempting to use a raspberry pi, as the Lattepanda 3 Delta is only marginally larger. It's also very easy to get the pupil software set up on it because its x86, not ARM, so running from source is not even necessary, and all dependencies worked exactly as expected. The real-time pupil detection and gaze calculations work perfectly. (Just putting this here for others info ☺️)

user-1c31f4 28 June, 2023, 17:07:45

I also restarted with default settings, also no change

user-cdcab0 28 June, 2023, 20:45:07

Hm. Okay, I'd be nice to know if the release (non-source) version has the same behavior? Same for the other calibration methods. What about other fullscreen OpenGL apps - do they run properly?

I think the next step is probably investigating the calibration loop, ideally with an interactive debugger. I'd start by sttepping through the recent_events and gl_display functions in screen_marker_plugin.py to make sure those are both executing as expectetd

user-1c31f4 28 June, 2023, 17:12:37

Made sure I had no changes with git, no change

user-c34503 28 June, 2023, 18:46:43

Hello! I've just installed the pupil capture software and plugged in the device for the first time. The application has loaded but no video is coming from the device, just just a blank screen. Basically, I'm stuck at step 2 on the "getting started" steps. Any advice would be appreciated!

user-1c31f4 28 June, 2023, 20:19:37

Hi Sam, what operating system are you using? On my Mac, I had to run pupil capture with administrator privileges for the camera feeds to show.

user-cdcab0 28 June, 2023, 20:47:21

Hi, @user-c34503 - welcome to the community! @user-1c31f4 is right - if you're running on Mac you will need to run with elevated privileges. Some info and instructions here: https://docs.pupil-labs.com/core/software/pupil-capture/#macos-12-monterey-and-newer

user-40d3d5 28 June, 2023, 22:53:35

Hello! Would it be possible to record the eye images with a constant frame rate through Pupil Capture?

For example by using a Request-Response method instead of the current method. If this would be possible how high would the maximum frame rate be for 192x192 images? (Currently 200fps for Vive pro add-on)

user-1c31f4 29 June, 2023, 00:00:24

As far as I am aware, there isn't a way to lock the frame rate of the camera feeds. You may be able to configure the your code with the networking API to receive messages at a specific rate, although I am not sure if the images can be transmitted with the network API (maybe @user-cdcab0 can speak on that):

subscriber = ctx.socket(zmq.SUB)
subscriber.connect(f'tcp://{ip}:{sub_port}')
subscriber.subscribe('gaze.')  # receive all gaze messages

import msgpack

while True:
    topic, payload = subscriber.recv_multipart()
    message = msgpack.loads(payload)
    print(f"{topic}: {message}")
    time.sleep(16.66e-3) #delay for 16.6ms for 60fps 

Although that feels a bit hacky. I also don't think you can configure the ctx.socket object to receive packets at a different rate.

user-1c31f4 29 June, 2023, 00:08:40

What is the reason for this, anyway?

user-cdcab0 29 June, 2023, 00:40:31

Hi, @user-40d3d5 - can you clarify what you mean by "the current method"? When you're using Pupil Capture's recording functionality, you should be seeing a constant frame rate (which you can configure under the Video Source options for each camera) in the generated video files. Your framerate options are limited to what's supported by the camera at the requested resolution. 200 fps is the max for our eye cameras at their lowest resolution.

With regards to the request-response method, what do you mean exactly? You can use the network API to send commands to pupil capture - such as telling it to record (see https://github.com/pupil-labs/pupil-helpers/blob/master/python/pupil_remote_control.py for an example). That's effectively the same as just pushing the record button in the Capture app yourself. Either way, you should have a framerate equal to whatever you have configured.

You can also use the network API to receive video frames (see https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py for an example), but if your intention is just to write those frames to a file then you'd just be adding an extra layer of overhead. Note that messages must be read by the subscriber as fast as they are generated - otherwise some messages will be dropped. You may be interested in some of the information here: https://docs.pupil-labs.com/developer/core/network-api/#delivery-guarantees-pub-sub

user-1c31f4 29 June, 2023, 00:00:59

See https://docs.pupil-labs.com/developer/core/network-api/ if you are not yet familiar with the network API πŸ™‚

user-cdcab0 29 June, 2023, 09:59:09

I don't yet have an intimate understanding of everything that happens on that event loop. It's possible that even if the uvc video stream callback has more consistent timings that other activity in the loop might negate the effect. I think that's unlikely, but not impossible, and it would be disappointing to put in that work only to find no benefit

user-89d824 29 June, 2023, 11:15:05

Hi,

Please may I know if it's possible to trim a recording? I accidentally let Pupil Core continue recording for an additional ~10 minutes and I'd like to trim the recording before analysis. The total duration of the recording is now 30 mins which is too long and would probably crash Pupil Player

thanks!

user-d407c1 29 June, 2023, 11:21:46

Hi @user-89d824 πŸ‘‹ ! Check out our previous message about how to trim recordings https://discord.com/channels/285728493612957698/285728493612957698/1121020823910764554

30 min should not be an issue for Pupil Player

user-c2d375 29 June, 2023, 13:08:00

@user-d99595 Replying to your message from the Neon channel https://discord.com/channels/285728493612957698/1047111711230009405/1123913766594170941 I'd suggest using the Network API (https://docs.pupil-labs.com/developer/core/network-api/#network-api) to programmatically send an annotation each time the participant starts a new trial on Psychopy. We offer also a Psychopy integration for Pupil Core (https://psychopy.org/api/iohub/device/eyetracker_interface/PupilLabs_Core_Implementation_Notes.html)

user-d99595 29 June, 2023, 14:22:27

Thanks @user-c2d375 I'll have a look.

user-93ff01 30 June, 2023, 01:14:30

So far I've only run pupil core capture from source (V3.6.6), naively assuming it would be more performant with Python 3.11 etc. It usually occupies around 65-75% CPU on an 8core-16HT Core i7 Dell workstation (each eye is ~35% and world is ~2%). Much to my surprise when I downloaded the old (2021) precompiled binary V3.5 it occupies around 16% CPU (~7% for each eye capture and 2% for world capture). This is on Ubuntu 22.04. I tried to make sure the same plugins and camera FPS was used by both (I only have 2 default plugins enabled). Is this expected? EDIT: CPU measured using system monitor which is averaged across all cores (100% = all cores at 100%), pupil capture CPU values (around 600 for source per eye) are much higher as they are per core?)

End of June archive