๐Ÿ‘ core


user-fb8431 03 December, 2024, 06:49:15

Hello everyone, I am currently developing a plug-in of my own. I need to obtain the center position of the user's current eyes, the distance between the user and the screen, the camera rotation matrix and the translation matrix and other related data. How do I obtain it?

nmt 03 December, 2024, 12:17:15

@user-fb8431, I will respond in ๐Ÿ’ป software-dev

user-74c615 03 December, 2024, 08:40:04

Hi All! I often get this error (pink line) when marking my computer screen. What can I do to combat such a problem? Do I have to change the markers for each experiment?Thank you!!

Chat image

user-d407c1 03 December, 2024, 12:29:37

I'd try resetting the surfaces, remove them and them again when the markers are detected

user-74c615 03 December, 2024, 12:21:51

i did not know what have i done.. there is no fixation data anymore๐Ÿฅน ๐Ÿฅน ๐Ÿ˜ญ Does someone know what should i do ? please help me ๐Ÿฅน

Chat image

user-d407c1 03 December, 2024, 12:29:10

Hi @user-74c615 ! do you have the fixation detector enabled in the plugins list?

user-74c615 03 December, 2024, 12:45:55

Thanks for your instruction, Miguel.๐Ÿ˜Š I have checked. Both the fixation and blink detector are on. However, there is no data in either the gaze or fixation csv file. I have tried to reset the pupil capture, but it didn't work. Its really strange๐Ÿ‘€ ๐Ÿ‘ฉโ€๐Ÿ’ป ๐Ÿ˜ข ๐Ÿ‘ป

user-d407c1 03 December, 2024, 12:46:52

You may want to tune the parameters for fixation detection within the detector.

user-d407c1 03 December, 2024, 12:47:30

But if the issue persists, feel free to open a ticket in the ๐Ÿ›Ÿ troubleshooting channel so we can assist you further.

user-9e0d53 05 December, 2024, 09:03:18

Hello,

our lab has two of your ET core and Neon, I would like to know if it is possible to upload and process (analyze) the sessions recorded with core ET using pupilcloud or only in pupilplayer. thanks

P.s.: I know that a similar question from @user-5adff6 appeared here about a year ago but it was directed only to the possibility of saving videos, not to their analysis. And sinc 2023 also something can change=)

user-d407c1 05 December, 2024, 13:58:42

Hi @user-9e0d53 ๐Ÿ‘‹ ! No, Pupil Cloud is not compatible with Pupil Core recordings.

user-9e0d53 05 December, 2024, 14:00:51

Hi, Thanks for answer.

user-ea3911 09 December, 2024, 12:17:06

Hi there

user-ea3911 09 December, 2024, 12:17:20

Is it possible to use the cloud with the core?

user-f43a29 09 December, 2024, 12:35:19

Hi @user-ea3911 , Pupil Cloud is not compatible with Pupil Core recordings. What are you looking to do?

user-ea3911 09 December, 2024, 12:37:00

I am looking for an easy way to analyse gaze and AOI over a 3D printed model of an organ (liver) and generate a heatmap over a digital 3D model of that same organ

user-d407c1 10 December, 2024, 08:00:19

Hi @user-ea3911 ! Could you clarify whether the AOI/organ changes positions dynamically, or is it static? While there arenโ€™t any turnkey solutions for this, if itโ€™s static, you might consider experimenting with the Head Pose Tracker.

The Head Pose Tracker provides the relative head pose over a 3D model. By combining the head pose, gaze data, and a 3D model with AprilTags, you could align these tags similar to whatโ€™s demonstrated here. From there, you can use tools like Blender or Unity to recreate a 3D heatmap.

user-46e23b 10 December, 2024, 22:24:38

Hi, I'm planning on graphing some eye tracking data we have from a reading task our subjects performed. I was looking at graphing the gaze_point_3d x and y coordinates over time as a possible way of doing this. I was wondering what the units these coordinates are in and would you suggest graphing as is or if I should convert the points to visual angle?

nmt 11 December, 2024, 02:38:07

Hi @user-46e23b! gaze_point_3d is the defined as the nearest intersection of the left and right eyes' visual axes. It's unit is mm. May I ask what your end goal is with this visualisation / what is it exactly you want to visualise? Technically, one can generate plots from all of Pupil Core's datastreams, e.g. if you want to plot visual angle changes, that's entirely possible. But that said, visual angle can also change during movements of the head. So the best choice really depends on what the goal is and also the experimental conditions. If you could elaborate on these things more I can try to point you in the right direction.

user-a34db0 11 December, 2024, 00:52:54

Hi, I am using a Pupil Core device since year and a half. I recently updated my computer to Windows 11 and I started getting the error "USB device not recognized. The last USB device connected to this computer malfunctioned." when I connect my device and one of the IR cameras does not work. Is this a common error on Windows 11? I have re-installed the drivers multiple times. Is it possible that either camera or the connector is broken?

nmt 11 December, 2024, 02:41:36

Hi @user-a34db0! Is it just one of the cameras that's not working? If so, can you try swapping the non-functioning camera to the other side of the frame. Does it work on the other side or continue to malfunction?

user-a34db0 11 December, 2024, 13:08:31

[email removed] Nargund! Is it just one of

user-e5b833 12 December, 2024, 06:17:43

Hi, I want to convert the output 'duration' to seconds. How can I do this?

user-d407c1 12 December, 2024, 07:35:07

Hi @user-e5b833 ! I assume that you are referring to the fixation duration which is given in miliseconds? Is that correct? Then simply divide it 1000.

user-fb8431 12 December, 2024, 08:38:34

//Hello everyone, I have encountered a problem now. I installed the code environment on arm, and it prompted that the pyglui library could not be installed. I followed https://github.com/pupil-labs/pyglui/tree/arm64_support?tab=readme The installation process in -ov-file still cannot be installed.

user-76ebbf 16 December, 2024, 07:15:50

Hi, I want to output gaze data using a heatmap, but when using core, what is the method for creating a heatmap?

user-480f4c 16 December, 2024, 07:31:19

Hi @user-76ebbf! Pupil Core software allows for the generation of heatmaps for single recordings, using the Surface Tracker plugin. You can use this plugin, along with Apriltag markers, to define surfaces and create heatmaps within these surfaces.

If you're comfortable with some coding, we also have a tutorial showing how to generate heatmaps from surface-mapped data.

user-b0328a 16 December, 2024, 14:18:29

Hello,

I am using your micro eye camera from the Pupil Core system with a custom headset, combined with a Logitech camera for the world camera. After performing multiple screen marker calibrations without any movement (no head or positional changes), I observe significant variations in the eye_center_3d values.

To clarify, eye_center0_3d_x normally represents the center of eye-ball 0 in the world camera coordinate system. From my understanding, these values should should remain stable across different calibrations if I do not move or change planes..

Here is the data I recorded:

           eye_center0_3d_x    eye_center0_3d_y    eye_center0_3d_z

TEST 1 -154.317 -7.007 -47.143 TEST 2 -148.606 -2.159 -50.040 TEST 3 -58.097 -23.400 -48.722 TEST 4 -151.141 -15.187 -43.138 TEST 5 -125.316 -25.532 -40.781 TEST 6 -129.889 -16.376 -41.263 TEST 7 -158.744 -16.924 -50.250 TEST 8 -161.328 -11.080 -44.540 TEST 9 -145.555 -17.474 -47.026 TEST 10 -163.708 -33.539 -46.117

Is this behavior expected? Are there any parameters or calibration methods that can help stabilize these measurements?

Thank you for your help!

user-d407c1 17 December, 2024, 07:41:17

Hi @user-b0328a ๐Ÿ‘‹ ! You are correct that the eye_center0_3d_x, eye_center0_3d_y, and eye_center0_3d_z values represent the 3D position of the eye center in the world camera coordinate system. Ideally, when there is no movement (head or positional), these values should remain relatively stable across different calibrations. However, variations can occur due to a few factors.

  • Initial State: The 3D model of the eyeball is built using pye3d through a bundle adjustment method. For optimal results, we recommend rotating the eyes during calibration to help build a proper model. Check out this video: https://youtu.be/_1ZRgfLJ3hc A proper model of the eyeball would be indicated by a dark blue circle and it would ressemble a physiologically plausible eyeball shape.

  • Update of the model: This model keeps regularly updating to compensate for slippage and model adjustment errors. In certain scenarios, freezing the model can help stabilize results. You can learn more about freezing the model and when it makes sense to do so here:
    Freeze the pye3d Model.

Unfortunately without seeing the actual eye videos is hard to know what happened.

user-3418bc 16 December, 2024, 15:02:36

Hello, I own the Core product. When I bought it, the world camera was not yet included. - Can I purchase it now separately? - What is its specific use-case? E.g. would I need it to analyze a kid's reading performance on a screen? Thx & kind regards

user-d407c1 17 December, 2024, 08:42:57

Hi @user-3418bc ๐Ÿ‘‹ ! Thatโ€™s unusualโ€”unless specifically requested or for VR add-ons, Pupil Core typically comes with the world camera by default. You can reach out to our team at [email removed] to check if the camera could be attached. However, please note that this would most likely require rewiring the system if possible.

The world camera serves two primary purposes: 1. Recording the wearerโ€™s point of view and tracking the gaze point within that view.
2. Calibrate the system by tracking the calibration markers visible in the scene camera.

Let us know if you have any other questions or need further clarification!

user-b02f36 17 December, 2024, 06:24:14

Hello, I want to design an unique eye tracking task based on Pupil Core for psychological experiment. Is there an example or a demo I can find in Github repository ? Many thanks!

user-d407c1 17 December, 2024, 08:45:06

Hi @user-b02f36 ! You can, for example, use Psychopy and our plugin to orchestrate the stimulus presentation with eye tracking.

user-b0328a 17 December, 2024, 09:08:29

Hello @user-d407c1 Thank you very much for your detailed explanation.

I would like to clarify a few last points to better understand the positioning of the eyeball in the world coordinate system:

Are the eye_center0_3d_x, eye_center0_3d_y, and eye_center0_3d_z values expressed in millimeters? I ask this because I want to place the eyeball with the highest precision possible.

To confirm: Does the (0, 0, 0) coordinate represent the world camera position? Are the eye_center values indicating the position of the eye relative to this (0, 0, 0) origin? Thank you again for your help

user-d407c1 17 December, 2024, 09:25:15

Yes is mm, in 3D scene camera space coordinates and yes the origin is the scene camera if using a binocular system.

user-b0328a 17 December, 2024, 09:26:47

ok thank you, i am using a monocular system, what is the difference ?

user-d407c1 17 December, 2024, 09:36:10

Apologies for the confusion, I need a coffee ๐Ÿ˜… The eye center' coordinates origin won't change, but the 0 would refer to camera 0 for binoculars (OD) , and 1 for camera 1. For monoculars, it would be just for that eye

user-b0328a 17 December, 2024, 09:38:29

Thank you so much for taking the time to clarify this ! Wishing you a great day

user-d407c1 17 December, 2024, 09:48:05

You too!

user-76ebbf 18 December, 2024, 03:20:00

Hi @user-480f4c ,Thank you for your reply. I was able to successfully create a heatmap for images! Next, I would like to create a heatmap for videos. Do you have any reference materials?

user-3ac65e 19 December, 2024, 17:50:40

is there any way in the code base to turn off the IR LEDs

nmt 20 December, 2024, 10:58:17

Hi, @user-0f7b55. It's not possible to disable them via the codebase. But you could physically occlude them if required. May I ask why you want to turn them off?

user-a0b654 20 December, 2024, 11:40:31

Hi๐Ÿ™Œ I am a graduate student in psychology and am currently working on an experiment. I would like to measure the change in pupil size while watching some videos using pupil core. Is it possible to start recording as soon as the video starts and stop recording as soon as the video ends?

nmt 20 December, 2024, 11:46:58

Hey @user-a0b654! Thanks for the question. It's not recommended to start and stop recordings in this way. This is because not all of the processes are guaranteed to start at the same time. Instead, I'd recommend that you send events over the real-time API that correspond to the start and stop of the video. You can find an example script that shows how here: https://github.com/pupil-labs/pupil-helpers/blob/master/python/remote_annotations.py Does that sound like something you could use?

user-66ff4c 24 December, 2024, 07:09:21

Is there a Vive Pro 2 eye tracking kit?

user-d407c1 24 December, 2024, 07:47:15

Hi @user-66ff4c ๐Ÿ‘‹!

Thereโ€™s no off-the-shelf add-on available for the Vive Pro 2, neither for the legacy ๐Ÿฅฝ core-xr nor the ๐Ÿคฟ neon-xr integration. However, you can create your own frame for NeonXR to use with the Vive Pro 2.

For more details on this topic, you can refer to this previous message:
https://discord.com/channels/285728493612957698/1218909974923706459/1219186187529617438

End of December archive