πŸ‘ core


user-9b6d86 02 October, 2022, 06:56:29

Hello everyone, I am interested to use the HTC Vive binocular add-on. Do we get eye-tracking data when using the binocular add-on? Can the binocular add-on also be used for Oculus Quest?

user-183822 03 October, 2022, 11:01:11

hello,? i am working with pupil diameter and i want to do baseline correction.. so someone has any idea how can I do it?

papr 04 October, 2022, 07:28:13

If it is methodological advise that you are looking for, I can recommend having a look at this https://www.researchgate.net/publication/335327223_Replicating_five_pupillometry_studies_of_Eckhard_Hess

user-8ece89 03 October, 2022, 13:12:16

Hi, is it possible to aggregate different heat maps from different recordings of the same surface?

papr 04 October, 2022, 07:24:23

Yes, but not within Pupil Player. You will need to export the gaze on surface data for each subject and aggregate it manually.

user-1dc058 03 October, 2022, 13:50:29

Hi, is there any problems with pupil cloud? Marker mapper enrichment (1 section, duration is 10 sec) is being analysed for 20 minutes so far...

user-62c153 03 October, 2022, 20:56:29

My Pupil Pro Binocular comes with two pupil cams. Should they both be streaming simultaneously? When I select the first eye, no problems. When I select the second camera for the second video feed, my first video feed jumps to the second window. I can't find a sequence or combination to get both pupil cams streaming simultaneously. Any help?

papr 04 October, 2022, 05:32:45

Hi, do they both have the same name in the selection menu?

papr 04 October, 2022, 07:22:29

If they indeed have the same name, please 1) install this file https://gist.github.com/papr/b08deac1a1023ce5187bfe7d4d40c574 into your ~/pupil_capture_settings/plugins folder, 2) start Capture, 3) go to the Video Source menu, 4) enable manual camera selection, 5) select a different camera for each process.

The downloaded file will include an unique id that will help you identify the different cameras.

user-869b8d 04 October, 2022, 20:31:34

if you have any information, it would be good

user-718ea9 05 October, 2022, 16:16:23

Hello, I am looking to utilize the pupil core software for eye tracking however I need to find a different small format IR camera. What requirements are needed for the video feed to process seamlessly with the software.

user-cc3b62 06 October, 2022, 11:54:38

Hello! can someone please tell me what are the PC minimum requirements system for the installation of Pupil Labs software? Desktop, RAM, storage, minimum screen resolution.. Thanks in advance

user-d407c1 06 October, 2022, 12:19:54

Hi @user-cc3b62 Apple's silicon chips (like the M1 Air) work great with Pupil Core. If you prefer a PC, the key specs are CPU and RAM. Recommended specs are a recent generation Intel i7 CPU with 16GB of ram (but it will also work on a recent i5). The supported operative systems are: macOS (minimum v10.12.0), Linux (minimum Ubuntu 16.04 LTS), and Windows 10.

user-648ceb 06 October, 2022, 12:01:57

Does anyone else have problems with extremely long processing times for the cloud enrichment face mapping processing? I have a 15 min recording that has been at 75% since yesterday and still not done.

papr 06 October, 2022, 12:02:44

Hey, could you let me know the enrichment and workspace id?

user-746d07 07 October, 2022, 11:11:29

I am trying to visualize the position I am looking at in 3 dimensions using gaze_point_3d in gaze_positions.csv. Is the coordinate system of gaze_point_3d as shown in the figure? (Reference:https://docs.pupil-labs.com/core/terminology/#coordinate-system)

Chat image

nmt 07 October, 2022, 12:52:16

Hi @user-746d07 πŸ‘‹. That's correct

user-746d07 07 October, 2022, 17:51:20

Chat image

user-f939d8 07 October, 2022, 19:24:11

Good afternoon, I'm using Pupil Capture and running into an issue where the camera for one of the pupils keeps disconnecting and reconnecting, with a lot of dropped frames between loading. The window itself seems to be freezing quite often when I click on it as well. The other eye window works smoothly. I noticed a user posting about a similar issue - I assumed it's similar and was a hardware issue with the specific eye camera. However, the freezing seems to jump in between eye cameras, where one works for a period of time while the other suffers disconnections and freezing. One of the cameras is also heating up

This is the console log information

Estimated / selected altsetting bandwith : 151 / 256. !!!!Packets per transfer = 32 frameInterval = 82712 Estimated / selected altsetting bandwith : 151 / 256. !!!!Packets per transfer = 32 frameInterval = 82712 eye1 - [WARNING] video_capture.uvc_backend: Camera disconnected. Reconnecting... eye1 - [INFO] video_capture.uvc_backend: Found device. Pupil Cam2 ID1. eye0 - [WARNING] video_capture.uvc_backend: Camera disconnected. Reconnecting... eye0 - [INFO] video_capture.uvc_backend: Found device. Pupil Cam2 ID1. Estimated / selected altsetting bandwith : 151 / 256. !!!!Packets per transfer = 32 frameInterval = 82712

papr 08 October, 2022, 06:03:24

~~Hi, what kind of Pupil Core headset do you use?~~ Can you please try restarting with default settings in the general settings menu?

user-6d40b9 11 October, 2022, 06:00:34

Good morning, I am planning to measure the pupil light reflex response. Is it possible to synchronise the world camera with the eye camera in order to get the exact timestamp, when the light reflex starts and an idea about the light intensity?

papr 11 October, 2022, 06:42:02

Hi check out https://pyplr.github.io/cvd_pupillometry/04_overview.html

user-ced35b 11 October, 2022, 23:08:52

Hello, I am doing a binocular rivalry experiment where each eye received a different image while I record their pupil diameter. I have only run two subjects however I keep seeing that their dominant eye has a larger pupil than their dominant eye (3mm vs 2mm). Is it possible to see this difference in pupil size across the two eyes? Moreover, am I seeing a real measure due to visual processing or do you think this is a technical/hardware problem with the pupil measurement? I have attached the data from one subject who is left eye dominant. Thank you in advance!

Chat image

papr 12 October, 2022, 10:10:39

Yes, it is very much possible that the difference is a result of small eye model fitting inaccuracies. I recommend repeating the same trial multiple times and resetting+refitting the eye models between each trial. If you see the same difference across all trials, then it likely that you are measuring a physiological effect.

user-aa03ac 12 October, 2022, 09:42:10

Hi everyone! Could you say please, if it is possible to turn the World camera recording off remotely? I haven't found this issue using search, so I hope it's not some sort of a clichΓ©.

papr 12 October, 2022, 10:08:00

Hi, yes, that should be possible by sending this notification to Capture:

{
    "topic": "notify.start_plugin",
    "subject": "start_plugin",
    "name": "UVC_Source",
    "args": {
        "frame_size": (1270, 800),
        "frame_rate": 30,
        "name": "No scene camera",
    },
}
user-6586ca 12 October, 2022, 12:12:03

Hello everyone !

As part of our study, we assess the fixations within several AOI. The subjects, seating in front of the screen, should search some objects displayed on it.

Can you advise us about the best way to calculate the margin of visual error according to the distance between the subject and the screen.

One more thing, we need to expand the field of AOI to calculate the matches between AOI and subject's fixations (owed to some questions about peripheral vision). Can you give us any suggestions about how far we can go to expand this margin ?

Thanks for your help!

user-6e1219 12 October, 2022, 13:57:22

Hello, In one of my experiment I have connected Pupil Labs with Psychopy. Previously, I was getting proper data but today when I was trying to perform the experiment I was continuously getting errors like

" AttributeError: 'ioHubDevices" object has no attribute "tracker"

" AttributeError: 'NoneType" object has no attribute "getIOHubDeviceClass"

My eyetracker is properly connected with my PC. What would be the reason of this error? And how I can fix that?

papr 12 October, 2022, 13:58:25

Hey! Can you please check if you get the same issue with the latest version of PsychoPy, too?

papr 13 October, 2022, 08:54:56

Thank you for sharing the file. Could you share the full trace back?

user-6e1219 12 October, 2022, 14:00:26

Yes, I have reinstalled the version right now. It is throwing me same error.

papr 12 October, 2022, 15:04:28

Can you share the psychopy experiment file with me?

user-62c153 12 October, 2022, 14:39:52

Will pupil core be well suited to an application where the subject wearing the device is turning their head to the side to monitor an automated process (say on a screen) and then turn back the other way to do some hands on tasks? This cycle would be repeated many times.

papr 12 October, 2022, 15:05:33

Yes, it would. Even though, Pupil Invisible might be the better fit.

user-3aea81 13 October, 2022, 08:32:13

hello guys

user-3aea81 13 October, 2022, 08:32:39

i have one question

user-3aea81 13 October, 2022, 08:34:01

how can i get software catalogu of pupil core

papr 13 October, 2022, 08:35:22

You can download Pupil Core software via our documentation https://docs.pupil-labs.com/core/

user-3aea81 13 October, 2022, 08:36:53

is it pupilcore software catalogu?

papr 13 October, 2022, 08:37:44

The download will get you Pupil Capture, Pupil Service, and Pupil Player. Are you just looking for an overview of the available software?

user-3aea81 13 October, 2022, 08:40:58

hm... just i need pdf file

user-3aea81 13 October, 2022, 08:48:08

like technical sheet?

papr 13 October, 2022, 08:57:15

You can find general tech specs here: https://pupil-labs.com/products/core/tech-specs/

Unfortunately, we don't have a dedicate catalog for our software. This is a high-level overview over what is available for Pupil Core:

Pupil Capture - https://docs.pupil-labs.com/core/software/pupil-capture/

Connect Pupil Core to a desktop or laptop computer. View, record and stream real-time gaze and pupil data. Interface with other devices with our network API.

Pupil Player - https://docs.pupil-labs.com/core/software/pupil-player/

Drag and drop single recordings into Pupil Player. Build visualisations, enrich data with analysis plugins, and export raw data and results.

user-3aea81 13 October, 2022, 09:03:16

okay thanks papr ^^

user-3aea81 13 October, 2022, 09:03:39

have a nice day!!~

user-6e1219 13 October, 2022, 09:45:00

Chat image

papr 13 October, 2022, 09:55:24

This file is just for the in-app calibration. It is not meant for Pupil Labs. Have you made sure that Pupil Capture is running and that PsychoPy is setup correctly to connect to it?

user-6e1219 13 October, 2022, 10:09:06

To setup Pupil Labs with Psychopy what necessary steps I need to follow?

(1) Calibration (2) April tag surface declaration (3) Settings on the Psychopy (choose eye tracker and details) (4) Input as ioHub

Pupil capture was turned onn thoughout the experiment.

Do I miss anything?

P.s: I have recorded a pilot data with same settings couple of hours back to the error. It was working fine. But after couple of hours it was throwing the error.

papr 13 October, 2022, 10:22:34

When you run the experiment, the psychopy window should minimize and show the Pupil Capture calibration. i.e. (1) should not be necessary. Generally, these steps look correct. You can check out the Network API menu in Capture to verify that the address/port number did not change.

user-f590a4 13 October, 2022, 10:30:16

I have a fix for you! As expected, a very quick 'this thing isn't in this folder' bug was causing your issue! This will be fixed automatically in the next release of PsychoPy, but so that you can get up and running now you'll just need to follow a couple of steps:

Please save an unzipped copy of the folder I've attached here to your desktop
Navigate to: C:\Program Files\PsychoPy\Lib\site-packages\psychopy\iohub\devices\eyetracker\hw\pupil_labs (please note this is assuming that you've downloaded PsychoPy to your Program Files. If you've saved it somewhere else please navigate via that path instead)
Replace the entire pupil_labs folder that you have there currently, with the pupil_labs folder that you've now just saved to your desktop. Please make sure that the folder is named exactly the same as the previous one.

The module should now work as expected.

papr 13 October, 2022, 10:31:49

Nice! Thanks for sharing!

user-6e1219 13 October, 2022, 10:57:49

Thank you so much @user-f590a4

user-f590a4 13 October, 2022, 10:30:43

pupil_labs.zip

user-f590a4 13 October, 2022, 10:31:42

i hope this will work for you aswell πŸ™‚

user-f590a4 13 October, 2022, 10:32:05

no problem πŸ™‚

user-660f48 14 October, 2022, 11:32:13

I just have a very dumb question, but is there a pdf version of the user guide? I could certainly use a pdf copy of it

nmt 14 October, 2022, 18:04:45

Hi @user-660f48 πŸ‘‹. We don't have a complete pdf version of the userguide - everything is contained in the docs.pupil-labs.com pages

user-660f48 14 October, 2022, 17:02:15

So is that a no?

user-746d07 18 October, 2022, 06:42:15

Hello. Can I use Pupil core even if I wear glasses? If not, can I use pupil core if I wear contact lenses?

papr 18 October, 2022, 06:53:17

Hey! Using it with glasses is tricky. With contact lenses, it is no problem.

user-3c006d 18 October, 2022, 07:45:08

Hey, I run some of the plugins and had some detections done. After finishing this the data was exported automatically... but ehm. How can I open these files? What do they tell me? I am a but confused. I also tried to lay a heatmap over the video but it didnt work.

nmt 18 October, 2022, 07:50:25

Hi @user-3c006d. Data are exported as .csv files which you can open in, e.g. spreadsheet software. Checkout the online docs for an overview of analysis plugins and data exports: Analysis Plugins - https://docs.pupil-labs.com/core/software/pupil-player/#plugins Raw data exporter - https://docs.pupil-labs.com/core/software/pupil-player/#raw-data-exporter

user-3c006d 18 October, 2022, 07:45:54

bit*

nmt 18 October, 2022, 07:52:52

To generate heatmaps, you'll need to use the Surface Tracker Plugin: https://docs.pupil-labs.com/core/software/pupil-player/#surface-tracker

user-3c006d 18 October, 2022, 07:53:39

I have that plugin and I tried to start it but it told me that it wasnt possible to add any new surfaces

user-3c006d 18 October, 2022, 07:54:24

I will try again with the guide you just sent me

user-3c006d 18 October, 2022, 07:54:39

I might come back to you. If not, I was successfull πŸ˜„

user-3c006d 18 October, 2022, 08:23:49

Okay I already fail with adding a new surface. It keeps telling me that there no markers in the image so I cannot add a new surface

user-d407c1 18 October, 2022, 08:43:37

Hi @user-3c006d , are you using the right family of markers https://docs.pupil-labs.com/core/software/pupil-capture/#markers ? If so, please keep in mind, that you will need to have at least 3 markers to properly detect a surface. Also, if you are printing them, please consider that you should include some white margins, so that they can be properly detected.

user-3c006d 18 October, 2022, 08:45:09

okay. I might get my problem. I didnt print any markers and put them around the screen. So of course I dont have any markers in my video/image. Ergo I cannot generate a heatpmap.

nmt 18 October, 2022, 08:55:01

Markers are of course a pre-requisite: https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking. We always recommend reading the docs and piloting prior to a testing session, but appreciate that ideas for data analysis often arise post-hoc πŸ™‚. Feel free to ask questions on here in prep for any future testing!

user-3c006d 18 October, 2022, 08:45:34

Would have been helpful to know that before. Now it is too late as I cannot run that experiment again. Too sad,

user-3c006d 18 October, 2022, 08:46:19

I just thought I could select a frame in the video which I set as a "marker".

user-3c006d 18 October, 2022, 08:56:38

I remember that the Pupil Core came without a physical handbook. That might be something! I know you have everything on your homepage. But maybe you could send a little checklist or whatever with the glasses. So people initially know what to do before testing.

user-3c006d 18 October, 2022, 08:57:04

Now I know better. Which unfortunately doesnt help me very much 😦

papr 18 October, 2022, 09:00:01

@user-3c006d Not sure if this third-party plugin will work with the latest Player version, and how well it would work for your use case, but it might be worth a try https://github.com/cpicanco/pupil-plugin-and-example-data

user-3c006d 18 October, 2022, 08:59:52

Or maybe differently: I just saw that there is a Pupil Cloud. That sounded interesting to me but it is only for Pupil invisible? It sounded easier to handle and analyse data. Maybe you could preset a kind of a little questionnaire before actually buying/deciding which Pupil glasses is best for the purpose and based on that questionnaire you could recommend core or invisible.

user-3c006d 18 October, 2022, 09:00:52

thank you!

user-3c006d 18 October, 2022, 09:25:13

Hm it seems that this plugin doesnt work anymore. I followed the steps and copied the file to the plugin folder but the "screen tracker offline" plugin doesnt appear in pupil player

nmt 18 October, 2022, 09:31:09

Looks like the plugin hasn't been updated for a while now so that's not totally unexpected

nmt 18 October, 2022, 09:40:12

Yes, Pupil Cloud is only compatible with Pupil Invisible. Invisible certainly has benefits for a lot of applications, and it is very easy to use. If you're interested in learning more about Invisible, you can reach out to [email removed]

user-3c006d 18 October, 2022, 09:41:37

well, as I said I cannot run the experiment again so now I have to work with the data I have with Core.

user-3c006d 18 October, 2022, 09:43:09

are there any other third party plugins I could try? Any recommendations?

user-6e1219 18 October, 2022, 12:39:35

Hello, I am trying to analyze a HDF5 file recorded through Psychopy and Pupil core. But I am getting no values. What could be the possible reason for that ?

(1) I have clicked on the Check mark 'hdf5 'on the psychopy settings

Chat image

papr 18 October, 2022, 12:40:28

Let's continue discussing this here https://discord.com/channels/285728493612957698/973685431495426148

user-19daaa 18 October, 2022, 19:31:07

Hi! @papr I Hope you are doing great. While exporting data through the pupil player or API, I noticed that the timestamps for the pupil data and timestamps for the gaze data are different. Do you know any specific reasons behind this? Is it possible, or is there any way to get both pupil and gaze data with the same timestamp?

papr 18 October, 2022, 19:36:41

Binocular gaze samples inherit the mean timestamp of their two Base datums. Check out the base_data field for the corresponding Base timestamps

user-19daaa 18 October, 2022, 19:32:50

The differences are not in seconds but in milliseconds.

user-b2d705 20 October, 2022, 06:10:12

Hii, i wanted to ask how can i find saccades from the data exported (using pupil player). I am able to find the fixations, eye blinks, gaze positions csvs. I tried looking online but was unable to find any plugin and code for the same. A humble request to please guide how to find saccades.

nmt 20 October, 2022, 07:43:08

Hi @user-b2d705 πŸ‘‹. We don't classify saccades in our software. Check out this message for further details: https://discord.com/channels/285728493612957698/285728493612957698/831795883007016981

user-dfd400 20 October, 2022, 07:34:51

Hello, everyone. In "gaze_positions.csv", I'm tyring to project the "gaze_point_3d_x", "gaze_point_3d_y", and "gaze_point_3d_z" to the 2d images accourding to camera intrinsics. However, I found the point(yellow point) did not matched with the result(green circle) of the "Pupil Player". But the point drew by "nom_pos_x" and "nom_pos_y" could match with the result of the "Pupil Player". Dose that mean the 3d gaze point is different from 2d gaze point? Or the 2d gaze point is not obtained by projecting 3d gaze point? Thanks.

Chat image

nmt 20 October, 2022, 07:47:01

Hi @user-dfd400. gaze_point_3d is the intersection of both eyes' direction vectors (gaze_normal0/1). The 3d point is projected onto the camera image plane using the camera intrinsics, thus yielding 2d gaze (norm_pos). View the code for binocular case here: https://github.com/pupil-labs/pupil/blob/eb8c2324f3fd558858ce33f3816972d93e02fcc6/pupil_src/shared_modules/gaze_mapping/gazer_3d/gazer_headset.py#L301

user-b2d705 20 October, 2022, 10:59:52

is there anything else that can be used

nmt 20 October, 2022, 12:12:03

Pupil Core exposes a lot of raw data that you could use to implement your own classifier. Check out this message for reference: https://discord.com/channels/285728493612957698/285728493612957698/936235912696852490

user-80123a 21 October, 2022, 08:29:10

Hello all, I would like to know if it is possible with Pupil Capture + Surface Tracker + Pupil Player to do that. My first experiment was to ask participants to see projected objects using a video projector and then record the gaze direction of the participants. After the experiment, I collected the data and I compared the position of the projected objects and the gaze directions. In my second experiment, I will ask the participants to see 3D projected objects. The 3D projected objects will be at the same position (x, y), but with different depths (near / far). Could the Pupil Capture + Surface Tracker + Pupil Player do that? The problem is that Pupil Player only record the norm position (x, y), without z. If you have any suggestions ? Thanks in advance πŸ™‚

user-d407c1 21 October, 2022, 13:37:30

How would you project those 3D objects? like anaglyphs/shutter glasses? is the physical distance changing or only the perceived one? What would be the distances?

user-80123a 21 October, 2022, 13:39:35

The distance will be the same, but not the perception

user-80123a 21 October, 2022, 13:42:27

The distance will be around 2 meters

user-80123a 21 October, 2022, 14:12:50

Thank you! About the accuracy, for the case of normal usage (no 3D), just some projected 2D objects, the accuracy will be enough since I do not make an extra computation to find the z? even if the distance is still around 2 m?

user-d407c1 24 October, 2022, 09:28:01

Hi @user-80123a! To further help you, it would be probably best, if you could share what is your hypothesis or end goal here. Like, do you plan to compare convergence between those conditions? How does gaze direction changes? Do you want to measure stereopsis?

user-908b50 21 October, 2022, 18:23:38

Hi all, I've been meaning to calculate saccades now in my dataset. I have been at the messages in discord and found Teresa Canasbajo's open-source code and Salvucci & Goldberg's paper on the different types of algorithms to detect fixations and saccades. So, my question is do we absolutely need blinks for the code to work? We only collected data from the right eye. Is it possible to calculate blinks if you have right eye data? Also, we used version 1.11.4 to collect data and version 2.5.0 to process this data. Are we affected by some sort of data bug (just saw the release notes) that will affect our 3d circle radius values in pupil poisitions?

papr 24 October, 2022, 08:47:46

Hi, can you link me to the particular release that mentions the bug?

user-d407c1 24 October, 2022, 11:33:24

@user-80123a I have this for you to get you going. This is how you can compute the Z component of a screen based on the image disparities from right (OD) and left eye(OS), you can use a similar principle by assuming gaze coordinates from left and right eye on the surfaces, represent the left and right image. I hope this helps.

Chat image

user-80123a 24 October, 2022, 11:59:48

@user-d407c1 Thank you :), I will use this formula after finding the coordinates from left and right eye on the surface.

user-d407c1 24 October, 2022, 11:36:20

With the second assumption I meant that you will need to change "z" in the formula to be more accurate, as the distance from the target to the subject may vary (especially if the screen or wall where is projected is big). Objects on the corners will be further away than if they were presented in the center (assuming the participant is centred)

user-20283e 25 October, 2022, 13:07:10

Hi all, I installed everything to have the pupil data going to LSL. apparently everything was going well. However, when pulling chunks from Matlab I have always a lot of NaN values and therefore missing data. Any suggestion what to change to fix this? The attached figure is just the plot of all raw values from the LSL pupil_capture stream agains its time stamps. I really appreciate your help!

Chat image

papr 25 October, 2022, 14:30:52

Could you share the xdf file with me s.t. I can have a look?

user-80123a 25 October, 2022, 14:23:30

Hello again, I would like to ask how to get the 3D data (X, Y, Z) with Pupil Player, when using the plugin dual-monocular. I already selected the Dual-monocular 3D on the Pupil Player, and selected the Post-Hoc Gaze Calibration in the Pupil Player. I did not see a difference when I activated or deactivated the plugin. Thanks in advance.

papr 25 October, 2022, 14:25:51

Did you redo the calibration and recomputed the gaze mapping after switching to Dual-monocular?

user-b2d705 26 October, 2022, 05:28:23

Hii, I wanted to ask, in the gaze_positions.csv generated after export, the gaze_point_3d_x are the coordinates in which unit. I wanted to calculate the distance between them, so will the distance be in cm? Also in the fixations.csv, what is the unit of duration column (milli-seconds)?

user-bbd687 26 October, 2022, 05:53:07

Hello I have a problem. I don't know how to set the software so that the red dot can be displayed again. Let the red dot show up on the screen, in the world cam video.@papr

papr 26 October, 2022, 06:12:16

Go to the general settings and click the Restart with defaults button

papr 26 October, 2022, 06:13:41

In Capture, to get gaze, you need to run the calibration first. You can find the getting started guide in our online documentation

user-bbd687 26 October, 2022, 05:54:16

I used an eye cam and a world cam.

user-bbd687 26 October, 2022, 05:55:48

Would you please provide a guide for using the pupil-capture software for the first time?

user-bbd687 26 October, 2022, 05:56:00

@papr

user-bbd687 26 October, 2022, 05:56:26

@@user-c8a63d@user-53a8c4@user-a28f3d

papr 26 October, 2022, 06:11:38

Please refrain from tagging a lot of people. Please only tag someone e.g. if you are referring to somebody that you are in an active conversation with.

user-bbd687 26 October, 2022, 06:07:50

@user-b2d705Hi , Do you make the instrument by yourself or buy the instrument?

user-bbd687 26 October, 2022, 06:12:51

Sorry, I seldom use this software, I don't know how to use this software better

user-bbd687 26 October, 2022, 06:16:50

I only need to use one world-cam and one eye-cam. Is there any recommended Settings for capture software? I just need to get the position of the red dot.@papr

papr 26 October, 2022, 06:17:42

Default settings are sufficient for this

user-bbd687 26 October, 2022, 06:21:20

I would like to confirm that the default configuration of capture is OK when the software is installed.

Here are the steps I will take in the future:

  1. I choose to open world-cam and eye 1 /or eye 0

  2. Click "C" of the software for calibration

  3. After the calibration is completed, a red dot will appear on the screen

papr 26 October, 2022, 06:23:06

That are the steps within the software. But you might also need to adjust the camera positions to ensure pupil detection (the input for the calibration) works as expected

user-bbd687 26 October, 2022, 06:21:30

@papr

user-bbd687 26 October, 2022, 06:24:05

@paprok , thank you!!!

user-bbd687 26 October, 2022, 06:35:41

The software doesn't recognize the pupil, doesn't show the red circle. @papr

Chat image

papr 26 October, 2022, 06:41:19

https://docs.pupil-labs.com/core/#_3-check-pupil-detection please see the videos here.

papr 26 October, 2022, 06:42:50

To improve the result, you might also want to change the eye windows region of interest. Go to the eye window settings, set mode to ROI and drag the rectangle such that it excludes any dark areas that are not your pupil.

user-bbd687 26 October, 2022, 06:42:32

Yes, I'm trying to identify the pupil

Chat image

user-bbd687 26 October, 2022, 06:45:39

This is the setting of my eye-1, is it correct?@papr

Chat image Chat image Chat image

user-bbd687 26 October, 2022, 06:47:18

Chat image

user-183822 26 October, 2022, 07:06:27

hello..how many seconds are in the top 100 rows of the pupil CSV file? is it seconds or milliseconds?

Chat image

papr 26 October, 2022, 07:08:12

These values are in seconds πŸ™‚

user-bbd687 26 October, 2022, 07:57:51

@paprπŸ‘ πŸ‘πŸ‘ Thank you very much!!!

user-bd1280 26 October, 2022, 13:36:45

How can I set "Vis Polyline" (what is shown in green circle with a red dot in the middle) to show only one of the eyes movement (left or right eye)

Chat image

papr 26 October, 2022, 13:37:52

That is not currently not possible πŸ˜•

user-d90133 26 October, 2022, 17:02:35

@marc is there a way to view calibration accuracy and precision after having recording trials for a subject? I forgot to record a subject’s angular precision and I’m trying to see if I can still find that value

papr 27 October, 2022, 09:00:26

Hey, if you recorded the calibration, you can recalculate them with Player's post-hoc calibration plugin https://docs.pupil-labs.com/core/software/pupil-player/#gaze-data-and-post-hoc-calibration

user-b9005d 26 October, 2022, 18:06:07

Does the 3d eye model assume a certain radius constant? If so, is there a way for us to change that value prior to or after recording a session?

papr 27 October, 2022, 09:04:40

It does. https://github.com/pupil-labs/pye3d-detector/blob/master/pye3d/constants.py#L3 and https://github.com/pupil-labs/pye3d-detector/blob/master/pye3d/cpp/pupil_detection_3d.pyx#L15

The software is currently not designed to have that value changed on the fly. You would need to change those locations and reinstall the package.

user-ed360e 27 October, 2022, 16:50:11

Just to be sure, Pupil Core software doesnt work with Win7 or 8, right?

papr 27 October, 2022, 17:07:27

Correct. Windows 10 is required.

user-b9005d 27 October, 2022, 18:41:20

In the pupil positions output csv, do the circle_3d_normal arguments relate to positions in the world camera image?
https://docs.pupil-labs.com/core/terminology/#coordinate-system I noticed from your 3d Eye Model terminology that looking in certain directions relates to a change in x and y. Are these changes specific to the eye cameras or do they have any correlation to the world camera image? Like is looking in the center of the world camera image equivalent to a (0,0) coordinate in circle_3d_normal_x and y?

nmt 28 October, 2022, 07:43:53

HI @user-b9005d πŸ‘‹. circle_3d_normal is provided in 'eye camera' coordinates. Essentially, all data contained in pupil_positions.csv is in eye camera coordinates, whilst those in gaze_positions.csv are in scene camera coordinates. Check out this page for an overview: https://docs.pupil-labs.com/core/software/pupil-player/#raw-data-exporter

End of October archive