๐Ÿ‘ core


user-5562d7 03 October, 2023, 14:31:54

Hi ! I would like some information. Is the HTC VIVE Binocular Add-on compatible with the XR Elite Headset ?

user-480f4c 03 October, 2023, 14:36:49

Hi @user-5562d7 ๐Ÿ‘‹๐Ÿฝ ! Our add-on is only compatible with the HTC Vive, Vive Pro, and Vive Cosmos. However, you could opt for a de-cased version of Pupil Core (cameras & cable tree), but you would need to your VR headset. If you are interested, please reach out to [email removed]

Alternatively, VR/AR prototyping is possible with Neon (https://pupil-labs.com/products/neon/) + our โ€˜Bare Metalโ€™ kit (https://pupil-labs.com/cart/?ne_bm=1), but this is not a turnkey solution. Mounts would need to be prototyped to install it according to the geometries and constraints of the headset. We have decided to open-source the nest and Neon module .step files to assist users in this process (see here: https://github.com/pupil-labs/neon-geometry).

user-5562d7 03 October, 2023, 14:51:52

thanks ! If I purchase one or multiple add-ons from you, I need to calculate the customs fees since I live in Canada (Quebec). To simulate the amount of taxes, I need to provide the type of material. Among the choices are: "Cameras", "video consoles", "video games", "printers", "software", "computers", "batteries", "tablets and e-readers", "phones", "televisions/projectors/monitors". In which category do VR headset add-ons fall? Thank you.

user-480f4c 03 October, 2023, 15:00:04

Could you please contact sales@pupil-labs.com in this regard?

user-5562d7 03 October, 2023, 15:00:41

OK; Thanks!

user-16d3e2 04 October, 2023, 19:55:02

Dear PupilLabs Team,

I hope this message finds you well. Our names are Noah and Layne, and we are working on a research project on enhancing road safety through eye-tracking technology.

We are interested in knowing if PupilLabs is interested in exploring the possibility of incorporating their eye-tracking solutions into our research. Could we connect with a representative to discuss potential collaboration opportunities and pricing details?

Thank you for your time, and we look forward to your response.

Layne and Noah

user-480f4c 05 October, 2023, 06:28:46

Hi @user-16d3e2 and Layne๐Ÿ‘‹๐Ÿฝ ! I understand you also contacted us via email, right? If so, we will reply there as soon as possible!

user-583940 05 October, 2023, 09:05:32

Dear PupilLabs Team, My name is George and I have very stupid yet simple question about Pupil Core Open Source Code. How do I run it? I have read some instructions about the virtual environment required, yet I am unable to run it from the source. Are there other sufficient instructions that I am unable to find? Thank you

user-d407c1 05 October, 2023, 09:20:48

Hi George! Is there any specific reason why you would like to run it from the source code rather than the pre-compiled version? Otherwise, could you please detail where are you stuck and what Operating System do you have?

user-8a20ba 05 October, 2023, 15:55:54

Hi, I wonder if the Pupil Core uses Infrared Oculography or Video-Oculography๏ผˆVOG๏ผ‰๏ผŸ

user-cdcab0 05 October, 2023, 22:15:25

Pupil Core uses infrared light to illuminate and capture images of the eyes

user-bda2e6 05 October, 2023, 19:10:36

Hi, I have a Pupil Invisible and I'm trying to export my recordings. I don't have the gaze.csv in my exported files. Is this something I have to generate manually?

user-cdcab0 05 October, 2023, 22:11:14

Greetings, @user-bda2e6 ! Do you have the Raw Data Exporter plugin enabled? (https://docs.pupil-labs.com/core/software/pupil-player/#raw-data-exporter)

user-014bf8 06 October, 2023, 00:04:44

Hi, I'm trying to stream the video and data from a raspberry Pi to another computer. The whole system was developed and worked perfectly before, but it sat for months without running, it failed to stream videos. I could still start pupil capture from the other computer, but just no videos were received. Please see the image attached.

Chat image

user-014bf8 06 October, 2023, 00:05:19

Any tips would be greatly apperiacted!

user-572e3e 06 October, 2023, 02:13:24

Hi developers. Thank you for developing this amazing peice of hardware. We are integrating this eye tracker in our driving simulator. In a very plain language, this eye gaze data is used to manipulate the user interface on the windshield of the self-driving car. We have defined multiple area of interest (i.e., surfaces for gaze mapper). The data retrieved has a lag of couple of second. As a result, there is a lag in the user interface. We have a high performance PC: 64 gb ram, RTX 3080, Intel I9 12 core, and 10 GB graphics card. Is this a performance issue as a result of not up to mark hardware we have, or is it because we have define multiple surfaces (apx 8 surface) leading to a lag in data being sent to the subsribers? Let me know if you have any questions (ping me). I would really appreciate your response. Edit: we have 14 QR code trackers in our environment Edit: I am running the pupil eye tracker from source code and we have made a little modification to the data type of data being sent (only for a specif surface) to the subsribers. We are aware that this could lead to performance decrement, but we think it is not the major contributing factor. Edit: I have overlapping surfaces too

user-572e3e 06 October, 2023, 09:19:43

Sorry for pinging you like this @user-cdcab0 , are you able to determine the cause? We are actually stuck (for days) in development.

user-b03a4c 06 October, 2023, 05:03:12

Hi, my name is Ken. When i transfered recordings via USB, how i can get "Gaze Overlay" movie (mp4 file) as showed in Neon Companion applications.

user-480f4c 06 October, 2023, 06:42:19

Hi @user-b03a4c ๐Ÿ‘‹๐Ÿฝ ! Just to clarify: are you using Pupil Core or Neon?

user-572e3e 06 October, 2023, 09:34:24

The driving simulator software we are using btw, is very expensive in terms of computation.

user-cdcab0 06 October, 2023, 09:35:46

Have you examined the CPU load and RAM usage while running both Pupil Core and the driving sim? If you're maxing out the system, you may consider moving Pupil Core to a separate machine and streaming the data over the network

user-572e3e 06 October, 2023, 09:40:13

Thank you for your response. I will get back to you when I go back to my research lab and confirm if that is the case.

user-b1befe 06 October, 2023, 18:49:54

Hello

user-b1befe 06 October, 2023, 18:51:19

I have a question about the pupil labs eye camera. We want to connect it to an MCU, and we were wondering what cable it uses for data transfer?

nmt 07 October, 2023, 03:33:12

Hi @user-b1befe! Could you describe your set up in more detail? E.g. do you want to connect a single eye camera to your MCU?

user-4bc389 07 October, 2023, 04:08:20

Hi . I would like to use a USB extension cable to extend the length of the Pupil core data cable. Is this okay? If possible, are there any recommended USB extension cables? thanks

nmt 09 October, 2023, 03:03:33

Hey @user-4bc389! Yes, this is possible using active USB3.0 hubs as repeaters. With this approach we have achieved cable lengths of around 8 meters. I don't have a particular brand to recommend, but I'd probably go for something of reasonable quality.

user-5346ee 08 October, 2023, 19:30:41

Hello Pupil team, recently we ran into several cases where Pupil Capture keeps detecting tiny "pupils" on participants eye lashes and eye lids. (See pics) The problem is that fake pupils can be detected even when the real pupil is visible. We initially thought that it could be women's mascara confusing the algorithm, but we've observed this in cases where the participant did not apply make-up. Note: the experiment environment is a low-light environment (as we want to measure pupil dilations) and the computer screen is the brightest source of light. I tried to change ROI but it does not always work. Can you give us some advice on how to overcome this? Thanks.

Chat image Chat image

user-cdcab0 08 October, 2023, 23:32:29

Looks like this participant is wearing mascara, and the only work-around for that is not to wear it (or eye makeup in general)

user-b1befe 08 October, 2023, 20:40:05

We are considering a Raspberry Pi MCU.

user-cdcab0 08 October, 2023, 23:43:28

For the sake of clarity, the Raspberry Pi is not an MCU. It's a single-board computer (SBC). It consists of a CPU, RAM, and storage - like a very small PC. As such, these types of devices run typical mainstream operating systems (like Linux, Android, Windows, etc). These also typically have similar I/O capabilities as desktop PCs, like USB, HDMI, ethernet, etc plus sometimes general purpose I/O (GPIO) and bus interfaces like SPI, I2C, etc

Microcontrollers (MCU) are a separate class of devices. They typically have significantly less memory and do not run full-blown operating systems - instead only running a solitary program. For connectivity, these typically are limited to GPIO and bus interfaces like those mentioned previously - so usually no USB, no HDMI, no ethernet, etc.

The Raspberry Pi is a single-board computer (SBC), and the Raspberry Pi Pico is a microcontroller (MCU)

user-b1befe 08 October, 2023, 23:49:03

Hi Dom, thank you for the clarification. If i were to have an MCU, how would I connect the eye cameras to it?

user-b1befe 09 October, 2023, 00:23:49

More specifically, what kind of wire connection does the eye camera use to be connected to an MCU or SBC?

nmt 09 October, 2023, 02:52:25

Core eye cameras + MCU

nmt 09 October, 2023, 02:58:33

Hi @user-5346ee! To expand on my colleagues answer, mascara can influence the tear film, exacerbating reflections that appear on the pupil images. This can negatively impact on pupil detection.

That said, there does seem to be decent contrast between the iris and pupil in the image you've posted.

Would you be able to share an example recording with us such that we can provide more concrete feedback? You can do so via [email removed] Please share the entire recording directory.

user-5346ee 10 October, 2023, 12:42:56

Will do later

user-b85e9c 09 October, 2023, 05:13:00

I am still a little confused about this, I do not have a gaze_positions.csv, only a gaze.pldata which I see that there is the "gaze_point_3d" and "eye_center_3d", are these the same as the gaze_point_3d_x and etc described in https://docs.pupil-labs.com/core/software/pupil-player/#raw-data-exporter?

and this gaze_point_3d_z - z position of the 3d gaze point is the optical axis?

user-cdcab0 09 October, 2023, 05:41:29

Ah, it sounds like you have a recording, but haven't exported data from it (https://docs.pupil-labs.com/core/#_8-export-data). In short, open your recording in Pupil Player, enable the Raw Data Exporter plugin, and then hit the export button. This will make a new folder inside your recording named "Exports", and inside that a numbered folder for each export you do. You'll find the exported data much easier to work with than the .pldata files, which are really just an intermediary format.

The camera intrinsics file is just an msgpack-ed dictionary:

import msgpack

file_path = 'path/to/world.intrinsics'
with open(file_path, 'rb') as file_handle:
    world_intrinsics = msgpack.unpack(file_handle, strict_map_key=False)

print(world_intrinsics)
user-b85e9c 09 October, 2023, 05:14:54

also how do I open the world.intrinsics file? I assume this is where the camera intrinsics are located

user-2fc592 10 October, 2023, 03:26:36

Hello Pupil team, after analyzing some pupil invisible recordings and exporting them into iMotions, it has come to our attention that the gaze maps are off centered for many of the recordings. There is nothing in eye motion that can be done to fix this, but I was wondering if there was a way to correct the off centered data in the original recordings in pupil cloud? Is there a specific feature that would let me pull off the gaze map from the videos or alter the positioning of the gaze map on the video? Thanks!

user-870276 10 October, 2023, 18:23:59

why this is popping up ? my MacOS version is Sonoma 14.0

Chat image

user-cdcab0 10 October, 2023, 19:12:12

Hi, @user-870276 - for technical reasons, Pupil Capture on MacOS 12 and newer has to be launched with elevated privileges. More information and instructions are here: https://docs.pupil-labs.com/core/software/pupil-capture/#macos-12-monterey-and-newer

user-870276 19 October, 2023, 15:05:12

thank you so much @user-cdcab0

user-d407c1 11 October, 2023, 05:25:09

Hi @user-6072e0 ! One way is to use the surface output image ( the detected one, to perform the OCR) , effectively maintaining the same coordinates. Ainโ€™t that an option for you?

user-6072e0 21 October, 2023, 10:23:54

Oh sorry, I didn't know there was a surface output image hahaha ๐Ÿ˜…๐Ÿ˜…. I have defined the surface and recorded it like the image bellow, but there is no gaze_positions_on_surface_<surface_name>.csv file in the recording folder. How to get it? Thank you for answering

Chat image

user-673021 11 October, 2023, 11:19:46

setting with movement involved

user-b44c7c 11 October, 2023, 16:23:12

We are analyzing eye-tracking data that was recorded using your AR/VR add-on for a VR simulation users are playing through an HTC Vive Cosmos. We want to be able to generate heat maps using the surface tracker plugin in Pupil Player but we are having difficulty enabling that feature. Specifically, it says we cannot add surfaces. Are we doing something wrong? Can we get more insight into how to generate heat maps using Pupil Player from a pre-recorded dataset?

user-c2d375 11 October, 2023, 16:42:38

Hi @user-b44c7c ๐Ÿ‘‹ The Surface Tracker plugin (https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking) in Pupil Player needs the presentation of apriltag markers on the scene in order to define a surface and generate a heatmap. First thing first, did you have these apriltag markers in your VR scene? And if you did, how many of them did you use?

user-eb9296 12 October, 2023, 16:53:08

@user-c2d375 I'm with Julianna responding to your query above. We did not have Apriltag markers in the VR scene. How do we set these tags up? Is it something that can be done only before the recording?

user-c2d375 13 October, 2023, 07:57:00

Hi @user-eb9296 ๐Ÿ‘‹ Yes, if you intend to utilize the Surface Tracker plugin and define AOIs within your VR scene, it's crucial to position those markers in the scene prior to starting the recording. Unfortunately, it is not possible to add those post-hoc. For detailed instructions on accessing the apriltag markers collection and preparing your environment to use this plugin, please refer to our documentation (https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking).

user-29f76a 13 October, 2023, 09:40:52

Hi, I have a technical issue with the device. The eye tracker seems not connect to the apps and doesn't show the display for the glasses and no data were collected.

user-cdcab0 13 October, 2023, 10:09:35

Hi, @user-29f76a - if you're on a Mac, you'll need to run Pupil Capture with elevated privileges. See https://docs.pupil-labs.com/core/software/pupil-capture/#macos-12-monterey-and-newer for more information. Otherwise, please tell us a little more about your setup (what operating system, what version of our software, have you used it successfully in the past, etc)

user-29f76a 13 October, 2023, 12:13:20

i use the phone that comes with eye-tracking glasses (Neon Module and Frame). But the thing is, the glasses cannot connect to the phone @user-cdcab0

user-29f76a 13 October, 2023, 12:15:52

like this

user-b5a8d2 13 October, 2023, 16:28:37

Hi, while collecting the data with the phone provided, it shows up an recording error and I think it seems the file limit almost reach the limit of the phone provided. Could I remove the files manually in the phone? Would files already uploaded in the pupil cloud still stay after I manually remove that file in the hone? Thank you for your help in advance!

nmt 14 October, 2023, 02:38:58

Hi @user-b5a8d2 ๐Ÿ‘‹. You can of course delete files from within the Companion App. If they're already uploaded to Pupil Cloud, deleting them from the phone will not delete them from Cloud. It's worth double-checking they're safely backed up in Pupil Cloud first, though!

user-29f76a 13 October, 2023, 17:04:10

Chat image

user-28c7fc 13 October, 2023, 18:25:20

Is there any documentation on how to add events in the pupil cloud? I can't seem to see how you move the start and end markers. Also, how do you add several event markers to the timeline? thank you for any advice.

user-daaa64 14 October, 2023, 12:07:56

Hi Team! I have multiple gaze mappers (calibrations) in one video. So when Iโ€™m choosing the most accurate gaze mapper for my experiment among with them, do you recommend me to go for the one with the least accuracy result in post-hoc validation or the gaze mapper (calibration) that is temporally the most approximating to my experiment section in the video (so that less change could potentially happen between them, e.g., glasses slippage)? Can you help me please? Thank you very much in advance!!

user-edef2b 15 October, 2023, 03:24:52

Does Pupil Core require that the source of IR illumination is from the LED on the glasses? In other words, would the eye tracking still work if an external IR light was used instead? I figure that traditional eye tracking algorithms that detect the angle between the pupil and the glint from the IR light wouldn't work, but I don't know enough about the Pupil Core algorithm to know if the same problem would occur

user-d407c1 16 October, 2023, 11:17:03

Hi @user-edef2b ! Pupil Core uses IR illuminators that run at a peak wavelength (ฮป) of ~860nm (centroid 850nm ยฑ 42nm ฮ”ฮป). These IR illuminators are designed to illuminate the eye region and help capture black and white eye images that are used for dark pupil detection. The purpose of these illuminators is to provide enough IR light to the eye so that the pupil detection algorithm can work effectively. You can also use external IR sources as long as they are not filtered by the cameras. If you want to learn more about the algorithm employed, you can find here the paper: https://www.perceptualui.org/publications/dierkes18_etra.pdf

user-edef2b 15 October, 2023, 03:47:21

Additionally, what wavelength of IR does the illuminator produce?

user-15e3bc 15 October, 2023, 07:47:36 pupil + hololens2
user-789ddb 15 October, 2023, 14:45:15

Hi Lency,

Happy to help. We could organise a meeting this week and I'll run through my steps?

user-15e3bc 16 October, 2023, 12:51:57

Hi Dregz. That's really cool! It depends on your schedule. I'm free all week :D.

user-4514c3 16 October, 2023, 11:11:54

Hello everyone! We are having problems when calibrating... Now it cannot find the "stop" mark at a distance of two meters, has anyone had the same error? Thank you

user-d407c1 16 October, 2023, 11:50:02

Hi @user-4514c3 ! For the markers to be detected robustly, they needs to be of an appropriate size (meaning visible by the scene camera at that distance) and well lit (no shadows casted over the marker).

That said, the markers included within Pupil Core' s package are meant to be used at 1 to 2.5 m. And if the calibration marker is detected, I do not think the illumination is a problem.

Might it be that the marker is not well printed? You can try and print one here https://docs.pupil-labs.com/core/software/pupil-capture/#calibration-marker

Anyway, feel free to make a recording with the calibration, and share it with us at [email removed] so we can give you more feedback.

PS. You can also stop the calibration by pressing "C" on your keyboard.

user-15e3bc 16 October, 2023, 14:18:14

@user-d407c1How should I disassemble the Pupilcore, it looks like the cables are fixed inside the skeleton.

user-d407c1 16 October, 2023, 14:19:41

Hi @user-15e3bc ! Why would you like to disassemble Pupil Core? We can offer you a de-cased version if you would like to prototype

user-15e3bc 16 October, 2023, 14:22:22

Sorry, I didn't think about integrating with other AR devices at the time of purchase. Is it easy to provide the connection for the de-cased version then? Or can the cable be supplied separately?

user-15e3bc 16 October, 2023, 14:23:36

I mean the purchase link of the de-cased version

user-d407c1 16 October, 2023, 14:25:19

Please contact sales@pupil-labs.com to get a quote for a VR de-cased version.

user-15e3bc 16 October, 2023, 14:27:01

Okay, thank you ๐Ÿ˜„

user-870276 19 October, 2023, 15:34:48

Hello everyone,

I have a question regarding testing the pupil tracking system on a subject. After performing the Screen Marker calibration, I observed the following values: angular accuracy - 2.8 and precision - 0.61. However, even after this calibration, the tracker was unable to properly recognize the pupil. I'm wondering if this issue might be related to the pupil dilation, as one of the subject's eyes is more dilated than the other.

nmt 20 October, 2023, 02:53:58

Hi @user-870276! Would you be able to share an example screen capture such that we can provide feedback?

user-8a20ba 19 October, 2023, 17:06:24

Hi, I have a question about the eye tracker method. The eye-tracking method is Infrared OcculoGraphy-based or video-based. Did the Pupil Core use a Infrared light source๏ผŸ

user-8a20ba 19 October, 2023, 17:11:39

is it based on the principal that, if a fixed light source is directed at the eye, the amount of light reflected back to a fixed detector will vary with the eyes position?

nmt 20 October, 2023, 03:02:38

Hi @user-8a20ba ๐Ÿ‘‹. Pupil Core is video-based. We use IR emitters to illuminate the eye region, which is recorded by IR spectrum cameras, and we employ a 'dark pupil' detection algorithm to segment and track the pupils. Read about it here: https://arxiv.org/abs/1405.0006

user-870276 20 October, 2023, 02:56:30

Chat image

nmt 20 October, 2023, 03:04:39

yes, I suspect the size of the pupil there is causing an issue. Have you tried increasing the expected pupil size in the 2D detector settings? You can get to that in the eye window settings menu.

user-870276 20 October, 2023, 03:52:10

Hey Neil! Thank you so much! Will try the setting and let you know, and are there any constrains when using the pupil core for low vision or people with defective eye? And which device you suggest me to use in this senario as I have (Pupil Core & Invisible).

nmt 22 October, 2023, 23:49:09

It actually depends on how you intend to use the systems and what data you're looking to obtain. It might be helpful if you could briefly elaborate on your research aims and outcome metrics of interest. Then I can try to point you in the right direction!

nmt 22 October, 2023, 23:50:58

๐Ÿค”. I don't see a gaze circle in that screenshot. Just to clarify, your recording does contain gaze data, right? If not, that would explain the empty surface tracking export.

user-6072e0 23 October, 2023, 00:57:18

I believe there is a gaze circle when I tried yesterday, but maybe it's a coincidence that it's not in the screenshot ๐Ÿ˜…. What do you mean by gaze data in my recording are gaze_timestamps.npy and gaze.pldata right? Apart from that, I can't find the gaze_position_on_surface file, not that the file exists but is empty. Just to make it clear, I just need to press the R button to record or maybe there is another button from the surface tracker to record and get the file export?

user-cdcab0 23 October, 2023, 05:31:15

Do the cameras each work individually? If they do, and if your computer has more than one USB controller, you might have success putting them on different busses.

user-6072e0 23 October, 2023, 18:09:41

Ok I will try that ๐Ÿ‘

user-8779ef 23 October, 2023, 14:16:49

Hey folks. I believe I remember there is a way to force a monocular gaze map for individuals with strabismus, or who just have a bad track with one eye. Can someone please share that script?

nmt 24 October, 2023, 17:20:35

Hey @user-8779ef! Here is the dual monocular gazer: https://gist.github.com/papr/5e1f0fc9ef464691588b3f3e0e95f350

user-edef2b 24 October, 2023, 01:54:34

What is the intensity of the IR illuminators on Pupil Core (in either W/sr or W/cm^2 on the eye)?

user-ca4e2e 24 October, 2023, 14:39:08

Hello, I am working with a team on an academic project designing a DIY eye tracking system. We have come across the 120 Hz individual camera for the Pupil Core and are curious if our team can get ahold of a data sheet for this camera to see if it would be beneficial for our application as opposed to the de-cased webcams used in Pupil Labโ€™s DIY instructions.

nmt 24 October, 2023, 17:25:40

Hi @user-ca4e2e ๐Ÿ‘‹. The 120 Hz eye cameras are no longer available. However, we do sell a 200 Hz version, which is an upgrade to the 120 Hz predecessor. Are there specific specifications not listed on the website that you would like to know? You can use a single eye camera for pupillometry, or in combination with a scene camera for monocular gaze tracking. However, for the most robust gaze tracking, we recommend using two eye cameras.

user-ca4e2e 24 October, 2023, 14:42:36

Additionally, can you use a single 120 Hz camera or is it designed to be used in pairs?

user-870276 24 October, 2023, 16:13:14

We already did experiments on eye and hand tracking on normal people doing their daily activities. Now we wanted to do it on people with low vision or defective eyes to observe their gaze and fixations while doing daily activities which doesnโ€™t involve drastic movements. In this process we are coming across subjects with bigger pupils or subjects having defective eyes and the Pupil core isnโ€™t able to obtain the gaze and fixation data in these situations. So what would be a better solution for this?

nmt 24 October, 2023, 17:28:10

Thanks for elaborating. Pupil size shouldn't have an impact on Pupil Invisible's gaze output. However, what do you mean by defective eyes?

nmt 24 October, 2023, 16:55:44

Hi @user-edef2b! Check out this message for reference: https://discord.com/channels/285728493612957698/285728493612957698/1163435437445087322

user-edef2b 25 October, 2023, 16:25:27

Hi Neil! That message gave great information about the spectrum of the IR light, but I donโ€™t think there was anything about the intensity/brightness of the illuminators. Do you have any info on that or know where I could find it?

user-870276 24 October, 2023, 17:35:53

I meant Irregular eyes or eye with cataract. And what do you suggest using for this application pupil core or pupil invisible?

nmt 24 October, 2023, 17:51:14

To be honest, it's difficult to make a concrete recommendation here as there will be a lot of person-specific characteristics. Would it be feasible for you to try both systems on your participants? The time-cost associated with Invisible will be minimal since it's very easy to set up and use. Perhaps you could try it first and then Core if the output is not sufficiently accurate?

user-59ce16 25 October, 2023, 14:00:40

Hey, regarding Core and Capture, I'm currently refactoring / remaking the LSL plugin to work with surfaces as well. Upon debugging, I noticed that the gaze event list isn't ordered by timestamp. Though I realize that by being timestamped the problem is solvable on the receiver's end, I'd like to send the LSL samples in their correct order. Is there any reason they aren't sorted?

Or does this happen when there is generally low confidence?

user-cdcab0 25 October, 2023, 15:31:48

Hi, @user-59ce16 - I actually have an implementation of exactly this which will be integrated officially after some testing and feedback. Please have a look! https://github.com/domstoppable/App-PupilLabs

user-59ce16 25 October, 2023, 15:00:40

Also, is there a way to access a plugin from another plugin? (not refering to the notifications)

user-74fd00 25 October, 2023, 15:20:30

Hello, can anyone recommend antiseptic wipes/ spray to clean the Core headset that is available in the UK? Thanks!

user-d407c1 25 October, 2023, 15:22:57

Hi @user-74fd00 ! Have you seen our recommendations for disinfecting Pupil Core? https://docs.pupil-labs.com/core/hardware/#disinfecting-pupil-core

user-74fd00 25 October, 2023, 15:24:34

Yes, thanks. However unable to find a product with a similar composition.

user-870276 25 October, 2023, 16:39:39

Hi, here is a link to recording that i made with pupil core. I wanted to know what would be the reason for poor confidence values and improper gaze detection https://drive.google.com/file/d/1CwJsn0FUMkMThFFA_K3dFHA3P7B-E-6m/view?usp=drive_link

nmt 25 October, 2023, 20:46:42

Hi @user-870276. Thanks for sharing the screen capture - that helps. Poor pupil detection in this case is because the pupils are occluded. The full pupil needs to be visible for high-quality data. There's no way around that with the pipeline that Pupil Core employs. Perhaps you could make your experimental environment brighter such that the pupils are not so dilated?

Chat image

user-d407c1 26 October, 2023, 19:10:31

@user-79be36 Hi! Your colleagues has let us know that we did not respond to one of your questions here in discord, and that you needed assistance.

Would you mind letting us know what is your question either here or at info@pupil-labs.com ?

user-6f5808 27 October, 2023, 07:52:41

Hello everyone! I just installed pupil capture on linux but it seems the device could not connect. Any suggestions?

nmt 27 October, 2023, 20:49:28

Hi @user-39737f ๐Ÿ‘‹. Please follow these instructions in the first instance: https://docs.pupil-labs.com/core/software/pupil-capture/#linux

user-b1befe 27 October, 2023, 18:37:53

Hello, is there a wiring diagram for the Pupil Core right and left eye sensors? I want to wire the sensors to my own Raspberry Pi SBC. Additionally, what is the power draw for each sensor (how much amperage does each sensor use)?

nmt 27 October, 2023, 20:56:56

Hi @user-b1befe! We don't have a wiring diagram available to share, but you can connect your cameras to the Pi using a USB cable. The standard USB 2.0 power specifications are applicable. May I ask what you plan to do with your Pi? Are you intending to just collect eye camera streams or do you have other plans?

user-59ce16 28 October, 2023, 10:04:18

Regarding the timestamps and the events, do they generally come unorderered? Or can it be attributed to low confidence? I typically see timestamps in recent_events that are later than timestamps that come in the next list of recent_events.

user-b1befe 29 October, 2023, 22:01:55

Hi Neil. Thank you. What is the power draw for each sensor?

user-b1befe 30 October, 2023, 22:02:41

Hey Neil, just wanted to follow up on this. Please let me know,

user-825dca 30 October, 2023, 09:17:23

Hello everyone ! I'm using pupilcore with LSL for an experimentation. I collect the data via LSL and via pupil capture, using the LSLRelay plugin. I have the gaze positions recorded via LSL in my xdf file, but also the .csv generated by pupilPlayer after processing the data acquired via pupil capture. I noticed that the timestamp associated with the LSL data goes backward sometimes (fig 1 shows the timestamp diffs with negative values ) while it does not with the processed pupilplayer file (fig 2, also timestamp diffs). I therefore wonder 1/ Why do I have these negative values in the LSL file and 2/ how does pupilplayer adjust the timestamp to obtain only positive diffs ? Many thanks for your answers !

Chat image

user-cdcab0 31 October, 2023, 06:04:00

Are you using a library for processing the xdf?

user-b1befe 30 October, 2023, 22:26:54

Also, what kind of USB is needed to connect the sensors to the Pi (2.0, 3.0, Type-A, Type-B, etc.)?

nmt 31 October, 2023, 09:00:05

USB 2.0. The connection at the camera is a JST type SH (1.0mm). If you haven't already purchased a Core system and want to buy the cameras individually, we ship a USB-A to JST cable with each camera.

I would also like to reiterate my question about what you're looking to do with the Pi. Are you intending to just collect eye camera streams or do you have other plans?

user-cdcab0 31 October, 2023, 06:11:35

I don't know the individual power draw of the eye cameras, but the combined draw of the eye cameras, their IR illuminators, and the scene camera together has to be < 500mA.

As far as interfacing goes, I believe that if you separate the eye cameras from the rest of the Core headset, you'll be looking at JST-style connectors which won't directly plug into a Raspberry Pi. You'd have to solder the wires onto a USB-A plug

user-a81678 31 October, 2023, 11:10:44

Hi

user-a81678 31 October, 2023, 11:11:18

This is Manuel from Universidad Politรฉcnica de Valencia (Spain)

user-a81678 31 October, 2023, 11:11:34

I have pupil w120 e200b

user-a81678 31 October, 2023, 11:12:03

and I can not finish the calibration step

user-a81678 31 October, 2023, 11:13:26

as soon as yon can, could do you help me, pls?

user-a81678 31 October, 2023, 11:13:54

Chat image Chat image

user-a81678 31 October, 2023, 11:17:38

in the 2nd image, you can see the calibration stop at 4th movement

user-d407c1 31 October, 2023, 11:23:47

Hi @user-a81678 ! Could you please navigate to the folder pupil_capture_settings, copy the capture.log after the crash and share it with us at [email removed]

user-a81678 31 October, 2023, 11:25:05

ok. I will go

user-a81678 31 October, 2023, 11:34:46

I canยดt find this folder or file in Pupil v3.5.1 folder

user-d407c1 31 October, 2023, 11:46:06

Each Pupil Core software creates its own user directory. It is directly placed in your user's home directory and follows this naming convention: pupil_<name>_settings, e.g. pupil_capture_settings.

Alternatively, you can try using Windows search to locate this folder.

user-d407c1 31 October, 2023, 11:47:57

One more thing, just to be on the same page, does the software crash during the calibration or does it simply not finish/detect the marker?

user-a81678 31 October, 2023, 11:35:20

@user-d407c1

user-a81678 31 October, 2023, 12:01:23

not finish/detect the marker

user-d407c1 31 October, 2023, 12:04:22

Ok! then we might not need the log file, could it be simply that the scene camera can't see the marker? perhaps you need to rotate it slightly down?

user-a81678 31 October, 2023, 12:27:44

@miguel(pupli Labs)

user-a81678 31 October, 2023, 12:29:52

Miguel, it works now like you have told me!!

user-a81678 31 October, 2023, 12:29:59

thank you!!

user-255375 31 October, 2023, 16:08:56

Hi This is Karan here from Universitat de Barcelona. We are using Pupil core for our neuromarketing experiment.

user-255375 31 October, 2023, 16:09:13

I had certain doubts regarding the data analysis.

user-255375 31 October, 2023, 16:09:46

We tried a lot but cannot understand how to proceed with the data analysis.

user-255375 31 October, 2023, 16:10:24

Also, can we make heat maps with the pupil labs free software?

user-cdcab0 01 November, 2023, 09:12:44

Heat maps with Pupil Core can be created using the Surface Tracker plugin (see: https://docs.pupil-labs.com/core/software/pupil-player/#surface-tracker). If you have other specific questions, let us know! ๐Ÿ™‚

user-255375 31 October, 2023, 16:10:41

Please let me know!

user-255375 31 October, 2023, 16:10:45

Thanks ๐Ÿ˜Š

user-5138c1 31 October, 2023, 17:53:40

Hello! I wanted to learn where I can find the serial number for my pupil core device. Can I see it on my hardware or through the pupil capture software?

user-cdcab0 01 November, 2023, 09:04:53

Hi, @user-5138c1 - Pupil Core units do not have serial numbers

End of October archive