👁 core


wrp 03 February, 2020, 01:53:26

@user-26fef5 that's a good response! I'd also like to add to ensure that you have sufficient white border around each of your markers.

user-d6e717 03 February, 2020, 04:13:30

What can I do to get the program running on Windows 10?

user-e10103 03 February, 2020, 06:02:54

How can I show the hot zone and the eye-tracking?

user-c5fb8b 03 February, 2020, 08:46:25

@user-d6e717 Can you describe a bit more in detail what's not working on your machine? Is there any error message that you receive? Can you start Pupil Capture, or does it not even start? Please also send us a log file from after a failed attempt to run the software. You can find the logfile in: your home folder > pupil_capture_settings > capture.log

user-b23084 03 February, 2020, 17:17:14

anyone else getting shocked from their eye tracking cameras? Is this a faulty system, or is it just the dry winter air creating static? How do I stop it?

papr 03 February, 2020, 17:24:24

@user-b23084 This should not be happing. Please contact info@pupil-labs.com for support.

user-4bf830 03 February, 2020, 17:25:49

Hey @papr I’m using my pupil mobile headsets in my research lab and, after running successfully multiple times, my eye 0, then my world camera as well, are showing as “ghost mode.” What’s going on?

user-4bf830 03 February, 2020, 17:26:44

Testing with another headset reveals that the issue is not replicating. It appears to be hardware

papr 03 February, 2020, 17:27:18

Yes, this is likely a hardware issue, e.g. a loose cable connection. Please contact info@pupil-labs.com in this regard, too.

user-4bf830 03 February, 2020, 17:27:40

K, thanks

user-d6e717 03 February, 2020, 20:22:41

@user-c5fb8b every time I try to run the program in general or even as an administrator, the program does not run at all. Where should I send you the log file?

papr 03 February, 2020, 20:28:42

@user-d6e717 You can upload it here if you like or share it with data@pupil-labs.com

user-d6e717 03 February, 2020, 20:58:07

@papr I was somehow able to get it working!

user-d6e717 03 February, 2020, 20:58:11

Thank you all for your help

user-7b943c 03 February, 2020, 22:15:46

Anyone else having trouble with the 60 degree FOV world camera? I keep trying to focus it, yet it is blurry and never reaches the same sharpness as the 100 degree FOV lens. Also, does recalibration involve refocusing the lens manually or going through the calibration process on screen? Thank you for your help!

user-61027a 04 February, 2020, 20:19:14

Hi, I am setting up my pupil core equipment. When I run my program it keeps saying that I am in ghost mode and will not show an image for eye0 or eye1. I tried disconnecting the devices and uninstalling them then plugging the device back in and running Pupil Capture as an admin but to no avail. Ideas?

user-e7102b 05 February, 2020, 02:31:30

Hi @papr , I've noticed that my annotations are behaving strangely for some of my recording files. When I export one of these recordings into pupil player, my annotations appear to be loaded in, but then when I play the recording the annotation messages don't pop up in sync with the events on screen, instead they all appear at once right at the end of the recording. The annotation system timestamps that appear on screen seem to be correct, but something about the timing is off. I did some digging and it looks like the "system" and "synced" start times are incorrect (see info file). Do you know if there's a way to correctly sync the annotations and salvage these data? Thanks!

user-e7102b 05 February, 2020, 02:32:06

info.old_style.csv

user-e7102b 05 February, 2020, 02:37:16

All the annotations appear at the very end of the recording

Chat image

papr 05 February, 2020, 08:52:23

@user-7b943c The world camera does not have auto focus. You will have to adjust the lens manually. After changing and focusing lenses, please run the Camera Intrinsics Estimation procedure. Previous calibrations (for gaze mapping) loose their validity when changing lenses, too.

papr 05 February, 2020, 09:02:53

@user-61027a Does that mean that the world camera previews the video feed correctly? In the eye windows, could you check the "Activate source" drop down in the "UVC Source" menu? Which cameras are listed their? Does the drop down contain entries with "undefined"?

papr 05 February, 2020, 09:08:08

@user-e7102b I see, you sent annotations in "system time", while Pupil Capture was not synced to system time. Luckily, this is fixable roughly like this:

system = Start Time (System)
synced = Start Time (Synced)
offset = system - synced
annotation_corrected = annotation_wrong - offset
user-c72e0c 05 February, 2020, 23:34:26

Hi, i'm trying to control the pupil core via matlab 2017a mac OSx High Sierra, and i've gone through trying to get the matlab helper scripts up and running but am running into some issues getting zqm working. I think i need a compiler program like xcode but i need an older version compatible with High Sierra -- is there any support on getting Matlab communicating with the pupil capture system? Or does anyone have any experience doing this?

user-e7102b 06 February, 2020, 04:12:15

@papr ok thanks for the info. I have no idea what causes this occasional sync failure, as I always send annotations in exactly the same way (from matlab, via the pupil_middleman server).

user-e7102b 06 February, 2020, 04:24:36

@user-c72e0c It sounds like you have a similar setup to me. A couple of years ago another user and I created a "middleman" server that allows commands to be sent from matlab > pupil capture. It's very basic and will only work in pupil capture ver <1.9, but it gets the job done, provided you only want to communicate in one direction. I run it on a mac with either sierra or high sierra. I don't think you'd need xcode. Anyway, if you're unable to get the newer zmq based protocol working, then you might want to get this a go. https://github.com/mtaung/pupil_middleman

papr 06 February, 2020, 07:05:54

@user-e7102b I think the issue is that Capture's clock is not set correctly to Unix time. How do you do that usually?

user-54376c 06 February, 2020, 10:09:17

Can you guys tell me which exact connector type is used for the eye- and world camera? Are those 4 pin JST SH?

papr 06 February, 2020, 10:47:13

@user-755e9e this one is for you 👆

user-755e9e 06 February, 2020, 10:49:04

@user-54376c correct, JST-SH1 4 poles connectors.

user-83773f 06 February, 2020, 11:17:20

Hi guys, I'm currently trying to analyse VOR (vestibular induced nystagmus of the eye) using the PL system. While doing recordings I noticed that my eye model kept 'shrinking' (pupil fit remained accurate though) from a normal eye fit to a fit of just the iris, resulting in an amplitude scaling of the supposed eye movements being made. So what I would like to do is an offline check of my eye model to visually check when it is accurately tracking the eye versus when it is not (so i can discard false data). However in Pupil Player, I can only find a window for visualizing the eyecam + pupil fit, yet no eye model is being depicted here... Does anyone know if this is possible? Many Thanks! Jesse

user-5ef6c0 06 February, 2020, 16:25:56

hi all, I have a quick question regarding the fixation detector plugin in pupil player. What is a good value for maximum dispersion? In your docs you do not provide one, but in the screenshot you have it is set to 1.00. Also, it shows min and max duration to be 300 and 1000, respectively. Are these good values as well?

user-5ef6c0 06 February, 2020, 16:40:51

Also, is there a criteria to select [email removed] vs [email removed] I understand the implications of having less fps (more coarse eye position to video frame pairing)... but is this a big deal in most experiments? If not, I'd probably prefer 1080p. I have 0 experience in analyzing eye tracking data

user-a555da 06 February, 2020, 17:32:06

Hi all, I'm a newbie of pupil labs. The device that I use is a pupil core. But I got problem, my software can't show pupil. Btw, how to solve that?

user-5529d6 06 February, 2020, 18:38:49

Serious issue. I am currently using pupil capture 1.20.51, on ubuntu 18.04. The eye camera stops working randomly mid-recording without showing anything specific. The eye camera window automatically closes. I can't find anything in terms of resources that could make it crash (RAM, too much CPU usage, etc.). Sometimes it happens a few minutes in the recording, sometimes not at all, sometimes >15 minutes in.

Based on the syslog copy/pasted below, I think something odd is happening with timestamps.

Feb 6 11:54:16 unicorn pupil_capture.desktop[8267]: eye0 - [ERROR] launchables.eye: Process Eye0 crashed with trace: Feb 6 11:54:16 unicorn pupil_capture.desktop[8267]: Traceback (most recent call last): Feb 6 11:54:16 unicorn pupil_capture.desktop[8267]: File "launchables/eye.py", line 617, in eye Feb 6 11:54:16 unicorn pupil_capture.desktop[8267]: File "shared_modules/av_writer.py", line 168, in write_video_frame Feb 6 11:54:16 unicorn pupil_capture.desktop[8267]: ValueError: Non-monotonic timestamps!Last timestamp: 91864.441208. Given timestamp: 91864.440973 Feb 6 11:54:17 unicorn pupil_capture.desktop[8267]: cannot import name 'MPEG_Writer'

user-7b943c 06 February, 2020, 19:34:55

I'm new to programming and I have no idea what I am doing. I tried to update my pupil software but I messed up along the way. I uninstalled it and tried to download it again, but my Mac won't let me open the application. Someone downloaded it for me the first time so I know I am missing something. I get the following image. Any suggestions?

Chat image

user-83773f 06 February, 2020, 21:33:42

@user-7b943c You've got several solutions.. Option 1: open spotlight search (command + spacebar), type: "Security & Privacy", hit enter. Go to the General tab, and click the button saying something like "Open 'pupil capture' anyways..." Option 2: try right-click on the pupil capture app and click 'open', you'll get the same pop-op only know it has the additional option 'open', click that and it will run..

user-7b943c 06 February, 2020, 21:40:06

Thank you! @user-83773f

user-c6717a 06 February, 2020, 23:20:05

What is an acceptable amount of pupil data dismissed after calibration? I keep getting about 50% of the pupil data dismissed during calibration and that seems like a lot. The eye cameras look like they are picking up the pupil very well (centered, clear black pupil and red trace of the pupil). Any thoughts? Might it be that it is too bright? I'm next to a sunny window. Thanks!

user-c6717a 07 February, 2020, 00:48:08

We are also interested in using the Pupil Core outside. Any suggestions on how to make sure the IR cameras don't 'white' out? We have tried to use a baseball cap it doesn't really help. Any other suggestions or knowledge of how others have solved this problem. Maybe some sort of filter that can cover the IR cameras to block some of the ambient IR? It works fine in the shade, but we get bad 'white' out in the sun. Thanks!

wrp 07 February, 2020, 05:34:10

@user-5529d6 if you haven't already please email sales@pupil-labs.com with so that we can arrange a time to diagnose/repair HW if needed.

wrp 07 February, 2020, 05:36:26

@user-a555da You downloaded Pupil Core software - https://github.com/pupil-labs/pupil/releases/latest -- correct? What OS are you using? Windows?

user-d98d40 07 February, 2020, 12:29:42

Hello... please share me your experience, about calibration and collect data outside. is it possible to catch eye movements from the car to banners in the street? I have an experience in the shops but in outside no.. I am waiting for everyone/

papr 07 February, 2020, 12:47:07

@user-d98d40 It is possible to use Pupil Core outside but you need to pay additional attention to not overexpose the video feeds. Additionally, sun light can create very strong IR reflections which can have a negative impact on pupil detection. I would recommend to have a look at Pupil Invisible instead which was designed to perform well in varying outside conditions: https://pupil-labs.com/products/invisible/

papr 07 February, 2020, 12:49:25

@user-c6717a

I keep getting about 50% of the pupil data dismissed during calibration and that seems like a lot. I agree that this sounds like a lot. I do not have numbers of what constitutes "an acceptable amount" right now, though. Generally, it is recommended to perform an accuracy validation after the calibration and decide based on it if the calibration was accurate enough. Any suggestions on how to make sure the IR cameras don't 'white' out? Use the "automatic" exposure mode in the eye windows to avoid overexposure ("white out").

user-d98d40 07 February, 2020, 13:23:49

@user-d98d40 It is possible to use Pupil Core outside but you need to pay additional attention to not overexpose the video feeds. Additionally, sun light can create very strong IR reflections which can have a negative impact on pupil detection. I would recommend to have a look at Pupil Invisible instead which was designed to perform well in varying outside conditions: https://pupil-labs.com/products/invisible/ @papr thanks, but at this moment I have only core, so I need more info about calibration, about collect data, can u help me in this issue?

papr 07 February, 2020, 13:32:16

@user-d98d40 I understand, just wanted to make clear that there is a more suitable product to solve the task. :-)

In your case, I would highly recommend to check out our recently added Best Practices section in our documentation: https://docs.pupil-labs.com/core/best-practices/ It applies to outside and inside. And as described above, you will to make sure that the eye videos are not overexposed. @user-c6717a seems to be on the right track already. Maybe they can share more of their experiences.

user-c87bad 07 February, 2020, 17:37:36

Hi! I just used my headset to record some data (around 18 mins), but then I found that there is no gaze recorded after 1min50s, should that be a hardware issue or software issue?

user-c87bad 07 February, 2020, 18:04:34

And I tried it again, after 15min5s it happened again

user-838cfe 07 February, 2020, 21:06:51

Hi, is it possible to determine which frame is which in pupil player? I am trying to correspond fixation durations (from fixations.csv exported file) with the appropriate frame listed but not sure how to get the right correspondence between frame # in fixations.csv and actual frames in the video. Maybe I should just use any video player (e.g. VLC) on world.mp4 video? Thank you for your pointers!

user-5ea855 07 February, 2020, 21:40:27

Is there any way to increase capturing frame rate of eye camera upto 1Khz?

user-371eba 09 February, 2020, 05:16:37

Hi all,

I apologize in advance if my question has been answered previously in the chat (I did a quick search and didn’t see anything…)

I am designing a study in which I will be using the mobile eye tracker with geriatric participants. Within my study, the participants will also need to be wearing their prescription glasses to complete the tasks. However, recording with the eye tracker through the lens of their prescription glasses doesn’t give nice, clear recordings. As such, I think that my best bet is finding a way to thread the eye tracker cameras under their prescription glasses. I have been able to do this, but the cameras are a bit too close to the participants’ eyes. To get around this issue, I am trying to devise a way to slightly lift the participants’ regular glasses so that the eye tracker can be a bit further from their eyes when threaded under the participants’ glasses.

I am wondering if anyone else has used the mobile eye tracker with participants who need to wear their prescription glasses for the purposes of their experiment? Further, does anyone have any possible solutions I haven’t thought of? Or anything they’ve come across to slightly lift the participants’ regular glasses away from their face (I’m thinking a silicone nose piece or something along those lines)?

Thanks in advance!

user-abc667 10 February, 2020, 00:06:40

@user-371eba We've got the same problem, so if you find a workaround, please do post it. Thanks!

user-abc667 10 February, 2020, 00:08:45

@papr We're trying to get the distortion coefficients for the world cam. We can get the camera matrix, but Open CV requires the distortion coefficients as well in order to undistort an image. Can you (or anyone else) suggest where/how to do this? Thanks!

wrp 10 February, 2020, 02:54:12

hi @user-c87bad - please could you send capture.log file to data@pupil-labs.com so that our team can provide you with concrete feedback? Additionally if you could include a summary of the setup and steps that led to this behavior that would be helpful.

wrp 10 February, 2020, 02:58:27

Hi @user-97d695 It looks like you are trying to correlate world video frames with fixations. In fixations.csv exported by Pupil Player you will find two columns start_frame_index and end_frame_index -- this shows you how world camera frames are correlated with fixations. Note that fixations can span across many world camera frames as fixations are composed of a "cluster" of gaze positions. The "cluster" that is classified as a fixation is dependent on the paramaters of the fixation detector plugin (min/max duration & max dispersion). Hope this helps!

wrp 10 February, 2020, 03:03:22

@user-5ea855 thanks for following up here -- we have also received your question via email. We will follow up here as well for consistency:

While it would be amazing if we could support higher frame rate eye cameras, we are not able to do so currently due to both hardware and software limitations (limitations of single USB controller and the current speed of pupil detection algorithms would likely not be able to handle 1 KHz). I wish we could give you a "yes" on this, but unfortunately are not able to help you out with this specific request.

user-abc667 10 February, 2020, 04:37:24

@papr @wrp Re: best practices. Thanks, a nice addition to the docn. On slippage: we've been using inexpensive eyeglass retainers of the sort that slip on the ends of the temple pieces (eg. https://www.amazon.com/Leyaron-Universal-Retainer-Sunglass-Eyeglasses/dp/B074H3NCGB). They're inexpensive enough to use one per subject and can be adjusted to be secure without being uncomfortable.

user-abc667 10 February, 2020, 04:40:49

@wrp (Directed this to papr but thought I might try you as well.) We're trying to get the distortion coefficients for the world cam. We can get the camera matrix, but Open CV requires the distortion coefficients as well in order to undistort an image. Can you (or anyone else) suggest where it might be found in the code or files, and how to get it? Thanks!

user-26fef5 10 February, 2020, 08:20:16

@user-abc667 Hi randy. You can find both the camera matrix and distortion coefficients in the message pack encrypted camera calibration file in the pupil directory (given that you have made a calibration using the pupil capture camera plugin thingy) - copy the content and paste it into an online converter (e.g. https://toolslick.com/conversion/data/messagepack-to-json)

papr 10 February, 2020, 08:27:18

@user-838cfe To add to @wrp's statement: Player shows the current frame index next to the "seek head" at bottom playback timeline.

papr 10 February, 2020, 08:33:48

@user-abc667 To add to @user-26fef5 statement: - We have prerecorded intrinsics here: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/camera_models.py#L26-L129 - The Python version of @user-26fef5 approach would look like this:

import msgpack  # pip install msgpack==0.5.6

path_to_intrinsics = ...
with open(path_to_intrinsics, "rb") as file_handle:
    result = msgpack.unpack(file_handle, raw=False)
user-c87bad 10 February, 2020, 10:15:20

hi @user-c87bad - please could you send capture.log file to data@pupil-labs.com so that our team can provide you with concrete feedback? Additionally if you could include a summary of the setup and steps that led to this behavior that would be helpful. @wrp okay, I'll do that, thank you👍

user-abc667 10 February, 2020, 19:42:31

@user-26fef5 @papr Thanks for the info. Turns out we had unpacked the intrinsics file successfully but somehow still managed simply to overlook the distortion coeffs.

papr 10 February, 2020, 19:43:30

@user-abc667 Just out of interest, what tool/programming language do you use to unpack the msgpack file?

user-abc667 10 February, 2020, 19:45:16

@papr My student did it in python, apparently just as suggested, and can't quite account for overlooking the info we were after. One of the mysteries of life, but at least it's ok now. tnx

papr 10 February, 2020, 19:47:33

@user-abc667 Thanks for the info. Overlooking stuff like this happens all the time. This is what this chat is for: Pointing people to stuff they have overlooked or just not seen yet. 🙂

user-abc667 10 February, 2020, 19:48:55

@papr The responsiveness and understanding of you guys is one of the best things about your system. Rare to get such good suppoort. Bravo.👍

user-b23084 10 February, 2020, 20:14:37

Quick question. I need to install pupil capture on a computer that is not connected to the internet (apparently the still exist!) Would I need to do any more than simply copy over the pupil_capture_windows_etc. directory?

papr 10 February, 2020, 20:54:15

@user-b23084 Yeah, that should work

user-abc667 11 February, 2020, 00:07:51

@papr @wrp We are interested in the frame by frame transforms that get computed when using fiducials. We are using Apriltags and get good results on surface tracking, but would also like to get access to the transforms themselves. Is there someplace in the code we can siphon off a copy of the transform computed for each frame that contains a known surface, so we can establish the rotation and translation of the form in that frame? Many thanks.

user-aa7c07 11 February, 2020, 02:39:26

Hi there, I'm a clinician and PhD student in the Dept of Ophthalmology new to eye tracking and have just acquired a Pupil Core. Can anyone help explain the difference between norm_pos_x, gaze_point_3D_x, and gaze_normal0_x ?

Are there any resources that explain what each graph and column means when you data export? (I do own Holmqvist's Eye Tracking textbook but this problem is specific to Pupil Core)

user-aa7c07 11 February, 2020, 02:39:58

[University of Auckland]

papr 11 February, 2020, 05:32:01

@user-abc667 The exports contain the matrices for both types of transformation: image to surface and surface to image.

user-aa7c07 11 February, 2020, 06:41:27

@wrp that's fantastic - I'll read this in detail to get started with my data. Is it at all possible to discern radial smooth pursuit? to be able to know the X,Y coordinates of the circle on the screen to compare with the X, Y coordinates of the person’s gaze.

user-aa7c07 11 February, 2020, 06:44:04

It would be great if you could measure tangential error of smooth pursuit around a circle on the screen, for example, but would this require a monitor-based pupil tracker to know the x,y coordinates of the circle on the screen for the experiment?

wrp 11 February, 2020, 07:08:23

@user-aa7c07 while this is not opthamology, here is a project that uses smooth pursits (small smart phone screen + Pupil Core): https://perceptual.mpi-inf.mpg.de/files/2015/09/Esteves_UIST15.pdf

wrp 11 February, 2020, 07:08:59

You can also get gaze relative to screen by using markers and surface tracking: https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking

user-aa7c07 11 February, 2020, 07:10:55

That's great, thank you so much. Plenty to get me started

user-a48e47 11 February, 2020, 08:28:50

Hello, I am a graduate school studet. I have a question about "Adjusting of Coordinate (x,y) in recoding vedio". Before I worte this question, I found the way to adjust the gaze data but it only availalbe in "Gaze From Offline Calibration" not "Gaze From Recording". Therefore If anyone know about this, could you mind let me know?

user-b8b425 11 February, 2020, 13:50:13

@wrp Hi, do you have any idea how much bandwidth would one eyetracker cost ? I mean, is the video compressed before it is sent to PC through USB?

user-b8b425 11 February, 2020, 14:09:46

thx

user-4a6d1f 11 February, 2020, 15:13:55

Hello, I have some questions on the data recorded by pupil player. 1) About the file fixations_on_surface_1: I have a problem with norm_pos_x and norm_pos_y data. I have some values not comprised between 0 and 1. Whereas those values should be normalised. Do you know why ? 2) I would like to calculate saccade duration , saccade velocity and facade amplitude thanks to fixation.csv, gaze_position.csv and pupil_positions.csv. Could you help me to do that ? I am looking for some formulas but I don't find.

user-4a6d1f 11 February, 2020, 15:14:21

Thank you very much!

user-86d8ec 11 February, 2020, 18:45:45

Hi. We are unable to the remote host via Wifi. Both laptop and Andriod phone were on the same network. Is there any troubleshooting guide for this?

user-86d8ec 11 February, 2020, 18:46:35

Hi. We are unable to the remote host via Wifi. Both laptop and Andriod phone were on the same network. Is there any troubleshooting guide for this? I am referring to Pupil Core

user-abc667 11 February, 2020, 19:02:55

@papr "The exports contain the matrices for both types of transformation: image to surface and surface to image." Egad of course. Score one more for overlooking what was in front of me. Tnx.

papr 11 February, 2020, 19:12:40

@user-86d8ec please share the capture.log file with us after starting the Pupil Mobile manager.

papr 11 February, 2020, 19:14:59

@user-4a6d1f normalized locations outside of the 0-1 range are outside of the field of view of the respective camera. You can drop these data points if you are only interested in gaze points that can be located in the world video. Please be aware that you should filter your data by confidence first.

papr 11 February, 2020, 19:16:07

@user-4a6d1f unfortunately, the Pupil core software does currently not implemented saccade detection.

user-4a6d1f 11 February, 2020, 20:20:51

@papr thank you very much ! I have understood with the column True and False ! Ok, I am going to try to do without it. Thank you for your answer!

user-7b943c 11 February, 2020, 20:36:07

Just a random question, but when I use my Pupil Labs Core system the device seems to heat up after an hour. I am worried about this causing damage to the system, so I usually let it rest for 15 minutes after an hour has passed. Am I being paranoid? Can the system safely be worn for 5-6 hours continuously as I play around with it each day?

papr 11 February, 2020, 23:16:16

@user-7b943c it is totally normal that the cameras warm up, especially if you run them on high frame rates. Nothing to worry about :)

user-7b943c 11 February, 2020, 23:18:48

Thank you!! @papr

user-a48e47 12 February, 2020, 02:14:51

Hello, I have two questions.

1)First question is occured after updated the latest version of the player. I set the fixation Maximum Dispersion as 1.50, Minimum Duration as 200ms, Maximum Duration as 200ms as same as before. But the exproted fixations.xlse value are all different. I'm so embarrassed..... I want to know why this happened. (All setting value are all same as before)

2)I want to calculate the entropy rate using the transition probabilistic matrix which based on the fixation data. If anyone know about this then please let me know!!

papr 12 February, 2020, 04:34:29

@user-a48e47 hey, we have recently updated the fixation detector to be more consistent across different settings and included a lot of fixes that prevent false positive detections. Therefore, it is expected that the output changed in the new version.

Please do not use the same value for minimum and maximum duration. It is extremely unlikely that fixations have an exact duration of 200ms. I would highly recommend to extend the allowed duration.

Unfortunately, I am not able to answer your second question.

user-a48e47 12 February, 2020, 04:55:58

@papr thank you for reply. By the way if you don't mind, may I ask one more question? Lots of papers fix the fixation as 100ms (Crundall 1998, Chapman 2002)or 200ms (Löcken, A., Yan, F.,2019; Salvucci, D. D., and Goldberg, J. H.,2000) and there aren't the range of fixation. So I wonder how can I set the maximum duration.

wrp 13 February, 2020, 04:14:18

@user-a48e47 you can set the maximum duration, min duration, and max dispersion in the GUI of the Fixation Detector plugin within either Pupil Capture or Pupil Player. The value for min/max duration is dependent on your research/task at hand - for some tasks researchers are interested in very long duration of fixations (e.g. in sports performance research for example). So, the range must be determined by the researcher or by using established values from literature within your research domain.

user-a48e47 13 February, 2020, 06:05:32

@wrp Thank you so much. It depends on my research domain lol. I reminded it thanks.

user-bd800a 13 February, 2020, 10:13:03

Hi, I have old pupil core and I'm trying pupil mobile, they are detected but I get the following message "ndsi.network: Devices withoutdated NDSI version found. Please update these devices" How do I do? Pupil Capture 1.21, latest app (downloaded yesterday)

papr 13 February, 2020, 10:14:48

@user-bd800a If they are detected correctly, you can ignore this warning. We will check what causes this message to appear even though you are using the most recent software.

user-bd800a 13 February, 2020, 10:15:42

well I guessed they were detected but I do not get any thing, I also enabled the tyme-sync plugin but i get "Make sure time-sync is enabled"

user-bd800a 13 February, 2020, 10:16:07

as group default-time_sync-v1 is not found

papr 13 February, 2020, 10:21:49

@user-bd800a Which version of pupil mobile are you using?

user-bd800a 13 February, 2020, 10:23:39

1.2.3

user-bd800a 13 February, 2020, 10:23:46

On a Samsung A50

user-2fd67a 13 February, 2020, 10:24:14

Hi, I'm doing some recordings in 1920*1080 (fisheye), the recordings are great and the visualization is absolutely fantastic. However I'm using the raw fixation data and when I plot the norm x and y manually on each frame I see a decent distortion with the original X and Y presented on the video. How can I correct this? Thanks in advance.

papr 13 February, 2020, 10:24:49

@user-bd800a Could you please share the capture.log file with data@pupil-labs.com after activating the Pupil Mobile backend in Capture? This way we can check if there is an issue with the network setup.

papr 13 February, 2020, 10:26:45

@user-2fd67a Could you post an example screenshot? Fixations group gaze positions, and use their mean location as the "fixation location". Therefore, there should not be a big difference between gaze and fixation.

user-2fd67a 13 February, 2020, 10:27:00

sure

user-bd800a 13 February, 2020, 10:27:12

Sent via pm

user-2fd67a 13 February, 2020, 10:38:35

Here is an example, the problem is only when I'm not looking at the center of the screen.... I have a couple of points that correlates with the raw data...

Chat image

papr 13 February, 2020, 10:40:06

@user-2fd67a What is meant by raw data in this case? Where do you take it from?

user-2fd67a 13 February, 2020, 10:40:51

Chat image

user-2fd67a 13 February, 2020, 10:40:56

I took from the fixations

papr 13 February, 2020, 10:42:08

Could you please describe your workflow more specific? E.g. how do you extract the frames? How do you match fixation entries to the frames? etc

user-2fd67a 13 February, 2020, 10:42:51

I have the video, Frame (say) 30, Fixation X and Y at Frame 30

user-2fd67a 13 February, 2020, 10:43:00

simple scatter plot

papr 13 February, 2020, 10:43:14

This is the red dot that you are talking about?

user-2fd67a 13 February, 2020, 10:43:19

Yes

papr 13 February, 2020, 10:43:28

Where does the yellow circle come from? From Player world video export?

user-2fd67a 13 February, 2020, 10:43:33

yes

user-2fd67a 13 February, 2020, 10:43:41

exactly

papr 13 February, 2020, 10:44:14

Can you see the same mismatch in Player? Player should visualize the gaze as a green dot by default instead of your smaller red version.

user-2fd67a 13 February, 2020, 10:44:36

no, the green and Yellow are perfect, the problem is to reconstruct them...

user-2fd67a 13 February, 2020, 10:44:52

using the raw data

user-2fd67a 13 February, 2020, 10:46:09

when I'm away from the center, the thing is decently distorted...

user-2fd67a 13 February, 2020, 10:47:37

using fixations.csv of course

papr 13 February, 2020, 10:47:53

I have the video, Frame (say) 30, Fixation X and Y at Frame 30 Matching by frame index is less trivial than one might think.

user-2fd67a 13 February, 2020, 10:48:17

can you explain more?

papr 13 February, 2020, 10:48:18

I think that it is likely that this is just a mismatch in frames.

user-2fd67a 13 February, 2020, 10:48:53

I'm not sure that is the problem...

papr 13 February, 2020, 10:49:01

meaning, if you use e.g. opencv to access frames one by one, that opencv's 30th frame might not be the same frame as Player gets it

papr 13 February, 2020, 10:49:28

Do you use Python to access the video frames?

user-2fd67a 13 February, 2020, 10:49:32

yes yes

papr 13 February, 2020, 10:50:06

I would recommend to use https://github.com/pupil-labs/pyav to access the frames.

papr 13 February, 2020, 10:52:33
import av
container = av.open(path_to_exported_world_video)
for frame in container.decode(video=0):
    bgr = frame.to_nd_array(format="bgr24")
papr 13 February, 2020, 10:53:37

If this still does not work as expected, there might be an issue with the fixation export. In this case we can investigate more. In this case, please also share your code such that we can compare our implementations.

user-2fd67a 13 February, 2020, 10:53:51

okay 🙂 Thanks!

user-2fd67a 13 February, 2020, 11:00:12

look... Pupil Player is saying that in this video I have 1930 frames. I've counted using cv2: 1930

user-2fd67a 13 February, 2020, 11:01:07

therefore, frame by frame is working quite well...

user-c5fb8b 13 February, 2020, 13:51:18

@user-2fd67a OpenCV has an issue with random access for frames in VideoCapture, where it won't perfectly give you frame number X. It should however work if you start from the beginning of the video and read one frame after another until you have read X frames. Is that what you are doing?

user-70d6b7 14 February, 2020, 01:40:23

Hi all, looking for a fix to a pupil capture issue. Pupil capture is no longer starting up fully (was working Feb 7th, no longer working Feb 10). There doesn't seem to be a clear error message in the command window, but the video feed windows simply do not start up. These windows are present in the windows taskbar, but do not actually have windows pop up in the desktop environment (as shown in the attached screenshot).

Chat image

user-70d6b7 14 February, 2020, 01:41:13

Here's the full startup log when this happens.

pupil_capture_issue_startup.txt

user-70d6b7 14 February, 2020, 01:46:06

After encountering this issue, I uninstalled and reinstalled the Pupil labs drivers, following the guide on the pupil labs site, but the issue persists.

wrp 14 February, 2020, 05:43:32

@user-70d6b7 can you please try to delete pupil_capture_settings folder from your home dir and re-run Pupil Capture. This will clear the settings. It looks like maybe the windows are outside of the visible screen boundary. Resetting back to default settings should resolve this issue.

user-8e220c 14 February, 2020, 12:53:24

Hi there

user-8e220c 14 February, 2020, 12:53:48

We're about to use the Pupil Core glasses in the context of a clinical trial and we're asked to provide proof of CE mark for the glasses

user-8e220c 14 February, 2020, 12:53:54

Where can I get that?

user-8e220c 14 February, 2020, 12:53:57

Thanks!

papr 14 February, 2020, 12:54:19

@user-8e220c Please contact info@pupil-labs.com in this regard.

user-8e220c 14 February, 2020, 12:59:31

Thanks @papr

user-e672e9 14 February, 2020, 13:54:17

Hi everyone,

user-e672e9 14 February, 2020, 14:01:46

We are trying to read gaze position on a surface (tracked by the surface tracker plugin) from the IPC Backbone. When subscribing to the "gaze." topic we don't receive the gaze position on the tracked surface. Is it possible to receive this data via IPC Backbone? If yes, should it be in the "gaze" topic or a different one?

user-c5fb8b 14 February, 2020, 14:18:22

Hi @user-e672e9 , quoting the docs at https://docs.pupil-labs.com/core/software/pupil-capture/#further-functionality:

Streaming Surfaces with Pupil Capture - Detected surfaces as well as gaze positions relative to the surface are broadcast under the surface topic.

user-e672e9 14 February, 2020, 14:20:39

Oh, I missed that. Thank you very much!

user-c5fb8b 14 February, 2020, 14:21:00

No problem, glad I could help!

user-70d6b7 14 February, 2020, 22:08:38

@wrp Thanks so much, that worked! For reference, this happened after switching from a 2-monitor setup to a 1-monitor setup. Even when switching back to the 2-monitor setup, the pupil capture windows did not pop up on either monitor. Maybe pupil capture should reset the screen settings to default on startup to avoid this issue? Also, in case anyone else encounters this -- after deleting the pupil capture settings folder, I also needed to disconnect and reconnect the pupil core headset. Just deleting the folder and not reconnecting the headset resulted in pupil capture crashing upon startup.

user-838cfe 14 February, 2020, 23:08:20

Hello and thank you @papr and @wrp for your answers regarding finding frames for fixations from the world video. I have a follow-up question regarding fixations on surfaces. What are the differences between norm_pos_x/norm_pos_y and x_scaled/y_scaled for fixations on surfaces? I believe for surfaces top left corresponds to (0,0) and bottom right corresponds to (1, 1); I would like to superimpose fixation locations onto an image of the surface after recording so just want to make sure I choose the right coordinates to match up with the image (which is the whole surface). Also, could you please tell me more about dispersion and how it determines fixation (I think this will help me better understand how a fixation can span multiple world frames). Thank you!

user-838cfe 14 February, 2020, 23:12:39

I am also curious if there are any research publications that have used core for recording eye movements of experts (for example radiologists) as they are viewing medical images? I looked at the Pupil Citation List here: https://docs.google.com/spreadsheets/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/edit?ts=576a3b27#gid=0 but couldn't find anything like that yet but maybe I am missing something/some other place to look - thank you! I am hoping to learn about protocols/methods from such previous work!

user-a98526 15 February, 2020, 08:01:45

Hi, When I browsed the User Guide and Plugins, I found that many functions require a high speed world camera, such as:

Chat image

user-a98526 15 February, 2020, 08:06:46

But I bought the USB-C mount and realsense D400 rather than high speed camera, can the tools mentioned before be used normally?

user-14d189 15 February, 2020, 09:34:59

Hi, I try to compare pupil timestamps with ?iso? date. but it is nowhere near correct. How can I convert 2020-01-24 16:36:27.391000 to your decimal time, please?

user-14d189 15 February, 2020, 09:36:08

start_time = (datetime.strptime("2020-01-24 16:36:27.391000", '%Y-%m-%d %H:%M:%S.%f')) ts = start_time.timestamp() print("start: " + str(ts))

user-14d189 15 February, 2020, 09:36:43

I think the reference time is different 1900 and 1970?

user-14d189 15 February, 2020, 23:49:15

I found a work around. don't worry too much. Thanks

user-2fd67a 16 February, 2020, 00:06:03

@user-2fd67a OpenCV has an issue with random access for frames in VideoCapture, where it won't perfectly give you frame number X. It should however work if you start from the beginning of the video and read one frame after another until you have read X frames. Is that what you are doing? @user-c5fb8b I'm doing exactly this... I really don't understand why it is distorting this much.

user-14d189 16 February, 2020, 00:12:33

@user-2fd67a I had the same problem and got a good working script from pupil. depends how you access the frame. # Do not try to call video.set(cv2.CAP_PROP_POS_FRAMES, frame_idx)!

user-14d189 16 February, 2020, 00:13:31

that is the trouble maker. my understanding the frame index is a good but not accurate estimate.

user-2fd67a 16 February, 2020, 00:13:45

i'm reading frame by frame... I think it is causing some sort of distortion due to the fisheye lens

user-2fd67a 16 February, 2020, 00:14:14

the norm_pos_x is working fine

user-14d189 16 February, 2020, 00:14:42

if you want i can sent you the script I got from pupil

user-14d189 16 February, 2020, 00:15:15

so you trying to reconstruct the ellipse onto it?

user-2fd67a 16 February, 2020, 00:15:23

no

user-2fd67a 16 February, 2020, 00:15:49

i'm just trying to reconstruct the fixations by using a simple scatterplot in each frame

user-14d189 16 February, 2020, 00:18:37

ah now I got it. Myself I look into eye videos, but you print the gaze onto world video and you see the fixations there, is that right?

user-2fd67a 16 February, 2020, 00:22:04

yes yes

user-2fd67a 16 February, 2020, 00:28:03

Look, same video, same frame...different data points for norm_pos_x and norm_pos_y..

Chat image

user-2fd67a 16 February, 2020, 00:28:40

I know they should be a little bit different, but not a std of 20%

user-2fd67a 16 February, 2020, 00:29:12

knowing that my maximum Y is about 0.66 and mininum is .44

user-2fd67a 16 February, 2020, 00:41:34

it is almost random the correct positions I have to take... I took the X from fixations and Y from pupil_positions and it worked perfectly for this particular frame.

Chat image

user-14d189 16 February, 2020, 05:38:22

Couple of things. The way gaze is calculated (pairing eye 0 with eye1, 1-0, 0-1, 1-0 consecutively according to pupil data) gaze has 240 data per second, world video has 30 up to 60 frames per sec. with 30Hz you will have 8 gaze data points for each frame. taking data from fixation is a good approach because eye keep local position for 0.15s - 0.4-6??? roughly.

user-14d189 16 February, 2020, 05:41:05

pupil_positions relate to the actual position of the pupil center in the eye videos. Gaze_positions.csv norm_pos_x and norm_pos_yrelate to world video.

user-a48e47 16 February, 2020, 10:50:15

Hello. I have a question. As I know the pupil lab core's Hz is 200 Hz and fixation classificator based on spatial dispersion (the default value is 1.5). However,dispersion-based algorithms are not suited to analysing data collected with higher sampling frequencies (> 200 Hz) (in the book "Eye-tracking: A comprehensive guide to methods, paradigms and measures). So I am wondering about this. Because my fixation range is depending it.

user-2fd67a 16 February, 2020, 23:18:41

Okay, I REALLY need help here... the fixation matrix is giving something really strange.

user-2fd67a 16 February, 2020, 23:28:10

@papr can you please help me?

user-2fd67a 17 February, 2020, 01:54:48

@user-a48e47 the dispersion will still be the same, no?

user-a48e47 17 February, 2020, 01:59:00

@user-2fd67a I cannot understand could you please write more detail please?

wrp 17 February, 2020, 02:09:31

@user-2fd67a - was this recorded on Pupil Mobile (e.g. on the android device) or via streaming from Pupil Mobile --> Pupil Capture and recorded on the computer running Capture?

@papr please follow up on this today when you are online.

user-2fd67a 17 February, 2020, 02:09:56

I've recorded on Pupil Mobile (android)

user-2fd67a 17 February, 2020, 02:11:13

is there any accelerometer on the device? It seems to mee that is related to the Y movement of my head as when I'm looking perfectly straight it works quite well...

user-14d189 17 February, 2020, 03:59:01

@user-2fd67a Just have seen the video. the norm positions are in the image coordinate system. 0 , 0 is top left. It looks like your y value is in a cartesian coordinate system. 0,0 bottom left. do you recon that could be?

user-2fd67a 17 February, 2020, 03:59:40

Hi @user-14d189, so then why the X axis is perfectlu fine?

user-14d189 17 February, 2020, 04:00:02

particular when you look up down on you see that it is counter acting

user-14d189 17 February, 2020, 04:00:14

x is fine because both start left.

user-2fd67a 17 February, 2020, 04:01:33

hmmm... let me try here, thank!

user-2fd67a 17 February, 2020, 04:01:35

thanks!

user-2fd67a 17 February, 2020, 04:03:16

i'm considering 0,0 top left...

user-2fd67a 17 February, 2020, 04:05:10

BUUUT

user-2fd67a 17 February, 2020, 04:05:28

the fixation matrix is not

user-2fd67a 17 February, 2020, 04:05:35

you are a genius

user-14d189 17 February, 2020, 04:06:11

Na, just run into same problems, 😩 🤣

user-2fd67a 17 February, 2020, 04:06:42

Thanks Peter!

user-2fd67a 17 February, 2020, 04:13:49

yeap, it worked

user-879265 17 February, 2020, 07:07:30

NEED QUOTAION FOR EYE TRACKING GLASSES

papr 17 February, 2020, 07:10:28

Hey @user-879265 have you contacted info@pupil-labs.com in this regard already?

user-879265 17 February, 2020, 07:10:40

NO

user-879265 17 February, 2020, 07:10:55

I WILL CONTACT

user-879265 17 February, 2020, 07:10:58

THANKS

papr 17 February, 2020, 07:13:50

@user-879265 OK, great. For the future, would you mind turning off your caps-lock? I know that this was surely not intended but reading the text in uppercase gives me the impression that you were yelling at us. 😕

papr 17 February, 2020, 07:26:05

@user-879265 alternatively to contacting info@pupil-labs.com via, you can select your preferred items from the shop and request a quote during the checkout process. Our sales team will follow up in any way. 👍

papr 17 February, 2020, 08:44:33

@user-2fd67a @user-14d189 Great to see that the community was able to help out! 🙂 Please see this link for references to the different coordinate systems that we use in Pupil: https://docs.pupil-labs.com/core/terminology/#coordinate-system

papr 17 February, 2020, 14:40:08

@user-838cfe Normalised coordinates are always setup such the origin is in the bottom left. Beware that the top orientation of the surface is visualized in Capture/Player by the red triangle. See the screenshot for reference.

Chat image

user-5e6e36 17 February, 2020, 18:33:04

Hello I am trying to run Core in a Windows 7 Professional PC, I have got it before (in Windows 8.1) but I cannot do it now ... I have correctly installed libUSK drivers ... I do not know what more could I try ... Can be a PowerShell Problem?

user-5e6e36 17 February, 2020, 18:33:54

I have installed PowerShell 2.0

user-5e6e36 17 February, 2020, 18:34:53

The mesage is Video_capture.uvc_backend:Init failed

user-5e6e36 17 February, 2020, 18:51:36

any help?

user-5e6e36 17 February, 2020, 19:26:32

I have the same behaviour with the three cameras ...

papr 17 February, 2020, 20:20:13

@user-5e6e36 officially, we only support Windows 10. I am surprised that you were able to get it running on Windows 8. :-)

user-c67683 18 February, 2020, 03:59:27

Hi, i am a little bit newbie using pupil and eye.trackers. I just read that Pupil randomly generates a timestamp to each eye-tracker record . I want to know if it is possible to get timestamps with the current unix timestamp, I want to synchronize the eye-tracker with an Otree application that give me the current unix timestamp

papr 18 February, 2020, 07:01:57

@user-c67683 this is my message to @user-dfa378 from yesterday. They had the same question as you.

Checkout the info.player.json file. It contains the recording start time in system time (unix epoch) and pupil time. You can use these two timestamps to calculate the offset between unix epoch and pupil time and apply it to your other data. Maybe they can share their experiences with this approach with you.

user-5e6e36 18 February, 2020, 07:17:53

So, there is nothing that I could try? I am searching in internet and there is a lot people with the same problem with W10

user-5e6e36 18 February, 2020, 07:18:09

Ghost mode

user-5e6e36 18 February, 2020, 07:20:24

In w8.1 I am using v1.11 so I am using the same version in W7Pro, it seems that the drivers are correctly installed. (It is going to be hard to install W10 in this unit)

user-b8b425 18 February, 2020, 13:54:04

Hi, @wrp , could you please tell me how to save undistorted video instead of raw video ?

papr 18 February, 2020, 13:59:26

@user-b8b425 Currently, this is only possible post-hoc in Pupil Player using the iMotions exporter. The exporter plugin requires 3d gaze data though.

user-eaf50e 18 February, 2020, 14:04:01

Hello, in what kind of coordinate system are the pose_T and pose_R for the apriltags? For the corners I get e.g. [[653.05981445 392.19338989] [655.08605957 520.94110107] [779.39581299 522.76715088] [778.57659912 393.91848755]]
but for the translation pose_T and rotation matrix pose_R I get coordinates in [0,1], e.g. pose_T = [[0.02474529] [0.02952776] [0.42436221]]

user-0eef61 18 February, 2020, 14:06:17

Hi, I am having some issues opening pupil player v1.21-5. Yesterday was working fine but now it won't open anymore. What might be the reason? I have tried restarting and shutting down my pc but still doesn't open. The command prompt opens but it doesn't run any commands.

user-0eef61 18 February, 2020, 14:09:10

v1.11-4 opens

papr 18 February, 2020, 14:09:56

@user-894365 I think you might be able to give more insight on @user-eaf50e 's issue, correct?

papr 18 February, 2020, 14:11:15

@user-0eef61 Please delete the user_settings* files in Home directory -> pupil_player_settings and try again. If it still does not open, please share the player.log file in the same directory.

user-0eef61 18 February, 2020, 14:30:44

Hi @papr now it's working. Many thanks 🙂

user-894365 18 February, 2020, 15:04:51

@user-eaf50e pose_T and pose_R are the translation vector and rotation matrix of the tag pose with respect to the camera. The coordinate system of the camera: X-axis: pointing to the right Y-axis: pointing down Z-axis: pointing forward

The magnitude of pose_T depends on tag_size. E.g. Assume the physical size of Apriltags is 5 cm, and you set tag_size=5, then pose_T is in cm.If you set tag_size=50, then pose_T is in mm.

user-03a2fe 18 February, 2020, 15:52:24

Hi! I am struggling with the installation of libuvc in Ubuntu 16. I cloned the version found on the pupil-labs repo and when calling "make" I get the following error:

[100%] Linking C shared library libuvc.so /usr/bin/ld: /usr/lib/gcc/x86_64-linux-gnu/5/../../../x86_64-linux-gnu/libusb-1.0.a(libusb_1_0_la-core.o): relocation R_X86_64_32S against `.rodata' can not be used when making a shared object; recompile with -fPIC /usr/lib/gcc/x86_64-linux-gnu/5/../../../x86_64-linux-gnu/libusb-1.0.a: error adding symbols: Bad value collect2: error: ld returned 1 exit status CMakeFiles/uvc.dir/build.make:188: recipe for target 'libuvc.so.0.0.9' failed make[2]: [libuvc.so.0.0.9] Error 1 CMakeFiles/Makefile2:72: recipe for target 'CMakeFiles/uvc.dir/all' failed make[1]: [CMakeFiles/uvc.dir/all] Error 2 Makefile:129: recipe for target 'all' failed make: *** [all] Error 2

user-03a2fe 18 February, 2020, 15:52:38

Any help would be appreciated 🙂

user-03a2fe 18 February, 2020, 16:17:55

Nevermind 🙂 I removed the file libusb-1.0.a and it worked (hopefully I didn't messed up)

user-526844 18 February, 2020, 20:21:42

Hi all I am a researcher trying to run a usability study on a website. I am curious the best route for converting the pupil coordinates into screen coordinates, does anyone have experience with this sort of thing?

user-526844 18 February, 2020, 20:21:44

Thanks!

user-2fd67a 19 February, 2020, 03:07:38

@user-526844 have you used a chin rest?

wrp 19 February, 2020, 05:42:36

@user-526844 use surface tracking [1], define your screen as a surface. Note that this surface/AOI will only take into account the position of the screen, and will tell you where on the screen the participant is looking, but does not take into account dynamic content - e.g. scroll/video/animations. You will need to denormalize the gaze on surface positions. You can see an example of how this works in [2] (note this is for static content).

  1. https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking
  2. https://github.com/pupil-labs/pupil-tutorials/blob/master/02_load_exported_surfaces_and_visualize_aggregate_heatmap.ipynb

If you have control of the website that is being viewed, then you could track scroll position via a custom script and correlate these with timestamps post-hoc to overlay gaze on dynamic content - but that would take a bit of development effort.

user-3ac2c5 19 February, 2020, 08:17:39

Hi Do you ship the AR headset to Turkey?

user-3ac2c5 19 February, 2020, 08:18:17

VR I mean

user-eaf50e 19 February, 2020, 11:58:41

@user-eaf50e pose_T and pose_R are the translation vector and rotation matrix of the tag pose with respect to the camera. The coordinate system of the camera: X-axis: pointing to the right Y-axis: pointing down Z-axis: pointing forward

The magnitude of pose_T depends on tag_size. E.g. Assume the physical size of Apriltags is 5 cm, and you set tag_size=5, then pose_T is in cm.If you set tag_size=50, then pose_T is in mm. @user-894365 So the translation vector should give the translation of the tag in the world coordinate system? Because I get negatives value as well, but the world coordinate system is said to be bound to x: [0, <image width>], y: [0, <image height>]

papr 19 February, 2020, 12:41:56

@user-eaf50e We have just noticed an error in our coordinate system documentation. The bounds of the 3D camera space are not x: [0, <image width>], y: [0, <image height>]. Actually, the 3d camera space does not have boundaries at all. The origin of the 3d camera coordinate system is in the center of the camera, not any of the corners. Therefore, it is expected to get negative values.

Just for further clarification: This 3d camera space is not equivalent to the image plane.

pose_T and pose_R are defined within the 3d camera space. The camera space itself is defined by the camera_params that are passed to the apriltag detector.

user-eaf50e 19 February, 2020, 13:21:45

Okay thanks! So the gaze_point_3d_{x,y,z} is returned in the world camera coordinate system as well, right?

papr 19 February, 2020, 13:23:35

Correct! Please be aware that the units of these spaces might differ since the unit of the apriltag camera space is dependent on the tag_size as @user-894365 noted.

papr 19 February, 2020, 13:24:47

@user-894365 please correct me if this last message was incorrect.

user-894365 19 February, 2020, 13:30:57

@user-eaf50e

the unit of the apriltag camera space is dependent on the tag_size -> correct gaze_point_3d is in mm

user-03a2fe 19 February, 2020, 13:51:13

Hi! I just managed to get pupil programs running in Ubuntu 16.04 (after a loooot of effort :P).

I would suggest to make the following modifications to the documentation found here https://github.com/pupil-labs/pupil/blob/master/docs/dependencies-ubuntu17.md:

  • sudo add-apt-repository ppa:jonathonf/ffmpeg-4 (the repo for ffmpeg-3 doesn't work anymore)
  • Some dependencies (like nslr and nslr-hmm) require gcc7 instead of 5.4. I solved it as follows: sudo update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-7 70 sudo update-alternatives --install /usr/bin/g++ g++ /usr/bin/g++-7 70 sudo update-alternatives --config gcc sudo update-alternatives --config g+++ CC=/usr/bin/gcc-7 pip install git+https://github.com/pupil-labs/nslr CC=/usr/bin/gcc-7 pip install git+https://github.com/pupil-labs/nslr-hmm

(Without CC=/usr/bin/gcc-7 it didn't worked, despite I changed the gcc version with update-alternatives) - I needed to install pybind11 before installing nslr pip install pybind11

papr 19 February, 2020, 13:53:00

@user-03a2fe oh oh, this document seems to be out of date. nslr is not required anymore in order to run the master branch. 😕

user-3ac2c5 19 February, 2020, 14:22:27

I want to buy Pupil Labs products. Any help....

user-3ac2c5 19 February, 2020, 14:22:46

Email or something

papr 19 February, 2020, 14:24:53

@user-3ac2c5 You can buy them via our website: https://pupil-labs.com/products/ Choose the product that you want to buy, add it to the cart, and afterwards you can start the checkout process. I can also send you a link to a prefilled cart if you let me know what you want to buy. 🙂

user-3ac2c5 19 February, 2020, 14:27:13

I want to buy two Pupil Labs VR headsets with two eye trackers (HTC Vive)

user-3ac2c5 19 February, 2020, 14:27:17

Thank you

user-3ac2c5 19 February, 2020, 14:27:33

Please include the shipment price

papr 19 February, 2020, 14:31:22

@user-3ac2c5 https://pupil-labs.com/cart/?htcvive_e200b=2 You can request a quote during the checkout process.

papr 19 February, 2020, 14:31:59

@user-3ac2c5 Please be aware that Pupil Labs does not provide the VR headset itself but only the eye tracking add-on.

user-3ac2c5 19 February, 2020, 14:32:24

I figured it out. Thanks

user-3ac2c5 19 February, 2020, 14:33:13

Last question. How many days can I have the eye tracker?

papr 19 February, 2020, 14:34:39

@user-3ac2c5 2-3 days after we have received the payment

user-3ac2c5 19 February, 2020, 14:34:50

Tnx

user-1cc326 19 February, 2020, 16:49:30

The pupil cam for the left eye wasn't working, so I followed the Windows 10 trouble shooting advice and uninstalled the pupil cams from the device manager. However, when I plugged the glasses back in and restarted pupil capture, the eye cam for the left eye still isn't working. I don't think it reinstalled. How do I fix that?

papr 19 February, 2020, 17:06:38

@user-1cc326 Please send an email to info@pupil-labs.com in this regard.

user-edd919 19 February, 2020, 17:14:42

Howdy! I have a question regarding pupil lab headset for writing my manuscript. How could I find the frequency of data collection? Is it 200 HZ or should I check something in the software? I appreciate any help

papr 19 February, 2020, 17:19:34

@user-edd919 you can check the Eye Video Exporter output. It contains csv files with the original eye video timestamps. Calculate the differences between them and take the reciprocal of the average time difference. This gives you the average frame rate.

user-608a0d 19 February, 2020, 17:22:07

Hi! We are trying to record pupil data synched with OptiTrack data using this script: https://github.com/mdfeist/OptiTrack-and-Pupil-Labs-Python-Recorder We have pupil data recording successfully, but no OptiTrack data. (Though we can confirm that motion coordinates are broadcasting successfully.) Does someone have experience using this script or any idea where the problem might be?

user-a98526 20 February, 2020, 11:30:24

Hi@papr ,I want to know pupil groups,but the link on the website failed. Can you tell me some details?

Chat image

user-a98526 20 February, 2020, 11:30:40

Chat image

user-c5fb8b 20 February, 2020, 12:01:48

Hi @user-a98526, here's the link to the full pupil-helpers repository: https://github.com/pupil-labs/pupil-helpers

user-a98526 20 February, 2020, 12:08:28

Thanks@user-c5fb8b ,but I didn't find one about the pupil group

papr 20 February, 2020, 12:09:12

@user-a98526 Actually, the link pointed at "Pupil Sync". Pupil Sync was the predecessor to our network api and has been out of date for more than 3 years now.

papr 20 February, 2020, 12:09:44

Let me gather some important links regarding Pupil Group

papr 20 February, 2020, 12:19:21

@user-a98526 - Pupil Groups is based on the ZRE protocol https://rfc.zeromq.org/spec:36/ZRE/ - Pupil uses the ZRE Python implementation "pyre" https://github.com/zeromq/pyre (any ZRE implementation can be used) - The Pupil Groups plugin joins a user-defined group, by default: "pupil-groups" - Messages whose topics' start with remote_notify are relayed to other group members. Alternatively, any notification with the key remote_notify is also relayed. - Notifications that are related to group members by default: - recording.should_start, recording.should_stop

papr 20 February, 2020, 12:19:43

@user-c5fb8b Could you update the docs with the information above please?

papr 20 February, 2020, 12:26:27

Github issue as reference: https://github.com/pupil-labs/pupil-docs/issues/354

user-a98526 20 February, 2020, 12:33:15

Thanks@papr .Can I use the pupil group to control my other sensors (e.g. depth sensor)

Is there an example where my other sensors join the pupil group, or what should I do?

papr 20 February, 2020, 12:36:06

@user-a98526 What type of control are you looking for? The purpose of Pupil Groups is to allow some quality of life for people who run multiple Pupil Capture instances at once. It does not give you fine-grained control over a single instance.

user-a98526 20 February, 2020, 12:41:23

I have a robot system, and I want to use the pupil group to obtain a variety of sensor data (force, angle, etc.) of the robot and connect them with gaze data.

papr 20 February, 2020, 12:49:02

@user-a98526 I fear that Pupil Groups does not fit that purpose.

user-a98526 20 February, 2020, 12:51:14

@papr thanks,I‘ll try another way to achieve it.

papr 20 February, 2020, 12:55:11

@user-a98526 I mean you can use Pupil Groups for network discovery and to avoid manual ip setups. But it will not automatically receive sensor data.

user-c5fb8b 20 February, 2020, 15:31:52

@here 📣 Announcement 📣 - We just released Pupil Core software v1.22 . Download the apps here: https://github.com/pupil-labs/pupil/releases/tag/v1.22

This release reintroduces "Gaze History" visualization and simplifies the selection of video sources. Additionally, we removed the built-in RealSense video backend. More details in the release notes.

We look forward to your feedback!

user-2ad874 20 February, 2020, 22:32:56

Hi, I'm trying to connect multiple eye-trackers together for an experiment. I'm not sure how to do this . Do i use pupil groups? Not sure how to get started. Any help will be much appreciated.

user-c5fb8b 21 February, 2020, 13:08:09

Hi @user-2ad874 Pupil Groups allows you to start and stop a recording simultaneously on multiple instances of Pupil Capture running on the same network. If this is what you want to do, then pupil groups will be the right choice for you. You need to enable the Pupil Groups plugin on all instances of Pupil Capture and make sure they are all in the same group. You should see the names of the other instances listed in the plugin then. When starting the recording, all instances will record simultaneously. Does that answer your questions?

user-0cc5ed 21 February, 2020, 15:42:22

Hi, I need an eye tracking library for webcams. - Is Pupil capable to meet my needs?

user-c5fb8b 21 February, 2020, 16:01:04

Hi @user-0cc5ed Pupil is designed to work with head-mounted eye trackers only. There is the possibility to build your own head-mounted eye tracker, e.g. with webcams (and a lot of other materials), but I assume you want to place a webcam in a fixed location, e.g. on a monitor? This procedure is called remote eye-tracking (as opposed to head-mounted) and we do not currently offer any software for this use case.

user-0cc5ed 21 February, 2020, 16:09:35

Thank your for your answer! Yes, I need remote eye-tracking, which provides the gaze point on the display. - Are you aware of any other library, that can do this?

user-c5fb8b 21 February, 2020, 16:17:42

@user-0cc5ed Pupil offers tools for tracking surfaces, e.g. a display with fiducial markers and correlating the gaze to the surface plane, so this is also a possible use case with our tools. Generally I'm not aware of any library that allows you to use your custom webcam. Most solutions come with custom high-resolution cameras.

user-c5fb8b 21 February, 2020, 16:21:06

Disclaimer: there totally might be libraries out there that offer this use case.

user-0cc5ed 21 February, 2020, 16:26:21

@user-c5fb8b Thank you for your information! Yes, I found WebGazer (https://webgazer.cs.brown.edu/), but I can't use it in the content script of a webextension. Therefore I am looking for a native app solution, to which I can connect via native messaging - but all projects, which I tried are very hard to setup and don't offer support...

user-c5fb8b 21 February, 2020, 16:27:24

@user-0cc5ed Is the display you are talking about a mobile phone or a computer monitor?

user-0cc5ed 21 February, 2020, 16:30:55

@user-c5fb8b It's a computer monitor, but the webextension could also be used on a mobile phone, if mobile browsers do allow to install the extension there too. Is there code vice a big difference between these two situations?

user-c5fb8b 21 February, 2020, 16:37:33

@user-0cc5ed Pupil can be used to track basically any surface, if you attach markers to it. I think it would be best for you to contact sales@pupil-labs.com with a description of what exactly you want to do. Someone there will be able to give you a better overview of which setup from us would work best for your use cases!

user-0cc5ed 21 February, 2020, 16:39:09

@user-c5fb8b Ok, I will try this. Thanks.

user-e61b04 21 February, 2020, 18:45:26

Hi, I have the pupil labs gadget and i want to pare this with an Eprime task. Is this possible?

user-5e6e36 21 February, 2020, 19:34:17

I have Core with binocular 200Hz cameras but when I am usin capture I cannot reach this framerate, what is the problem? (the computer loadwork is about 30%)

user-d9f5f9 21 February, 2020, 22:39:16

Hi, I am new to pupil labs and is trying to get a core. I would like to get the world camera together with the eye cameras but might need to use the depth camera in the future, which I am wondering if I order the one with world camera and get a RealSense camera in the future, will it work with Pupil Labs Core?

user-d9f5f9 21 February, 2020, 22:40:45

In other words can I plug the RealSense camera to my computer with a separated cable and use it as the world camera for the Pupil Labs or do I need to get the USB-C version? Thanks

wrp 22 February, 2020, 01:18:35

@user-5e6e36 please change the sensor settings in the eye windows of Pupil Capture to 200Hz if not already

user-5e6e36 22 February, 2020, 11:33:29

I am going to look for this .... (how could I miss that?)

user-5e6e36 22 February, 2020, 11:55:32

Ok, I have found the settings, I am living very important bandwith problems ... however I have i7-4790.3.6 Ghz with 16 Gb, this is not enough for running two eye cameras and the world camera with (200 Hz and 30 Hz respectively)? Do you have some minimal requirements advice for running with this settings?

user-76416d 22 February, 2020, 12:56:58

Hi! , I am new to pupil capture. I am trying to manually calibrate the pupil by showing markers on the screen. However, the calibration polygon is not the same which I have used. For example, I am calibrating a monitor. Ideally there should a rectangular calibration polygon which encompasses the whole screen but in my case it is a triangular shape and didn't cover whole screen.

papr 23 February, 2020, 09:52:57

@user-5e6e36 This setup sounds sufficiently fast. What frames are you seeing? Also, on which operating system are you running Pupil Capture?

papr 23 February, 2020, 09:54:09

@user-76416d This means that for some of the shown marker positions there is not sufficient pupil data that reaches the confidence threshold (0.8 by default).

user-a4b794 23 February, 2020, 13:38:01

Hi! I would like to ask if the pupil code could be adapted to work with remote eye tracking instead of mounted eye tracking. I would so much like to purchase a "Pupils Remote". 🙂 All the other vendors I have tried have worse accuracy than Tobii 4C. I would like to resell Tobii devices in medical areas, but they're monopolistic and don't allow that. So I'm looking for an alternative, and "Pupil" seems to be the most promising project. Thanks for it, btw.

user-12d809 23 February, 2020, 21:12:06

Hi, I'm looking into the budget for a Pupil Core setup for a startup package. Beyond the pupil core itself, a mobile device, a support contract with pupil labs, and a computer that can handle the analyses, are there other things I will need or should consider?

user-12d809 23 February, 2020, 21:13:08

Also, regarding the mobile device, is there a device that people would recommend? I see the hardware list in the docs (https://docs.pupil-labs.com/core/software/pupil-mobile/#supported-hardware), but that looks out of date. Would a new Pixel or One Plus 6 or 7 work?

user-838cfe 24 February, 2020, 03:26:01

Hi, I am running into an interesting problem...my csv output of fixations_on_surface indicates that none of the fixations are on the surface ('on_surf' is 'FALSE' for all fixations), yet I can see that there are fixations on the surface from the world camera and by visualizing the heatmap on the surface during recording. Is there something that could explain this bizarre observation? If needed, I can provide video snippets or fixations_on_surface csv file; please advise what would be most helpful. Thank you!

wrp 24 February, 2020, 03:35:57

@user-a4b794 You can use Pupil Core for screen based research if you want to use surfaces ( https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking ). That being said, this would still be a wearable eye tracking system and not a remote system. Pupil Core software source code is designed to be used for near-eye IR cameras (e.g. a wearable system), so there would not really be a way to easily adapt/modify Pupil Core software to support remote eye tracking applications.

wrp 24 February, 2020, 03:37:41

@user-12d809 If you'd like to discuss sales related questions, please get in touch via [email removed] Re Pupil Mobile - OnePlus devices should work if they are running Android 9 -- we are still waiting on a fix from the Google Android team in Android 10.

wrp 24 February, 2020, 03:39:32

Hi @user-838cfe the heatmap in Pupil Player/Pupil Capture visualizes gaze on surface. You may have gaze on surface, but no fixations on the surface, as fixations are made up of "clusters" of gaze positions.

user-5e6e36 24 February, 2020, 08:44:08

@user-5e6e36 This setup sounds sufficiently fast. What frames are you seeing? Also, on which operating system are you running Pupil Capture? @papr When I am running 120 Hz (2 eye cameras), and 30 Hz (World camera) everything is OK, but when I am trying to run 180 Hz the CPU workload increase over 100 % (140 %) and the world camera get almost freezed (3 Hz); I can even try with 200 Hz. I am running Windows 8.1

user-5e6e36 24 February, 2020, 08:44:26

I cannot even try wit ,...

papr 24 February, 2020, 08:52:54

@user-5e6e36 Your hardware should be sufficiently fast but as I mentioned previously, we do not support Windows versions prior to Windows 10. This also means that the software might run significantly slower due to unexpected differences in the different operating systems. I highly recommend to either upgrade to Windows 10 or to use a dual-boot setup with Ubuntu 18.04.

user-5e6e36 24 February, 2020, 10:12:56

OK, I am waiting for a new computer with W10 so I hope that everything goes faster then, thank you

user-abc667 24 February, 2020, 16:05:18

@papr @wrp Is there documentation on the format of the world camera video before it is run thru pupil viewer? We want to analyze the world cam images and those closest to the original are better. Thanks.

papr 24 February, 2020, 16:06:49

@user-abc667 Pupil Capture actually saves the raw mjpeg data from the cameras. This is the closest you can get to the original as the camera does not support uncompressed video.

user-abc667 24 February, 2020, 16:14:33

@papr OK, good to know. (I assume you meant "mpeg".) I understand it's variable frame rate; is there a time code with each frame? (Might be a naive question, I don't do lot of video hacking.) Tnx.

papr 24 February, 2020, 16:16:59

@user-abc667 No, I mean mjpeg (https://de.wikipedia.org/wiki/Motion_JPEG). This "video format" is just a stack of independent images without a sense for time. This is why we save the timestamps for each frame in the <video name>_timestamps.npy file. You can load the ts files with numpy.load(). Alternatively, you can find the description of the npy format here: https://numpy.org/devdocs/reference/generated/numpy.lib.format.html

user-abc667 24 February, 2020, 16:18:02

@papr Great, will follow those pointer. Many thanks.

user-66251d 24 February, 2020, 18:36:39

@papr Hi all, I am recording to a phone (running android 9 and set to save to SD card) and I seem to be having an issue if the recording is anything close to an hour long I dont appear to be able to playback the audio capture. The mp4 file is present and appears to be the correct size (approx. 24mb for an hour) but will not open in any audio playback software nor in the pupilplayer software (I am using the pupil core product and using the microphone on the phone to record audio). I don't have this issue if I make short recordings.

user-b029af 25 February, 2020, 08:51:15

Hi, terribly sorry for a stupid question, which camera could be used for DIY eye-camera? According to pupil-labs.com it should be taken from Microsoft HD6000, but it its PCB will not fit the Pupil Labs eye-camera size (10x45x7mm)

user-d9a2e5 25 February, 2020, 14:14:10

i'm trying to get pupil size which are not affected from light- this is why i took a black screen and a white cross- to focus on him , can you please tell me if i should make a black and white cross or a diffrent colours ?

papr 25 February, 2020, 14:43:10

@user-d9a2e5 Hi, I would recommend to have a look at other pupillometry studies and checkout best practices there. This paper would be a good place to start. https://www.researchgate.net/publication/335327223_Replicating_five_pupillometry_studies_of_Eckhard_Hess

As you can see, they try to avoid strong contrasts and large areas of different colors.

user-5054b6 25 February, 2020, 15:10:42

For IRB purposes, did the Core headset infrared emissions ever get certification under IEC 62471 or something similar? I found a posting saying it was being assessed, but I didn't see anything to the effect that it has been certified. Alternatively, is there anything that I can point to show it's safe?

papr 25 February, 2020, 15:16:48

@user-5054b6 Please contact us via info@pupil-labs.com and my colleagues will follow up accordingly.

user-5054b6 25 February, 2020, 15:18:34

done - thanks!

user-d9a2e5 25 February, 2020, 20:57:39

@papr thanks!! , i will check it out 🙂

user-c828f5 25 February, 2020, 23:45:55

Hello, I was wondering if PL can share some information about the sensors used in Pupil Core (1 and 2). I'm looking for the following info: a) Sensor size b) Focal length of lens (fixed focal length in 2.0 and range of length for 1.0) c) If some kind soul has calibrated the eye cameras, then the intrinsic matrix for the same

papr 26 February, 2020, 06:36:58

@user-c828f5 please contact info@pupil-labs.com in this regard. We will follow up with the corresponding information.

user-5529d6 26 February, 2020, 14:26:02

Hey there, I came here a week or two ago about a non-monotonic timestamp issue. Apparently it was fixed, or worked around in 1.22, however when I give it a try, the eye camera still crashes... In syslog I get a sequence of "correcting clock frequency" followed by

Feb 26 08:55:53 XX pupil_capture.desktop[11967]: *** Correctieye0 - [ERROR] launchables.eye: Process Eye0 crashed with trace: Feb 26 08:55:53 XX pupil_capture.desktop[11967]: Traceback (most recent call last): Feb 26 08:55:53 XX pupil_capture.desktop[11967]: File "launchables/eye.py", line 542, in eye Feb 26 08:55:53 XX pupil_capture.desktop[11967]: File "shared_modules/video_capture/uvc_backend.py", line 429, in recent_events Feb 26 08:55:53 XX pupil_capture.desktop[11967]: AttributeError: 'UVC_Source' object has no attribute '_latest_ts'

papr 26 February, 2020, 14:27:08

@user-5529d6 Hey, yes, we will release a fix for this tomorrow.

papr 26 February, 2020, 14:28:11

I think you contacted info@pupil-labs.com in this regard, too, is that correct?

user-5529d6 26 February, 2020, 14:28:22

yes, I apologize for the redundancy I just wanted to bring it up here in case another user is encountering similar issues.

papr 26 February, 2020, 14:28:48

Don't worry. Your report was very helpful.

user-c828f5 26 February, 2020, 17:04:15

@papr Thanks!

user-2ad874 26 February, 2020, 19:04:00

Hi @user-2ad874 Pupil Groups allows you to start and stop a recording simultaneously on multiple instances of Pupil Capture running on the same network. If this is what you want to do, then pupil groups will be the right choice for you. You need to enable the Pupil Groups plugin on all instances of Pupil Capture and make sure they are all in the same group. You should see the names of the other instances listed in the plugin then. When starting the recording, all instances will record simultaneously. Does that answer your questions? @user-c5fb8b Thank you for getting back to me. I appreciate it. Could you explain what you mean by running pupil capture on the same network. How do I get that started. I really appreciate your help.

user-7b943c 26 February, 2020, 21:32:36

Does anyone know the range of values for the following data points? Also, how are the axes setup? I'm having a hard time visualizing it. Is there a way to display these values in Pupil Capture?

Chat image

user-c629df 26 February, 2020, 21:48:41

If I'm doing a psychology experiment with Pupil Labs, how can I correlate the trial numbers in my experiment with the timestamp in Pupil Lab's recordings?

user-c629df 26 February, 2020, 21:48:45

@papr

user-14d189 27 February, 2020, 03:08:26

@user-7b943c check that link out. your sample should be x == left 258..., y up 5, z 463 into the space in front of the camera.

user-14d189 27 February, 2020, 03:08:29

https://docs.pupil-labs.com/core/terminology/#coordinate-system

user-14d189 27 February, 2020, 03:27:51

in capture, the binocular vector gaze mapper visualize the gaze. is available after calibration. hope that helps.

user-a98526 27 February, 2020, 09:42:28

Hi@papr ,I want to know if there is any related paper or introduction about the gaze estimation algorithm of the pupil core.

marc 27 February, 2020, 13:40:00

@user-a98526 The pupil detection algorithm is described here https://arxiv.org/pdf/1405.0006 (this paper is pretty old and most info is out of date, but the pupil detection algorithm has not fundamentally changed) The 3D eye model we are using is based on this work: https://www.researchgate.net/profile/Lech_Swirski/publication/264658852_A_fully-automatic_temporal_approach_to_single_camera_glint-free_3D_eye_model_fitting/links/53ea3dbf0cf28f342f418dfe/A-fully-automatic-temporal-approach-to-single-camera-glint-free-3D-eye-model-fitting.pdf The way we utilize this 3D eye model for gaze estimation, is currently not documented in a dedicated paper.

papr 27 February, 2020, 14:10:07

@user-c629df Yes. The recommended workflow is to send annotations (https://docs.pupil-labs.com/core/software/pupil-capture/#annotations) remotely from the experiment software to Pupil Capture. https://docs.pupil-labs.com/developer/core/network-api/#remote-annotations

In your case the annotations would contain information about the current trial.

user-274c94 27 February, 2020, 16:45:42

I have implemented a ros node which is interfacing pupil capture using the network interface over zmq. It just receives all surface gaze messages and republishes them to the ros environment. If I measure the number of gaze data per second in ros i get around 240 messages per second. The eye cameras are running at 120 Hz. The messages are binocular gaze data points so from both eyes (gaze.3d.01._on_surface). I would except 120 messages per second, why do I get 240? Do I get the gaze mapping for each eye separately? I would have expected to get one gaze mapping using both eyes.

user-c629df 27 February, 2020, 19:05:36

@papr thank you so much!:)

user-7b943c 27 February, 2020, 19:23:35

@user-14d189 Thank you so much! Do you know if it is possible to display these coordinates live on the Pupil Capture software?

user-c72e0c 27 February, 2020, 21:57:56

I'm using MatLab to remote control [record, calibrate and send triggers] pupil capture software using the pupil middle man python code and accompanying MatLab code. https://github.com/mtaung/pupil_middleman

Using this code to send annotations to the pupil core does not work for current pupil capture versions -- @papr I believe there was a change in v1.9 pupil capture to annotations, can you give any guidance on updating the python code to allow it to send annotations?

user-7b943c 27 February, 2020, 22:16:35

Also, my right eye camera displays a blurrier image than the left eye camera. Any suggestions on how to get it in focus or should I send my system in to get that done since my configuration is the 200 Hz no focus eye camera?

user-2fd67a 27 February, 2020, 23:55:01

@user-7b943c have you focused your camera properly?

user-7b943c 27 February, 2020, 23:58:13

@user-2fd67a I am able to focus the world camera. I am scared to mess with the no focus 200 Hz right eye camera though because of the following documentation:

Chat image

user-2fd67a 27 February, 2020, 23:59:04

have you tried to clean the lenses with a soft fabric?

user-2fd67a 27 February, 2020, 23:59:33

maybe someone touched the camera and left a fingerprint as a gift for you

user-7b943c 28 February, 2020, 00:14:10

@user-2fd67a I just tried that and used an air duster in the area of the camera but nothing changed sadly.

user-14d189 28 February, 2020, 00:49:14

@user-7b943c I don't think you can display. But imagine a coordinate system with 0 0 in the horizontal and vertical center of the image. z would be the perpendicular through the center.

user-14d189 28 February, 2020, 00:50:41

with the different sharpness of the images right and left, try to change the distances to the eye of the blurrier camera. See if that change something, otherwise may get advise from pupil labs.

user-14d189 28 February, 2020, 00:51:23

check if the resolution setup is the same.

wrp 28 February, 2020, 02:04:12

@user-7b943c @user-2fd67a you should not attempt to focus the 200Hz eye cameras. This will result in breaking the lens mount as documented. @user-7b943c I think I just responded to you via email - from the images you sent it seems that you might be capturing the eye through eye glasses lenses, which could explain the "blurryness" of one of the images.

user-14d189 28 February, 2020, 02:19:29

@wrp is it possible to export the results from the head pose tracker?

wrp 28 February, 2020, 02:57:41

@user-894365 when you are online can you please respond to @user-14d189's question ☝️ ?

papr 28 February, 2020, 06:26:58

@user-14d189 yes, that should work just fine if you hit the export button in Player while the head pose tracker is active and has finished its processing.

user-14d189 28 February, 2020, 06:47:03

@papr @wrp That was easy! Thank you. What units are the translation x,y,z?

papr 28 February, 2020, 08:02:22

@user-14d189 The unit is one Tag size.

user-d20f83 28 February, 2020, 08:40:56

Hi guys, I'm trying to connect pupil mobile with the last Pupil Capture 1.22, but there is no more NDSI manager - Backend Manager - Manager - Pupil Mobile. Do you know where they move it? thanks

papr 28 February, 2020, 09:06:36

@user-d20f83 it is always running. Your devices should appear in the Video Source menu.

user-d20f83 28 February, 2020, 10:40:03

@papr Great! we got it

user-b2de72 29 February, 2020, 20:48:09

I'm trying to connect pupil remote to matlab so I can export the gaze on surface data. I'm using Matlab_Python from the pupil-helpers-matlab git. I'm using the same machine for both pupil and matlab, so I use the 127.0.0.1 address. I get this timeout: python pythonPupil.py Timesync successful. 0 Traceback (most recent call last): File "pythonPupil.py", line 197, in <module> my_socket.send(uMsg) ConnectionRefusedError: [Errno 61] Connection refused

user-b2de72 29 February, 2020, 20:48:40

Any idea how to connect pupil to matlab so I can either export gaze on surface to matlab, or send events from matlab to pupil? Edit- I'm able to use pupil_remote_control.py to start and stop recording, and remote_annotationes.py to send custom annotations, so that part works.

End of February archive