core


user-94ac2a 01 February, 2019, 01:20:34

Does unity market example with any interaction? Or it just displays the point you are looking at

wrp 01 February, 2019, 05:47:59

@user-94ac2a IIRC it is just about displaying the gaze, no interaction.

wrp 01 February, 2019, 05:48:45

@user-a08c80 you are looking to export a video heat map and not static heatmap (e.g. like what is displayed when you playback the video in Pupil Player, correct? This is not currently supported, but could be considered for future developments.

wrp 01 February, 2019, 05:50:10

@user-a39804 streaming HD video of the internet in real-time (low latency) is a non-trivial problem (and at some point is just about the limits of the physical infrastructure of the internet). Latency is to be expected. Over local WiFi (with the appropirate hardware) latency is less of an issue.

user-a39804 01 February, 2019, 05:53:10

I agree though. We need to account for the latency. Thanks

user-84ae77 01 February, 2019, 17:56:58

Hi there: I am currently writing a saccade detector in Matlab but I noticed there was a saccade detector plugin with the pupil GitHub source code. I loaded the plugin into the player but how do I export the information it gathers? I exported the normal way but I didn't see any new data or data files.

user-2be752 01 February, 2019, 19:21:15

Hi, what is the exact difference between pupil coordinates and gaze coordinates?

user-94ac2a 01 February, 2019, 22:33:30

Does the tilted angle of DIY eye cameras has to be the same tilted degree for both left and right eye cameras? Can one of the camera tilt a little bit but both can cover the entire eyes?

user-92cca2 01 February, 2019, 22:51:54

does anyone know if Pupil Labs does demos or is viewable before purchase in the U.S.?

wrp 01 February, 2019, 23:50:50

@user-84ae77 there is a fixation detector, but no saccade detector/classifier in pupil src code

wrp 01 February, 2019, 23:52:08

@user-2be752 pupil coordinates are positions of the eye. Gaze coordinates are what the participant is looking at in their field of view (pupil mapped into world)

wrp 01 February, 2019, 23:53:20

@user-94ac2a positions and angles of eye cameras are adjustable by design, during calibration the relationship of app cameras in the system is computed

wrp 01 February, 2019, 23:55:39

we can do a demo via video chat/screen sharing or offer a return period within 30 days after purchase. Get in touch via email if you'd like to discuss further @user-92cca2

user-a39804 02 February, 2019, 02:18:24

Hi guys... I can't get the remote aspect of the eye tracker to work. It's simply not connecting or let me say I don't know where to start

user-a39804 02 February, 2019, 02:28:01

I keep getting "could not bind to socket... Address not found"

wrp 02 February, 2019, 02:50:36

How are you trying to connect?

wrp 02 February, 2019, 02:51:15

If remote please ensure the IP and port match what is shown in Pupil Capture's pupil remote plug-in GUI

user-a39804 02 February, 2019, 02:51:59

So I am using the hololens addon

user-a39804 02 February, 2019, 02:52:06

Will that change anything?

user-a39804 02 February, 2019, 02:52:30

I'm trying to stream World camera output to a different PC and unity

user-94ac2a 02 February, 2019, 02:53:42

How can you change two cameras device name? Since I have two cameras with the same device name

user-a39804 02 February, 2019, 03:05:18

@wrp your hint worked. Thanks man

user-a39804 02 February, 2019, 03:05:33

I have another question. How can I stream the world camera remotely?

papr 02 February, 2019, 09:58:48

@user-a39804 using the frame publisher plugin and this example script https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py

papr 02 February, 2019, 10:00:39

@user-94ac2a The names are assigned on uvc/firmware level. You might need to change the code and differentiate the cameras by an other low level attribute.

user-e91538 02 February, 2019, 10:44:24

What work has been done with depth of focus detection?

user-e91538 02 February, 2019, 14:48:21

Hi everyone! I am working on a college project using the pupil glasses and a raspberry pi - I know that it will be difficult to run s the raspberry pi will be slow but I do not need to display anything, I just need to know the coordinates of the users gaze in relation to the world camera and feed to another device - does anyone know if this is possible?

user-96755f 02 February, 2019, 17:37:26

Hello, i'm currently using 2d mapping . How can I convert pupil diameter from pixel to mm? Or should i switch to 3d Mapping?

user-a39804 02 February, 2019, 17:40:16

@papr thanks. I will try it now

user-94ac2a 02 February, 2019, 22:45:08

I can open eye0 or eye1 camera seperatedly. However, when opening them together, they are stuck with each other. What is the reason?

user-94ac2a 02 February, 2019, 22:52:53

When I have one camera plugged in. Trying to open Eye0 will make Eye0 stream video. But when open Eye1, Eye1 will replace the Eye0 video.

wrp 03 February, 2019, 00:55:55

@user-96755f if you need diameter in mm you need to use 3d mode

mpk 03 February, 2019, 01:02:14

@user-94ac2a please restart pupil capture with default settings. It should fix your issue.

user-94ac2a 03 February, 2019, 05:00:40

@mpk thanks

user-e91538 03 February, 2019, 06:12:02

What work has been done with depth of focus detection?

papr 03 February, 2019, 10:50:47

@user-e91538 I would recommend to check our citation list https://docs.google.com/spreadsheets/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/edit?ts=576a3b27#gid=0

user-84ae77 04 February, 2019, 09:38:35

In pupil_positions.csv there is the column diameter_3d - diameter of the pupil scaled to mm based on anthropomorphic avg eye ball diameter and corrected for perspective. With a binocular setup which eye is this pertaining too?

papr 04 February, 2019, 09:53:26

@user-84ae77 Hi, there is a second column eye_id. It contains values of 0 or 1 which indicate right and left eyes.

user-84ae77 04 February, 2019, 09:54:18

I just realized that. Thank you for the quick response!

user-96755f 04 February, 2019, 09:56:37

@wrp thanks!

user-41c874 04 February, 2019, 13:54:42

Hey. Wanted to confirm whether this positioning of the headset is fine? I am attaching front and top views along with eye camera pictures. If you think the focus of the eye-camera images isn't perfect, then how do you suggest to improve it ? Thanks!

Chat image

user-41c874 04 February, 2019, 13:54:43

Chat image

user-41c874 04 February, 2019, 13:54:50

Chat image

user-41c874 04 February, 2019, 13:56:29

And could you potentially send images of how to ideally place the headset ?

papr 04 February, 2019, 13:58:49

@user-41c874 Generally, the positioning looks ok. I would adjust the eye cameras such that point slightly lower and such that the eye is centered in the eye camera image. Additionally, I would try the orange arm extender for the eye1 camera.

user-41c874 04 February, 2019, 15:17:13

Okay . I will try to optimize this. Thanks !

user-89ff35 04 February, 2019, 22:04:26

Hey guys, pretty new around here. I'm a total stranger to eye-tracking technology, but earlier today I had an idea for a fun project that could potentially make use of it, so I've been looking around at open-source options in case I decide to actually go through with the project.

papr 04 February, 2019, 22:35:36

@user-89ff35 hi, welcome to the channel. Feel free to share your idea if you want direct feedback. Alternatively, in case you are not ready to share details yet, you can do some research on your own. Eye tracking as a research field is already very old and there is a lot of research in regard to potential applications. I would recommend you to search for papers investigating your field of application. It will show you the limits of the technology and if your idea is feasible.

user-89ff35 04 February, 2019, 22:52:17

Hey, thanks for your response. I'm currently still investigating the state if what's out there, but I don't mind sharing a bit about my idea.

I'm a very big fan of over-optimizing my workflow as a programmer. I choose software that relies on keybinds rather than mouse clicks, and I use a programmable keyboard that lets me type without moving my fingers very far from the home row. I put a lot of effort into being lazy.

With that in mind, I think (and I'm sure I'm not the first) that eye trackers have a lot of potential as an input device just like a mouse or keyboard. I don't think they should pretend to be a mouse, but rather be their own thing. I've been considering writing a simple process that can read in the raw data from an eye tracker, do some gesture detection, then output it. I would then work on small modular tools that take this data and act on it. As an example, I could replace my status bar on a Linux machine with an applet that only shows itself when I look at the corner of my screen, hiding otherwise.

user-8b1528 05 February, 2019, 13:47:28

@user-89ff35 If I don't mistake, Pupil Labs already has a software that takes the eye tracking and convert it to the mouse pointer moving... so looking at screen corner will put mouse arrow at that corner, so if your menu is activated when mouse come close (like Ubuntu auto hide menu works), it should just work as is without having to add more code to it...

user-8b1528 05 February, 2019, 13:57:14

You just need to add the logistic for clicking... Maybe you could use your programmable keyboard and bind a key to the mouse left/right button or have an old mouse with the rolling ball removed or a new mouse with a black tape on the sensor so the mouse will only take the button click and not the movement of the mouse...

user-8b1528 05 February, 2019, 13:59:05

Like @wrp mentionned on a previous post, the is a link to a script for that: https://github.com/pupil-labs/pupil-helpers/blob/master/python/mouse_control.py

user-8b1528 05 February, 2019, 14:03:08

The only annoying point is that since the mouse follow where your eyes look at, when you type text, the mouse pointer will be right above that text... unless you can make the pointer just a tiny small dot or invisible.

Just giving you my 2 cents 😉

user-71a483 05 February, 2019, 16:14:58

\

user-89ff35 05 February, 2019, 19:05:48

Thanks for your feedback! My concept is just a little different, as I envision eye trackers as a device separate from the mouse. To explain my intuition behind this, I'll share a couple assumptions I've made.

First, I make the assumption that eye movements are fast and rough, as opposed to the slower, more precise motions of a mouse pointer. In common user interfaces, buttons can be as small as 10 pixels across, and actions taken with the mouse tend to be much slower than actions taken with the keyboard.

Second, I assume that as eye-tracking devices become cheaper (e.g. commercial webcam methods), they become less precise. An accessible approach to eye movement as an input should be able to work even if the device cannot accurately point to individual pixels like a mouse can.

user-89ff35 05 February, 2019, 19:05:51

Third, humans use their eyes for more than just input. Since eyes already have tasks lined up for them in a standard workflow, any new tasks for them should be able to be completed quickly, in 1-2 motions at most. As an example, consider the reading of a EULA, a task that takes place often on a computer. For most users, the sequence would be to hover the mouse cursor over the 'accept' button while they read the text, then click. These actions can be completed in parallel. With an eye-tracker acting as a mouse, these tasks go from being parallel to being sequential, resulting in a slowdown for the user. In addition, one would need to provide some criteria to classify eye motions as either deliberate input or normal motion. This is no small task, requiring either burdening the user with the requirement to signal inputs using the keyboard, or using some logical function, perhaps using ML, to classify a sequence of movements. That approach brings with it the problem of inaccuracies, which would become frustrating very quickly.

That's my reasoning for believing that an eye-tracking input device should not necessarily try to make use of the same interfaces in place for mouse devices, but should rather have their own set of interfaces designed for their unique limitations and advantages. I'm sorry if I'm coming across as a little verbose or misguided here, but I do quite enjoy theorycrafting around these kinds of problems.

wrp 06 February, 2019, 02:54:43

Hi @user-89ff35 you might want to check out Tristian Hume's work on eye tracking + mouse control (and/or connect with him). Here is a link to his blog post: http://thume.ca/2017/11/10/eye-tracking-mouse-control-ideas/ (which also links to code)

user-e189f0 06 February, 2019, 10:11:18

Hey, I am now working on HTC VIVE add-on of Pupil-lab. When I try get the blink data by using "Blink Detection" plugin with the confidence threshold set as 0.45, the received data is not paired. In the screenshot, the first list is timestamp, the second list is confidence and the third list is the blink type. As you can see in the picture, I got several continuous onset with no offset between them. I have tried using different condence threshold, but the same problem still exists. Anyone has an idea about this problem? Any help will be appreciated 😃

Chat image

papr 06 February, 2019, 10:38:22

@user-e189f0 https://github.com/pupil-labs/pupil/issues/1285

user-e189f0 06 February, 2019, 13:36:28

@papr thanks for your help. According to the link you give, I may need to use a saccade offset events. However, when I check the source code of "shared modules", I find that it hasn't been implemented. From this post, it seems you have removed this feature from pupil-labs app.

wrp 06 February, 2019, 13:45:04

@user-e189f0 there was never a saccade classifier in the source code or bundled app. The issue is still open.

user-6ec304 06 February, 2019, 20:14:01

hey everyone, quick question - Pupil's user docs list a few different calibration procedures that differ depending upon distance. More specifically: the manual marker and nature features calibration procedures are suited for "midrange distances" and "far distances", respectively. Anyone have an idea what those might be in terms of a unit measurement, or why those calibration procedures were chosen for midrange and far distances?

user-4a4def 06 February, 2019, 20:21:21

hi everyone, I am trying to connect Pupil Mobile to my laptop so I can hopefully just use an android instead of a computer. I think the main problem is that I can't get my phone and laptop on the same network so they can sync up (even though they are on the same wifi). Do I need a special plug in? I am under the impression if they are on the same network, they should connect. Anyone with any general input would be great!

wrp 06 February, 2019, 23:53:13

@user-4a4def are there firewall settings on your computer and/or restrictions on your WiFi network?

wrp 07 February, 2019, 10:36:43

@user-6ec304 this part of the docs should be updated. I think the key takeaway here is that a manual marker would more easily enable a participant to calibrate in the field at a variety of depths (and cover a large area of the participant's FOV). The animated screen based method can also be used at a variety of depths, but would require larger screens (e.g. projections) at greater distances in order for the world camera to actually "see" the markers and cover a large area of the participant's FOV.

user-a04957 07 February, 2019, 16:59:44

Hello I have a question: I want to do pupil size evaluation on an uncalibrated measurement. Should I use "Diameter" or "Diameter3D"? Is "Diameter3D" even valid without calibration?

Furthermore: Do I benefit of Blink detection if I discard all values with confidence <0.6?

Thank you! Best Tim

papr 07 February, 2019, 17:01:07

@user-a04957 3d diameter is valid without calibration, yes. Make sure that the 3d eye model (green outline in eye windows) fits well.

papr 07 February, 2019, 17:01:34

Could you elaborate on your blink detection question?

user-a04957 07 February, 2019, 17:04:18

@papr Thanks! "3d diameter is valid without calibration, yes. Make sure that the 3d eye model (green outline in eye windows) fits well." -> I only have export .csvs, is there a chance to discard not well fitted data by any threshold or value from the exports?

"Could you elaborate on your blink detection question?" -> I mean the blink detection discards sharp drops in the confidence levels and marks beginning and ends of this drops. however if i discard all data with confidence <0.6, than effectively all blinks are discarded as well?

papr 07 February, 2019, 17:07:25

@user-a04957 I would basically discard all low confidence values for the diameter evaluation and calculate the blinks separately.

user-a04957 07 February, 2019, 17:29:40

@papr Thanks again! So I am discarding all confidence <0.6, discard the blink periods, and take the 3D Diamater.

user-a04957 07 February, 2019, 17:37:23

If anybody wants to know: These are the 3D Diameters of a recording where all confidence <0.6 is discarded. I see: pupil sizes have an offset. However, changes seem to be constant. => Is there a "best practice" (e.g. mean the values) to combine the eyes to have one ("more accurate") value ? Or how to deal with these inconsistancies?

Thanks!

Chat image

user-4a4def 07 February, 2019, 17:52:57

@wrp there are no firewall settings on the laptop but we are operating through a university secured wifi where first time users must first download a one-time installation file to access the wifi, and then log in with credentials. However, both of our devices are already on that wifi, but I am wondering if that might be causing an issue

wrp 08 February, 2019, 10:46:42

@user-4a4def university wifi could certainly be a cause of access between Pupil Mobile --> Pupil Capture

user-41c874 08 February, 2019, 10:56:27

Hey. Even though I can play both pupil camera recordings offline on VLC player, but when I play it on pupil player, it seems one of the eye camera never got recorded. When I try to re-detect pupil offline on pupil players, I can see both pupils moving during that process, which means both eyes were recorded. Detection was always on 3d++ detector mode. (for both eyes) Any idea why that would happen ?

papr 08 February, 2019, 11:05:04

@user-41c874 how does the issue show? What do you mean by it seems one of the eye camera never got recorded

user-41c874 08 February, 2019, 11:21:44

I use the eye camera video overlay plugin in pupil player . It just displays the first frame from the video and I don't see any movement of one of the eyes. The other eye video works perfectly fine.

papr 08 February, 2019, 11:26:34

@user-41c874 I see. Can you make a Screenshot of all files within the recording? And which version of Pupil Player do you use?

user-41c874 08 February, 2019, 12:18:14

I'm using 1.10.20 version of pupil capture , player

Chat image

user-41c874 08 February, 2019, 12:18:34

Is this what you're asking for?

Chat image

papr 08 February, 2019, 12:19:48

That's it. Could you please share *_timestamp.npy files with [email removed]

user-41c874 08 February, 2019, 12:22:29

Yes. I am sending. Thanks !

papr 08 February, 2019, 12:48:19

@user-41c874 As expected, your eye timestamps are for some reason offset. It looks like the timestamps were recorded at a different time. This is likely a synchronization issue.

user-64b0d2 08 February, 2019, 12:52:18

Hello everyone, I need to extend the USB cable that connects the headset to the computer. I tried a normal USB to USB extension cable, but then pupil capture could not connect to the eye cameras or world camera and started in ghost mode. Is there a way to extend the USB cable? Do you need a special cable for this?

papr 08 February, 2019, 12:53:43

@user-41c874 I could try to re-adjust the timestamps but this adjustment is not guaranteed to be the original timestamps

papr 08 February, 2019, 12:54:08

@user-64b0d2 try an "active" extension cable

user-41c874 08 February, 2019, 13:01:21

Do you know why this synchronization issue could happen ?

user-41c874 08 February, 2019, 13:03:11

I am currently just trying to optimize and setting up the eye-tracker. So, this isnt an important experiment . This was just one of the trial-errors . Any idea how this synchronization problem came ? And what I could do to prevent this issue ?

papr 08 February, 2019, 13:04:17

@user-41c874 I am not sure. :/

papr 08 February, 2019, 13:05:06

Have all recordings this issue?

user-41c874 08 February, 2019, 13:05:12

Actually, I just remembered that when I recorded these files, pupil capture automatically stored them in the previous day's folder .

user-41c874 08 February, 2019, 13:05:38

All the recordings made on that specific day only.

user-41c874 08 February, 2019, 13:07:07

Maybe this could be related some how ?

papr 08 February, 2019, 13:09:24

This seems weird indeed.

user-4a4def 08 February, 2019, 14:13:08

@wrp there's a general "attwifi" that we could get on to that might work. The connection between the two should be automatic, correct? Granted if we are on the same (more general) wifi?

wrp 08 February, 2019, 14:29:23

@user-4a4def this might work, but public network like this would likely have high latency

wrp 08 February, 2019, 14:30:03

Best choice (if possible) is to use a dedicated router to set up a local wifi network (internet connection not needed)

user-1582b2 08 February, 2019, 18:47:27

Hi guys, When subscribing to the gaze datum I can access norm_pos_x&y which is the normalised position in the world image frame - [0,0] is bottom left and [1,1] is top right - my question is what would the angles be from center of 0 to 1? and what is the granularity of these coordinates

user-1582b2 08 February, 2019, 18:47:35

@wrp

user-1582b2 08 February, 2019, 18:47:38

Thanks

papr 08 February, 2019, 18:53:39

@user-1582b2 what do you mean by angles? The granulatity is float64 precision. Be aware that the gaze values can exceed these bounds if the subject looks at a point outside of the world camera's field of view.

user-1582b2 08 February, 2019, 19:24:45

@papr If the normalized bound on the right side is 1 - how far right is this compared to the center ie 0.5? if that makes sense

papr 08 February, 2019, 19:26:24

That depends on the lense that you use.

papr 08 February, 2019, 19:27:38

If you use a wide angle lense, the distance in angles is further than if you use a narrow lens.

user-1582b2 08 February, 2019, 19:30:16

@papr this depends on the lens of the world camera? what if I use the Pupil Labs high speed camera as advertised? I could find an estimate of the angles myself but would be great to have an accurate measurement for my project

user-1582b2 08 February, 2019, 19:43:27

Ah i see it ships with both 60 and 100 degree lenses

user-1582b2 08 February, 2019, 19:43:30

thank u

user-4c85cf 08 February, 2019, 23:21:32

Hi! I’ve been looking at the blink detection code and saw that for offline blink detection, the filter length is calculated by 2 * round(len(all_pp) * self.history_length / total_time / 2.0). This seems to give a time window of around 200ms, which seems reasonable considering that a blink is at least 100ms. Am I understanding this correctly? If so, why was this particular equation used? Thanks!

wrp 08 February, 2019, 23:26:48

@marc please respond to @user-4c85cf question when you are online.

user-07d4db 10 February, 2019, 15:29:47

Hello 😃 I've got a problem: When I am opening the pupil capture application on my computer, only the window for the world camera opens and not the eye camera. The eye tracker is fixed to the computer. Has anybody an idea, what I can do in order to see the eye camera again?

user-8be7cd 10 February, 2019, 16:41:30

Does turning the eye camera back on from the UI not work?

user-07d4db 10 February, 2019, 19:39:21

I solved the problem thank you!

user-07d4db 10 February, 2019, 19:41:15

Just one further qeustion: When I am doing now the calibration it says after that "not enough reference points available for calibration". What shell I do than?

papr 10 February, 2019, 19:42:20

Do you see the pupil detection visualization in the eye windows? (red circle around pupil, green circle around eye ball (3d mode only))

user-07d4db 10 February, 2019, 19:46:00

do you mean the "pupil detector"? Yes I see it! But I have adjust just the green circle between the two red ones. It is not possible to fix it to my pupil/eye ball

papr 10 February, 2019, 19:47:06

try to make circular motions with your eye. this will help fit the 3d eye model

user-07d4db 10 February, 2019, 19:49:43

thank you for your advice! I tryed that, bit still it says that it cannot find any reference points!

user-07d4db 10 February, 2019, 19:50:31

I never hat this message before. Do I maybe have to deactivate the "s" button on the left hand side?

papr 10 February, 2019, 19:51:07

Are you using the screen-based calibration?

papr 10 February, 2019, 19:51:33

Make sure that the markers are visible in the scene camera

papr 10 February, 2019, 19:52:37

If this does not help please do the following 1. Start a recording 2. Attempt a calibration 3. Stop the recording after the calibration failed 4. Share the recording with data@pupil-labs.com

user-07d4db 10 February, 2019, 19:53:58

thank you very much! I will try that out

marc 11 February, 2019, 08:53:55

@user-4c85cf Yes, your understanding is correct. The length of the time window is determined by history_length, which is a user settable parameter. The default value is 0.2 seconds == 200 ms as you describe. The equation translates the duration of the window in seconds to the length in number of pupils, which differs if the eye video was recorded with different FPS.

user-51ccc9 11 February, 2019, 13:30:39

Hi all,

user-51ccc9 11 February, 2019, 13:32:11

Not necessarily related to pupil, but does anyone know if it's possible to convert Tobii Pro Glasses 2 data to a format which is suitable for analysis in Ogama (or any other open-source analysis software)

user-51ccc9 11 February, 2019, 13:32:51

I have some recordings but I seem to be stuck with the Tobii analysis software, which is very annoying

user-51ccc9 11 February, 2019, 13:33:25

Hope this question doesn't violate the rules of this server

papr 11 February, 2019, 13:34:02

Did you checkout if they have the possibility to export data to csv? Don't worry, such questions are allowed. I am just afraid that there won't be many users able to help you with it. 🙂

user-51ccc9 11 February, 2019, 13:36:17

Looks like it may be possible within the Tobii software... Don't have a license for that, sadly

user-51ccc9 11 February, 2019, 13:37:07

Previously found a Tobii data conversion tool, but doesn't seem to be compatible with the specific format I have

user-51ccc9 11 February, 2019, 13:38:02

Very annoying! But thanks, I guess conversion to .csv is possible using a PC at my uni with a license for the software

user-51ccc9 11 February, 2019, 13:43:13

I only did a short read about the Pupil project, but I sure wish I had been using it... Things not being open-source is so limiting

user-51ccc9 11 February, 2019, 18:09:17

Just wondering, is there any plugin already available for ICA workload (http://www.eyetracking.com/Software/Cognitive-Workload), Index of Pupillary Activity (https://dl.acm.org/citation.cfm?id=3173856) or related measures?

wrp 12 February, 2019, 03:34:25

@user-51ccc9 there is no official plugin from Pupil Labs for cognitive load analysis. Maybe someone else in the community has something?

user-41c874 12 February, 2019, 15:32:02

Hello. So, we display 4 surface markers on the screen, but once in a while, one of the markers gets obstructed by hand movement of the subject and then the surface changes. Is there a way to keep the surface stable even if 3 of them are visible?

user-41c874 12 February, 2019, 15:33:06

Also, what's the best configuration for world camera to function in low lighting conditions?

user-41c874 12 February, 2019, 15:40:16

Could you please look into why the desynchronization between timestamps occurred so that we can make sure this doesn't occur again? (We discussed this last week on Friday.)

papr 12 February, 2019, 15:48:35

@user-41c874 the surface should stay fairly stable even if a marker is obstructed.

papr 12 February, 2019, 15:49:30

There is no way to tell where the synchronization issue comes from. I cannot reproduce it locally. Unless you are able to reproduce it, there is nothing I can do, unfortunately.

user-41c874 12 February, 2019, 15:56:38

Unfortunately the surface changes slightly , when one marker is obstructed. Do you think there are settings which modulate this ?

papr 12 February, 2019, 16:06:53

No, this is to be expected since there are less markers to support the surface. 😕

papr 12 February, 2019, 16:07:14

You could try to add more markers.

user-5821fa 12 February, 2019, 17:18:28

Hi Everyone. I'm having difficulties with the new software update. When I opened it eye 0 image was flipped upside down but eye 1 was right side up. I know that I can flip the eye 0 image in the settings so that it is right side up. However, I was wondering if even though the eye 0 image is flipped right side up if this flipping could be the source of poor tracking quality that I am facing even after calibration.

user-4878ad 12 February, 2019, 17:19:42

When I open pupil player, there is normally a grey window that pops up so that I can drag the recording file into it, but it doesn't pop up anymore so I am not sure how to play my recording...

Chat image

user-41c874 12 February, 2019, 17:27:55

Ah! That's a good idea. I'll add more markers and see if that helps.

user-41c874 12 February, 2019, 17:28:03

Thanks . 😃

wrp 12 February, 2019, 23:02:57

@user-4878ad please delete the pupil_player_settings folder (this is found in your user folder. I think what might have happened is that the window for player is offscreen. Deleting the settings folder will reset to default settings and ensure that the window is no longer out of screen bounds. Let me know if this works.

wrp 12 February, 2019, 23:03:53

@user-5821fa flipping the image will not affect pupil detection robustness.

user-4ef892 13 February, 2019, 04:42:21

Hi guys, I am new to gaze data analysis and I just completed my first couple of eye tracking sessions. Can anyone guide me on how to calculate the vertical and horizontal angles (of gaze) using the exported csv file? Sorry for the noob question.

user-14d189 13 February, 2019, 07:00:53

@user-51ccc9 ICA is interesting but has many underlying variables ... pupil reflex due to accommodation/vergence , light, emotional state, cognitive workload.

user-14d189 13 February, 2019, 07:00:55

Gee JW de, Knapen T, Donner TH. Decision-related pupil dilation reflects upcoming choice and individual bias.

user-14d189 13 February, 2019, 07:01:08

Hess E., Polt J. Pupil size in relation to mental activity in simple problem solving. Science

user-14d189 13 February, 2019, 07:01:26

KAHNEMAN D. Pupil Diameter and Load on Memory

user-14d189 13 February, 2019, 07:01:48

Rational regulation of learning dynamics by pupil–linked arousal systems.

user-14d189 13 February, 2019, 07:18:34

@user-4ef892 which detector method do you use? 2D or 3D. In 3D the sphere center XYZ represents the eye rotation center. from there you can use the gaze normal XYZ monocular to calculate your angles/ vectors ... in 2D it gives you x and y in relation to the world camera image. search in pupil docs - > data format

user-4ef892 13 February, 2019, 07:24:26

I'm trying to implement 2D (I think). I am trying to plot the vertical vs horizontal (in degree) to try and implement some I-VT and I-DT algorithms. From what you said, I gather that the difference between consecutive values (x & y) will be the degree? (gradient function in matlab)

user-5881ed 13 February, 2019, 11:17:49

hello im trying to build pupil_detectors using 'python setup.py build'. but i got this error You've instantiated std::aligned_storage<Len, Align> with an extended alignment (in other words, Align > alignof(max_align_t)). Before VS 2017 15.8 can you help me...?

user-5881ed 13 February, 2019, 11:18:53

the message is C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Tools\MSVC\14.15.26726\include\type_traits(1271): error C2338: You've instantiated std::aligned_storage<Len, Align> with an extended alignment (in other words, Align > alignof(max_align_t)). Before VS 2017 15.8, the member type would non-conformingly have an alignment of only alignof(max_align_t). VS 2017 15.8 was fixed to handle this correctly, but the fix inherently changes layout and breaks binary compatibility (only for uses of aligned_storage with extended alignments). Please define either (1) _ENABLE_EXTENDED_ALIGNED_STORAGE to acknowledge that you understand this message and that you actually want a type with an extended alignment, or (2) _DISABLE_EXTENDED_ALIGNED_STORAGE to silence this message and get the old non-conformant behavior.

papr 13 February, 2019, 13:12:32

@user-5881ed there is a closed issue regarding this issue. Search for the macro mentioned on Github

user-5881ed 13 February, 2019, 13:15:32

yes i found it. and i wonder if i downgrade the visual studio version from 15.8 to 15.76. it will solve the problem

user-b926dc 13 February, 2019, 13:23:53

Hi, how can I use Surface_Tracker in pupil service?

papr 13 February, 2019, 13:28:34

Hi @user-b926dc This is not possible since there is no scene video in Pupil Service.

user-b926dc 13 February, 2019, 13:32:13

Which message should I send to receive the traced surfaces?

papr 13 February, 2019, 13:34:02

Do you mean which topic to subscribe to? Are you using one of our python scripts?

user-b926dc 13 February, 2019, 13:34:48

'surface', yes

user-b926dc 13 February, 2019, 13:35:36

filter_gaze_on_surface

papr 13 February, 2019, 13:39:50

ok, great, make sure that the surface names in the script and in Capture match

user-b926dc 13 February, 2019, 13:41:25

Do I have to send this message before subscribing to the topic?

user-b926dc 13 February, 2019, 13:41:31

{'subject':'start_plugin','name':'surface_tracker', 'args':{'min_marker_perimeter':'30'}}

papr 13 February, 2019, 13:42:24

only if the surface tracker is not running yet.

papr 13 February, 2019, 13:43:04

You will have to setup a surface once. Read the user docs on how to do that. You will have to give it a name that matches the surface in the script.

user-b926dc 13 February, 2019, 13:43:29

I start pupil service no capture

user-b926dc 13 February, 2019, 13:43:47

it's the same?

user-b926dc 13 February, 2019, 13:44:29

I created the surfaces offline with player, how can I import them in capture?

papr 13 February, 2019, 13:45:22

Capture is not equal to Service! Service does not process the scene video, Capture does. You need to setup the surfaces in Capture. You cannot import them from Player.

user-b926dc 13 February, 2019, 13:48:48

Capture can not be started in the background without a graphical interface, because the points are processed by my desktop application?

user-dfeeb9 13 February, 2019, 14:55:49

@papr just a quick couple questions: - Do you guys have any internal tools/scripts comparing different raw capture data of the same recording e.g. online vs offline detection? If not, I am currently working on one and can happily share once it's done - Do you guys have any recommendations for a pupil capture that crashed partway through? I've not yet updated to the latest version of pupil as I'm still in mid-production collecting data for an experiment. I have a recording that died yesterday where the moov atom read error is popping up, so Ill attempt a repair of the mp4 container later when I get to my linux pc. However, I still cannot think of a way to salvage the notification pldata file.

papr 13 February, 2019, 14:56:07

@user-b926dc correct

papr 13 February, 2019, 14:59:34

@user-dfeeb9 1. We do not have such a thing and appreciate that contribution! 2. The notify_timestamps.npy file can be reproduced from the pldata file. The pldata file itself should not be broken since it consists of multiple independent msgpack objects. Worst case, the last object is not readable.

user-dfeeb9 13 February, 2019, 15:01:04

That's fantastic news, do you know if there is a convenient way to generate that npy file?

Also, I don't know whether you have fixed this as I am many versions behind, or whether it's still valid for the new notifications/annotations system, but in my case, I found that enabling pupil_diameter element in the GUI would kill the ability for pupil annotations to receive messages through ZMQ

I'll definitely open up a repo and send it along once I have completed it.

Thanks papr!

papr 13 February, 2019, 15:11:11

pupil_diameter in the gui? I do not follow. We fixed the annoation issue though.

user-dfeeb9 13 February, 2019, 15:16:52

@papr it was when you enabled the options "Display pupil diameter for eye 0/1" in the system graphs setting of the GUI. I don't know if you ever got around to it so I thought I'd mention it. Sorry about that, I am definitely behind in my versions and will update for my next experiment

user-dfeeb9 13 February, 2019, 15:18:31

This is for pupl capture, sorry

user-dfeeb9 13 February, 2019, 15:18:34

I should've clarified

user-dfeeb9 13 February, 2019, 15:18:41

It's probably long fixed but I accidentally discovered it today lol

papr 13 February, 2019, 15:18:50

@user-dfeeb9 Ah, the system graph. Interesting. This is the fist time I heard about this issue. Can you create a Github issue for it?

Regarding recovery: Just modify file_methods.py/load_pldata_file() to not load the timestamp file and to decode the timestamp from the msgpack_payload/serialized dict

user-dfeeb9 13 February, 2019, 15:19:07

This is for an old version, I don't have an environment with the newest pupil version to test it

user-dfeeb9 13 February, 2019, 15:19:23

Okay thanks for guidance on the recovery, will take a look

papr 13 February, 2019, 15:19:31

Which version? I can go back and try to reproduce on old and new versions

user-dfeeb9 13 February, 2019, 15:20:09

1.8-26-g5fe9dbe

user-dfeeb9 13 February, 2019, 15:20:19

it's probably very outdated by this point

user-dfeeb9 13 February, 2019, 15:20:39

I can replicate the issue using my middleman server from a while back, communicating with lua

user-dfeeb9 13 February, 2019, 15:20:51

but in theory anything sending zmq messages should fail

user-dfeeb9 13 February, 2019, 15:21:31

but if you can replicate it i'll throw an issue up on github now

user-5a3c68 13 February, 2019, 15:21:47

Hello! Currently I have lots of sessions. I want to get gaze positions from all of them. At first I thought about parsing all pldata files by my own script. But then I realised tha I could use Pupils API. I found Raw_Data_Exporter at shared_modules but I don't know how to use it properly. Can you help me. What should I use for writing Python script to export raw data from all my sessions.

papr 13 February, 2019, 15:22:39

@user-dfeeb9 the github issue is just for me to keep it in mind since I do not have the time to test it right now 😃 I will simply close it if it is not reproducable

user-dfeeb9 13 February, 2019, 15:22:51

roger, I'll throw it up after my labs then

papr 13 February, 2019, 15:24:14

@user-5a3c68 Check this out: https://gist.github.com/papr/743784a4510a95d6f462970bd1c23972 It is an raw data extractor for pupil data. With some adaptations you should be able to export any data you would like

user-442083 13 February, 2019, 15:59:52

Hi, I'm checking out the eye tracker, and seem to have a curious problem. It appears that 'Pupil Cam' and 'Integrated Camera' are switched. If I select 'Integrated Camera' in the UVC manager, I get the video stream from the eye cam - and vice versa. Any ideas on how to fix that?

user-442083 13 February, 2019, 16:02:40

screenshot of the 'Pupil capture - eye 0' (obv it's the field cam)

Chat image

papr 13 February, 2019, 16:19:09

@user-442083 mmh, interesting. Can you manually change the cameras in world and eye windows?

user-442083 13 February, 2019, 16:20:47

I can select both cameras in both windows, but each time with the wrong label. Also, in the service application, it sometimes starts with the eye cam by default. But also here, selecting 'eye 0' will yield the field cam, and vice versa.

papr 13 February, 2019, 16:24:40

But, you select them inverserly, then it works? I just want to know if you are able to select the actual eye camera in the eye window.

user-442083 13 February, 2019, 16:26:08

Maybe I didn't understand you correctly. Yes I can select the eye camera in the eye window (with the label 'Integrated camera', and there is a pupil detection.

user-442083 13 February, 2019, 16:27:41

But I cannot calibrate after that: Just nothing happens. And my working hypothesis is that the system listens to the wrong camera for each video feed.

papr 13 February, 2019, 16:27:47

Ok, great. I just wanted to make sure that you can continue your research despite the bug.

user-442083 13 February, 2019, 16:28:16

Ah not really, basically I am at a loss how to make the camera work in the first place.

user-442083 13 February, 2019, 16:28:35

If simply the labels were switched, I could live with that.

papr 13 February, 2019, 16:29:03

No, the preview of the video that you can see is also the video that is used for the specific process.

papr 13 February, 2019, 16:29:31

So your calibration issues might come from somewhere else.

papr 13 February, 2019, 16:29:55

What error do you get if the scene and eye videos are setup correctly? What pupil confidence do you get on average?

user-442083 13 February, 2019, 16:36:51

I cannot really set it up. When executing the calibration step, an error message occurs at the end: "Not enough ref point or pupil data available for calibration"

user-442083 13 February, 2019, 16:37:59

Also, the eye cam feed does not show up in the 'eye 0' window when starting with the default settings.

Chat image

papr 13 February, 2019, 16:40:38

How many eye cameras do you have?

papr 13 February, 2019, 16:41:52

Make sure that the calibration marker is visible in the scene video the whole time. From you screen shot it looks like your scene camera is pointing too high.

user-442083 13 February, 2019, 16:47:47

1 eye cam. Good point about the scene cam, I'll try that again.

papr 13 February, 2019, 16:48:28

If you only have one eye cam, then you can close one of the eye windows

user-442083 13 February, 2019, 16:49:43

I did, I closed both of them. But shouldn't the feed still appear in that window?

papr 13 February, 2019, 16:50:28

You need one eye window for the pupil detection. If you close the window, you stop the detection process

papr 13 February, 2019, 16:50:38

And therefore do not get any pupil data

user-442083 13 February, 2019, 16:54:33

That did the trick! thank you a lot for your patience.

user-4878ad 13 February, 2019, 17:21:36

@wrp it works now!!! thanks!!! 😃

user-ac3779 13 February, 2019, 19:34:23

Hey guys, I am stucked on how to connect the EPSON BT300 eye tracking add-ons with pupil apps. There is no document in the pupil doc showing that.

user-14d189 14 February, 2019, 00:07:40

@user-4ef892 2D gives you the x and y coordinates in regards to the world camera image. Since you do not have the distance ... angles are hard to calculate. If you only after the angles in between data points... have look at the 3D method and the normal vectors x y. then you can take the difference of consecutive data points. Does that make sense? Can you fill me up I-VT I-DT?

user-82e954 14 February, 2019, 02:34:17

Hi, I'm trying to use Mouse_Control. The code has been executed and print x_dim and y_dim, but the mouse cursor is still not controlled by eye movement.

It is said that I need to enable surface marker, but I'm not sure how to do that. I am just using the pupil capture GUI to create the surface.

user-82e954 14 February, 2019, 02:42:06

When the script is executed, the cursor just went to the top of the screen, but then can't be controlled. And just to clarify, I'm talking about mouse_control.py in pupil_helpers

wrp 14 February, 2019, 02:51:58

Hi @user-82e954 does the name of your surface match the name of the surface in the mouse_control.py script?

wrp 14 February, 2019, 02:52:04

@user-82e954 what OS are you using?

user-82e954 14 February, 2019, 02:54:13

Yes, I named it "screen". I'm using Windows 10.

user-82e954 14 February, 2019, 06:43:10

If I just load it in cmd, am I executing it the right way?

user-5881ed 14 February, 2019, 09:44:54

papr, thank you for comment. and i have one question. now i'm on the 'build boost' and i got a this message after building it. is it okay? The Boost C++ Libraries were successfully built!

The following directory should be added to compiler include paths:

C:\work\boost

The following directory should be added to linker library paths:

C:\work\boost\stage\lib
user-5881ed 14 February, 2019, 09:46:05

this message seems to give some path(C:\work\boost) to compiler. but i have no idea....

papr 14 February, 2019, 09:56:14

@user-5881ed Are you citing the docs or some output of the install process? I am not too familiar with the windows dependencies. Btw did you try running from bundle as alternative? Running from source is often not necessary.

user-5881ed 14 February, 2019, 09:59:16

yes i just type the 'b2 --with-python link=shared address-model=64' in the docs.

user-5881ed 14 February, 2019, 10:00:04

but if i type just 'b2' in cmd it seems make file named 'stage'

user-5881ed 14 February, 2019, 10:00:31

and make many .lib files

user-5881ed 14 February, 2019, 10:01:53

and the 'b2 --with-python link=shared address-model=64' didn't make any filel..

user-5881ed 14 February, 2019, 10:06:06

how can i do the 'running from bundle'... sorry im not skillful at theses.

user-a04957 14 February, 2019, 12:39:07

@papr When I discard all data with confidence <0.6 (as recommended) and afterwards filter with the blink detection (onset and offset confidence to 0.5), the blink detector effectively filters nothing. Correct?

Thanks

user-5881ed 14 February, 2019, 12:39:23

i solve this by changing jamfile in C:\work\boost\libs\python\build to jamfile in build folder at https://github.com/boostorg/python here thank you!

papr 14 February, 2019, 12:40:19

@user-a04957 Correct! The blink detector expects unfiltered data

user-a04957 14 February, 2019, 12:42:22

perfect. makes it easier 😃

user-a04957 14 February, 2019, 13:04:52

@papr is the confidence level only the "accuracy" of the current pupil size, or does it also consider previous samples? (Meaning: Should I apply some more signal preprocessing before interpretationof the data)?

user-41c874 14 February, 2019, 14:38:23

Hello, Will surface gaze positions when sent over the network on-line also send gaze positions outside of the surface ? Also, with what frequency are the surface gaze positions sent over the network ? Is this dependent on the frequency/fps of the world camera ?

user-41c874 14 February, 2019, 14:41:02

Also, can we replace the world camera with intel realsense D435 and/or another usb camera with better low light performance?

user-c5bbc4 14 February, 2019, 22:15:14

Hi, has anyone here tried Pupil in the outdoor environment? May I ask how is the performance? For me, I personally feel that the performance is not as good as the indoor. But since Pupil uses glint-free algorithm, this performance decrease is not supposed to appear. Or maybe it is because I was walking around which might induce head slippage? Any ideas about the potential causes? Thanks a lot.

user-3c9008 15 February, 2019, 02:29:36

anyone noticed that the eye-tracker add on for htc vive gets hot? I run it for 10mins and it was hot. I worry that my participant may feel real discomfort due this heat generating from eye-tracker. If I touch the eye-tracker I can feel the heat.

wrp 15 February, 2019, 02:36:03

Hi @user-41c874 Responses to your questions: 1. Surfaces and network API - if you subscribe to a surface, you will be able to get gaze positions relative to the surface. Each datum will have a timestamp. You can see an example of what this looks like here: https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_gaze_on_surface.py 2. RealSense D400 Series - We have a mount for the D400 series cameras, but this requires different cabling so the frame would need to be swapped out (eye cameras could be reused). Please contact info@pupil-labs.com to follow up on this.

wrp 15 February, 2019, 02:39:39

@user-c5bbc4 Even though Pupil does not use corneal glints, very bright sunny environments will still be a challenge for Pupil headsets because the IR light from the sun overpowers the IR emitters, making it difficult to get a good image of the eye, leading to less robust pupil detection. Slippage is a separate issue, but can also contribute to reduced gaze estimation accuracy over time. The 3d pipeline tries to compensate for small slippages with model updates, but large movements of the headset can lead to reduced accuracy/drift over time. Additional calibration(s) can be included to help remedy this issue.

wrp 15 February, 2019, 02:44:26

@user-3c9008 the cameras will get warm, especially when in the HMD running at 120hz due to lack of air circulation in this environment. If you feel this is a blocking issue to your research we can follow up with discussion via email to discuss solutions.

user-3c9008 15 February, 2019, 03:13:27

@wrp thanks. I will let my professor know about this. And he can follow up with you.

user-3c9008 15 February, 2019, 03:15:19

Do you have a preferred email address to contact with ? @wrp

wrp 15 February, 2019, 03:28:44

@user-3c9008 please email info@pupil-labs.com

user-c5bbc4 15 February, 2019, 06:32:18

@wrp Thanks a lot!!

wrp 15 February, 2019, 08:29:19

welcome @user-c5bbc4

user-41c874 15 February, 2019, 09:02:21

Thanks a lot ! I'll look into it. Also, by any chance could you recommend any camera compatible with the Pupil Labs eyetracker which has good low light performance?

papr 15 February, 2019, 09:15:52

@user-a04957 The confidence is always a quality measure for the current eye frame. There is no other recommended signal processing than filtering by confidence.

user-ed70a0 15 February, 2019, 12:43:07

Hello! Hello. I want to export the heatmap data. What process should I take to export it?

wrp 15 February, 2019, 13:02:09

Hi @user-ed70a0 did you look at surface tracking docs already? https://docs.pupil-labs.com/#analysis-plugins

user-ed70a0 15 February, 2019, 13:12:45

Hello @wrp . Yes, I did. It was exported as a png file until yesterday. Today, I did recoding and exported it, but the heatmap was not saved😭

wrp 15 February, 2019, 13:15:02

I'm not sure I follow @user-ed70a0 did you define surfaces and have the offline surface tracking plug-in enabled?

papr 15 February, 2019, 13:18:33

@user-ed70a0 I think I have heart about the bug before. The csv data is correctly exported but not the heatmap images, correct?

papr 15 February, 2019, 13:19:13

If so, please use this jupyter notebook as a work around to generate heatmaps until the issue is fixed. https://github.com/pupil-labs/pupil-tutorials/blob/master/02_load_exported_surfaces_and_visualize_aggregate_heatmap.ipynb

user-ed70a0 15 February, 2019, 13:24:41

Thank you for your kind answer @wrp . That's right @papr! The csv data is correctly exported but not the heatmap images. I'll give it a try as you say. Thank you very much @papr

user-5a3c68 15 February, 2019, 14:13:25

Hello! I've tried to use find_closes function from player_methods.py module (https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/player_methods.py#L145). Docstring says that the second timestamp series should be sorted. But on pracitse my result is wrong if I sort it. I can't reilise who is wrong me or the docstring?

papr 15 February, 2019, 14:15:10

@user-5a3c68 The docstring is wrong. target needs to be sorted. See https://docs.scipy.org/doc/numpy-1.15.0/reference/generated/numpy.searchsorted.html

user-5a3c68 15 February, 2019, 14:23:12

Thanks @papr . I have one more question. I found out that world_timestamp.npy is always sorted but gaze_timestamp.npy not. Why so?

papr 15 February, 2019, 14:24:56

Because world timestamps are generated by a monotonic clock (world camera) while gaze timestamps are the result of merging data from two different clocks (eye cameras)

user-5a3c68 15 February, 2019, 14:26:32

OK, thank you @papr

user-dfeeb9 15 February, 2019, 17:09:33

Hi, I'm getting more and more regular crashes of my world camera feed during capture of 20 minute pupil sessions. I can't seem to isolate whether the cause is with the camera or with my pc/software/os. The error information is also a not too informative one liner 'world - [WARNING] video_capture.realsense_backend: Realsense failed to provide frames. Attempting to reinit.'

Symptomatically, these crashes occur with the world camera often failing to appear on the device manager and not being detected upon a reseating of either the USB A or USB C jacks/ports. However, if I keep pupil plugged into my PC and reboot, it magically works again. I am beginning to suspect it is a consequence of very annoying user policies on the PC with which i operate, but are there issues you have experienced in the past with the world cam crashing erratically like this?

user-dfeeb9 15 February, 2019, 17:12:07

specifically, I am on an older version of pupil 1.8-26

user-5a67b9 17 February, 2019, 10:40:34

Hello! Where do I find the calibration file for a specific recording? Specifically, I am looking for the angular accuracy and angular precision values. There is a text file called “capture” in the pupil_capture_settings folder but, unfortunately, it contains only information about the last calibration that was performed. Is it possible to retrieve this information also for older recordings?

papr 17 February, 2019, 10:43:29

@user-5a67b9 Hi, this is currently not possible. Our upcoming version will allow you to recalculate these values offline though.

user-6302ac 18 February, 2019, 14:06:20

Is there a way to easily get or calculate gaze in units of angle (degrees or radians) instead of normalized units?

papr 18 February, 2019, 14:29:11

@user-6302ac No, but it is possible. The angle values depend on your camera intrinsics. We use this technique to calculate the angular error in the accuracy visualizer plugin: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/accuracy_visualizer.py#L223-L242

user-6302ac 18 February, 2019, 15:44:02

@papr Thanks, I will take a look at it

user-14d189 19 February, 2019, 07:15:10

@user-6302ac Did you have a look into tabo schema. Assume the center of the schema is the eye rotation center. X and Y normal indicating the offset out of straight forward position, the direction of gaze. the distance to the center the amplitude of gaze angle. Is that about right?

user-bd800a 19 February, 2019, 10:22:56

Hi, I see that the batch export has been removed from version 1.8 to 1.10 is it due to a bug? Because I do get one "Division by zero" in batch_exporter.py Is there any alternative?

user-a50aee 19 February, 2019, 17:42:17

Hi, can i use webcam instead of eye tracker?

wrp 20 February, 2019, 01:56:27

@user-a50aee the pupil detection algorithms are designed to be used for head mounted/wearable eye trackers (not remote systems).

user-f81efb 20 February, 2019, 11:24:22

Hello, I am trying to use the surface tracking feature. Is there some data on the location of the marker that can be extracted?

user-f81efb 20 February, 2019, 11:28:55

Also, how can I improve pupil detection apart from the things mentioned in pupil docs. Specifically, with the algorithm display mode for better pupil detection. Thanks

user-741ae5 20 February, 2019, 14:57:59

Hello, do I need a Nexus to use the pupil mobile ? Or can I just use any android device ?

papr 20 February, 2019, 15:00:41

The minimum requirement is a usb-c connector. Unfortunately, we know from experience that there are a lot of usb-c phones that do not recognize our cameras correctly. We recommend to use a phone of this list: https://github.com/pupil-labs/pupil-mobile-app/#supported-hardware

papr 20 February, 2019, 15:01:21

Also, be aware that there is a known issue with Android 9. I highly recommend to avoid upgrading to Android 9.

user-741ae5 20 February, 2019, 15:02:41

I have a Zenfone 3 zoom with android 7, I was able to connect to pupil but I'm finding problems when calibrating

user-741ae5 20 February, 2019, 15:03:15

It says "Not enough ref point .s.."

user-741ae5 20 February, 2019, 15:03:30

No matter how many points I look

papr 20 February, 2019, 15:03:43

Are you calibrating in Pupil Capture or in Player?

user-741ae5 20 February, 2019, 15:03:56

In Capture

papr 20 February, 2019, 15:04:19

Either way, this is not a problem with your phone.

user-741ae5 20 February, 2019, 15:04:43

ok, good to know

papr 20 February, 2019, 15:04:57

Could you make a recording in Capture while you are calibrating and share it with [email removed]

user-741ae5 20 February, 2019, 15:06:10

Ok, I send you by 40 min

papr 20 February, 2019, 15:06:40

No hurry 🙂 The recording does not need to be long as well. 👍

user-741ae5 20 February, 2019, 15:06:58

Ok thanks

user-741ae5 20 February, 2019, 15:44:34

I have sent the email with the video. Thanks for the attention

papr 20 February, 2019, 15:52:14

Hi @user-741ae5 I have not received the email yet. Could you verify that you sent it to data@pupil-labs.com ?

user-64b0d2 20 February, 2019, 16:52:17

Hello everyone, I am currently trying to send annotations to pupil capture while recording. I would like send annotations via MatLab. Unfortunately, I'm new to MatLab and haven't succeeded yet. Has anyone done this before and would be willing to share some code with me?

user-741ae5 20 February, 2019, 17:03:26

Hello @papr I sent the video to the cloud and sent the link on email because probably the video file was too big to send it attached to the email and I my guess is you don't received my email

papr 20 February, 2019, 17:54:51

@user-741ae5 correct, I still have not received the email.

user-741ae5 20 February, 2019, 18:35:30

@papr I tried to send to you from another server, check again please

papr 20 February, 2019, 18:36:43

@user-741ae5 got it!

papr 20 February, 2019, 18:40:14

@user-741ae5 It looks like there are enough ref points, since the calibration procedure progresses as expected. But it does not look like you are running any pupil detection/eye processes. Enable eye0 and eye1 in the general settings, and select the respective eye cameras for each window, in a similar fashion as you did for the world window.

user-741ae5 20 February, 2019, 18:42:49

@papr I will try and return with the result

user-741ae5 20 February, 2019, 20:46:31

@papr I got it, this was the problem. Thanks so much for the attention!

user-4174b3 20 February, 2019, 20:53:57

Hello, I have asked this before a long time ago. Is pupil setup to get the array of the radius from the center at all angles (assuming non circulatory) as opposed to only the pupil diameter. And if not where would be the first place to start to add that feature in?

papr 20 February, 2019, 21:03:52

@user-4174b3 hi, this is not available. You would have to add it to the 2d detector. But please be aware that the Pupil detection assumes pupil circularity. Therefore you might see lower confidence values than usual with subject with non circular subjects.

user-4174b3 20 February, 2019, 22:49:26

@papr Thank you. Is there any interest in adding this feature to Pupil? Maybe in a separate branch? It's what I'll be actively working on.

user-4174b3 20 February, 2019, 23:10:25

When I first inquired two years ago, we simply implemented this feature in python post-process without using Pupil. Now we have more resources to the projec and are looking to implement this in Pupil. (Real time if possible)

wrp 21 February, 2019, 01:45:21

@user-4174b3 have you published any papers with the method you have implemented as a post-process?

user-4174b3 21 February, 2019, 02:20:36

No, it's used for our pre-data for the grant we are writing now. The algorithm itself is too simple to publish on its own without data (In my opinion). We are still working on IRB approval to publish patient data. It is in my Github account.

wrp 21 February, 2019, 07:59:52

@user-4174b3 thanks, I found the github (same user name as your handle here - which is useful 😸 ), however I'm not sure what you mean when you say "get the array of the radius from the center at all angles" - the ellipse data is saved/published, is this not sufficient for your application?

papr 21 February, 2019, 08:02:41

@wrp I think @user-4174b3 works with subjects that do not have a circular pupil. The array of the radius from the center at all angles basically describes the outline of the pupil shape.

wrp 21 February, 2019, 08:03:02

noted, thanks

user-f81efb 21 February, 2019, 10:25:33

i am trying to calibrate in pupil capture

user-06c1e0 21 February, 2019, 10:40:50

@papr Hi, I saw your reply on a github post (https://github.com/pupil-labs/pupil/issues/724) about wearing the eye tracking frames with eye glasses and you suggested wearing the glasses over the frames. I was wondering how well this works/feels, do the eye glasses frame obstruct the world view camera at all?

user-24270f 21 February, 2019, 10:49:07

So @wrp et al, no oculus rift cv1 eye tracking

user-019256 21 February, 2019, 16:43:12

Hi! I am currently trying to change the calibration method from binocular to dual monocular, since I need gaze data for both eyes separately. I did that by changing 'Binocular_Gaze_Mapper' to 'Dual_Monocular_Gaze_Mapper' in the calibrate_2d_binocluar function in finish_calibration.py. I also changed the arguments given in 'args': {'params': params, 'params_eye0': params_eye0, 'params_eye1': params_eye1} to 'args': {'params0': params_eye0, 'params1': params_eye1} since this is what the Dual_Monocular_Gaze_Mapper class in gaze_mappers.py wants as input.

This works fine when I use it for offline calibration in pupil player, but when I use it in capture, it is very unstable and doesn't work when confidence is not perfect. Additionally, when calibration failed one time, it will fail every time after this until capture is restarted, even if confidence is good. In capture I get the error "Not enough ref point or pupil data available for calibration." and in the log " Binocular match rejected due to time dispersion criterion".

Does anyone already have experience with dual monocular calibration in capture and can point me in the right direction? Are there any additional and necessary changes I have to do in the source code?

Thank you very much in advance!

capture.log

user-4174b3 21 February, 2019, 17:04:58

@papr yes, exactly

user-4174b3 21 February, 2019, 17:11:21

@papr @wrp Here is a snapshot of our pre-data.

Chat image

user-4174b3 21 February, 2019, 17:12:23

This is what we would like to get in real-time.

user-741ae5 21 February, 2019, 18:06:49

Hi @papr I have another question. When I try to create a heatmap of the record, the surfaces window keeps blank, no matter what I do.

user-6997ad 22 February, 2019, 05:17:00

The eye camera previews in capture seem to show one eye image is much darker than the other. Is this a known issue?

Chat image

user-6d0d65 22 February, 2019, 08:25:25

Can I use Pupil Labs VIVE add-on with vive wireless adapter?

user-41c874 22 February, 2019, 14:42:17

Hey ! When exporting Surface gaze points, the frequency is similar to the world camera's FPS (60Hz) . Is it possible to get surface gaze positions at the frequency of pupil camera (200Hz)? Or is it possible to map pupil positions on to the previously recorded surface coordinates,? (somewhat like an interpolation?)

user-41c874 22 February, 2019, 14:43:32

We can't switch to a higher FPS for the world camera because of our low light conditions and we cannot use a lower spatial resolution.

papr 22 February, 2019, 14:53:59

@user-6997ad check the exposure time in the eye1 window. Make sure it matches with eye0

user-741ae5 22 February, 2019, 18:11:25

Hello! How can I make a video heatmap? I have tried to add surfaces but it didn't work.

user-741ae5 22 February, 2019, 18:24:16

This was shown as the surface, but the real surface is the bottle on the left

Chat image

user-94ac2a 22 February, 2019, 22:33:46

What is the best IR LED Wavelength for eye camera?

user-6997ad 23 February, 2019, 05:51:32

@papr to get the brightness of the two cameras to be roughly the same, I need to set one of the absolute exposure times to ~90, and the other to ~350. Does this indicate a potential issue with one of the sensors?

papr 23 February, 2019, 07:53:15

@user-6997ad That sounds not exactly right. Which Capture version do you use? Did you try to enable the eye auto exposure mode?

user-f2e54b 23 February, 2019, 08:34:00

Hi ! I tried to open my recording file by using "pupil player" but I couldn't. I only saw a warning message. Please let me know what I have to do to see my file.

Chat image

user-f27d88 23 February, 2019, 08:38:44

Hello, @user-f2e54b , how you ran pupil player, from source or from our release? So the window of pupil player didn't reopen after you drop you recording file in it, right?

user-f2e54b 24 February, 2019, 02:25:52

Hello, @user-f27d88 , sorry for late reply. I think I ran pupil player from your release and yes I couldn't reopen my recording after I drop the file in it.

user-f27d88 24 February, 2019, 06:17:30

@user-f2e54b Does it happen every time?

user-f2e54b 24 February, 2019, 06:29:21

@user-f27d88 yes. I tried to open every files but I couldn't open any of them.

user-f27d88 24 February, 2019, 06:31:37

Would you mind upload your recording files to cloud (google drive or other), so I can download them for testing?

user-f2e54b 24 February, 2019, 06:38:33

sure! I will do it right now. Thank you.

user-f2e54b 24 February, 2019, 06:56:30

this is the file link. Thank you

user-f27d88 24 February, 2019, 07:08:27

I'm will come back to you once I figure out the problem.

user-bd800a 25 February, 2019, 08:21:05

Hello, is there any expected date for the batch exporter to be reintroduced with multiple subjects? I'd like to process a large number of file

papr 25 February, 2019, 09:05:58

@user-bd800a Hi, currently not. Do you rely on offline detection/calibration or do you have prerecorded pupil/gaze data? What type of data are you looking to export?

user-bd800a 25 February, 2019, 09:18:37

I have recorded pupils size of both pupils, I'd like to export this data with the timestamps

papr 25 February, 2019, 09:29:43

@user-bd800a Check out this script. https://gist.github.com/papr/743784a4510a95d6f462970bd1c23972

With some adaptation, you should be able to extract other pupil fields as well.

papr 25 February, 2019, 09:35:31

@user-f2e54b Hi, we were to open your recording as expected. Please be aware, that a recording is identified as a folder containing files. A single file does not count as a recording for Player. Therefore, you have to drag the 000 or 001 folder on top of the gray window, instead of the single files.

user-bd800a 25 February, 2019, 09:44:20

@papr thanks! this seems to be exactly what I need

user-07d4db 25 February, 2019, 15:25:52

Dear Pupil Labs Community, does anybody of you have an advice, which value to choose for the maximum duration in the offline fixation detector, when I want to calculate the average number and duration of fixations within specific AOI? I was thinking of leaving it at the possible maximum of 4000 ms, in order to avoid the artificial refraction of fixations above a lower threshold value. How did you cope with this question?

user-e0772f 25 February, 2019, 17:24:44

Hello! I'm having bottleneck problems with the communication between a laptop executing Pupil service and another laptop collecting world camera images from zmq. I have a 100Mb/s ethernet connection and I can only receive between 11 and 13 frames per second. Is there any possibilitie to reduce the frames per second that Pupil Service can send or can I improve this frames per seconds using another zmq setup?

user-97bc82 25 February, 2019, 19:21:37

Hello,

user-97bc82 25 February, 2019, 19:27:07

with the new update I am no longer able to view my Real World Camera. With the older version I was able to select via the backend manager the RealSense3D and activate the World Camera. …… Now with the new update the backend manager no longer has that option and it has 2 new ones. Indeed, I have tried both the RealSense400 and RealSense R200 and my Real World Camera still doesn't seem to work.

papr 25 February, 2019, 19:48:26

@user-97bc82 Hi, so you have been using a R200 realsense camera up to this point? Am I also right that you are using Pupil on Windows?

user-97bc82 25 February, 2019, 19:52:21

Thank you so much for your immediate response it is much appreciated ! I have been using the Pupil Labs Eye tracking device and I recently switched to Pupil on Windows which has been giving me this error. Otherwise, the eye tracker works perfectly well on my Mac with the Pupil Capture Version 1.7.42

papr 25 February, 2019, 19:55:16

@user-97bc82 Yes, this is a driver issue. Unfortunately, Intel has stopped the driver support for the R200 in favor of its newer D400 series. 😕 I therefore recommend to continue using your Mac.

user-97bc82 25 February, 2019, 19:58:18

So in other words, for clarification purposes, is there any way possible that I may be able to record with my Windows device? May there be an alternative solution? Or the final answer is just to solely continue to use the Mac?

papr 25 February, 2019, 20:01:24

Unless you are able to restore your Windows machine to an earlier Windows version that still supports the R200 drivers, you will probably not be able to use the R200 on Windows. 😕

papr 25 February, 2019, 20:01:53

Is there a specific reason why you switched from Mac to Windows?

papr 25 February, 2019, 20:04:19

@user-e0772f did you try changing the image format from bgr to jpeg? This should yield better transmission rate but you will have to decode the jpeg images on receiver side.

user-21d960 25 February, 2019, 21:54:33

Doesn pupil capture not work windows?

user-21d960 25 February, 2019, 21:54:35

@papr

papr 25 February, 2019, 21:56:04

@user-21d960 The opposite, Pupil Capture does work on Windows! The issue described above does only relate to the usage of the R200 Intel RealSense 3d camera for which Intel has stopped supporting Windows drivers.

user-21d960 25 February, 2019, 21:56:24

Ah ok great just making sure

user-f27d88 26 February, 2019, 03:29:59

@user-f2e54b , Are you dragging a single file on top of Player, instead of the folder? @papr and I tested your recording, it works well.

user-1bcd3e 26 February, 2019, 07:35:24

Good morning everyone! I would like to know which is a good configuration to be able to record in a bright enough outdoor environment. I have noticed that we often lose confidence because eyecameras take too much light....

wrp 26 February, 2019, 09:55:52

@user-1bcd3e - in very bright outdoor environments the best solution is to manually reduce the exposure time. However, even in very bright (direct sunlight) environments, this might not be enough. Your participant/respondant/subject may need to wear a hat in order to shade the face to reduce the amount of direct IR light. This is unfortunately, an issue that will affect all IR based eye trackers using a traditional pupil detection --> gaze estimation pipeline

wrp 26 February, 2019, 09:56:05

Side note - this is one of the issues we aim to solve with Pupil Invisible

user-e0772f 26 February, 2019, 12:20:22

Thank you @papr ! Changing the image format to jpeg solved my problem. After that I spent some time to decode this format. Finally using opencv library and imdecode method I solved this issue too. Thanks again!

user-64b0d2 26 February, 2019, 12:53:54

Pupil Capture suddenly stopped working. Whenever I try to run it I get the following crash report. Does anyone know why this started happening?

Chat image

papr 26 February, 2019, 13:51:40

@user-64b0d2 please stop Player, delete the user_setting files in the pupil_player_settings, and restart Player.

wrp 26 February, 2019, 13:58:40

@user-64b0d2 out of curiosity, do you have a VM running?

user-64b0d2 26 February, 2019, 14:34:43

@papr Deleting the setting files did not resolve the issue. @wrp What is a VM?

wrp 26 February, 2019, 14:37:00

Virtual Machine.

wrp 26 February, 2019, 14:37:57

@user-64b0d2 when you say "suddenly stopped working" were there windows updates between working and not working?

wrp 26 February, 2019, 14:38:28

The error you are seeing is usually linked to your graphics card drivers not supporting OpenGL

wrp 26 February, 2019, 14:40:39

If there was a Windows update, then this could have caused some drivers to be modified and you may need to manually update graphics card drivers to the latest version via device manager

wrp 26 February, 2019, 14:42:02

In other cases I have seen this error be displayed when a user was accessing the computer running Pupil software via remote access (e.g. a software like TeamViewer)

user-64b0d2 26 February, 2019, 15:00:34

@wrp I'm not running a virtual machine. I succesfully used pupil capture a few days ago, so maybe suddenly was not the right word to use. I don't think there were Windows updates during that time. However, after encountering the error I searched for new updates and found some. I installed them in the hope that this might resolve the error, but it didn't. I updated the graphics card drivers, but this didn't resolve the issue either. And I'm not running the computer via remote access. Thanks for all the suggestions though!

wrp 26 February, 2019, 15:03:57

@user-64b0d2 can you try to use a tool like http://realtech-vr.com/admin/glview

wrp 26 February, 2019, 15:04:23

To check what OpenGL support your system has

user-64b0d2 26 February, 2019, 15:31:15

@wrp According to that program I have OpenGL version 1.1

user-97591f 26 February, 2019, 17:08:22

Hello, I was wondering if anyone else has the problem of loading their eye recordings into pupil player. Some recordings (recorded on the same day on the same computer) will fail to load, and the video file itself is fine - but not the timestamps. This is outlined in the issue here: https://github.com/pupil-labs/pupil/issues/1449

user-21d960 26 February, 2019, 18:03:02

@papr I am trying to setup the pupil tracker but it seems there isnt a good position for it to track my eye, heres a picture of the best position I could get

user-21d960 26 February, 2019, 18:03:28

Chat image

user-21d960 26 February, 2019, 19:26:29

it seems to not be tracking very accuratly

user-21d960 26 February, 2019, 19:42:29

i think my focus is off

papr 26 February, 2019, 19:50:57

@user-21d960 Are you using a 200hz headset or a 120Hz hmd add-on?

user-21d960 26 February, 2019, 19:53:43

I am not sure, how can I tell

user-21d960 26 February, 2019, 19:53:44

@papr

papr 26 February, 2019, 19:56:38

When did you buy the device? Is it the glasses-like frame or the add-on for VR devices?

papr 26 February, 2019, 19:57:08

@user-21d960 Alternatively, post a picture of your device.

user-21d960 26 February, 2019, 19:57:55

Chat image

papr 26 February, 2019, 20:01:00

Ok great. This is a monocular 200Hz Pupil headset. Did you know that you can slide the camera arm forwards? You can even remove the arm completely and place the orange arm extender in between. The extender should be part of the accessories that have been shipped with the headset.

user-21d960 26 February, 2019, 20:01:47

I have been trying to adjust it, i didnt see an extender, i think that would help alot

user-21d960 26 February, 2019, 20:03:33

But how can I change the focus?

user-21d960 26 February, 2019, 20:03:42

because even if a manualy move it farther its still quite blurry

user-21d960 26 February, 2019, 20:04:02

also is there a CAD model I can use to make a custom extender?

papr 26 February, 2019, 20:05:28

@user-21d960 The 200Hz cams do not have adjustable focus! A good position and good contrast between the pupil and the remaining image is more important.

papr 26 February, 2019, 20:06:20

Also, the pupil detection on your screenshot looks pretty decent. The only problem is that the eye camera seems to point too far to the top.

user-21d960 26 February, 2019, 20:08:04

ok

user-21d960 26 February, 2019, 20:08:16

for some reason the manual target calibration does not work sometimes

user-21d960 26 February, 2019, 20:08:27

even though in the eye tab the pupil tracking is fine

papr 26 February, 2019, 20:08:36

This is the extender's geometry file: https://github.com/pupil-labs/pupil-geometry/blob/master/Pupil%20Headset%20triangle%20mount%20extender.stl

papr 26 February, 2019, 20:10:29

@user-21d960 make sure that the circle markers are always visible in the world camera. Also, the manual marker calibration requires you to hold still (from the world cameras point of view) on each calibration location. Also, make sure to calibration a large area of the world camera's field of view.

user-21d960 26 February, 2019, 20:12:05

how can I take up a large area of the world cam if the calibration images are on my laptop?

user-21d960 26 February, 2019, 20:13:10

Also what calibration method do you recomend

papr 26 February, 2019, 20:16:00

If you use your laptop, the easiest way is to use the built-in screen marker calibration.

papr 26 February, 2019, 20:17:30

Alternatively, use the single marker calibration. It displays one marker which you need to focus while moving your head.

user-21d960 26 February, 2019, 20:20:43

Chat image

user-21d960 26 February, 2019, 20:20:45

how can I move the green detection ring

papr 26 February, 2019, 20:21:34

The green ring is the projection of the 3d eye sphere onto the image. It depends on the 2d pupil detection.

papr 26 February, 2019, 20:23:10

In the eye processes' general setting, change the view to "Algorithm". Afterwards, please make an other screenshot, ideally from the complete eye window. You can minimize the right menu by clicking on the general menu icon multiple times.

user-21d960 26 February, 2019, 20:24:18

Chat image

papr 26 February, 2019, 20:26:07

Ok, this looks quite well already. Have you tried running in the lower resolution mode? You can change the the resolution in the eye windows' uvc source menu.

user-21d960 26 February, 2019, 20:28:53

Here is low res

user-21d960 26 February, 2019, 20:28:58

Chat image

papr 26 February, 2019, 20:32:40

The dark blue values are pixels potential pupil pixels. Light blue are potential pupil edges. Ideally, you want adjust the uvc settings such that only the actual pupil pixels are marked as blue. This is not always 100% possible. Try changing the eye cameras' exposure mode to auto (uvc source menu). Also, if this does not help, try to slightly increase the Gamma parameter in the post-processing sub menu.

papr 26 February, 2019, 20:36:50

Also, the 3d model initially takes a few seconds to fit. To speed the process up, roll your eyes. 🙄

user-21d960 26 February, 2019, 20:38:49

It seems like my data is good enough but my computer makes an error noise and the marker does not go green when calibrating unless i get very close

user-21d960 26 February, 2019, 20:39:00

Chat image

papr 26 February, 2019, 20:39:31

That's very good pupil detection indeed.

user-21d960 26 February, 2019, 20:41:32

Is there a way to fi this

user-21d960 26 February, 2019, 20:41:36

Chat image

user-97bc82 26 February, 2019, 20:41:40

@papr Thank you for your responses, I will attempt to restore the Windows machine to an earlier version. (I switched from the Mini MacBook into a Surface Pro because the Mac had limited recording storage capacity, the MacBook only allowed for 5 minute recording increments)

papr 26 February, 2019, 20:46:06

@user-21d960 Are you trying to enable auto exposure for the world camera? The exposure modes for the world camera are not named intuitively. I do not know the mode names by hard and I am not able to look them up right now. Could you make a screen shot of the options. I know that two of them result in this error message.

papr 26 February, 2019, 20:46:22

@user-97bc82 Good luck!

user-21d960 26 February, 2019, 20:48:32

this video is 6 years old cuz his focus is very good

user-21d960 26 February, 2019, 20:48:33

https://www.youtube.com/watch?v=PXo0k7WmGYs

papr 26 February, 2019, 20:50:51

@user-21d960 Yes, the old 120Hz cameras had adjustable focus. But as it turns out, good pupil detection does not require perfect focus when a low resolution is chosen.

user-21d960 26 February, 2019, 20:51:43

ok

user-21d960 26 February, 2019, 20:52:08

The calibration only works if i move very close to the screen on my laptop

papr 26 February, 2019, 20:52:31

Could you increase the marker size?

user-21d960 26 February, 2019, 21:06:13

yes

user-21d960 26 February, 2019, 21:06:24

but if i calibrate it with a large marker the tracking is horrible

papr 26 February, 2019, 21:10:27

Please make short recording while calibrating and share it with data@pupil-labs.com

papr 26 February, 2019, 21:10:47

I can check it out and hopefully find the issue.

user-21d960 26 February, 2019, 21:13:13

Ok will do

user-21d960 26 February, 2019, 21:18:17

I can just unplug the pupil headset from my computer, no curruption issues right?

papr 26 February, 2019, 21:18:43

Yes.

user-21d960 26 February, 2019, 21:19:35

@papr email sent

papr 26 February, 2019, 21:25:36

Ok, it looks like your world camera is out of focus. As opposed to the eye cameras, the world camera has adjustable focus.

papr 26 February, 2019, 21:26:17

But your pupil confidence is mostly bad.

user-21d960 26 February, 2019, 21:28:11

Ok

user-21d960 26 February, 2019, 21:28:14

how can I adjust focus

papr 26 February, 2019, 21:29:28

By rotating the the lens carefully https://docs.pupil-labs.com/#additional-parts

user-21d960 26 February, 2019, 21:29:29

oh got it nvm

user-21d960 26 February, 2019, 21:29:43

confidence is stil low when calibrating

papr 26 February, 2019, 21:30:03

I am able to get slightly better pupil detection with the 2d mode.

papr 26 February, 2019, 21:31:11

The markers are detected well although the lens is out of focus. It is definitively the poor pupil detection at fault.

user-21d960 26 February, 2019, 21:33:00

I will 3d print that extender tonight, and see if its any better

user-21d960 26 February, 2019, 21:33:09

but I dont suspect it will help focus

papr 26 February, 2019, 21:37:35

2d offline detection is reasonably ok, but not great. But the recorded 3d pupil data is very bad. You need to make sure that your 3d model is fit well before the calibration.

user-21d960 26 February, 2019, 21:38:13

how do I switch from 3d to 2d

papr 26 February, 2019, 21:38:45

The 3d model works better with better 2d data. Did you try enabling the auto exposure for the in the eye video as well as changing the eye's gamma values?

papr 26 February, 2019, 21:39:01

You can change to 2d mode in the general settings of the world window.

user-21d960 26 February, 2019, 21:40:59

Yes I played with those two settings

user-21d960 26 February, 2019, 21:41:08

2d seems abit better but still not great

papr 26 February, 2019, 21:42:09

I do not think that there is a better solution than improving the 2d detection.

user-21d960 26 February, 2019, 21:42:25

How are others tracking so much better?

papr 26 February, 2019, 21:42:35

What do you mean?

user-21d960 26 February, 2019, 21:43:25

Well theres no way other users have this poor tracking and are able to use the headset

user-21d960 26 February, 2019, 21:43:34

we cant do anything we need if this is how poor the tracking is

papr 26 February, 2019, 21:45:37

I agree that the tracking is very poor in your case and that this is usually not the case. But pupil tracking depends on a lot of factors. One is good lighting conditions. For example, in the video from Will, that you posted above, you can see that there is a very high contrast between the pupil and the rest of the eye image. Try to achieve that and your pupil detection will improve as well.

user-21d960 26 February, 2019, 21:49:21

So will lighter eye colors have trouble?

user-21d960 26 February, 2019, 21:49:26

such as blue or green?

papr 26 February, 2019, 22:01:33

The iris color does not play an important role since the eye camera records in the IR spectrum.

user-21d960 26 February, 2019, 22:07:12

if its IR then hoe does contrast, brightness, and other setting effect the tracking?

papr 26 February, 2019, 22:16:45

Please check out the paper on how the pupil detection works: https://pupil-labs.com/blog/2014-05/pupil-technical-report-on-arxiv-org/

contrast, brightness, etc of the recorded eye image (does not matter if it is in IR or in the visible eye spectrum) are important for the edge detection performed during the pupil detection algorithm. Eye trackers usually record in the IR spectrum since the pupil is better visible, independent of the iris color, than it would be in the visible light range.

papr 26 February, 2019, 22:21:39

I am off for today. If you have any further questions, just post/leave them here. 👍

user-0a2ebc 27 February, 2019, 02:57:03

Hi

user-0a2ebc 27 February, 2019, 02:57:17

Sorry for asking silly question

user-0a2ebc 27 February, 2019, 02:57:41

What is the difference between 3d and 2d camera on your product ?

wrp 27 February, 2019, 03:00:32

Hi @user-0a2ebc 3d world camera is no longer offered, but was using the Intel RealSense R200 depth sensor. The world camera that is on the store is a "traditional" video camera.

user-0a2ebc 27 February, 2019, 03:04:17

Hello @wrp thank you so much for the reply and information

wrp 27 February, 2019, 03:04:28

Welcome @user-0a2ebc

user-64b0d2 27 February, 2019, 08:40:54

@wrp @papr I managed to find the problem. I connected a new monitor to my laptop the other day, and started working from the laptop with the screen closed, on my monitor instead. When I tried to run Pupil capture with the laptop screen open again, it worked fine as per usual.

wrp 27 February, 2019, 08:42:17

Hi @user-64b0d2 thanks for the update. This is definitely a behavior I have not observed before

user-e91538 27 February, 2019, 10:59:49

Hey there pupil, i have a problem with pupil player: The Vis Scan Path plugin is not showing up. Does it have any prerequisites for recording recording to be available later in pupil player? In the plugin manager i cannot find it

user-e91538 27 February, 2019, 11:00:19

And another problem is, that Vis Polyline does not work properly, i cannot see any line...

user-e91538 27 February, 2019, 11:00:52

Do i have to install the scan path plugin manually?

papr 27 February, 2019, 11:04:18

@user-e91538 Hi, the Vis Scan Path plugin has been disabled/deprecated a few versions ago due to technical reasons. The Vis Polyline plugin can be enabled in the Plugin Manager menu

user-e91538 27 February, 2019, 11:08:51

@papr thanks for the reply! Sadly it is exactly what i need for my project atm... And it still appears in the docs. Regarding Vis Polyline: I enabled it already but it shows no line at all

user-e91538 27 February, 2019, 11:11:18

@papr ok i closed and re-enabled it and now it works... but it is showing very very shortly. This is why i am asking about scan path 😃 Is there any possibility to emulate the scan path plugin?

papr 27 February, 2019, 11:11:21

~~Do you see gaze data via the Vis Circle plugin?~~ ok

user-e91538 27 February, 2019, 11:11:29

yes

papr 27 February, 2019, 11:16:04

No, currently, there is no way to emulate it. You could try to open your recording in an older Player version.

user-e91538 27 February, 2019, 11:16:34

ah ok, until which version it was working?

papr 27 February, 2019, 11:19:15

1.7 was the latest version

user-e91538 27 February, 2019, 11:20:15

ok thank you very much for your info!

user-e91538 27 February, 2019, 11:30:16

i think it is a really nice plugin, is it planned to reenable it?

papr 27 February, 2019, 11:35:37

The issue is that gaze mapped in frame 0 is not valid in frame 1 anymore, due to potential movement of the world camera. The old plugin tried to compensate such a movement using optical flow. It modified the gaze data on the fly. Starting with v1.8, we only keep the serialized gaze in memory, which is read-only. It is not easily possible to modify it on the fly. Disabling this plugin is one of the trade-offs that we took in favor of a much more memory-efficient approach to data handling. So far the high-level description.

user-e91538 27 February, 2019, 11:40:10

thanks for you explanation!

wrp 27 February, 2019, 13:14:49
user-41c874 27 February, 2019, 14:54:32

Hey, Is there a way to have a fixed 3d model of the eye instead of it being updated regularly?

user-41c874 27 February, 2019, 14:59:57

And also, When exporting Surface gaze points, the frequency is similar to the world camera's FPS (60Hz) . Is it possible to get surface gaze positions at the frequency of pupil camera (200Hz)? Or is it possible to map pupil positions on to the previously recorded surface coordinates,? (somewhat like an interpolation?) We can't switch to a higher FPS for the world camera because of our low light conditions and we cannot use a lower spatial resolution.

user-f81efb 27 February, 2019, 15:01:32

Hello, if we apply a manual gaze correction in pupil player , does the gaze on surface data gets updated as well?

user-f81efb 27 February, 2019, 15:02:03

and what is x and y used for in surface definition?

user-f81efb 27 February, 2019, 15:02:16

Thanks

user-f81efb 27 February, 2019, 15:02:59

also, the auto exposure option in pupil capture is not working. can you tell what could be the reason

user-f81efb 27 February, 2019, 15:04:15

@wrp @papr

user-41c874 27 February, 2019, 16:09:02

Hello! another question: does the "show undistorted image" option in the camera intrinsics plugin has an effect on how surface markers are detected (and hence the surface gaze points which are exported)?

user-41c874 27 February, 2019, 16:10:30

Or is it that the capture process automatically corrects the distortion without displaying it on the world camera view? Let me know. Thanks ! 😃

user-e70d87 27 February, 2019, 19:06:58

What is the focal length of the world camera on the Pupil Tracker?

user-6f86f3 28 February, 2019, 05:33:16

Hi, I have a problem with "No gaze on any surface for this section" when I want to generate heat map in pupil player after I record video on pupil capture. I tried on both Mac and windows 10.

wrp 28 February, 2019, 07:25:08

@user-6f86f3 have you defined surfaces?

wrp 28 February, 2019, 07:26:41

@user-41c874 There is no way to disable updates of the 3d eye model. However, you could reduce the model sensitivity which would make the model less sensitive to changes. Further customizations would require changes to source code.

user-6f86f3 28 February, 2019, 07:28:17

yes, I did add surface with approximate x y

wrp 28 February, 2019, 07:30:46

@user-41c874 when you export surfaces in Pupil Player, you will see world_timestamp as well as gaze_timestamp in the .csv file. Gaze positions are not limited by the world camera FPS. You will see that there can be many gaze positions per world camera frame if your eye cameras are running faster than your world cameras. Please check out https://github.com/pupil-labs/pupil-tutorials/blob/master/02_load_exported_surfaces_and_visualize_aggregate_heatmap.ipynb for a demonstration of how to map gaze positions onto surfaces post-hoc with a small code sample/tutorial.

user-6f86f3 28 February, 2019, 07:34:11

Chat image

user-6f86f3 28 February, 2019, 07:34:13

Chat image

user-6f86f3 28 February, 2019, 07:34:13

Chat image

user-6f86f3 28 February, 2019, 07:34:45

Here are some screenshot of what I have done

wrp 28 February, 2019, 07:35:59

Hi @user-6f86f3 thanks for the update/clarification. However, I do not see any fiducial markers in your scene video, therefore no surfaces will be defined.

wrp 28 February, 2019, 07:36:19

Please see: https://docs.pupil-labs.com/#surface-tracking (if you haven't already)

wrp 28 February, 2019, 07:39:21

@user-41c874 show undistorted image is only a visualization of the camera intrinsics estimation parameters. You can turn this on/off and it should not have any affect on the data. However, if you perform a new camera intrinsics estimation, this will effect gaze data. You should not need to estimate camera intrinsics unless you change the camera lens or are using a custom (non Pupil Labs world camera).

user-6f86f3 28 February, 2019, 07:40:27

I see! Thank you very much!

user-6f86f3 28 February, 2019, 07:47:37

@wrp I also have question of generating heat map. Is the heat map generate from a single frame? if it is, how can I export heat map for each frame in the video automatically?

wrp 28 February, 2019, 07:50:34

Hi @user-6f86f3 heatmaps (and all exports) are based on the "trim section". In Pupil Player, you can adjust the trim section by dragging the left and right most ends of the playback timeline.

wrp 28 February, 2019, 07:51:21

Heatmaps per frame, do not really make sense, because heatmaps are method to visualize aggregate data. Perhaps I don't understand your question.

user-6f86f3 28 February, 2019, 07:52:57

@wrp Got it, thank you very much for your help!😁

user-6f86f3 28 February, 2019, 07:57:23

@wrp Sorry, I also have problem when I do calibration on my Mac, it giving me this error "Not enough ref point for calibration". But I did exactly same thing on Windows 10 computer, it doesn't have this problem.

wrp 28 February, 2019, 08:52:58

This error means that (a) the calibration marker is not well detected and/or (b) the pupil is not well detected

user-5dae58 28 February, 2019, 09:14:30

How can I analysis eye tracking data ? 😰

user-ed70a0 28 February, 2019, 09:17:21

Hello! We are using a binocular eye tracker. Can we just shield on side and use it as a single option?

papr 28 February, 2019, 09:28:55

@user-ed70a0 You can simply close the eye window of the appropriate side. 😉

user-ed70a0 28 February, 2019, 09:33:23

@papr So is there a difference in the accuracy of doing it with both eyes and with one eye?

papr 28 February, 2019, 09:37:12

@user-ed70a0 Yes, specifically in the field of view where you disabled the eye process, i.e. if you disable the right eye (eye0), then gaze accuracy will be low on the right side. This is due to the difficult pupil detection in the left eye, when the subject is looking to the rigtht.

wrp 28 February, 2019, 11:30:45

@user-5dae58 you can start by using Pupil Player: https://docs.pupil-labs.com/#player-workflow

user-c22e3a 28 February, 2019, 15:49:35

Hi I'm having issues with running the filter_message.m file in Matlab some times I retrieve the following info (pupil.0 pupil.1 gaze.0 gaze.1) but some time I get the following error:

Error using containers.Map/subsasgn Specified key type does not match the type expected for this container. Error in parsemsgpack>parsemap (line 191) [out(key), idx] = parse(bytes, idx); Error in parsemsgpack>parse (line 49) [obj, idx] = parsemap(len, bytes, idx+1); Error in parsemsgpack>parsemap (line 191) [out(key), idx] = parse(bytes, idx); Error in parsemsgpack>parse (line 49) [obj, idx] = parsemap(len, bytes, idx+1); Error in parsemsgpack (line 16) [obj, ~] = parse(uint8(bytes(:)), 1); Error in recv_message (line 16) payload = parsemsgpack(payload); % parse payload Error in filter_messages (line 68) [topic, note] = recv_message(sub_socket, 2048);

for me the only thing that I'm interested in is having the 2D position of the gaze within a tracked surface how can I do that.

papr 28 February, 2019, 16:59:20

Yeah, for some reason, the Matlab msgpack implementation is not able to decode our binocular gaze data. Did you try turning off one eye?

user-41c874 28 February, 2019, 17:17:45

Regarding the camera estimations intrinsics , is there a way to check whether the software is using the original / in-built intrinsics or not? It is possible that while exploring we changed it by mistake.

user-c22e3a 28 February, 2019, 17:35:44

@papr thanks I'll try that and see what would be the result

user-41c874 28 February, 2019, 17:56:01

Regarding the surface gaze positions, we send surface gaze positions remote and use it online while tracking. So, I'll try to check which ones are we utilising, world_timestamps or gaze_timestamps. Thanks!!

user-41c874 28 February, 2019, 17:56:48

remotely*

wrp 28 February, 2019, 17:57:14

@user-41c874 restart with default settings to restore camera intrinsics.

user-41c874 28 February, 2019, 17:58:41

Aha ! That's simple! Should've thought of that ! Perfect ! Thanks. Also, thanks for the filter_gaze_onSurface.py file which was suggested to me before. That's a great debugging tool !

user-21d960 28 February, 2019, 18:24:07

I am still having lots of trouble getting good detection with the headset, should I try ubuntu, would that help at all?

user-794008 28 February, 2019, 18:27:21

Hello, I am completely new here. I am uncertain about the setting up pupil properly. 1. I dont get good detection, no matter what calibraction i try. Also most of the times I dont see the fixation on the screen. I have followed the docs ... I am mainly interested to do eyetracking in the "wild" and I don't see how this would be possible. Should I be looking at some specific tutorial?

user-21d960 28 February, 2019, 18:28:42

@user-794008 I am having the exact same issues man, ive tried many many settings and still not getting good detection even in a well lit room, and I also want to go out in the wild eventually

user-794008 28 February, 2019, 18:29:41

yes, I have spend hours. We have 8 eyetrackers, and have planned multiple projects... but right now I am really worried!

user-21d960 28 February, 2019, 18:30:34

Same here

user-21d960 28 February, 2019, 18:31:06

Could a more experienced member please help us out, let us know perhaps how you get such good tracking

user-794008 28 February, 2019, 18:31:21

Yes, it would be much appreciated!

user-794008 28 February, 2019, 18:33:25

Also when I open pupil I never see a camera view of my pupils as in the videos

user-794008 28 February, 2019, 18:33:40

Is that to be expected?

user-21d960 28 February, 2019, 18:34:46

Chat image

user-21d960 28 February, 2019, 18:34:51

make sure you have detect eye 0 on

user-21d960 28 February, 2019, 18:34:54

its off by default

user-794008 28 February, 2019, 18:36:56

I had both as on....

user-21d960 28 February, 2019, 18:37:30

then you should have another window open with a view of the eye

papr 28 February, 2019, 18:38:31

Both eye windows are enabled by default in the newer versions.

user-794008 28 February, 2019, 18:39:06

yeah, but nothing is showing in that window

papr 28 February, 2019, 18:39:20

Do you mean it is gray?

user-794008 28 February, 2019, 18:39:37

Chat image

papr 28 February, 2019, 18:41:23

Looks like the eye cameras are over exposing. Are you outside? Please enable the auto exposure mode in the eye windows. You can find it in the uvc source menu.

user-794008 28 February, 2019, 18:43:15

I am inside.... you meean at uvc source?

user-794008 28 February, 2019, 18:43:28

it did not help, just made things darker grey ....

user-794008 28 February, 2019, 18:44:02

Chat image

papr 28 February, 2019, 18:44:16

Are you wearing the headset?

user-794008 28 February, 2019, 18:44:44

yes

user-794008 28 February, 2019, 18:44:49

you know... though you are right

user-794008 28 February, 2019, 18:45:05

it just does not point on my eyes!

papr 28 February, 2019, 18:45:54

Haha, yeah, the cameras are indeed adjustable 🙂

user-794008 28 February, 2019, 18:46:03

I am and idiot... 😄

user-794008 28 February, 2019, 18:47:46

so, how well shoud I have the eyes in these views

papr 28 February, 2019, 18:48:12

Such that, when you move your eyes, the Pupil does not leave the image frame

user-794008 28 February, 2019, 18:48:45

I dont know if i have a weird face

papr 28 February, 2019, 18:48:57

Use the orange arm extenders, if you do not find a good position with the default hardware setting

user-794008 28 February, 2019, 18:48:58

but anatomically it looks impossible for both eyes

user-794008 28 February, 2019, 18:49:15

i dont have those

papr 28 February, 2019, 18:49:39

They should be part of the accessories that came with the eye tracker

papr 28 February, 2019, 18:50:02

A small shine bag, in the center of the ring

user-794008 28 February, 2019, 18:50:15

mmm... i ll need to check ... it would be in the office

user-794008 28 February, 2019, 18:50:24

is it for all models?

papr 28 February, 2019, 18:50:35

Yes

user-21d960 28 February, 2019, 18:50:36

I didnt get one in my bag

papr 28 February, 2019, 18:51:32

@user-21d960 were you able to print your own? If not, please contact info@pupil-labs.com

user-21d960 28 February, 2019, 18:52:41

I tried but the model is pretty small so my print failed

user-794008 28 February, 2019, 18:52:56

I ll also check and come back to you about the extender....

user-794008 28 February, 2019, 18:57:17

thanks for the help! it has definelty been an improvement!

user-794008 28 February, 2019, 19:02:54

For what is worth... it is impossible to have both eyes within the view..... I am routing for the extender....

user-21d960 28 February, 2019, 19:03:19

you have one eye per camera @user-794008

user-794008 28 February, 2019, 19:04:42

yes... but I have to hold them away of my face

user-794008 28 February, 2019, 19:04:50

to have each eye in its camera

user-21d960 28 February, 2019, 19:06:20

https://gyazo.com/27e6b7822a3d161f6053c3e463d5c76e

user-21d960 28 February, 2019, 19:06:29

@papr

user-21d960 28 February, 2019, 19:06:34

gif of my detection

user-21d960 28 February, 2019, 19:06:36

whats wrong

papr 28 February, 2019, 19:10:25

The algorithm does not find the Pupil edges consistent enough. Have you tried somebody else wearing the headset?

user-21d960 28 February, 2019, 19:13:40

Yes

user-21d960 28 February, 2019, 19:13:55

are you sure the camera not being focused does not matter? I feel like its a big deal

user-21d960 28 February, 2019, 19:18:50

confidence is zero

user-21d960 28 February, 2019, 19:18:58

also my

Chat image

user-21d960 28 February, 2019, 19:20:50

0 with 3d tracking

mpk 28 February, 2019, 19:27:23

@user-21d960 I think two things: 1) Max Pupil Radiius setting is to small. detection threshold is to high. maybe reset to default settings?

user-21d960 28 February, 2019, 19:41:05

@mpk Ive just tried your suggestions and its about the same

user-6f86f3 28 February, 2019, 19:41:09

Hi, I still have problem with generate heat map in pupil player even though I used marker, error message "No gaze on any surface for this section" is giving on Windows 10

user-6f86f3 28 February, 2019, 19:41:22

Chat image

user-21d960 28 February, 2019, 19:41:51

Chat image

user-21d960 28 February, 2019, 19:42:17

@user-6f86f3 what are those papers on the corners of your screen?

user-6f86f3 28 February, 2019, 19:43:32

That's surface tracker marker

user-6f86f3 28 February, 2019, 19:43:55

fiducial markers

user-6f86f3 28 February, 2019, 19:44:15

It use to define the surface

user-21d960 28 February, 2019, 19:44:28

Is that required?

user-6f86f3 28 February, 2019, 19:44:34

yep

user-21d960 28 February, 2019, 19:44:39

wut

user-21d960 28 February, 2019, 19:44:44

it never said that in documentation

user-6f86f3 28 February, 2019, 19:45:14

it is in doc under surface tracker

user-6f86f3 28 February, 2019, 19:45:32

it required for heatmap generation

user-6f86f3 28 February, 2019, 19:45:37

not for others

user-21d960 28 February, 2019, 19:45:40

Oh

user-21d960 28 February, 2019, 19:45:45

I see

user-21d960 28 February, 2019, 19:45:48

so I dont need it

user-21d960 28 February, 2019, 19:45:59

did you ever have trouble tracking when you first got the headset?

user-6f86f3 28 February, 2019, 19:46:27

it's tracking good

user-21d960 28 February, 2019, 19:46:41

Can you show a screenshot of your eye window?

user-6f86f3 28 February, 2019, 19:47:40

@mpk @papr Hi, I have problem with generate heat map in pupil player even though I used marker, error message "No gaze on any surface for this section" is giving on Windows 10

user-6f86f3 28 February, 2019, 19:52:23

Chat image

user-6f86f3 28 February, 2019, 19:57:04

If I do surface tracker in pupil capture, the defined surface looks very wired

user-6f86f3 28 February, 2019, 19:57:14

Chat image

user-6f86f3 28 February, 2019, 19:57:29

Feel like it only detect the top two tracker

user-6f86f3 28 February, 2019, 20:15:19

I feel like I found the problem, the marker has to be in ordered with the shape of surface, right?

user-21d960 28 February, 2019, 20:19:59

@user-6f86f3 are u using the extender on the eye camera?

user-6f86f3 28 February, 2019, 20:22:25

yep

user-6f86f3 28 February, 2019, 20:22:51

other wise it cannot locate my eye

user-e7102b 28 February, 2019, 20:54:35

@user-6f86f3 Your surface markers look quite small. I had to make mine larger in order for them to be detected.

user-e7102b 28 February, 2019, 20:59:38

@user-6f86f3 I also had to place my markers within the bounds of the screen (in each corner), otherwise they wouldn't be detected. I think this is because our screen is really bright, so any objects outside the screen get really dark.

papr 28 February, 2019, 21:00:24

Also, decrease the minimum marker perimeter parameter if your markers are small. But be aware that this might increase false positive detections.

user-21d960 28 February, 2019, 21:00:52

is it possible my computer isnt fast enough?

user-21d960 28 February, 2019, 21:00:56

its pulling 100fps

user-08f2c2 28 February, 2019, 23:01:05

Hi everyone. 😃 I have a calibration question: If I calibrate using screen markers, can I do finger calibration as well, or will the former override the latter? Is one better than the other for gaze accuracy?

End of February archive