core


user-1221c7 01 July, 2018, 17:28:42

Hello, I tried pupil_remote_control.m and got the error message that req_socket is undefined How can I fix it?

user-cd9cff 01 July, 2018, 22:09:49

@user-1221c7 The req_socket needs to be setup first before you can run the program

user-cd9cff 01 July, 2018, 22:10:00

change the socket variable to req_socket

user-cd9cff 01 July, 2018, 22:10:12

so that socket will not exist anymore

user-cd9cff 01 July, 2018, 22:10:16

run the program once

user-cd9cff 01 July, 2018, 22:10:38

you may or may not get another error that socket is undefined

user-cd9cff 01 July, 2018, 22:10:48

change req_socket back to socket

user-cd9cff 01 July, 2018, 22:10:51

you will be fine

user-cd9cff 01 July, 2018, 22:11:24

you will have to repeat this every time the req_socket variable is erased

user-1221c7 01 July, 2018, 22:26:21

thanks

user-f9fccd 02 July, 2018, 01:25:34

Hello!

user-f9fccd 02 July, 2018, 01:26:58

I was looking how I could have eye tracking in a VR app I wanna develop and I found this channel via GitHub

user-f9fccd 02 July, 2018, 01:29:41

I am not a developer though, I am a researcher and I 'd like to know if I can use core for a mobile VR app (android) through Unity. Any idea? Thanks

user-c351d6 02 July, 2018, 08:08:44

Thank you guys for fixing the issue with the h.264 compression. It seems to work now!

user-8779ef 02 July, 2018, 12:28:20

@user-f9fccd Pupil labs trackers (and most trackers) require special IR cameras and illuminators.

user-3f0708 02 July, 2018, 12:36:18

someone has already used the data obtained in the eye tracking for generation of graphs such as: accuracy, time, error rate?

user-80b544 02 July, 2018, 13:16:51

Hey guys, can this be used to replace a mouse?

user-a8c41c 02 July, 2018, 14:04:57

What version of Unity has everyone been using with the pupil labs trackers? My group and I have been using 2018.1.1f1. I ask this because we have gotten the scene to appear on the monitor, however we have not been able to see the scene through the actual Vive headset. Can anyone help with this issue?

user-c351d6 02 July, 2018, 15:05:29

@mpk Does the h.264 compression have a bad influance to the surface tracking? I got the fealing it's not working properly with compression.

user-cd9cff 02 July, 2018, 15:59:53

@papr Hello, were you able to get around to using gaze subscription in Matlab?

user-c351d6 02 July, 2018, 16:25:55

I'm just wonderinmg if you are aware about the bug which causes pupil player to crash when surface tracking is activated? When surface tracking is activated and you start pupil player with a recording, the surface tracking plugin is crashing directly. You can reactivate it by disabling and enabling again. However, when there is no pupil data, the surface tracking will crash immediately again.

user-bcaa09 02 July, 2018, 17:12:52

Hi guys, I am trying to run Pupil Mobile on my Android phone. My host machine is Windows. Both are on the same local wifi network and usb debug is enabled on android. Whenever I try to run Pupil Mobile with the tracker connected to the phone, then I run Pupil Capture this error appears. Any idea where can I start looking?

Chat image

user-3f0708 02 July, 2018, 18:46:58

Is it possible to do the eye tracking using glasses of degree?

user-bfecc7 02 July, 2018, 20:13:59

@user-525392 I run into this problem often enough. Give me a few minutes to give you a good answer.

user-bfecc7 02 July, 2018, 20:28:55

@user-525392 So the short answer on how to begin solving this is to delete the pupil_capture_settings folder. Once you delete this it should allow you to open Pupil Capture (PC). My guess is that the first time you opened PC on your windows computer you had the Pupil Remote Pluggin on and your computer was currently connected to the internet. With this pluggin running and having it toggled "use primarry network interface" Pupil Labs is going to assume that when running Pupil Mobile this is the network that it should be looking for, not your local wifi which probably isn't set on your computer as the primary port. Long term solutions: Set up your local wifi as the computers primary port or as a safety/sanity check make sure your local wifi is the only connected network to your computer when opening PC. And Untoggling the Pupil Remote pluggin won't force you to use your primary network port if you still want you computer connected to multiple networks.

mpk 03 July, 2018, 07:38:57

@user-bcaa09 would you mind raising an issue on github.com/pupil-labs/pupil regarding this error? We have not seen thsi before.

mpk 03 July, 2018, 07:40:07

@user-c351d6 @papr can you check if we can recreate this issue (surface tracker crashes when there is no gaze /pupil ? data).

mpk 03 July, 2018, 07:41:04

@user-c351d6 I have not seen degradation of surface tracking due to h264 compression, but it could certainly be possible. I dont know what the best way to solve this would be though.

mpk 03 July, 2018, 07:41:37

@user-c351d6 actually playing with the compression rate could be a way to address this issue!

user-c351d6 03 July, 2018, 10:24:35

@mpk @papr It seems like the surface tracker plug-in crashes when there is either no square_marker_cache or no surface_definition and it definitly has also somethinfg to do with the availability of pupil data. I could create an issue or contribute to an issue when you reopen one.

user-c351d6 03 July, 2018, 10:33:43

@mpk I did some tests, there is a decrease in surface tracking performance, especially when there are fast movements. Marker in compressed files can be tracked worse when moving than in uncompressed videos. We are possibly moving to a wifi connection to have uncompressed files. However, this needs probably a fast wifi with a high data rate. Do you have some experiences with wifi routers and do you maybe can recommand a reliable one?

papr 03 July, 2018, 10:39:32

@user-c351d6 offline detection is not an option for you?

mpk 03 July, 2018, 10:43:39

@papr I don't think the issue is online/offline rather what h264 does to the video.

mpk 03 July, 2018, 10:44:16

Mjpeg on device is no option because 4gb are quickly reached in filesize

user-c351d6 03 July, 2018, 10:45:32

@mpk Yes, around 10 minutes. And we also have the issue that we need to provide data around 30 minutes after the experiment has finished.

mpk 03 July, 2018, 10:46:07

It's a bit hard to believe the mjpeg makes 4gb files in under 10mins. Let's me double check that.

user-c351d6 03 July, 2018, 10:47:21

In this context I also recognized that offline detection is much faster on mac than on windows even though the windows computer I used was actually faster.

user-f68ceb 03 July, 2018, 10:50:35

Hi, Does anybody know how to create a Gaze Plot similar to this one:

user-8779ef 03 July, 2018, 12:52:30

Taken by Jeff Pelz @ RIT using a DSLR with IR filter removed. Subsequent photos led us to believe that the Pupil Labs illuminators are actually more diffuse than this image would have you believe.

Chat image

papr 03 July, 2018, 12:55:36

This image looks great!

user-8779ef 03 July, 2018, 12:56:50

...but also creepy, right? Our intention was to test if we could reposition the lights more effectively, especially after adding the extenders purchased from Shapeways. Some students are still looking at the many images we took. I'll let you know if we develop any firm opinions on the matter.

papr 03 July, 2018, 12:57:32

It looks like a diving helmet 😃

user-8779ef 03 July, 2018, 12:58:43

When we held a solid color matte surface at the tangent to the camera, the illumination was fairly uniform. This suggests that your illuminators are quite diffuse, and changing the angle will benefit us little.

Chat image

user-8779ef 03 July, 2018, 13:02:37

@papr I've just shared the google drive folder with your team.

user-8779ef 03 July, 2018, 13:17:36

@papr I've been meaning to suggest...you should just create a saccade detector, not a fixation detector. Your fixation detector finds when the eye is still, which will only happen when the object is stable in head centered coordinates.

user-8779ef 03 July, 2018, 13:18:09

...a fixation, however, is when gaze is still upon something in world centered coordinates. When using a mobile tracker, I would define that just as "not in saccade."

user-8779ef 03 July, 2018, 13:18:45

One example of the difference in the two approaches is that my broader definition would include VOR and pursuit as "fixation."

user-8779ef 03 July, 2018, 13:19:01

...or, perhaps it's more accurate to describe them as "inter-saccadic intervals."

user-8779ef 03 July, 2018, 13:19:23

Relaly depends on the use case . Your style fixations are good for user interaction (e.g. gaze mediated button selection)

user-8779ef 03 July, 2018, 13:19:44

...but if the goal is to identify when the person is foveating something, then the ISI is more appropriate.

user-8779ef 03 July, 2018, 13:20:20

I'm going to send you some code.

papr 03 July, 2018, 13:25:17

We will probably be adopting this though: https://www.nature.com/articles/s41598-017-17983-x.pdf

papr 03 July, 2018, 13:27:17

Thank you for your code though! I will have a look at it. 🙂

mpk 03 July, 2018, 13:37:49

@user-8779ef thats a very intriguing idea. We would love to try this out!

user-8779ef 03 July, 2018, 13:41:09

@papr Yeah, I've been loving Otto Lappi's stuff. My students read this one, but I'm behind. Keep in mind that he made that for head-fixed, so I imagine that it will omit VOR/pursuit. Is that a problem? Again, depends on the application... but you guys should be clear about how you define fixation in the instructions for your classifier. You're defining a fixation as when the eye is not rotating. My preferred definition, and the one that I think it most intuitive, is when gaze is stable on something that is fixed in the world frame. For example, staring at an object lying on the ground while walking up to it is still a fixation. Your algorithm would miss that.

user-8779ef 03 July, 2018, 13:42:08

...or incorrectly classify it as pursuit, if its using algorithms design for the head-fixed context.

papr 03 July, 2018, 13:42:23

I understand. The current algorithm is far from optimal. (independently of the head-fixed assumption)

user-8779ef 03 July, 2018, 13:43:03

Yeah, but not bad. We're using it - we are just careful to realize what it's doing. Others users won't be so aware unless you spell it out for them.

user-8779ef 03 July, 2018, 13:43:40

We're also doing some stuff for Google - trying to classify eye+head movements from the velocity signal using IMU + a Pupil Labs tracker.

user-8779ef 03 July, 2018, 13:43:55

IMU + stereo RGB for head pose estimation.

user-8779ef 03 July, 2018, 13:44:12

This will produce a database. We will share.

user-8779ef 03 July, 2018, 13:45:14

Rakshit and his creation!

Chat image

papr 03 July, 2018, 13:45:23

My collegue is also working on head-pose estimation based on marker detection. This does not require stereo cameras but markers in the camera's field of view.

user-8779ef 03 July, 2018, 13:45:40

Rakshit and his matlab interface for hand labelling training data!

Chat image

papr 03 July, 2018, 13:45:41

Hardware requirement vs additional visual stimuli trade-off

papr 03 July, 2018, 13:47:19

"Others users won't be so aware unless you spell it out for them." - The biggest reason why we have to improve our documentation.

user-8779ef 03 July, 2018, 13:47:37

Indeed.

user-8779ef 03 July, 2018, 13:48:14

Ok, this is fun, but responsibilities call. Chat later, of course.

papr 03 July, 2018, 13:48:27

👋

user-810714 03 July, 2018, 17:30:40

Hello, I have some doubts about processing the eyetracking data. 1. Is there a way to define areas of interest without using the markers when doing the recordings?

user-810714 03 July, 2018, 17:35:37

Which is the best way to create heatmaps for videos?

user-988d86 03 July, 2018, 19:52:12

does anyone have a good suggestion for best values for pupil_size_min and pupil_size_max? It looks like pupil_size_min can be anywhere from 1-250, and max anywhere from 50-400. That leaves a lot of room for different variable. If somebody has any suggestion of values for that that has worked well, I'd love to hear. We've been struggling with getting a lot of bad readings and are trying to get our confidence as high as we can.

user-8779ef 04 July, 2018, 02:24:43

@user-988d86 These are pixel values, so this depends entirely on camera positioning.

user-8779ef 04 July, 2018, 02:26:31

...because the size of the pupil in the image (measured in pixels) will vary with the distance of the camera from the eye. Try using the "algorithm view" to find an acceptable range.

wrp 04 July, 2018, 04:37:07
user-01ce8a 04 July, 2018, 14:10:05

We're getting ready to get a Pupil Mobile bundle with the binocular high speed recording, but I'm confused about audio recording. The docs say that there is a sensor area on the Android app for sensors such as audio or IMU. But 1) is there a default sensor input that already works, and 2) how is the sound recorded? Is the idea that you attach a microphone to the glasses frame and plug it into the sound input of the Android device? Is anyone currently doing this who would have hardware recommendations?

user-01ce8a 04 July, 2018, 14:11:32

I would love to hear from anyone who is using the Pupil Mobile setup, particularly anyone who has or is interested in helping to develop a system for coding fixations into AOIs (perhaps along the lines of what SMI did with their ETG system), or something more automated.

user-8944cb 04 July, 2018, 17:13:06

Hello, I have an additional question regarding the frame rate which I will be happy for some help with. We are working on syncing the 2D binocular eye tracker data and motion capture, and want to decide with which frame rate to record with the motion capture/interpolate the data to have similar frame rates. We are recording with 200 hertz for the two eye cameras , and 120 hertz for the world view camera. In the data exported from pupil player there are 400 (plus minus 2-3) timestamps with norm gaze positions. Will it be correct to interpolate the data based on the timestamps for example to 350 hertz (so that we don't up sample), and record with 350 hertz for the motion capture. Or should we use the individual timestamps for both eye camera separately based on the timestamps in the "base_data"? If the latter is better how and on which timestamp do I look, as both eyes have different timestamps in each row of the "base_data"? Thank you for your help!

user-8779ef 05 July, 2018, 18:27:46

Hey guys, any undocumented ways to adjust led power? We believe ir luminance may be too low using extenders.

user-e2056a 05 July, 2018, 20:11:49

Hello, I was wondering if we can find the number of fixations on an AOI during a certain time period during the data collection? Thank you.

user-2686f2 05 July, 2018, 23:17:35

Does focus matter at all for the gaze tracking algorithms? I normally wear glasses, but I cannot wear them and the Pupil hardware at the same time. Would I need to wear contact lenses to properly calibrate and use the eye tracking, or could I forgo the contacts and look around a slightly blurry room and still get the same accuracy?

mpk 06 July, 2018, 05:40:31

@user-2686f2 depends on your vision. You can do eye tracking without glasses as well. I m guessing accuracy might take a hit.

mpk 06 July, 2018, 05:41:33

@user-8779ef at 192x192 or 400x400 px ? We find the lower resolution to yield better results and to require less liiumination.

mpk 06 July, 2018, 05:42:34

@NahalNrz#1253 I would do as use suggest, use the gaze data and interpolate.

user-b08428 06 July, 2018, 12:48:12

Hey guys, first time with the Pupil. I just download Pupil Capture for MacOS

user-b08428 06 July, 2018, 12:48:43

Connected the Pupil and launched the app, but there is no feed. Is there any "On" button or "Start" ?

user-b08428 06 July, 2018, 12:48:57

Or should the feed start directly? What/How can I troubleshoot?

user-b08428 06 July, 2018, 12:49:15

(very excited to get started with this device :))

wrp 06 July, 2018, 12:57:27

@user-b08428 feeds should start automatically. Can you ensure the USB is plugged in fully?

wrp 06 July, 2018, 12:58:27

Sometimes you need to push the USBC end in a bit (slightly more force than you may be accustomed to) to ensure that it's firmly connected

wrp 06 July, 2018, 13:00:38

BTW @user-b08428 what macOS version are you using?

user-2686f2 06 July, 2018, 13:30:26

Is the surface information actually recorded during a Pupil Capture session? I can only seem to get an export of the surface information if I load a session into Pupil Player and then use the Offline Surface Tracker. This seems to yield the same information, regardless of whether or not I used the Surface Tracker plugin during the Capture session.

papr 06 July, 2018, 14:44:16

@user-2686f2 that is correct. The online surface tracker is meant for live interaction. The offline version does the exact same thing as the online version.

user-36b7fc 06 July, 2018, 18:38:18

Is there any way to play pupil capture videos using media players such as wmp/vlc? I am using the videos for training an object detection task, but am not able to open the videos in the tagging software. So is there any way to convert the videos to normal playable format?

user-cc65ff 07 July, 2018, 13:00:58

I'm beginner. Can you please guide me how to start using this.

wrp 07 July, 2018, 13:14:08

@user-cc65ff please check out the docs if you haven't already: https://docs.pupil-labs.com

user-3f0708 07 July, 2018, 19:22:59

Good afternoon I am using the mouse_control.py code to navigate through an associated web interface via extension in the chrome browser. But when I run the mouse_control.py code to use in the application the mouse movement does not compute in real time. it is in a delay in the drive or the ouse hangs. Could someone help me to solve this problem?

wrp 08 July, 2018, 00:26:30

There should not be noticable latency if you're running the script and Pupil Capture on the same machine. My suspicion is that your machine may already be using most of the CPU available for Pupil Capture. What are the specs of your machine @user-3f0708

user-e2056a 09 July, 2018, 01:53:00

Hello! What does the "confidence threshold" in fixation detector settings mean? Thank you

papr 09 July, 2018, 07:37:55

@user-e2056a Fixations are based on gaze. Low confidence gaze can be very inaccurate. Including these can lead to a lot of false negative detections. The threshold sets the minimum confidence that a gaze datum needs in order to be considered for the fixation detection.

user-3f0708 09 July, 2018, 12:26:32

@wrp The specifications of my machine are a Dell brand notebbok with 8G of ram, 2.5GHz processor and Intel Core i7

user-90270c 09 July, 2018, 21:04:02

Hi, sitting with a bunch of students trying to figure out why a colleagues's Pupil Pro with real sense world camera doesn't even show RGB (does not even detect). We see that in Oct. 2017, using the real sense required us to make changes and compile. Is that still true?

user-bfa5df 10 July, 2018, 04:32:54

Hey hello everybody I am working in an experiment and I need to register data of pupil size, but not in pixels , I need mms. Has anyone done this before.? I am guessing that I need to know the distance between the camera and the pupil, but it will be changing all the time depending on the gaze, also the camera in some cases is not exactly in the front of the pupils, it has some angle. How do you deal with this isues? Finally, if it is necessary to measure the distance between camera and pupils how do you do it for it no to be unconfortable for the participants. Too, many many questions, i know. Thank you in advance for your help.

papr 10 July, 2018, 07:41:00

@user-90270c Which OS do you use? And does your Computer have a USB3 port?

papr 10 July, 2018, 07:45:09

@user-bfa5df The solution is to use our 3d model. It provides the diameter in mm. Be aware that it does not account for corneal defraction. You can read up on how this influences the acutal pupil size: https://dl.acm.org/citation.cfm?id=3204525

user-bfecc7 10 July, 2018, 20:22:19

Hello fellow Pupil People. I'm running into an issue with Pupil Capture and having it recognize my Pupil Mobile device. I recently had to tear down my set up in which it was working fine before. I have both my computer and Pupil Mobile device connected to the same local wifi network and both computer and phone are not connected to anyother networks. When I open Pupil Capture and change the NDSI manager to Pupil Mobile I get the "No host found". I'm assuming I can change this with the Pupil Remote plugin, but I had not messed with that too much before and was able to have it work. Any advice would be extremely helpful.

user-bfa5df 10 July, 2018, 22:56:58

@papr Thank you so much! I will explore that approach and will let you know how it went.

user-bfa5df 11 July, 2018, 04:15:11

@papr Thanks, it solved part of my problem. Any suggestion for compesating corneal refraction?

user-90270c 11 July, 2018, 18:02:10

@papr We were testing with Mac OS, but not on a USB3 port. Can you share any documentation on getting the real sense going? We want to help our ucsd colleagues, and are considering getting one of these for our own purposes. Thanks!

user-8779ef 11 July, 2018, 18:11:04

Just collected some data only to find that one eye does not have its corresponding time stamp file.

user-8779ef 11 July, 2018, 18:11:08

Anyone see this before?

papr 11 July, 2018, 18:21:33

@user-8779ef on v1. 7?

papr 11 July, 2018, 18:23:13

@user-90270c the real sense camera requires USB 3. Everything should work out of the box with the bundle for macos

user-8944cb 11 July, 2018, 18:29:24

Hello, I have an a question regarding the frame rate which I will be happy for some help with. We are working on syncing the 2D binocular eye tracker data and motion capture, and want to decide with which frame rate to record with the motion capture/interpolate the data to have similar frame rates. We are recording with 200 hertz for the two eye cameras , and 120 hertz for the world view camera. In the data exported from pupil player there are 400 (plus minus 2-3) timestamps with norm gaze positions. Will it be correct to interpolate the data based on the timestamps for example to 350 hertz (so that we don't up sample), and record with 350 hertz for the motion capture. Or should we use the individual timestamps for both eye camera separately based on the timestamps in the "base_data"? If the latter is better how and on which timestamp do I look, as both eyes have different timestamps in each row of the "base_data"? Thank you for your help!

user-90270c 11 July, 2018, 18:43:26

@papr-- very good to know! Same true for Linux?

papr 11 July, 2018, 18:54:26

@user-90270c unfortunately not. You will have to install the dependencies and run from source as described in the docs

user-90270c 11 July, 2018, 19:17:58

@papr Good to know. Thanks!

user-68e00c 11 July, 2018, 19:20:53

Thank you @user-90270c and @papr ! Trying to run the Pupil Capture from source now.

user-c351d6 12 July, 2018, 08:51:44

Hi guys, I just encountered a bug which causes the eye cameras to deliver a really dark picture from the eyes which can't be used for eyetracking. It just can be fixed by restoring the default settings of everything in pupil capture. I did not change any parameters, it just happend after starting pupil capture. Have you seen this before? It's a pretty nasty one, because it seems to appear "randomly" and you have to reconfigure pupil captue from scratch - a huge problem while performing an experiment.

user-723401 12 July, 2018, 08:55:37

Hi! Is there a work around for when the users have transitional eye glasses? I was told it doesnt work. Unfortunately, I am one of those who wear them constantly. This is for demo situations

papr 12 July, 2018, 08:59:59

@user-c351d6 Alternatively, you can delete the user settings for the eye processes separately. Close Pupil Capture. Go to the pupil_capture_settings folder, delete user_settings_eye0 and user_settings_eye1, start capture.

user-c351d6 12 July, 2018, 09:19:15

Thanks! And another question: When using Pupil Mobil in combination with Pupil Capture, we are having a huge delay between the movement of the head and the actual picture of the world camera. Is this normal? I also got the problem right now that the file I recorded caused pupil player to crash...

user-239f8a 12 July, 2018, 09:36:40

Morning all. I have posted a question on the research and publications forum if anyone wants to help out with their thoughts. Didn't want to repost same issue here again. Not so much an issue really. More of a sense checking on fixation output EDA... #confuseddotcom

papr 12 July, 2018, 10:31:48

@user-c351d6 Please send the recording to [email removed] with a note that it cuases Player to crash, and which OS and Version you are using.

Some delay is expected, yes, depending on the throughput of your wifi.

user-c351d6 12 July, 2018, 10:36:51

@papr Does this delay get fixed in the recording due to time sync? Because right now, head movements are first detected as eye movements because of the delay.

user-c351d6 12 July, 2018, 10:37:37

@papr And does this also mean we have to rely on offline calibration?

user-c351d6 12 July, 2018, 10:39:46

(Just the world camera is delayed, the eye cameras are not)

papr 12 July, 2018, 10:40:21

Each video frame has its own timestamp. Detected pupil data inherits the timestamps from their according video frames. Gaze is displayed as soon as it calculated aka. th eye video frames arrive. This can lead to delayed visualization in Capture. Calibration etc should work correctly since data is correlated based on creation time, not arrival time. All data should be correctly visualized if you open the recording in Player.

user-c351d6 12 July, 2018, 10:42:22

@papr Ah thanks, that was my concern. Then we just have to figure out why the recording is corrupt. I'll send you the recording. But back to calibration. Is the calibration in pupil capture synchronised?

papr 12 July, 2018, 10:43:45

Yes, it is. The calibration procedure accumulates pupil and reference locations. These include the correct timing. The timestamps are used for correlation before the actual mapping function is estimated.

user-459080 12 July, 2018, 11:32:37

hey guys

user-459080 12 July, 2018, 11:33:00

i have a really stupid issue

user-459080 12 July, 2018, 11:34:01

the real issue is, my Capture does't pick up any of the cameras, but the intermediate issue is, where do I look for the log file created by Capture

wrp 12 July, 2018, 11:34:41

Hi @user-459080 What OS anv OS version are you using?

user-459080 12 July, 2018, 11:35:49

win 10 enterprise

user-459080 12 July, 2018, 11:36:29

Folder with pupil stuff is on D, in C/Users/myname there are some pupil folders with Capture settings

user-459080 12 July, 2018, 11:36:31

but no log file

wrp 12 July, 2018, 11:36:44

@user-459080 if video feeds from cameras are not showing up in Pupil Capture (and if you are using official Pupil Labs hardware). You may want to try the following (1) Ensuring that the USBC connector is firmly plugged in - sometimes it needs a little bit more effort to connect it (2) For Windows 10 users, you will need admin privlidges to install drivers - please ensure that drivers are installed for Pupil Cam in the device manager

user-459080 12 July, 2018, 11:36:50

so i am wondering, do i get anything more than the terminal info that gets printed as the thing is running

wrp 12 July, 2018, 11:37:16

You can open Pupil Capture as admin user and click General > Restart with default settings

wrp 12 July, 2018, 11:37:34

Hopefullyt his will resolve issues you may be observing

wrp 12 July, 2018, 11:38:49

Regarding the log in C:\Users\yourname\pupil_capture_settings - this contains user settings and a log of Pupil Capture - it is what was shown in the cmd prompt in windows when you ran Pupil Capture

user-c351d6 12 July, 2018, 12:29:02

@papr Is it sufficient to activate the time sync plug-in just in pupil capture? No action required somewhere else? I've send you the recording which crashes.

papr 12 July, 2018, 12:29:34

Yes, Pupil Mobile is running time sync in the background automatically.

user-459080 12 July, 2018, 12:31:33

ok, so the front camera is still having issues

user-459080 12 July, 2018, 12:32:54

oh no, i am wrong, they all still do

user-459080 12 July, 2018, 12:35:10

I hope it is just a cable issue cause everything works on mobile with usb-c. i will figure it out tomorrow

user-459080 12 July, 2018, 12:36:25

yep, cable

user-c351d6 12 July, 2018, 12:41:16

Chat image

user-c351d6 12 July, 2018, 12:41:38

I'm also getting a massiv spam of this errors

user-459080 12 July, 2018, 12:47:22

ok, mine is running. I still have to figure out how to do a calibration when recording mobile

mpk 12 July, 2018, 13:35:29

@user-c351d6 does this also happen when you use a different USB port? What you are seeing are usb transmission errors.

user-c351d6 12 July, 2018, 14:12:48

@mpk It happens when using a Wifi connection to Pupil Mobile. So it's the connection between eyetracker and smartphone? It does not happen on the MacBook

user-c351d6 12 July, 2018, 14:13:57

Can it be due to slow wifi? Just ordered an AC wifi router to have a faaaast connection

user-b571eb 12 July, 2018, 14:15:39

Dear Pupil Labs Team, I would like to ask if I could assume the data of diameter_3d could represent pupil dilation?! Thank you. ^^

user-464538 12 July, 2018, 15:32:14

Hi all, I am looking at the pupil position excel sheet that gets exported from pupil player and one of the eye's data is flipped both vertically and horizontally. I am wondering if there is a way to fix this so that the exported data is consistent between eyes? any help would be appreciated.

user-ed6bcd 12 July, 2018, 17:51:36

Hi Team,

I'm having issues with the Pupil capture app: 1. Drivers updated and installed. 2. Most recent Windows Update aquired 3. Most recent Pupil labs apps downloaded. 4. Was able to record this morning no problem, then the computer got hot (has since cooled) but has not allowed me to record since.

HALP. 😬

MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version. world - [INFO] launchables.world: Application Version: 1.7.42 world - [INFO] launchables.world: System Info: User: Ginny, Platform: Windows, Machine: DESKTOP-EP8KVHA, Release: 10, Version: 10.0.17134 Running PupilDrvInst.exe --vid 1443 --pid 37424 OPT: VID number 1443 OPT: PID number 37424 Running PupilDrvInst.exe --vid 1443 --pid 37425 OPT: VID number 1443 OPT: PID number 37425 Running PupilDrvInst.exe --vid 1443 --pid 37426 OPT: VID number 1443 OPT: PID number 37426 Running PupilDrvInst.exe --vid 1133 --pid 2115 OPT: VID number 1133 OPT: PID number 2115 Running PupilDrvInst.exe --vid 6127 --pid 18447 OPT: VID number 6127 OPT: PID number 18447 Running PupilDrvInst.exe --vid 3141 --pid 25771 OPT: VID number 3141 OPT: PID number 25771 world - [ERROR] video_capture.uvc_backend: Init failed. Capture is started in ghost mode. No images will be supplied. world - [INFO] camera_models: No user calibration found for camera Ghost capture at resolution [1280, 720] world - [INFO] camera_models: No pre-recorded calibration available world - [WARNING] camera_models: Loading dummy calibration world - [WARNING] launchables.world: Process started. Running PupilDrvInst.exe --vid 1443 --pid 37424(edited)

user-96d850 12 July, 2018, 18:28:35

@papr Getting an assertion error in the fix detector, line 186: "assert dispersion <= max_dispersion, 'Fixation too big: {}'.format(fixation_datum)"

user-96d850 12 July, 2018, 18:28:56

I have the log output, but don't want to spam the room. It's quite long.

user-96d850 12 July, 2018, 18:43:31

....unfortunately, I've had issues replicating the error! So, I wouldn't lose any sleep until we can send a few more logs.

papr 12 July, 2018, 18:51:48

@user-8779ef Yeah, this should not be happening. The assertion was really just to find edge cases, but I was never able to replicate this either. The next release will not include the assertion.

user-2686f2 12 July, 2018, 18:53:26

If I do a calibration, then close the Capture window and unplug the device, then come back later and plug in the device, should I be doing another calibration, or can I expect the calibration to reasonably last across sessions? Similarly, if someone else uses the hardware, do you need to calibrate between different users?

papr 12 July, 2018, 18:53:39

@user-b571eb Yes, diameter_3d is the modelled pupil dilation of the 3d model. The correctness of the value depends on 1) how well the model is fitted, 2) 2d pupil detection and 3) the bias introduced by not modelling refraction.

papr 12 July, 2018, 18:54:46

@user-2686f2 You definitively need to calibrate between different users! 3d calibrations reasonably stable between single-user session but it is recommended to recalibrate.

papr 12 July, 2018, 18:55:46

@user-ed6bcd Try restarting with default settings. Go to General Settings > Restart with default settings

user-8779ef 12 July, 2018, 18:55:51

@papr Great. I'll just comment it out and make sure to inspect the resutls for oddities.

user-8779ef 12 July, 2018, 18:56:05

Thanks!

user-2686f2 12 July, 2018, 18:57:25

@papr Thanks! On that note, I've scroll through some previous chats and looked at the github issues and I saw someone say that different calibration methods are more useful for different scenarios, but there wasn't an elaboration. If this is the case, could you explain when you'd want to use one calibration method over another?

papr 12 July, 2018, 18:59:28

@user-464538 The right eye camera (id0) is physically flipped. Therefore the pupil data is "flipped" as well -- not really since the pupil data still corresponds to the locations within the original video. Pupil data itself is only relative to the corresponding eye camera coordinate system. There should not be a reason where you have to flip this data manually. Gaze mapping (pupil data mapped to the scene camera coordinate system) handles the flipped camera automatically.

user-58d5ae 13 July, 2018, 09:45:13

Did anyone successfully control UI with eye Gaze in Unity, if so how ?

user-2ff80a 13 July, 2018, 10:04:34

Hi all, have a few questions about setting up mobile eye tracking for experiments, so I'm hoping this is the place to ask?

papr 13 July, 2018, 10:07:22

@user-2ff80a yes it is

user-2ff80a 13 July, 2018, 10:10:43

great, thanks. So... our lab is preparing an experimental setup with numerous sensors in a motion capture space. Initially, we were using Windows 10 (with the latest Pupil Capture release), the Moto Play Z2 (as suggested) and lab streaming layer. We then tried a Pixel tablet with a similar setup, with fewer issues.

user-2ff80a 13 July, 2018, 10:11:08

But so far, our test recordings have indicated that the Pupil Mobile setup we are using will not be stable enough for continued use throughout the official study, so I am hoping to get some advice as to whether there are ways to improve our setup, or whether we should try an alternative setup (abandon Pupil Mobile for now)?

user-2ff80a 13 July, 2018, 10:13:08

The biggest issues seem to be overheating (especially the Moto Play) and pixelation of the world view, that we think may result

user-2ff80a 13 July, 2018, 10:16:07

we also wonder whether anyone has had any issues with IR interference in an optical motion capture space? I'm planning to try to add a bit of shielding around the eye camera LEDs, since we have seen them in our mocap recordings.

user-2ff80a 13 July, 2018, 10:30:42

We are also wondering if there are minimum hardware requirements for the system running Win10, because we considering a mobile mimi-PC (like a Dynaedge, but hopefully cheaper) running pupil capture directly.

papr 13 July, 2018, 10:31:39

Could you post an example screenshot of what you mean by "pixelation of the world view"? In which extend is the heating of the phone problematic? Does it overheat so much, that it turns off?

user-2ff80a 13 July, 2018, 10:32:03

sure thing. I'll search for a good example

user-2ff80a 13 July, 2018, 10:34:38

Yes, during one recording, it was so hot it kept stopping after 3-5 minutes and the study participant had to take it out of the carry pouch to fan it so that it would operate again. we added a pocket to include a coolpack, for hotter recording days

wrp 13 July, 2018, 10:35:27

@user-2ff80a based on your description, I'm assuming that you are streaming data from Pupil Mobile to Pupil Capture. Do I understand correctly?

user-2ff80a 13 July, 2018, 10:35:52

yes

wrp 13 July, 2018, 10:37:19

What kind of network do you have set up for wifi?

user-2ff80a 13 July, 2018, 10:46:02

The Win10 machine and the WiFi router are both connected to a Netgear switch. We're currently using a TP-Link 450Mbps wireless N router dedicated to the mocap space

user-2ff80a 13 July, 2018, 10:49:04

And here's one pixelated image

Chat image

user-2ff80a 13 July, 2018, 10:50:05

and another from shortly after

Chat image

user-3f0708 13 July, 2018, 11:57:49

has anyone used the coordinate data from the eye tracking to make precision comparison using graphics?

user-c351d6 13 July, 2018, 13:25:56

@user-2ff80a We are also using a 450Mbps wireless N router but we don't have the same problems with pixalated images. However, we have problems with delays and possible lost frames which causes corrupt video files. Did you encounter problems like this? Btw. we are using a OnePlus5T which does not overheat - probably because of the more powerful processor

mpk 13 July, 2018, 13:26:49

@user-c351d6 @user-2ff80a if possible try recording on device with later transfer. Wifi can result in delays and dropped frames...

mpk 13 July, 2018, 13:27:05

time syncing is also possible when recording on device.

user-c351d6 13 July, 2018, 13:33:47

@mpk It is, but with two major problems. (1) Our recordings are 20 mins, but uncompressed files reach the file size limit at around 10 minutes. Compressed files however make the surface tracking worse. I havn't played a lot with the parameter so far because of the second major problem. (2) Our results should be used in a debreifung session around 30 minutes after the experiment. Postprecossing however, takes around 40 Minutes (including copying of files, pupil tracking, mapping, surface tracking, export). That's on a MacBook Pro from 2017. We have tried a faster Windows Gaming Laptop with much worse(!) results. Any idea how to get this experiment to run?

user-2ff80a 13 July, 2018, 13:36:23

@user-c351d6 we also get dropped frames and inconsistent timestamps. this is one other reason we are considering switching to a mobile pc.

user-2ff80a 13 July, 2018, 13:37:50

@mpk we also considered recording to the device, but from reading others' posts I got the impression that it is not so reliable either.

user-c351d6 13 July, 2018, 13:37:51

@user-2ff80a We ware also talking about a laptop in a backpack...

user-c351d6 13 July, 2018, 13:39:18

Which clearly would be the worst case.

user-2ff80a 13 July, 2018, 13:41:12

@user-c351d6 true, but it seems like the safest approach. we will try to improve the setup with the pixel tablet for now, but may have to look into the mobile pc option

mpk 13 July, 2018, 14:10:14

@user-2ff80a please give recording on device a schot. We have made this more stable in recent releases!

mpk 13 July, 2018, 14:11:12

@user-c351d6 sorry I remember, you had other constraints! In this case I would recommend hardening the wifi setup. Use a LAN cable to connect to the host computer, use fast 5gh wifi etc...

user-2ff80a 13 July, 2018, 14:20:10

@mpk Ok, cool... I'll test that out next. Thanks!

user-8779ef 13 July, 2018, 14:21:53

Guys, no love in HMD-eyes land?

user-8779ef 13 July, 2018, 14:22:37

BOth 2D or 3D demos are throwing errors upon calibration (after successfully connecting to the tracker). Error posted in the hmd-eyes discussion, where it belongs.

user-8779ef 13 July, 2018, 14:24:31

...I'll create the github issue. I was hoping for 3rd party confirmation that someone also shares the error.

papr 13 July, 2018, 14:38:08

We have a dedicated developer who started working on the hmd-eyes project. Improvements should come soon. Please bare with us. 😃

user-8779ef 13 July, 2018, 14:38:21

@papr That is GREAT news. Who is this person?

papr 13 July, 2018, 14:46:35

I think he did not join this channel yet.

user-8779ef 13 July, 2018, 14:46:46

He's hiding from me?

user-8779ef 13 July, 2018, 14:46:54

Smart.

user-8779ef 13 July, 2018, 14:47:11

THis guys shows promise!

user-7d3aea 13 July, 2018, 15:54:43

hey! I was wondering if you could help with an issue our lab are currently facing with using pupil player. In particular, after calibration etc, we are trying to export our videos. However, in 90% of cases, the exportation freezes at some point through (but not pupil player as a whole) and the video doesn’t fully export. On repeat, the video always makes it to the same point, but each different video makes it to a different point before freezing. We are using the latest version of pupil player, so are not sure why this error has started to affect us.

papr 13 July, 2018, 15:57:02

@user-7d3aea Could you send an example recording to [email removed] that we cna use to reproduce this issue?

user-8779ef 13 July, 2018, 21:13:05

@papr Getting good reports from my folks after commenting out that assertion in the fixation detector.

user-8779ef 13 July, 2018, 21:13:28

No crashes yet, and I believe they processed an hour or so of data today.

papr 13 July, 2018, 21:16:21

I still would like to know how the assertion was triggered. The algorithm should not produce fixations longer than the maximum but the assertion triggers if it does. Please check the fixation lengths vs your maximum and let me know if you find violations of the maximum and by how much the maximum duration was violated.

user-8779ef 13 July, 2018, 21:16:53

Thanks - I'll do that when it comes time to analyze.

user-8779ef 13 July, 2018, 21:18:02

...and I'll let you know if I see anything.

papr 13 July, 2018, 21:18:13

Great, thank you!

user-2ff80a 16 July, 2018, 11:42:18

Hi all, have some hardware/materials questions... Would it be possible to get more information about the material the frame parts are printed in (is it Nylon 12, or...)? and is it safe to assume the small covers situated over the eye cameras are the same material? Is there any other function to these covers beyond protecting the cameras?

user-7d3aea 16 July, 2018, 11:59:54

@papr working on creating a demo video as we have confidential data. The error that appears when the export stops is:

Starting video export with pid: 11144 Application provided invalid, non monotonically increasing dts to muxer in stream 0: 9141814 >= 9141814 Process Export (pid: 11144) crashed with trace: Traceback (most recent call last): File “shared_modules\exporter.py, line 183, in export File “shared_modules\av_writer.py”, line 177, in write_video_frame Fiile “av\container\output.pyx”, line 201, in av.container.output.OutputContainer.mux (src\av\container\output.c:3545 File “av\container\core.pyx”, line 228, in av.container.core.ContainerProxy.err_check (src\av\container\core.cL4020) File “av\utils.pyx, line 76, in av.utils.err_check (src\av\utils.c:1666) Av.AVError: [Errno 22] Invalid argument: ‘XXXXXXXXXXXXXXXXXXXXXX.mp4’

papr 16 July, 2018, 14:47:44

@user-7d3aea Ok, was this a Pupil Mobile recording?

user-7d3aea 16 July, 2018, 15:34:53

@papr yes, it was!

papr 16 July, 2018, 15:38:34

Did you see this issue? If not, could you try following these instructions? https://github.com/pupil-labs/pupil/issues/1203#issuecomment-396884543

It fixes the timestamps and hopefully lets you export the video

user-cd9cff 16 July, 2018, 18:04:24

@papr Hello, I am querying python data for 10 seconds and collecting the eye gaze positions in real time. I then collect all of these values into a python list. However, the size of the list is around 4000 entries. This doesn't make sense because the camera fps is 200, and so 200*10 should equal 2000 entries.

Does this issue have anything to do with the buffer containing data from previous samples? Should I clear the buffer at the start of every trial?

papr 16 July, 2018, 18:08:20

@user-cd9cff this is expected if you have a binocular headset.

papr 16 July, 2018, 18:08:47

Then it is 10 seconds * 2 cameras * 200fps

user-cd9cff 16 July, 2018, 18:12:00

@papr This is how I am querying the data: rx = msg['gaze_normals_3d'][0][0] ry = msg['gaze_normals_3d'][0][1] lx = msg['gaze_normals_3d'][1][0] ly = msg['gaze_normals_3d'][1][1]

Each line of data that I unpack from the msgpack has the four values of rx,ry,lx,ly. There are essentially 4000 data points of rx,ry,lx,ly, not 2000 of rx,ry and 2000 of lx,ly

Shouldn't that mean that each data point takes both cameras into consideration, and therefore should not do one data point ofr each eye seperately?

user-cd9cff 16 July, 2018, 18:12:27

Rx,Ry: Coordinates of the right eye

user-cd9cff 16 July, 2018, 18:12:38

Lx,Ly: Coordinates of the left eye

user-abc667 16 July, 2018, 18:26:04

Question about calibration -- We want to track gaze while people are looking down at a paper form on the desk. Currently we're calibrating using screen markers, but wondered whether the same method could be used with a tablet computer lying on the desk. This would calibrate against eye positions that more closely matched what people would be doing when looking down at the form. Has this been done? Any cautions? Any obvious problems with it? Many thanks for any advice.

user-cd9cff 16 July, 2018, 18:26:41

@papr In addition, when I have the trial run for only 3 seconds, I get almost 3000 data points, which doesn;t make sense according to the two camera calculation; I should be getting 32200 = 1200 data points

papr 16 July, 2018, 20:30:09

@user-cd9cff Actually, I will have to investigate this. My colleague mentioned something similar. Give me a few days.

papr 16 July, 2018, 20:31:08

@user-cd9cff in the meantime, could you check if you encounter duplicated timestamps?

papr 16 July, 2018, 20:34:32

@user-abc667 one major issue with that is that your calibration area would be very small. Gaze outside the calibration area can be very inaccurate. I would suggest manual marker calibration and moving the marker around on the desk while the subject faces the desk

user-abc667 16 July, 2018, 21:32:54

@papr "one major issue with that is that your calibration area would be very small. Gaze outside the calibration area can be very inaccurate. I would suggest manual marker calibration and moving the marker around on the desk while the subject faces the desk" Happy to try this. Is there any accuracy difference in using manual markers vs screen markers? We want the best tracking accuracy we can get and in principle don't really care where the subject is looking if they are not looking at the paper form. [It's part of a psych test and we're concerned only with where they're looking on the page.] Thanks!

papr 16 July, 2018, 21:36:38

@user-abc667 there is no difference in the procedures from an algorithmic point of view. Be aware that the calibration area is relative to the field of view of the subject, not relative to the desk!

user-cd9cff 16 July, 2018, 23:38:12

@papr Yes, I did encouter duplicated timestamps; there would be a group of five to six duplicated timestamps

user-8779ef 17 July, 2018, 14:14:58

Hey guys, what's the peak wavelength of your LED's?

mpk 17 July, 2018, 14:17:26

850nm

user-8779ef 17 July, 2018, 14:29:38

Thanks @mpk

user-8779ef 17 July, 2018, 14:34:29

We're having trouble sourcing a poly hot mirror large enough to span the vive eye cup at 45˚ angle, but we're looking!

user-8779ef 17 July, 2018, 14:35:24

Our need isn't great enough for a custom run, so we're begging for old stock 😛

user-78dc8f 17 July, 2018, 14:46:34

Greetings. We are trying to extract a synchronized set of images from our pupil labs data from two world cameras (one on a child, one on a parent). I've played around with ffmpeg for doing this, but there is some variations in the sampling frequency so when I specify the same number of frames, I'm sampling slightly different time segments.

mpk 17 July, 2018, 14:58:48

@user-78dc8f you will need to sync by timestamps not frames. This can be done with a bit of python. We have a plan to integrate this into Pupil Player. let me check with @papr about this.

user-c351d6 17 July, 2018, 19:07:34

@mpk Today, I tried to run the experiment with a fast 5ghz wifi and a cable connected pc. Two out of three times one of the two eye cameras lost the connection. One with the error like last time, the usb error, and second one was the following.

user-c351d6 17 July, 2018, 19:07:40

Chat image

mpk 17 July, 2018, 19:08:32

@user-c351d6 can you share to full log for this? It should be in the Pupil Capture dir. Once this error happens and before you restart the app just copy the logfile and send it to us.

mpk 17 July, 2018, 19:09:08

We will try to fix this ASAP once we know what happended.

user-c351d6 17 July, 2018, 19:09:31

It's aroung 500MB

mpk 17 July, 2018, 19:09:40

The logfile?

user-c351d6 17 July, 2018, 19:09:43

Yes

mpk 17 July, 2018, 19:10:36

can you see if you find the traceback of the exception you pasted in there?

mpk 17 July, 2018, 19:10:49

Or even share the file with data[at]pupil-labs.com

user-c351d6 17 July, 2018, 19:11:59

Yes I can, probably it's also interessting for you when it happend first.

mpk 17 July, 2018, 19:12:09

Agreed!

mpk 17 July, 2018, 19:12:21

also we need to make sure logfiles dont become 500mb 😃

user-c351d6 17 July, 2018, 19:13:10

I also agree 😉

user-c351d6 17 July, 2018, 19:52:20

https://github.com/pupil-labs/pupil/issues/1229

user-c351d6 17 July, 2018, 20:14:32

@mpk Just a brief question about pupil player. Dragging a long recording to pupil player takes a long time to open it. There is unfortunately no progress bar but that's not a big deal as long the user knows that it is not crashing but calculating something. It seems to be due to calculations of something which seem to be not saved after opening the file. That's unfortune because pupil player crashes sometimes. In case you need to reopen the file you don't want to wait this time again. Why are you not saving this data after finishing the calculations? It seems like you just save it when the applikations is closed properly. Thats also the same when you change settings and the application crashes.

user-c351d6 17 July, 2018, 20:16:18

Hm, not sure if you even save this datav in pupil player sometimes. I just recognized it always takes ages to open long recordings.

user-abc667 17 July, 2018, 20:20:33

@papr Thanks for your earlier reply, and please bear with me. We're at the beginning of a major research project involving eye tracking and want to be sure we're using your technology properly from the outset, and in a way that gets the maximum gaze position accuracy.

Our subjects will be working on a pen and paper task on an 8.5 x 11 form while sitting at a table. We want to know with the best accuracy possible where on the form they are looking at each moment. (If they look elsewhere than on the paper, all we need to know is that they are not looking at the form.)

As I mentioned, we're currently using screen marker calibration from a laptop screen. I wondered whether gaze accuracy would better if the calibration was done in an orientation that more closely matched their head and eye position when they are working on the task. That's what motivated the question about doing a calibration while they were looking down at the table, eg, a screen marker calibration using a tablet lying on the table about where the paper form would be.

You mentioned that the tablet approach would produce a very small calibration area. Note that I'm thinking about a large tablet, something on the order of a Surface Book 2 with a 15" screen, ie a display size of 12.5' x 8.3". The idea would be to have something roughly the same size as the test form itself, sitting on the desk in front of the subject.

Two questions - + Would using a large table address the issue you mentioned? + Even more fundamentally, is there a chance that gaze accuracy would be improved by doing calibration in this fashion, ie while looking down at the desk (& tablet), as that mimics the subject's orientation and posture when doing the task? (And if the answer is that we just have to try this out to find out, that's ok.)

Thanks!

user-abc667 17 July, 2018, 20:24:10

@papr About surfaces -- we understand and have been using the fiducial markers on our form. In looking over one of the videos

https://www.youtube.com/watch?v=bmqDGE6a9kc

the narrator mentions that the markers have an orientation. What does orientation mean in this case, and why does it matter? If I copy the markers from your image and paste them onto a test form with the same orientation they have in your image, will the orientations be correct (for whatever "correct" means)?

Similarly, what is meant by the orientation of a surface?

Many thanks for your help.

user-8779ef 17 July, 2018, 20:36:59

@user-abc667 Some quick thoughts. 1) Yes, calibration at the distance/ in the plane in which you want to maximize data! Build a custom calibration grid that lays on the table. Use natural feature / offline calibration mode.

user-8779ef 17 July, 2018, 20:38:00

I would calibrate a larger area than the book itself. remember, you are calibrating within screen space, not in world space (a head movement may cause the book to occupy a dififerent part of the screen)

user-8779ef 17 July, 2018, 20:38:35

You may also want to consider whether it is kosher (given your design / hypothesis / philosophy) to restrict movement at all

user-8779ef 17 July, 2018, 20:39:21

SOMETIMES, it's OK to use chin rests. Personally I try and avoid restricting movement in any way, physically OR through verbal instruction

user-8779ef 17 July, 2018, 20:39:34

Some additional tips...

user-8779ef 17 July, 2018, 20:40:04
  • Control the room lighting! You want a very bright room, with very well distributed light.
user-8779ef 17 July, 2018, 20:40:40
  • If possible, turn off auto white-balance on the scene camera, and manually set the shutter speed as fast as it will go. This will lower motion blur in teh scene camera during head movements.
user-8779ef 17 July, 2018, 20:41:27

Best of luck.

user-abc667 17 July, 2018, 20:51:35

@user-8779ef Many thanks.

Making sure I have it -- create a sheet of paper say 12.75 x 16.5 (ie 50% larger than an 8.5 x 11), and put on it at least 9 icons of some sort so we can tell the subject to look at each one in turn, using the natural features approach, yes?

As you suggest, we do not want to restrict movement. That's the whole reason for wanting your glasses, as they are far more natural.

Thanks for the additional tips on lighting, etc.

user-8779ef 17 July, 2018, 20:54:29

@user-abc667 Be sure to leave a visual indication of when the person is looking at each calibration point. We use a system involving a button box with an LED on it. So, for each calibration point...

user-8779ef 17 July, 2018, 20:55:37

1) . The experimenter indicates the target by touches it, and them removes her/his finger from the region. 2) . The observer looks at the point, and briefly turns on the LED while looking (the led is visible in the scene camera)

user-8779ef 17 July, 2018, 20:56:25

Be sure to tell them to point their nose at the center target during calibraiton, and then keep their head still!

user-8779ef 17 July, 2018, 20:57:21

THe LED trick helps quite a bit during post-hoc processing, when you need to scrub through and define the frames on which the observer is looking at the natural feature/target.

user-8779ef 17 July, 2018, 20:57:48

...and, keep in mind that you will have to pilot test and maybe adjust the calibration sequence depending on your particular design.

user-8779ef 17 July, 2018, 20:57:58

Test run a few times and see how the track turns out!

user-8779ef 17 July, 2018, 20:58:56

the quality will be heavily dependent upon the algorithm settings during pupil detection. Be sure to use algorithm view. There is an art to all of this, and it will take time to learn.

user-8779ef 17 July, 2018, 20:59:27

(Sorry, use ROI and algorithm view of the pupil during 3D pupil detection). If that doesn't make sense now, it will eventually.

user-8779ef 17 July, 2018, 20:59:37

This system is not plug n'play, but it's very good.

user-8779ef 17 July, 2018, 21:00:59

You may have to sacrifice a few chickens and/or do a little dance.

user-8779ef 17 July, 2018, 21:01:10

(for good data)

user-8779ef 17 July, 2018, 21:01:47

...but, this is the nature of eye tracking 😃

mpk 18 July, 2018, 06:27:04

@user-c351d6 this is known and the load speed is improved with the next release. We also improved memory management by a lot so that very long recordings can be opened on smaller machines as well.

mpk 18 July, 2018, 06:27:53

@user-abc667 @user-8779ef you can also use the calibration marker for that. It will be detected in Pupil Plater automatically.

user-c351d6 18 July, 2018, 06:33:56

@mpk Thanks for that. Is there a date for the next release?

mpk 18 July, 2018, 06:34:06

later this week is planned.

user-7f5ed2 18 July, 2018, 06:36:39

hey everyone , can anyone tell me any application based on eye tracking

wrp 18 July, 2018, 08:08:26

@user-7f5ed2 could you be more specific with respect to "application"?

wrp 18 July, 2018, 08:09:06

@user-7f5ed2 you can see papers/projects that use/cite Pupil here: https://docs.google.com/spreadsheets/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/edit?usp=sharing

wrp 18 July, 2018, 08:09:30

You can also see a number of featured projects via the Pupil Labs blog - https://pupil-labs.com/blog

wrp 18 July, 2018, 08:10:22

The pupil-community repo also contains links to projects, forks, plugins, and scripts that are using Pupil

wrp 18 July, 2018, 08:10:24

https://github.com/pupil-labs/pupil-community

user-78dc8f 18 July, 2018, 10:13:45

@mpk @papr : I sent a query yesterday about extracting sync'd video images from our child and parent world cameras. Any thoughts on this? Or should I starting writing code to do this using the timestamps? Would obviously prefer not to if you guys have something developed 😉

papr 18 July, 2018, 10:16:26

@user-78dc8f Unfortunately we do not have such an implementation. Edit: I would agree with mpk to sync video frames based on the timestamps in the world_timestamps.npy file

user-78dc8f 18 July, 2018, 10:19:01

@papr Thanks for the quick reply. I'll get busy. Just to get your input...I was planning to use the video with fewer frames as the master and then extract the associated frames from the other one. Sound right?

papr 18 July, 2018, 10:22:00

What is you expected result? A picture in picture video? Or two saparated video files that are synced if started at the same time?

user-78dc8f 18 July, 2018, 10:24:23

@papr I am extracting the video data image by image so I can process the images using a deep learning network (to do object recognition). I need to extract synchronized images from each video so that I know if mom is looking at object 1 at time X and baby is looking at object 1 at time X, that the two images are viewing the world at the same moment in time. Does that make sense?

user-78dc8f 18 July, 2018, 10:26:38

@papr So I'm thinking of using the timestamps for the slower video stream as the master and then finding the closest timestamps from the other video that match each time stamp from the master and selecting those paired images for extraction.

papr 18 July, 2018, 10:27:37

Yes, that makes sense

papr 18 July, 2018, 10:27:51

We do something similar for the eye video overlays

user-78dc8f 18 July, 2018, 10:28:35

@papr sounds good. I'll get busy.

user-78dc8f 18 July, 2018, 10:28:44

@papr thanks for the quick input

user-7f5ed2 18 July, 2018, 10:38:14

@wrp application means take for example 'controlling wheelchair using pupil motion' is an eye detection further application, or you can take driver drowsiness detection for an example, so I wanna know some more applications based on real time face or eye detection.

wrp 18 July, 2018, 10:45:32

@user-7f5ed2 please check out the links posted above, there are examples of applications according to your definition

user-239f8a 18 July, 2018, 13:15:03

"Hello world of Pupil", I have lost one eye camera in the middle of experiments. That pesky bastard called time isn't on side, so what adjustments should I make to ensure useful data. I am contemplating reverting to monocular thinking.... Use case involves the visual management of instrument readout on flight simulator screen. Simulator provided score based on level of recognition, since I am looking at fixations primarily at this stage, how does losing one eye cam (still dont know why by the way) affect how I look at fixation outputs. Any pitfalls to look out for?. Thanks in advance.

papr 18 July, 2018, 13:29:21

Pupil Capture supports monocular mapping. But be aware that a single eye camera cannot map the subject's complete field of view. E.g. if you have a camera on the right eye, and the subject looks to the left, then the mapping will be most likely be inaccurate. The reason for that is that the pupil is barely visible to the camera in these cases.

Btw, you can also buy a replacement camera on our store.

user-239f8a 18 July, 2018, 13:44:08

@papr, thanks for the quick response. You raise an interesting point about the field of view. I might be wrong but perhaps the movement of the head toward the "left" as per your description might compensate for any losses?... I might be clutching at straws here but given the association between the eyes (distance between pupils if you like), surely there must be some mathematical correction that can be done. I say this because the on screen calibration still works. Setup is 3x 27" screens with 100% of AOI on middle screen. Calibration is done with the 5 point on screen calibration.(This seems to be sampling well- ofcourse head is held stationary and central to FOV) and video playback does actually show fixations on item (instrument) of interest. However, I am hoping I could meet my deadlines by accounting for any losses mathematically until I can run the tests again in proper binocular mode. Again I might be clutching at straws. Thanks again.

user-abc667 18 July, 2018, 14:02:58

@papr Is there a chance you can check out my questions from 4:20PM yesterday? There are a few seemingly useful replies, but I'd appreciate your perspective. Thanks!. Also, is there a chance for a phone call? I find these typed exchanges painfully slow at times. Happy to call you at your convenience if that's possible. I suspect 15 min would be plenty. (I'm at MIT, so EDT time zone.)

papr 18 July, 2018, 14:03:16

@user-239f8a Consider this quick sketch. This is a top-down view. As you can see the pupil will be barely visible to the camera. Without a visible pupil we cannot do any gaze mapping. There is no mathematical correction for that.

Looking forwards will work as long as the pupil is visible to the eye camera

Chat image

papr 18 July, 2018, 14:11:02

@user-abc667 I can second everything what @user-8779ef said. I strongly advise to follow his ideas.

papr 18 July, 2018, 14:11:32

Unfortunately, I am not available for a call.

user-239f8a 18 July, 2018, 14:30:40

@papr. Thanks for correcting my simplistic thought process. It would seem binocular is the way, is there a repair service one could take advantage of? New kit out of budget at the moment unfortunately.

papr 18 July, 2018, 14:32:06

@user-239f8a Please write an email to [email removed] concerning that matter,

user-239f8a 18 July, 2018, 14:33:43

Sure thing. Your time and thoughts are very much appreciated.

user-96d0dc 18 July, 2018, 15:39:26

Hey Guys! I am sorry for broadcasting, we just get shiny Pupil Glasses, and trying to send annotations with https://github.com/pupil-labs/pupil-helpers/blob/master/python/remote_annotations.py script. But we got nothing in annotations.csv after Pupil Player export event with annotations plugins on everywhere, why could it be possible? p.s. There no errors in code execution, code starts and stops record, so there is ZMQ connection. Thanks!!!

papr 18 July, 2018, 15:40:52

@user-96d0dc Could you send the recording to [email removed] I will have a look at this in the coming days.

user-96d0dc 18 July, 2018, 15:45:20

@papr Thanks you! Sent an email from [email removed] Waiting for your replay!

papr 18 July, 2018, 15:50:31

The recording does not include a pupil_data file and the info.csv file is incomplete as well. This indicates that the recording was aborted instead of being stopped gracfully.

user-96d0dc 18 July, 2018, 15:55:36

@papr I am sorry! Will record you a new one!

papr 18 July, 2018, 15:56:19

I am just saying that this is the reason why you might not see any exported annotations. 😉

user-96d0dc 18 July, 2018, 16:10:05

@papr no-no, i got it from all the records 😦

papr 18 July, 2018, 16:32:28

@user-96d0dc I will have a look at the second recording tomorrow

user-cd9cff 18 July, 2018, 16:38:25

@papr Hello, I have a pupil labs camera connected to my laptop and I want to sync the clocks on both systems so that the stimulus that I am trying to run from my laptop will run on the same clock as the pupil cameras. Can you please point me towards the appropriate documentation and github repositories in order to accomplish this?

user-38ff51 18 July, 2018, 20:20:00

Hey, I'm trying to use pupil labs to track where a user is looking on a computer screen. Currently, the gaze data's normals give me where the user is looking with respect to the world camera, which isn't what I want. Is there an easy solution to my problem?

papr 18 July, 2018, 21:07:58

@user-38ff51 Look up surface tracking in the docs

user-d79ff5 18 July, 2018, 22:28:24

@papr The problem fot the Remote Anatations was with Packet encoding. Open Seasame uses Python 2, but Pupil use Python 3. Removing use_bin_type=True from serializer.dumps helped. Thanks!

user-24fdfb 19 July, 2018, 02:34:13

How does the pupil tracker hardware communicate with a PC?

user-24fdfb 19 July, 2018, 02:37:03

What kind of communication protocols are used?

wrp 19 July, 2018, 02:41:33

HI @user-24fdfb Pupil headset connects to a laptop/desktop via USBC-USBA cable. You can also connect to a select number of android devices running Pupil Mobile via USBC-USBC.

user-24fdfb 19 July, 2018, 02:42:13

thanks @wrp

user-24fdfb 19 July, 2018, 02:42:54

I guess your Python/c++ code on github has everything necessary to make the connection useful on the PC

user-24fdfb 19 July, 2018, 02:43:46

If I wanted to relay information about pupil movement rates to a web service, that should be easy enough tweak into the Python code, right?

wrp 19 July, 2018, 02:45:02

@user-24fdfb You can just download Pupil apps from https://github.com/pupil-labs/pupil/releases/latest - plug in the headset, and start Pupil Capture.

wrp 19 July, 2018, 02:46:55

If you want to relay information to a web service you don't even need to touch the source code. You can subscribe/publish to the network based API. Please check out docs here: https://docs.pupil-labs.com/#interprocess-and-network-communication

user-24fdfb 19 July, 2018, 02:53:35

I haven't used ZeroMQ before but at a glance, it looks like it uses a TCP/IP protocol that isn't over HTTP so the only web service I can directly link that to would be one specifically designed for ZeroMQ. Is that right?

user-24fdfb 19 July, 2018, 02:56:56

That looks pretty helpful anyway, though. Writing a separate ZeroMQ to HTTP API relaying server sounds better than adjusting your project's code.

wrp 19 July, 2018, 03:53:31

@user-24fdfb what explicitly are you trying to accomplish - can you provide a concrete example. Are you trying to subscribe client side only?

wrp 19 July, 2018, 03:55:04

@user-24fdfb you might want to take a look at this repo for reference - https://github.com/hookdump/asistiva - the project is unfinished AFAIK, but has the skeleton/foundation there for a web app that communicates with Pupil Capture

user-3f0708 19 July, 2018, 13:39:04

Besides the mouse_controll.py script have any other code that the pupil labs suggests using with the pupil's tracking device?

user-78dc8f 19 July, 2018, 13:39:06

@papr Just a quick question about the stuff I'm coding up (from yesterday). I found a way to use ffmpeg to extract all the images between two time points in a video. I tried this out using 1 min of data. Then I went into matlab (since I suck at python) and read in the timestamp data within the same range. Happily, I got the same number of timestamps as exported frames. Then I repeated this process with 10 min 26 sec of data (the window of data we want to process). That yielded 18676 exported images from the child using ffmpeg and I count 18677 timestamps in matlab (not bad); but for my parent video, I get 16224 exported images from ffmpeg and 13527 timestamps in matlab. That's a big difference. Any thoughts?

user-78dc8f 19 July, 2018, 13:39:33

@papr Here's my ffmpeg command: ffmpeg -ss 00:00:55.379 -i 06NIHVWM131B_child_worldviz.mp4 -ss 00:00:30.000 -t 00:10:26.000 06NIHVWM131B_ChildFrames/06NIHVWM131B_Child%d.jpg

user-78dc8f 19 July, 2018, 13:40:28

@papr Note that there are some bells and whistles in that command to speed extraction, but basic idea is to specify the start time and duration...

papr 19 July, 2018, 13:42:44

So each frame in the original video has a corresponding timestamp. You need to know at which frame indeces ffmpeg starts and stops extracting images. These indeces can be used to extract the corresponding timestamps.

papr 19 July, 2018, 13:43:46

The videos are saved with a fixed time between frames. This means that the duration given to ffmpeg is a different one than the duration calculated for the timestamps in matlab.

user-78dc8f 19 July, 2018, 13:44:59

@papr Ok. I'll dig into whether I can get ffmpeg to return a vector of frame numbers it is extracting... unless you happen to know how to do that magic? I think there's a way to dump a log file or something...

papr 19 July, 2018, 13:46:58

You only need the start and stop indeces. But I do'nt know if ffmpeg gives that type of feedback

papr 19 July, 2018, 13:58:50

Idea: Don't skip to starting time. Extract all frames but instead of only numbering the frames, include the video frame timestamps in the name. This assumes that this is possible. After extracting, you can delete all files whose time is before your original starting time and now the frame index for the matlab timestamps

user-d79ff5 19 July, 2018, 14:13:54

Hey Guys, we are tryting to setup surface tracking. So we printed markers, but them on monitor, enabled the plugin and added a surface, but it does not detect any marker and i am unable to select anything with a mouse. What may i missed?/

user-d79ff5 19 July, 2018, 14:14:04

Chat image

papr 19 July, 2018, 14:15:49

I highly recommend to use 1270x720 as resolution. It results in less distorted images. Additionally, you need to reduce the min_marker_perimeter value. Reducing it to much might result in false positives though!

papr 19 July, 2018, 15:25:22

@user-3f0708 The mouse control script is just a very simple example. See our community repository for related work: https://github.com/pupil-labs/pupil-community

user-3856e9 19 July, 2018, 17:26:30

Hey all, is surface tracking the best way to track someones gaze on a computer screen? accurate enough to control a mouse?

papr 19 July, 2018, 17:35:14

@user-3856e9 You can test this with the example script in the Pupil Helpers repository

user-3856e9 19 July, 2018, 17:36:17

Ah ok I'll take a look, thanks

papr 19 July, 2018, 17:37:55

But to answer your other question: There is currently no other official way to track region of interests other than through surface tracking.

user-cd9cff 19 July, 2018, 18:26:41

@papr Hello, As I have been looking at the pupil epoch clock, I have realized that the clock does not increment by each second. For example, if I run a five second trial, the clock will have incremented by twenty counts

user-cd9cff 19 July, 2018, 18:32:06

How does the epoch clock increment?

papr 19 July, 2018, 19:17:00

@user-cd9cff the time unit is always seconds

user-abc667 19 July, 2018, 20:26:23

@papr About surfaces -- we understand and have been using the fiducial markers. In one of the videos about using surfaces (https://www.youtube.com/watch?v=bmqDGE6a9kc) the narrator mentions that markers have an orientation. What does orientation mean in this case, and why does it matter? If I copy the markers from the image in the documentation and paste them onto a paper form with the same orientation, will the orientations be correct (for whatever "correct" means)? Similarly, what is meant by the orientation of a surface? If this is already explained in the documentation, feel free to point us there. Thanks.

papr 19 July, 2018, 20:30:56

@user-abc667 The markers have a top, bottom, left, and right side. This is meant by orientation.

papr 19 July, 2018, 20:33:33

The markers do not have to point into the same direction for the surface definition. You can download Pupil Player and the dataset from the video from our website and play around with it.

user-cd9cff 19 July, 2018, 20:35:24

@papr [255910.14473, 0.7996776304667995, 0.5497307879030378, nan, nan] [255910.168938, 0.849932840856795, 0.5052694567434703, nan, nan]

user-cd9cff 19 July, 2018, 20:35:38

The numbers in the first column are pupil timestamps

user-cd9cff 19 July, 2018, 20:35:46

but they were taken five seconds apart

user-cd9cff 19 July, 2018, 20:35:57

however, the timestamp does not reflect this

papr 19 July, 2018, 20:36:29

@user-cd9cff From were are you reading the timestamps?

user-cd9cff 19 July, 2018, 20:36:34

from python

user-cd9cff 19 July, 2018, 20:37:24

This is the gist of the oython program

papr 19 July, 2018, 20:37:34

Yeah, but in which context? As a plugin? As an external script? This looks suspiciously like a misinterpretation of bytes to me

user-cd9cff 19 July, 2018, 20:38:01

I am reading it from a ppython script

user-cd9cff 19 July, 2018, 20:39:17

The timestsamp is being queried on line 36

user-cd9cff 19 July, 2018, 20:39:26

and being stored in the ptime variablr

papr 19 July, 2018, 20:40:37

Ah OK, so the first value in each list is the timestamp. But this makes sense

papr 19 July, 2018, 20:41:20

0.02 seconds passed between the generation of the two gaze data points that you received

user-cd9cff 19 July, 2018, 20:42:19

they were five actual seconds apart

user-cd9cff 19 July, 2018, 20:42:34

if name == 'main': f() time.sleep(5) f()

papr 19 July, 2018, 20:45:39

Ah, yes, but the subscription receives all data. Zmq caches incoming data in a fifo queue. You need to continuesly read from the socket and discard values that you do not want.

papr 19 July, 2018, 20:46:30

This is not a request reply pattern.

user-cd9cff 19 July, 2018, 20:47:45

So then how would I get timestamps that are five second apart?

papr 19 July, 2018, 20:48:56

You receive all data, check each incoming datum's timestamp, and discard data that is less than 5 seconds apart

user-cd9cff 19 July, 2018, 20:49:19

so while the loop is running, I read all data?

papr 19 July, 2018, 20:49:27

Yes

papr 19 July, 2018, 20:50:40

All data that is available. And while no data is available, the recv call blocks. This prevents that the process runs at 100%

user-abc667 19 July, 2018, 20:50:42

@papr Will download the data from video; thanks for suggestion. Also: "Randy The markers have a top, bottom, left, and right side.... The markers do not have to point into the same direction for the surface definition."
Then what is the operational consequence of them having an orientation? (If we can ignore marker orientation, that's fine, just trying to make sure we're using the technology properly.)

papr 19 July, 2018, 20:54:23

@user-abc667 There is nothing much that could go wrong in terms of orientation

user-cd9cff 19 July, 2018, 20:54:56

Doesn't this mean that the data is unreliable? I am trying to start collecting data at one point, continously collect data for five seconds and then stop

user-cd9cff 19 July, 2018, 20:55:06

The timestamp, however, shows a different time

papr 19 July, 2018, 20:55:19

You can always edit the orientation of the defined surface after the effect in Pupil Player.

papr 19 July, 2018, 20:55:55

@user-cd9cff I don't understand. Why wouldn't the timestamp change over time?

user-cd9cff 19 July, 2018, 20:56:44

This is what I got over a five second trial

[257181.793281, nan, nan, nan, nan] b'257181.793281' [257181.80135, nan, nan, nan, nan] b'257181.80135'

papr 19 July, 2018, 20:57:42

That's because you only read the first two data points of your trial instead of the first and the last.

user-cd9cff 19 July, 2018, 20:58:26

yea sorry

user-cd9cff 19 July, 2018, 20:58:34

I just realized that I sent you the wrong data

papr 19 July, 2018, 20:59:00

Zmq receives data in the background and caches it. Calling recv reads from that cache. The cache is implemented as a fifo queue

user-cd9cff 19 July, 2018, 20:59:39

what I actually got was this: 257291.668573 257294.968804

user-cd9cff 19 July, 2018, 21:00:17

I understand that it is a fifo qeue, but a second in the pupil epoch time takes longer than an actual second

user-cd9cff 19 July, 2018, 21:01:22

the timestamp shows a difference of 4.3

papr 19 July, 2018, 21:03:20

A possible reason is that there are still 0.7 seconds worth of data in that cache queue.

papr 19 July, 2018, 21:03:37

Could you please share the current version of your script?

papr 19 July, 2018, 21:08:35

Ah never mind

user-cd9cff 19 July, 2018, 21:08:43

This is the script that queries each data point https://gist.github.com/saipraneethmuktevi/bbdf2e298deef659149e55743b480a75

This is the sort of 'main' script that interfaces with Matlab: https://gist.github.com/saipraneethmuktevi/08e9464527800e2794394371702e2818

papr 19 July, 2018, 21:09:24

I found the issue. The problem is that the subscription takes a bit of time. You start your timer directly after subscribing.

papr 19 July, 2018, 21:11:17

The correct way would be to: 1. Subscribe 2. Sleep for 0.5-1.0 seconds 3. Ensure that the queue is empty by reading all available data 4. Start the timer 5. Continue receiving data 6. Stop after x seconds 7. Compare timestamps

user-cd9cff 19 July, 2018, 21:12:30

another situation I am gettting is this: [257554.485408, nan, nan, nan, nan] [257728.509449, nan, nan, nan, nan]

These two timestamps were taken consecutively, yet indicate a 200 second delay

user-cd9cff 19 July, 2018, 21:12:45

is this because of a cache that is not empty?

papr 19 July, 2018, 21:13:32

@user-cd9cff Timesync is disabled? What is the timestamp of the next datum?

user-cd9cff 19 July, 2018, 21:13:54

[257728.623036, nan, nan, nan, nan]

user-cd9cff 19 July, 2018, 21:14:02

it goes in an orderly fashion from there

user-cd9cff 19 July, 2018, 21:16:11

should I empty the cache tho?

papr 19 July, 2018, 21:16:45

A jump of 200 seconds is weird. Is this reproducable?

user-cd9cff 19 July, 2018, 21:16:53

yes

papr 19 July, 2018, 21:17:11

Is it always the first timestamp that is off by that much?

user-cd9cff 19 July, 2018, 21:17:23

[257740.048154, nan, nan, nan, nan] [257833.294428, 0.49847990851838514, 0.8020016869958443, nan, nan]

user-cd9cff 19 July, 2018, 21:17:30

no its ususally in the middle

user-cd9cff 19 July, 2018, 21:17:43

Chat image

papr 19 July, 2018, 21:19:25

Could you make a screenshot of Capture, please?

user-cd9cff 19 July, 2018, 21:20:16

Chat image

user-cd9cff 19 July, 2018, 21:20:21

like this?

papr 19 July, 2018, 21:22:33

Yes, thanks

papr 19 July, 2018, 21:28:36

So your actual goal is to receive gaze data in Matlab?

user-cd9cff 19 July, 2018, 21:29:54

Yes, and I am achieving this by compiling gaze data in python trial by trial and the submitting it to matlab

papr 19 July, 2018, 21:34:34

Mmh, the screenshot with the timestamps is really weird. At the bottom timestamps jumps back in time

papr 19 July, 2018, 21:36:03

I will have to think about this after having a bit more sleep (German timezone 😉 ). I am off for today. Let me know if you find the source of the problem.

user-abc667 19 July, 2018, 21:36:48

@papr Per earlier discussion, we're looking into using natural features calibration for our task; recall that we want to track eye position for someone looking down at a test form on the tabletop.

Plan is to create a calibration form, a piece of paper roughly 12.5" x 16" with 9 small icons (eg, dog, cat...) in 3 rows. Put that on the table where the test form will be and calibrate by asking the subject to look at each icon in turn, clicking on that point in the world window (per the documentation).

a) Is this the right idea? Anything we need to change?

b) I'm trying to understand how natural features calibration works. Is the pupil position collected during this process just wherever the eyes are when the point in the world window is clicked (hence possibly noisy), or does the system look for pupil position fixation after the world window is clicked, and use that position?

Thanks!

user-cd9cff 19 July, 2018, 21:39:06

@papr if you’re up this late answering questions in Germany, props to your dedication

papr 19 July, 2018, 21:42:22

@user-abc667 I would simply use a printed calibration marker and use manual marker calibration. This allows you to do automatic offline calibration. Natural feature calibration in capture is noisy since we try to track the clicked position with optical flow and that is not realy realiable.

Also: If you use pictures you will introduce label noise since there is not a single point where the subject looks at, e.g. the subject could be looking at the cat's tail or its head. Use visual stimuli where the gaze target is clear to the subject, e.g. our calibration markers have concentric circles with a single point in the middle.

user-abc667 19 July, 2018, 21:43:21

@papr Got it, many thanks. Will try this. And indeed, thanks for the late nite (for you) service!

user-103621 20 July, 2018, 14:08:59

Hey everyone, while experimenting with the highspeed camera and the exposure time value the luminosity jumps from bright to a darker state while sliding the expsure time bar (126-128 and 159-160 as example). Any idea why this is happening ? Does changing the exposure time give an autommatic reaction from the sensor ?

user-103621 20 July, 2018, 14:09:15

best regards

user-833165 20 July, 2018, 14:50:18

Hello Everyone,

I accidentally ripped the cables out of one camera connector, i need to get some replacement metal pins ...does anyone know the model of the connector ? iOn the housing attached to the pcb i could read J and JKJ , any help will be appreciated !

Best Regards

Chat image

wrp 20 July, 2018, 15:08:13

@user-833165 please send an email to info@pupil-labs.com re replacement parts and/or repairs

user-c828f5 20 July, 2018, 15:23:55

@papr Hello! I am using the normalized 3D gaze vector, i.e gaze_normal_x/y/z. What is the origin considered for calculating each eye's normalized gaze vector? The docs says that the visual axis goes through the eye ball center and the object that's look at in the world camera coordinate system. I'm asking because I need to compute the cyclopean gaze vector.

user-11dbde 20 July, 2018, 16:54:40

anyone has problems and using the creen markers calibration procedure?

user-11dbde 20 July, 2018, 16:55:09

I activate the calibration but no fixation is ever recognized and the process is stopped on the first marker?

user-11dbde 20 July, 2018, 16:55:50

any ideas?

user-11dbde 20 July, 2018, 16:55:56

tks

mpk 21 July, 2018, 06:43:28

@user-11dbde can you share a video?

user-bab6ad 21 July, 2018, 09:15:57

@user-11dbde it worked with our thingy (not on the hololense). Maybe you try with ours to check if it is a software or hardware problem? I am on holiday, but Oliver should be able to give it to you

user-380f66 22 July, 2018, 14:03:50

Hello, we are getting good eye tracking with adults but really bad eye tracking with preschool aged children, even though everything seems to be going ok during the calibration. We are wondering if we could adjust any settings in the detecting/processing stream that may be created more for an adult pupil to better detect a child's pupil? Thank you! Sara

wrp 23 July, 2018, 02:51:07

Hi @user-380f66 thanks for the feedback. If possible, can you send a sample to data@pupil-labs.com so that we could provide concrete feedback (we realize this may not be possible due to privacy - but if possible, let us know). What is the accuracy reported after calibration?

user-c1220d 23 July, 2018, 14:31:02

Talking about the use of pupil mobile app. Do I have to use a usb-c on the phone? Or I can also use the normal micro usb port - B. I need to buy the right cable from the glasses to the smartphone, so, i wonder if the type of the phone's port is mandatory the same of that one of the glasses. Thank you

papr 23 July, 2018, 14:34:19

@user-c1220d This will most likely not work. See the Pupil Mobile repository for a list of supported devices.

user-8779ef 23 July, 2018, 16:02:35

Hey guys, we are in the middle of data collection before a big deadline and one of our headsets seems to be failing us.

user-8779ef 23 July, 2018, 16:03:42

One eye camera is dropping out. We have replaced the eye camera with the camera from another headset, and the problem persists. So, we think its the headset wiring, and not the camera. Given the time constraints, we're just about ready to order a new headset with super fast shipping.

user-8779ef 23 July, 2018, 16:04:01

However, I thought it best to check in before we do this, just in case you want us to try anything funny.

papr 23 July, 2018, 16:06:23

Hey @user-8779ef Please write an email to [email removed] I do not think there is much more to try.

user-8779ef 23 July, 2018, 16:06:35

Ok, thanks @papr .

user-8779ef 23 July, 2018, 16:06:52

We have a 120 hz system. Can we use that headset with the 200 Hz cameras?

user-8779ef 23 July, 2018, 16:07:06

....are they interchangeable?

papr 23 July, 2018, 16:08:38

They should be interchangeable. But it looks like the cable tree is at fault, if you tried a different camera already.

user-8779ef 23 July, 2018, 16:09:07

Yes, but that's good news. I believe we can use the 120 Hz system's cable tree with the 200 hz camera arms.

user-8779ef 23 July, 2018, 16:10:01

120 hz headwear, but take off 120 hz cameras and replace with 200 hz cameras.

user-8779ef 23 July, 2018, 16:10:11

It seems to be the 200 hz system's tree that is at fault.

papr 23 July, 2018, 16:10:34

You can test it by attaching the 200hz cam to an 120hz headset

user-8779ef 23 July, 2018, 16:10:39

Exactly.

papr 23 July, 2018, 16:10:49

Before you start ripping out cable trees...

user-8779ef 23 July, 2018, 16:10:54

No no no ripping out

user-8779ef 23 July, 2018, 16:11:02

Thanks!

user-3f0708 23 July, 2018, 16:47:23

what does this comment mean in mouse_control.py     m.move (0, 0) # hack to init PyMouse - still needed?

user-cd9cff 23 July, 2018, 17:49:35

@papr

Hello, were you able to figure out why the pupil timestamps would jump and not correlate with real time?

user-20de15 23 July, 2018, 18:37:06

Hey, what does the "on_srf" parameter means? (In the "fixations_on_surface" files). If I want to count how many times someone fixates at certain surface, i should ignore the "on_srf = false" values?

user-c4da24 23 July, 2018, 19:06:17

Hello, I wanted to ask how we may calculate the depth of each time frame via Pupil Labs? We are currently piloting an experiment where we want to capture what individuals look at in their everyday lives. Thus, I have integrated all of the calibration methods (i.e., screen, manual markers, natural features) A question and concern that I have encountered is that I do not know how to calculate the actual depth of the gaze detections. I have been looking at all the documents and played around with all the plugins, but, I have been very unsuccessful in actually retrieving the depth of the gaze. I will appreciate any insight. Best, Celene

papr 23 July, 2018, 19:08:38

We map gaze to surfaces nonetheless if the gaze is actually on the surface. This field tells you if that gaze was within the surface bounds. @Johnny#7075 correct, ignore these

papr 23 July, 2018, 21:49:45

@user-cd9cff No, I have not have time yet and will probably not have any this week since I am at EuroPython in Edinburgh 🙂

user-11dbde 23 July, 2018, 21:52:40

thank you so much for replying

user-11dbde 23 July, 2018, 21:52:49

tomorrow i have time to check this problem again

user-cd9cff 23 July, 2018, 22:11:41

@papr Thank you for your response.

wrp 24 July, 2018, 00:38:24

@user-3f0708 that line of code was used to in it pymouse and moved the mouse to 0,0 screen coordinate at the beginning of the script. At the time the script was written this line was necessary to init pymouse but may no longer be needed. Try commenting it out and see 😸

user-c351d6 24 July, 2018, 09:55:20

Hi guys, is there somehow a possibility to define surface manually? For example surface "xyz" is consisting of marker 1, 2, 3, 4 and then afterwords define the position of a surface? It's quite a work to define a surface when you have a lot of marker and you just want to have a certain amount this markes related to a surface. The surface tracking also crashes every second surface definition.

wrp 24 July, 2018, 09:57:01

@user-c351d6 there is not currently a method to define surfaces "manually" (e.g. with an external file). Crashes should be resolved with the forthcoming v1.8 release

papr 24 July, 2018, 09:58:33

@user-c351d6 Am I assuming correctly, that your marker setup is fixed and that you cannot show the surfaces one by one to the camera?

user-c351d6 24 July, 2018, 09:59:30

Exactly, we have to walk in our environment to get all the surfaces

papr 24 July, 2018, 10:01:32

How about creating "dummy" surfaces like on a piece of paper, that use the same markers and proportions as the original surfaces? You could register them one by one without having to walk around

user-c351d6 24 July, 2018, 10:03:56

Sounds like a good workaround.

user-82e7ab 24 July, 2018, 12:52:40

hi, I'm using the pupil headset with 200Hz cameras and Pupil seem to have problems detecting its 3d eye model on one eye (the left). Also the right one is bad sometimes, but it occurs far more often on the left side. Any ideas why this happens? Maybe the positioning of the cameras isn't optimal?

Chat image

papr 24 July, 2018, 12:53:14

Looks like the 2d pupil detection does not work well

papr 24 July, 2018, 12:53:44

Could you change to the "Algorithm view" in the general settings of the eye windows?

papr 24 July, 2018, 12:53:51

And post a screenshot of that

user-82e7ab 24 July, 2018, 12:54:51

sure

user-82e7ab 24 July, 2018, 12:55:10

algorithm mode

Chat image

user-82e7ab 24 July, 2018, 12:56:24

(the right is flipped upside down)

papr 24 July, 2018, 12:57:38

Mmh, looks actually good. Try rolling your eyes. The 3d model needs pupil positions from different locations to build up the model

user-82e7ab 24 July, 2018, 13:01:54

yes, rolling the eyes helps, but most of the time the model "is lost again" in less than a minute

user-82e7ab 24 July, 2018, 13:03:09

which does not occur with the right eye .. at least much less frequently

user-63941a 24 July, 2018, 13:52:50

Hi, does the calibration support monocular test?

papr 24 July, 2018, 13:53:42

@user-63941a What do you mean by test?

user-63941a 24 July, 2018, 14:01:22

We want to compare the searching ability between two eye and one eye, so the experiments will need participants closing one eye during the test. I wondered if the pupil capture support that and do we need to adjust the settings when it comes to monocular test?

papr 24 July, 2018, 14:26:02

I understand. Closing one eye will lead to low pupil confidence for data from this eye. The calibration procedure will discard low confident data automatically.

user-63941a 24 July, 2018, 14:28:43

Okay I get it. Thanks for the information.

user-e2056a 24 July, 2018, 14:45:03

Hello! @papr , May i ask a question about obtaining the duration of gaze? We could obtain the "gaze_timestamp" from "gaze-position" folder, but what is the definition of gaze by Pupillab? how can we find the duration of the gaze? Thank you!

papr 24 July, 2018, 14:49:03

Pupil data is relative to the eye camera space. Gaze is mapped pupil data and is relative to the world/scene camera coordinate system. For each eye video frame there is exactly one pupil datum. Therefore you only have one gaze datum for each pupil datum (monocular mapping) or one gaze datum for a pair of pupil datums (binocular mapping). Therefore gaze does not have a duration but the gaze position at the time at which the eye video frame(s) were taken.

user-e2056a 24 July, 2018, 15:40:12

Thank you,@papr, where can I find the reference of this definition of gaze?

papr 24 July, 2018, 15:41:05

See https://docs.pupil-labs.com/#development-overview under Pupil Datum Format

user-e2056a 24 July, 2018, 15:52:58

Thank you, @papr, I was looking for the time duration between the gaze position started falling on a certain surface, and when it left that surface. However, when I look at the "surface event" file, sometimes the gaze enter one surface, followed by enter another surface without exiting the previous one( there's no overlapping between those two surfaces, is the data normal? if not, what would you suggest to improve it? thank you.

user-e2056a 24 July, 2018, 15:53:58

Thanks for the link https://docs.pupil-labs.com/#development-overview , @papr, is there any literature support available?

papr 24 July, 2018, 15:54:18

What do you mean by "literature" support?

papr 24 July, 2018, 15:56:38

~~If the surfaces overlap you can get "enter" events without having "leave" events.~~ Ah, you say that they do not overlap.

papr 24 July, 2018, 15:57:52

If you know that they do not overlap, then you can simply assume that they exited the previous surface as soon as they enter a new surface.

papr 24 July, 2018, 15:58:39

We cannot assume or test if the surfaces overlap. Therefore we cannot "fix" this since it can be expected behavior.

user-cd9cff 24 July, 2018, 16:02:27

@papr Where can I find blink detection support?

user-bfecc7 24 July, 2018, 16:04:06

Hello everyone. Didn't know if any of you have run into this ValueError before. I was exporting a video in Pupil Player and have plenty of available space so I'm not entirely sure what it is referencing. Using a Windows system and tried this on Pupil Player version 1.2.7 and Pupil Player version 1.7 and got the exact same error. Wanted to check here first before making a formal issue in the Github.

Chat image

user-e2056a 24 July, 2018, 23:19:29

Thank you @papr, by "literature support" I mean does the method used by Pupillab to identify gaze exist in any literature?

user-cd9cff 24 July, 2018, 23:26:22

@papr 1) Do you have a way of monitoring fixation?

2) Do you have a way of detecting blinks in data recorded remotely without pupil capture?

wrp 25 July, 2018, 04:15:30

@user-cd9cff Pupil Capture and Pupil Player have fixation detector and blink detector plugins built in. You can classify ("detect") both blinks and fixations post-hoc with Pupil Player or in real-time with Pupil Capture.

wrp 25 July, 2018, 04:17:56

@user-e2056a can you be more specific regarding your question: "method used by Pupil Labs to identify gaze exist in any literature?" I am confused by the terms "gaze exist" can you please clarify.

papr 25 July, 2018, 06:24:51

@user-e2056a See this spreadsheet with published papers that cite us: https://docs.google.com/spreadsheets/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/htmlview#gid=0

user-d9bb5a 25 July, 2018, 07:50:40

Good afternoon friends. Can someone among you handle data processing Pupil labs? I need your help or help.

user-82e7ab 25 July, 2018, 10:32:36

hi again, referring my previous post (https://discordapp.com/channels/285728493612957698/285728493612957698/471298673410572288) Is it possible to "fix" the current 3d eye model - maybe at least against large changes (to still compensate for small slippage) ?

papr 25 July, 2018, 11:15:54

@user-82e7ab The 3d model is currently being reworked to include refraction. This will include overall model stability improvements. Unfortunately, there is no official release date yet

user-82e7ab 25 July, 2018, 11:33:59

thanks Pablo for keeping us up-to-date

user-c351d6 25 July, 2018, 13:26:09

Are you guys planing to release 1.8 before the weekend?

user-0e8148 25 July, 2018, 14:41:23

Hi, Any idea why recordings of several minutes in capture have raw videos of each eye saved as videos of only 1 frame?

user-85976f 25 July, 2018, 14:51:44

Hey @papr, I'm on an older Mac (iMac (27-inch, Mid 2010) running High Sierra, and getting an instant crash of pupil player and service upon launch. Is this a known issue, or do you have any suggested fixes?

user-85976f 25 July, 2018, 14:53:14

...running release 1.7

user-85976f 25 July, 2018, 14:55:04

1.6 also crashes.

user-85976f 25 July, 2018, 15:13:26

Submitted a new issue.

user-85976f 25 July, 2018, 15:14:21

If this is an issue with all older Macs, then perhaps you should add Mac version to the software requirements.

user-85976f 25 July, 2018, 15:18:28

We are going to format the machine and try again.

papr 25 July, 2018, 17:45:58

@user-0e8148 it only looks like it. It was a bug where videos were saved with a huge fps number. All frames are there. Please export with Player to get correct Videos.

user-8779ef 25 July, 2018, 17:50:23

Have someone getting setup to run from source. Hopefully the error messages will be informative.

user-8779ef 25 July, 2018, 19:15:46

From a tech: "I tried Pupil Player on a clean 10.13 machine, the same make of the machine in question, with the latest version of Pupil Player and I'm getting the same error. I stand by the idea that it has something to do with the way in which they're compiling their binaries. The machine level error which is being thrown is known to be associated with compilation problems due to the way in which the code optimizer is written. Meaning, even if we were to nuke back to a clean 10.13 install there would be no change in behavior. "

user-8779ef 25 July, 2018, 19:16:13

So, it appears that pupil player is not compatible with older macs.

papr 25 July, 2018, 22:11:34

@user-8779ef running from source is not an option?

papr 25 July, 2018, 22:13:00

Because yes, for the bundle everything is compiled on one machine and packed into the bundle. If the processor is not compatible, the bundle will not work. Which CPU do you use exactly?

user-988d86 25 July, 2018, 22:36:05

I have a question re: blink duration. When we've exported data after the fact from videos using Pupil, we've been able to get information on blink duration. However, when we are getting data in real time by subscribing to blink notifications, we don't get any information about the duration of a blink once it has ended. I'm trying to figure out how we could calculate this ourselves using the data we are receiving, but we aren't getting onset/offset messages for every blink event. Sometimes, we only get an onset, sometimes only an offset. Anyone have any suggestions about this? Thanks!

mpk 26 July, 2018, 04:39:46

@user-8779ef pupil bundles do not work with pre Intel i3/i5/i7 macs.

user-128c78 26 July, 2018, 07:43:04

Hi, I'm new to eye tracking, I studied a lot about it but if didn't find a good solution for blink detection. what your idea? I will be so happy if I find my the answer.

wrp 26 July, 2018, 09:17:42

Hi @user-128c78 you can read about our plugin's approach to blink detection here: https://docs.pupil-labs.com/#blink-detection

user-feb6c2 26 July, 2018, 09:36:50

hi ! We have a problem that when we start pupil capture the world camera is white

wrp 26 July, 2018, 09:37:52

@user-feb6c2 please could you go to the world window and in the menu General > Restart with default settings

wrp 26 July, 2018, 09:38:07

@user-feb6c2 what OS are you using?

user-feb6c2 26 July, 2018, 09:39:51

@wrp We did this two times, but still comes up this. Windows.

Chat image

user-feb6c2 26 July, 2018, 09:54:04

are you here ?

wrp 26 July, 2018, 09:54:40

Hi @user-feb6c2 yes, thanks for sending the image. Strangely it looks like the camera white balance or gain settings are not getting reset when you reset default settings.

wrp 26 July, 2018, 09:55:11

Can you exit Pupil Capture and manually delete the pupil_capture_settings folder from C:\Users\YourUserName\

wrp 26 July, 2018, 09:55:15

and relaunch Pupil Capture

user-feb6c2 26 July, 2018, 09:57:22

okey i try

wrp 26 July, 2018, 09:57:46

Thanks @user-feb6c2

wrp 26 July, 2018, 11:01:02

@user-feb6c2 any updates?

user-8779ef 26 July, 2018, 11:01:50

@papr @mpk Thanks guys. Makes sense. Could it really be pre-intel?!? I don't think it's THAT old, but maybe I'm wrong.

user-8779ef 26 July, 2018, 11:01:56

This makes me feel old.

wrp 26 July, 2018, 11:02:13

@user-8779ef not pre intel but pre i Series

user-feb6c2 26 July, 2018, 11:02:15

@wrp THANKS now it is working 😃

wrp 26 July, 2018, 11:02:43

@user-feb6c2 great - thanks for the feedback. Strange that restarting with default settings did not clear the user settings folder on Windows 10. We will look into this.

mpk 26 July, 2018, 11:07:31

@user-8779ef Not Pre Intel but pre 'i' series.

user-feb6c2 26 July, 2018, 11:11:25

@wrp i also would like to ask you about the manual marker calibration. Some times it works but sometimes the fixtation is not where we look. I dont understand if we have to record during the calibration for later to calibrate it offline ?

wrp 26 July, 2018, 11:14:44

@user-feb6c2 I will be AFK soon, but want to give a quick reply before I have to go. There are two issues here from what I understand: 1. Quality of calibration - Are you using the most recent v4.0 marker from the docs? What is the accuracy reported? What is your calibration technique? Have you tried single marker calibration with manual marker? 2. Offline calibration - If you want to have the option to re-calibrate post-hoc in Pupil Player, then you should start recording prior to calibrating. This way you will record the calibration process and can then detect the calibration markers post-hoc in Pupil Player. (Note: please ensure that you also record eye videos - this option is enabled by default in the recorder plugin, but just making this note in case you turned it off).

user-103621 26 July, 2018, 11:38:02

Hey i already asked this a while back and wanted to up it again. A reply would be much appreciated !

Hey everyone,
while experimenting with the highspeed camera and the exposure time value the luminosity jumps from bright to a darker state while sliding the expsure time bar  (126-128 and 159-160 as example).  Any idea why this is happening ? Does changing the exposure time give an autommatic reaction from the sensor ?
best regards
user-380f66 26 July, 2018, 11:48:20

@wrp , Thanks for your response to my enquiry about eye tracking with preschool aged children. I cannot provide any data from our participants because we have privacy agreements but I am happy to answer any questions about the eye cam, etc. When you ask about accuracy, are you asking about the percentage of data dropped during the calibration? For our adult participants it tends to be somewhere between 8 and 13%. For our kids it has been 38% and up.

user-380f66 26 July, 2018, 12:00:44

@wrp Another thing I should mention, if it helps, is that it is really difficult for us to adjust the eye cam to get a really good view of the child's eye. I'm not sure why this is--maybe just because the child's head is smaller? But the eye always looks angled a little bit to the side in the eye cam video instead of being a direct view.

wrp 26 July, 2018, 12:28:40

@user-380f66 I'm asking about the gaze accuracy metric not percentage of data used during calibration.

wrp 26 July, 2018, 12:29:12

@user-103621 I see your question but don't have an immediate answer for you

user-2ff80a 26 July, 2018, 13:27:36

Hi all, our lab is still trying to test our experimental setup using Pupil Mobile with the Moto Play Z2, and having some pretty substantial issues with low framerates, especially for the worldcam. I've tried switching off the H.264 transcoding, but don't really want to go below 1080 x 720 resolution or 30fps, if possible. So far I have also tried local recording (as suggested), which can get as good as 17fps on a cool day, or as low as 1 fps on a hot day--which happened yesterday and caused us to have to omit eye-tracker data collection altogether. Framerates and heating issues are not so bad with the Pixel tablet, but the eye cameras behave erratically when trying to do local recording with it.

user-2ff80a 26 July, 2018, 13:31:04

So, our question remains.. is there anything more we can do to try to improve this recording setup (specifically frame rate, heat, timestamps and overall stability), or should we look at running Pupil capture on a mini-PC to assure long term reliability? Does anyone else have similar issues with heat? We'll need to maintain this setup for at least 100 participants and have a huge robotics research consortium waiting on this data, so my PI is getting pretty adamant about resolving this. I saw that there will be a new release soon, fixing some of the other errors we have been seeing, so will continue to try to improve what we have for the next week or so. After that, I should go ahead with the mobile PC purchase.

user-2ff80a 26 July, 2018, 13:43:52

So, the next question would be: what are the minimum requirements for running pupil capture? Would an i3 processor be enough, or better to go with an i5? Considering something similar to a Lenovo Thinkcentre TINY - M73 or M710q.

wrp 26 July, 2018, 14:02:32

@user-380f66 have you tried the eye camera arm extenders? The eye being off center in the camera image is OK, as long as all eye movements are still within the frame of the eye camera window.

mpk 26 July, 2018, 14:11:55

@user-2ff80a thanks for the report! I understand that you are doing local recording only (no streaming?) . We have not seen this kind of behaivour here.

papr 26 July, 2018, 14:13:18

@ease-csl-research#3732 I would recommend the i5

user-85976f 26 July, 2018, 14:21:40

So, back at getting player to run on this old Mac. When running from source, I get the error: "Error calling git: "Command '['git', 'describe', '--tags']' returned non-zero exit status 128."

user-464538 26 July, 2018, 14:21:46

Hi all, when I export the gaze positions excel file out of pupil player, what units are the gaze positions measured in? it would seem to me that the norm_pos_x and norm_pos_y should range from (-1,1) but this is not the case.

user-85976f 26 July, 2018, 14:22:03

Eventually, this results in "raise ValueError('Version Error')"

user-85976f 26 July, 2018, 14:23:06

Perhaps I should try cloning the git repo in a different way....

wrp 26 July, 2018, 14:24:11

@user-85976f did you download the git repo as a zip file from release?

user-85976f 26 July, 2018, 14:24:46

Yeah, did that. About to log into git and clone.

wrp 26 July, 2018, 14:25:19

@user-85976f You should actually clone git clone - Additionally as noted earlier some libs might not build on non i series processors

user-85976f 26 July, 2018, 14:26:04

I think they all built correctly ... we'll see.

user-85976f 26 July, 2018, 14:26:14

Oh, might not build on runtime. Ok, I'll report back.

wrp 26 July, 2018, 14:26:31

@user-464538 norm_pos_x and norm_pos_y can be greater and/or less than the range of 0-1 - as gaze mapping could potentially map outside the frame of the world camera

mpk 26 July, 2018, 14:27:05

@user-2ff80a One more note. I saw that the current version of Pupil Mobile does not stop streaming once you have acceces the streams from Pupil Capture. For now just manually stop streaming the sensors in Capture. We will work on a fix. If you are only recording locally this is not an issue!

mpk 26 July, 2018, 15:54:48

@user-2ff80a I just made a 30min 1080p h264 and 200fps binoculare local recording using Pupil Mobile without streaming in about 32deg ambient temperature.I saw smooth 25fps avg. (30fps when using 720p) Please make sure to not stream at the same time. If your problem persists lets do a video debug session!

mpk 26 July, 2018, 15:56:15

@user-2ff80a I did see that Pupil Player does skip frames when the CPU is not fast enough during playback. The fps graph in the top left shows the rendered frames so this can be misleading. Maybe this is why is looks slow? I find playing it back on a faster machine helps. Also video exports will not be skipping frames.

user-b04ab9 26 July, 2018, 17:45:59

Hello all, just installed the mobile app and wanted some clarification on the intended functionality. After transferring the locally recorded files/project folder from the Z2 to my desktop the "Player" software displayed "no fixations available." There doesn't look to be any visualization data at all. No gaze lines or circles are present. The issue is unique to "Mobile" app as my primary setup using a tablet with "Capture" software to record sessions presents as expected. Is Pupil Mobile solely a companion app to "Capture" for the purpose of streaming and previewing video OR can it function as a standalone means of recording video and data for post-session analysis in "Player"? Perhaps there a setting I was supposed to enable but didn't? Couldn't find much in the documentation and understanding it is in alpha I turn to you fine folks for any thoughts. I am just getting started with the Pupil hardware and software so any help is greatly appreciated. Regards.

papr 26 July, 2018, 21:07:17

@user-b04ab9 you are exactly right, the mobile app only records and streams but does not do any calculations. See the docs on offline pupil detection and calibration

user-8a8051 27 July, 2018, 01:40:56

Hi all, I was attempting to teach one of my colleagues how to use the eye tracker when one of the eye cameras (eye 1)and the world camera stopped working. The world camera has since come back on but the eye 1 is still out with it listed as "unknown" in the Local UVC sources, is this a hardware issue?

user-8a8051 27 July, 2018, 01:43:21

ooh, ok eye 1 did just come back online but is now off again

user-8a8051 27 July, 2018, 02:05:13

well because the cameras do sometimes work i'm thinking it might be a loose wire or something similar, any help would be greatly appreciated

wrp 27 July, 2018, 02:50:55

@user-8a8051 what OS are you using? Are cables firmly connected?

user-8a8051 27 July, 2018, 03:41:58

aah sorry macbook air, OS Sierra 10.12.6,

user-8a8051 27 July, 2018, 03:42:42

it was working fine for about 30 minutes before it stopped quite suddenly and as far as i can tell all the cables still appear to be connected

wrp 27 July, 2018, 03:43:31

@user-8a8051 send us an email [email removed] and we can schedule a time for remote debug and/or return for repair if needed. Please also send your order_id number in the email if possible

user-8a8051 27 July, 2018, 03:45:43

thanks, ill send through shortly

wrp 27 July, 2018, 03:46:32

Ok, thanks @user-8a8051

wrp 27 July, 2018, 03:55:10

@here 📣 Announcement 📣 - We have just posted the v1.8 release of Pupil Software. It is available here: https://github.com/pupil-labs/pupil/releases/tag/v1.8

wrp 27 July, 2018, 03:55:47

We recommend that you update to v1.8. Please let us know if you have any feedback 😄

user-c351d6 27 July, 2018, 08:04:44

I already did some test's with V1.8 and it looks great so far. Good Job!

papr 27 July, 2018, 08:21:19

@user-c351d6 great to hear!

user-f81efb 27 July, 2018, 11:08:03

Hello everyone, I wanted to ask if there is a way to get all the data from the video captured by the world camera?

user-2ff80a 27 July, 2018, 11:08:12

@mpk we originally tried streaming, but the timestamp regularity and video quality were not reliable. Streaming over WiFi also increases the heat of the Moto Play, by quite a lot. When I reported the issues here (a week or so ago) the suggestion was to try local recording.

papr 27 July, 2018, 11:10:08

@user-f81efb What do you mean by "all the data". Videos are saved during recordings.

user-2ff80a 27 July, 2018, 11:10:34

@papr Thanks! hopefully it doesn't come to that... I see some of the later messages from @mpk are addressing the issues we've had, so I'll test out the new release today.

user-f81efb 27 July, 2018, 11:11:14

@papr i meant can we get the depth paramaters as a csv file maybe

user-f81efb 27 July, 2018, 11:11:25

*parameters

papr 27 July, 2018, 11:11:56

What depth parameters in particular? Do you mean depth value of 3d gaze?

papr 27 July, 2018, 11:12:15

Or are you using the Realsense camera?

user-f81efb 27 July, 2018, 11:12:48

I know we can get the depth parameters with the gaze, but is it possible to get values for all coordinates irrespective of gaze

papr 27 July, 2018, 11:13:45

@user-f81efb which scene camera configuration are you using? The Realsense 3d or the high speed camera?

user-f81efb 27 July, 2018, 11:13:54

the realsense 3D

user-2ff80a 27 July, 2018, 11:14:12

@mpk thanks for the additional testing. I'd tried to stop streaming while doing local recording, but it repeatedly started back up after a few moments. I'm going to get the latest release and see if we get improvements, based on your instructions. Thanks again!

papr 27 July, 2018, 11:16:14

@user-f81efb Ok, the realsense 3d camera returns 16bit depth/gray values. There is no video format that can save this values. Therefore we convert the 16bit depth/gray values to 8bit rgb values and save these images instead. There is a flag in the Realsense Source menu to enable/disable this option.

user-f81efb 27 July, 2018, 11:18:01

Okay! I shall check it. Thank you

papr 27 July, 2018, 11:18:26

@user-f81efb you can preview the conversion as well. The option is part of the same menu

user-2ff80a 27 July, 2018, 11:30:40

@mpk I will also check out the Pupil Player issues you mentioned. The FPS reported in the upper left didn't always make sense based on what we were seeing, so I usually went by that shown in the file source 'Frame rate' field. The computer we run Capture on has an i7-6700, so it should not be an issue, I guess? Will also try video export, as you suggested, and see how that affects the frame skipping.

user-f81efb 27 July, 2018, 11:39:12

@papr how can i access the values for 16bit depth/gray values?

user-f81efb 27 July, 2018, 11:39:52

i am using pupil capture on mac OS

papr 27 July, 2018, 11:59:17

@user-f81efb You will need to modify the backend to do so.

papr 27 July, 2018, 11:59:36

If I remember correctly. Let me check that after lunch.

user-b04ab9 27 July, 2018, 12:15:37

@papr Got it. Thank you for pointing me in the right direction.

papr 27 July, 2018, 12:42:22

@user-f81efb I was wrong! Let me write up a small example plugin that show cases the depth frame access.

papr 27 July, 2018, 12:50:01

@user-f81efb This is the example: https://gist.github.com/papr/0f13943e2aebd768ab6b1508d466caae

  1. Save it in plugins folder within your capture settings folder
  2. Start Capture
  3. Activate your realsense camera from the Realsense Backend selector
  4. Enable Depth Frame Accessor within the Plugin Manager menu
  5. Profit.

This example requires python >=3.6 since I used f-strings in line 12.

user-d79ff5 27 July, 2018, 15:17:43

Hey Guys! I have big record (30 mins) pupil_data is about 1.1GB and player is hangs on Windows and grashes on linux

user-d79ff5 27 July, 2018, 15:17:47

Here is the log

user-d79ff5 27 July, 2018, 15:17:51
  ~ pupil_player
MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version.
player - [ERROR] player_methods: No valid dir supplied (/opt/pupil_player/pupil_player)
player - [INFO] launchables.player: Session setting are from a  different version of this app. I will not use those.
player - [INFO] launchables.player: Starting new session with '/media/timopheym/Seagate Backup Plus Drive/Record data/Recovered data 07-26 17_28_12/result/disk/GoogleDisk/shebekeno/Record data/clear/23_july/eye_tracking/004'
player - [INFO] player_methods: Updating meta info
player - [INFO] player_methods: Checking for world-less recording
player - [INFO] player_methods: Updating recording from v1.4 to v1.8
player - [ERROR] launchables.player: Process player_drop crashed with trace:
Traceback (most recent call last):
  File "launchables/player.py", line 646, in player_drop
  File "shared_modules/player_methods.py", line 234, in update_recording_to_recent
  File "shared_modules/player_methods.py", line 599, in update_recording_v14_v18
  File "shared_modules/file_methods.py", line 146, in append
KeyError: 'topic'
user-d79ff5 27 July, 2018, 15:18:16

What can i do with that? I can send google drive forlder withone of records.

user-d79ff5 27 July, 2018, 15:18:30

Thanks!

papr 27 July, 2018, 15:22:52

I will have a look at it, thanks

user-d79ff5 27 July, 2018, 15:28:46

Please! We have 40 records, and need to open it in OGAMA to make a report, and it's looks like impossible now =((

user-d16d74 27 July, 2018, 15:30:48

Before installing the latest version of pupil software, should I uninstall the old versions?

papr 27 July, 2018, 15:31:01

@user-d16d74 This is not necessary

user-d16d74 27 July, 2018, 15:31:56

Thanks!

papr 27 July, 2018, 15:32:09

@user-d79ff5 I will write you a PM in order to not spam everyone with the details. I will share significant results here.

user-d79ff5 27 July, 2018, 15:32:31

👍

papr 27 July, 2018, 17:06:55

Ok, turns out that his recordings are older and that they had their surface key named differently than it was the case in v1.7. We will update the bundle soon.

user-d79ff5 27 July, 2018, 19:24:20

@papr When could it be possible to run it from sources? We don't have much time. I succeed to eport one record with 20GB Swap, but with a lot of pain with prev. version...

user-cc65ff 27 July, 2018, 20:44:53

hello, i'm new to this. Please can you guide me how to start using this? i would be grateful to you.

papr 27 July, 2018, 22:05:34

@user-d79ff5 the fix was pushed to master this afternoon. You should be able to run from source already

papr 27 July, 2018, 22:06:26

https://docs.pupil-labs.com/#getting-started

papr 27 July, 2018, 22:06:47

@user-cc65ff see the link above

user-24e31b 29 July, 2018, 04:50:51

Hi @papr is there an demo video which I can import into the pupil player and have a play around, explore with? I don't possess any glasses yet but awaiting approval from our finance department... =)

user-2c0e1f 29 July, 2018, 04:57:11

@papr Hello! New version hasn’t helped much. We tried to upload video in 1.7.1 version and 1.8 and it crashes in upload process ( error in transformation of world file). In 1.4 everything is ok, except for video in mp4, which cannot be played after export.

wrp 29 July, 2018, 05:13:06

Hi @user-2c0e1f we will look into this on Monday and will likely update the v1.8 release with new bundles with hotfixes to recent issues posted.

user-2c0e1f 29 July, 2018, 05:13:59

@wrp Ok, thanks

wrp 29 July, 2018, 05:14:41

@user-24e31b You here is a sample binocular dataset that you can open in Pupil Player - https://drive.google.com/file/d/0Byap58sXjMVfUFZMdUZzdTdjaFE/view?usp=sharing

wrp 29 July, 2018, 05:14:48

@user-2c0e1f thanks for the report.

user-82e7ab 30 July, 2018, 08:26:17

hi, I' just tried to run Pupil on a Windows 8.1 machine - without success. Are there any additional requirements compared to Windows 10? (I haven't had any problems on a Windows 10 machine, but now I have to use this system) Pupil Capture prints the following to its console:

C:\Users\show\Downloads\pupil_v1.8-16-g0ab50f4_windows_x64\pupil_capture_windows_x64_v1.8-16-g0ab50f4>pupil_capture.exe
MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version.
world - [INFO] launchables.world: Application Version: 1.8.16
world - [INFO] launchables.world: System Info: User: show, Platform: Windows, Machine: MASTER, Release: 8.1, Version: 6.3.9600
Running  PupilDrvInst.exe --vid 1443 --pid 37424
OPT: VID number 1443
OPT: PID number 37424
...
world - [ERROR] video_capture.uvc_backend: Init failed. Capture is started in ghost mode. No images will be supplied.
world - [INFO] camera_models: No user calibration found for camera Ghost capture at resolution [1280, 720]
world - [INFO] camera_models: No pre-recorded calibration available
world - [WARNING] camera_models: Loading dummy calibration
world - [WARNING] launchables.world: Process started.
Running  PupilDrvInst.exe --vid 1443 --pid 37424
OPT: VID number 1443
OPT: PID number 37424
...
eye0 - [ERROR] video_capture.uvc_backend: Init failed. Capture is started in ghost mode. No images will be supplied.
eye0 - [INFO] camera_models: No user calibration found for camera Ghost capture at resolution [320, 240]
eye0 - [INFO] camera_models: No pre-recorded calibration available
eye0 - [WARNING] camera_models: Loading dummy calibration
Running  PupilDrvInst.exe --vid 1443 --pid 37424
OPT: VID number 1443
OPT: PID number 37424
...

A few tries ago there also was an error about missing powershell version 5. It's not showing up anymore but maybe this helps?!

user-82e7ab 30 July, 2018, 09:06:41

the powershell error showed up again:

powershell.exe -version 5 -Command "Remove-Item C:\Users\show\Downloads\pupil_v1.7-42-7ce62c8_windows_x64\pupil_capture_windows_x64_v1.7-42-7ce62c8\win_drv -recurse -Force;Start-Process PupilDrvInst.exe -Wait -WorkingDirectory \"C:\Users\show\Downloads\pupil_v1.7-42-7ce62c8_windows_x64\pupil_capture_windows_x64_v1.7-42-7ce62c8\"  -ArgumentList '--vid 3141 --pid 25771 --desc \"Pupil Cam2 ID0\" --vendor \"Pupil Labs\" --inst' -Verb runas;"
world - [WARNING] video_capture.uvc_backend: Updating drivers, please wait...
Die Windows PowerShell-Version 5 kann nicht gestartet werden, da sie nicht installiert ist.
world - [WARNING] video_capture.uvc_backend: Done updating drivers!
world - [ERROR] video_capture.uvc_backend: Init failed. Capture is started in ghost mode. No images will be supplied.
world - [INFO] camera_models: No user calibration found for camera Ghost capture at resolution [1280, 720]
world - [INFO] camera_models: No pre-recorded calibration available
world - [WARNING] camera_models: Loading dummy calibration
world - [WARNING] launchables.world: Process started.
Running  PupilDrvInst.exe --vid 1443 --pid 37424
OPT: VID number 1443
OPT: PID number 37424
...
user-128c78 30 July, 2018, 09:29:25

Hi, Tnk u so much. Do you know starburst algorithm? what's your idea to find the start point if i use this algorithm?

user-82e7ab 30 July, 2018, 10:41:36

You seem to require powershell version 5. The Windows 8.1 instance I have to use here had only version 4 installed. Resolution was to install Windows Management Framework 5.1 (https://www.microsoft.com/en-us/download/details.aspx?id=54616)

wrp 30 July, 2018, 11:27:10

Thanks @user-82e7ab for following up on this. Please note that we only support Windows 10 - but nice to see your workaround.

user-82e7ab 30 July, 2018, 11:48:07

you do not support Windows 10?

user-0f7b55 30 July, 2018, 11:48:26

@user-82e7ab We support ONLY Windows 10

user-82e7ab 30 July, 2018, 11:48:40

😃

user-0f7b55 30 July, 2018, 11:48:49

@user-82e7ab He meant to say we don't support Windows 8

user-82e7ab 30 July, 2018, 11:48:59

I thought so 😉

user-82e7ab 30 July, 2018, 11:58:27

the topic of zmq transmitted gaze data (first part of a multi message received after subscription to gaze) changed from gaze to gaze.3d.{0,1,01} .. Now it matches the subject contained in the second part of the message, so I see that this makes more sense, but it took me some time to find this change.. maybe you could add a short note on this to the 1.8 release notes? Unless there's some other kind of changelog I missed.

papr 30 July, 2018, 12:17:37

@user-82e7ab This is an implicit outcome of what has been described under Developers notes > API changes in the release notes

user-82e7ab 30 July, 2018, 12:25:54

ah I see .. the changes to zmq_tools.Msg_Streamer.send() anyway .. I like the change as it gives more consistency in the data. Also the tracking is much more stable (from first short test runs). Nice work - thx!

user-82e7ab 30 July, 2018, 12:26:43

btw .. is it possible to stream gaze data from pupil player?

papr 30 July, 2018, 12:27:33

No, streaming from player is not possible

user-82e7ab 30 July, 2018, 12:34:11

ok, maybe you could think about this for a future release? It may be a personal preference, but I like the workflow of capturing some exemplary movements and then being able to use these in a loop while developing .. Yes- it's only a convenience feature, but a nice one imho - no need to put up and down an HMD every time testing a small code change and (especially in days like these) the cameras don't need to run and won't heat up that much.

papr 30 July, 2018, 12:53:33

If it is just for testing, you can use the Video File Source backend in the eye windows. Select the backend and drag&drop the eye videos onto the eye windows.

papr 30 July, 2018, 12:54:24

Unfortunately, there is no way to sync the video playback but the frames are published using the recorded timestamps

user-82e7ab 30 July, 2018, 13:09:27

ah that would be enough for my case .. thx ; )

user-2c0e1f 30 July, 2018, 13:18:22

@papr Hello! What is the average frame duration when recording in 200 Hz mode?

papr 30 July, 2018, 13:20:09

200 hz result in an average of about 0.005 (1/200) seconds per frame

user-988d86 30 July, 2018, 21:59:08

I'm looking at the blink detection plugin offline detector, and am trying to understand all that is happening within it. Can anyone explain what the filter/filter response is, and how that is used to determine the blink confidence? I'm combing through all of the code, but I'm having a hard time understanding this part of it.

papr 30 July, 2018, 22:08:23

@user-988d86 the whole idea is based on binocular confidence drops during blinks. We find them by applying a step filter to the confidence signal. The result shows us when the confidence dropped and spiked. The time between these events is the blink.

papr 30 July, 2018, 22:09:28

The online blink detector applies the filter on a moving window, while the offline detector applies it to the whole recording.

user-cd9cff 30 July, 2018, 23:30:24

@papr I heard that there was a problem with model variation when it came to collecting gaze data. Does that problem still exist? Will the model of the eye change from time to time when querying gaze data?

user-cd9cff 30 July, 2018, 23:31:10

If so, then what is the best way to query the position of an eye? I am using norm_pos, but it is not precise enough

user-d9bb5a 31 July, 2018, 07:19:17

Hello everyone! My name is Yulia! Friends are looking for people who also work with the data. Need a little help.

user-128c78 31 July, 2018, 07:56:09

hi every body, i have a question again 😦 in eye tracking, the pupil center is jumping alot and it isn't a fix point. what should i do?

papr 31 July, 2018, 08:00:57

@user-cd9cff Yes, the 3d model can change if new observations do not fit the old model anymore. This is not a problem but a feature to improve accuracy over time. The 3d model only has 3 paramaters: x/y/z coordinates of the eye sphere center. This is what we call the position of the eye. norm_pos is not the eye position but the normalized 2d gaze location within the scene camera video frame. Gaze estimation has an expected error of 0.5-1.0 degrees. As reference: 1 deg corresponds to the width of your thumb if stretch out your arm hold your thumb up.

papr 31 July, 2018, 08:02:26

@user-128c78 Could you provide an example recording and send it to [email removed] Please record the calibration procedure as well.

user-d79ff5 31 July, 2018, 08:48:53

@papr Do you plan release new version with the fix?

papr 31 July, 2018, 09:02:01

@user-d79ff5 Which fix?

papr 31 July, 2018, 09:04:16

Ah I remember. As I said, the fix is released on github if you want to run from source. I will have to check with my collegues for the bundled app release though. Thanks for reminding me.

papr 31 July, 2018, 09:19:19

@mpk will try and release a new bundle that includes the fix today

user-82e7ab 31 July, 2018, 11:31:31

@papr i tried to use recorded videos as source instead of local usb and it worked, thx again. Still, maybe you can add the option to loop a video file source in a future release?

mpk 31 July, 2018, 11:33:32

@user-82e7ab there is a loop option in the video file backend UI. Check the sidebar 😃

user-82e7ab 31 July, 2018, 11:40:32

😱

user-82e7ab 31 July, 2018, 11:41:09

sorry 😔

mpk 31 July, 2018, 13:57:34

@user-d79ff5 @papr I just uploaded bundles for Linux and Mac with fix.

mpk 31 July, 2018, 13:57:38

Windows will follow in the next 24hrs.

user-cd9cff 31 July, 2018, 15:46:37

@papr So, if I want to track the eye position so that it does not stray from a dot in the center of the screen, it would be better to use gaze_point , gaze_normals, or something else?

papr 31 July, 2018, 15:57:02

@user-cd9cff norm_pos is the derived 2d position from gaze_point

papr 31 July, 2018, 15:58:07

If you simply want to visualize the gaze within the recorded scene video, norm_pos is the way to go.

wrp 31 July, 2018, 17:00:44

v1.8-22 bundles have been uploaded and are available via the v1.8 release page - https://github.com/pupil-labs/pupil/releases/latest

End of July archive