πŸ‘ core


user-128c78 01 August, 2018, 08:39:46

@papr You can read this file, 2.7 section https://pdfs.semanticscholar.org/4167/7844556582adc68a5a14dbb1cea0b28d9016.pdf

What should I do for reducing the noise?

papr 01 August, 2018, 08:46:48

@user-128c78 Did you try applying the "second order Butterworth high-pass filter" as suggested in this section?

user-8779ef 01 August, 2018, 11:59:18

Hey guys - I'm going to add a feature request. I believe you are implementing the manual offset at the wrong time in the pipeline, which unnecessarily requires that we remap gaze vectors between adjustments. It would be much better if it were implemented at the time of drawing the gaze vector with a callback to redraw following an adjustment. The offset would also have to be evaluated at the time of data export.

user-8779ef 01 August, 2018, 11:59:58

The issue is that, for longer videos, an exploratory adjustment must be followed by 5+ minutes of reprocessing before you can see the results. Very inefficient.

papr 01 August, 2018, 12:10:42

Hey @user-8779ef Thank you for the input. I agree that this is a problem. Unfortunately, it is not that simple since "higher order" plugins (e.g. fixation detection, but as mentioned all drawing plugins and export plugins) would have to evaluate the callback all the time. Even if the gae data did not change. Not very efficient either.

The actual problem is that manual gaze correction (and Vis Scan Path) require manipulation of previously calculated data. This breaks our processing paradigm that does not support data manipulation. -- That's also the reason why these plugins are currently disabled.

user-8779ef 01 August, 2018, 12:13:17

Yes, I thought that might be a problem.

papr 01 August, 2018, 12:13:55

We still need to think of a good, general solution to this problem. A mapping callback (chain) is definitively a possible solution. The question is how to avoid redundant mapping calls

user-8779ef 01 August, 2018, 12:14:19

I have considered suggesting that you guys abandon the approach that all plugins auto-recalculate via callback upon a change to the data. Instead, you might have each plugin have a very clear indication that its data is out of date.

user-8779ef 01 August, 2018, 12:14:31

For example, a red glow on the icon in the tray.

user-8779ef 01 August, 2018, 12:14:44

...also, provide a single button to update all out-of-date plugins.

user-8779ef 01 August, 2018, 12:14:58

(rather than force the auto update upon each change to the data)

user-8779ef 01 August, 2018, 12:15:45

Again, this would help avoid the issue of unwanted and time consuming callbacks, e.g. fixation detection.

user-8779ef 01 August, 2018, 12:16:35

...also, perhaps provide a warning on data export that some plugins are out of date.

papr 01 August, 2018, 12:18:11

I will talk to my collgue that is responsible for UX and ask him what he thinks about this approach.

user-8779ef 01 August, 2018, 12:18:22

It would be a pretty big change πŸ˜ƒ

papr 01 August, 2018, 12:18:55

Yeah, that's why I want there to be a clear UX concept first before considering further steps πŸ˜‰

user-8779ef 01 August, 2018, 12:19:15

Most definitely. I haven't yet floated it to the folks over here, but I'll see what they think. Maybe they'll shoot it down for some reason I can't think of.

papr 01 August, 2018, 12:21:22

My biggest concern is that this approahc would not be inuitive. The softwrae knows that data is out of date but does not update automatically? Not something I would want. I think a clear processing chain as in Capture would be nice. We will see...

user-3f0708 01 August, 2018, 14:31:50

Has anyone used the .csv tracking data for graphing to validate data from multiple users?

user-ea779f 01 August, 2018, 14:50:26

We just got our first pupil labs htc add on and i have some questions. when i start pupil labs capture i can clearly see both camera inputs and can activate them both. both of my eyes are displayed but one of my eyes has a wrong direction. it is displayed upside down. maybe thats the reason why i can't calibrate in HMD eyes demo. and i recognized that the eye tracking add on is getting pretty hot.. Can anyone help me?

user-ea779f 01 August, 2018, 15:09:29

Okay i managed to flip the image

user-ea779f 01 August, 2018, 15:14:43

After starting Pupil Service i detect eye 0 and eye 1. Now i start Test Build 3D VR and it searches for pupil service running. after that it restarts the detection of eye0 and eye1 in pupil service but in the end there is no eye detected

user-ea779f 01 August, 2018, 15:15:33

Then i need restart with default settings

papr 01 August, 2018, 19:35:02

@user-ea779f the flipped image results from the physically flipped camera. This does not effect Pupil detection.

user-988d86 01 August, 2018, 19:38:21

Does anyone know why Pupil Capture would create a new 'recordings' directory out of nowhere? I just was debugging an application I built, and all it was supposed to do was connect to the backbone, listen for messages, and disconnect. I ran it (using video source in the eye windows) and it worked just fine. I didn't change anything, ran it again, and my recordings file was empty. The log says: world - [INFO] recorder: Created standard Rec dir at "/Users/katie/recordings"

Thankfully all of my videos are backed up, but this is rather frustrating.

papr 01 August, 2018, 19:41:06

@user-988d86 this is the default recording directory. What directory where you expecting? How did you set it? Manually or via notification?

user-988d86 01 August, 2018, 19:42:06

That is the recording directory I've been using, where my files were saved. It seems to have just recreated it, therefore wiping all of my previously recorded files from it.

papr 01 August, 2018, 19:44:55

Now I understand the issue. I have never encountered this issue. Are you able to reproduce this phenomenon?

user-988d86 01 August, 2018, 19:45:42

No because I don't have any recordings on my local machine now, and don't have any working Pupil cameras to record more video.

papr 01 August, 2018, 19:50:05

Did you rename the original recording folder by accident? Just making sure because the code would need to actually delete old files and I cannot remember this type of code in the recorder... πŸ€” I will have a second look at it tomorrow though to make sure that I am not wrong.

user-988d86 01 August, 2018, 19:58:26

Nope. I didn't make any changes to anything, including my code. I simply deleted my csv output file (which I've done many, many times) and re-ran my program.

user-988d86 01 August, 2018, 19:59:20

My colleague was experiencing similar problems on his machine, but I don't know details from what his problems were, aside from that he seemingly wasn't making any changes to Pupil related code, or to the Pupil application, or to the Pupil recordings folder.

wrp 01 August, 2018, 20:03:43

Hi @user-988d86 we will look into this issue tomorrow to see if we can replicate.

user-988d86 01 August, 2018, 20:05:17

@wrp thanks much. For context, I'm running on v1.7. Based on the release notes on v1.8, its not feasible for me to upgrade.

papr 01 August, 2018, 20:07:07

@user-988d86 based on the path above, am I correct to assume that you use macos?

user-988d86 01 August, 2018, 20:07:41

correct

papr 01 August, 2018, 20:08:24

Just for curiosity, what do you mean by not feasible to upgrade?

user-988d86 01 August, 2018, 20:11:59

The changes in the Developer notes about API changes and fixation format changes suggest that we would have to put some extra work into to our applications in response. We just don't have the time to devote to that right now.

papr 01 August, 2018, 20:12:38

OK, thank you for your feedback.

papr 01 August, 2018, 20:13:11

As wrp said, we will look into your issue tomorrow.

user-988d86 01 August, 2018, 20:13:20

Thanks!

user-5ccd98 01 August, 2018, 20:41:41

Hi, I just got my pupil headset. When I connected the headset to a MacBook Pro (High Sierra 10.13.6), the Pupil Capture showed that β€œCapture initialization failed”. I am wondering if there is any other thing needs to be done except for plugging into the laptop (like press a button?) and how to check if the device is connected to the laptop?

user-988d86 01 August, 2018, 20:43:14

@user-5ccd98 any time we have that problem, we just unplug/replug into the USB port and it seems to fix it. I'm sure you've already tried that, though.

user-5ccd98 01 August, 2018, 20:44:51

Is there a way to check if the headset is connected?

user-5ccd98 01 August, 2018, 20:45:32

Chat image

papr 01 August, 2018, 20:47:08

Click on the uvc manager icon on the right and check if the drop down menu lists any Pupil Cam cameras.

papr 01 August, 2018, 20:47:37

Make sure that the cable is correctly connected to to the hub of the headset.

user-5ccd98 01 August, 2018, 20:50:14

Chat image

user-5ccd98 01 August, 2018, 20:50:27

I guess this means not connected?

wrp 01 August, 2018, 20:51:09

@user-5ccd98 please could you try to firmly press the USBC cable into the clip of the Pupil headset - it may require a bit more force to ensure it is connected

user-5ccd98 01 August, 2018, 20:51:40

Oh!!!!!!!

user-5ccd98 01 August, 2018, 20:52:08

Got it!! Thank you so much!

papr 01 August, 2018, 20:52:41

Sorry for not being clear. Clip is much clearer than using the word hub.

wrp 01 August, 2018, 20:53:27

You're welcome @user-5ccd98 pleased that this could be resolved 😸

user-5ccd98 01 August, 2018, 20:58:59

One more quick question. I also want to record the data using pupil mobile app. But I only have an Android phone with micro-USB (not type-C). I am wondering if the pupil mobile Android app is goanna to work with this Android using a Type-C to Micro-USB cable.

papr 01 August, 2018, 21:00:20

This will unfortunately not work.

user-5ccd98 01 August, 2018, 21:01:06

So, I need to find an Android phone with Type-C, right?

papr 01 August, 2018, 21:01:31

See the official Pupil Mobile repository for a list of phones that are known to be working.

user-5ccd98 01 August, 2018, 21:01:32

Any other requirement?

user-5ccd98 01 August, 2018, 21:01:41

Cool.

user-5ccd98 01 August, 2018, 21:01:45

Thanks.

papr 01 August, 2018, 21:02:34

https://github.com/pupil-labs/pupil-mobile-app

user-5ccd98 01 August, 2018, 21:05:19

That's super helpful. Thanks papr.

user-5ccd98 02 August, 2018, 03:14:46

Hi, sorry to bother again. I came across another issue. When I dragged my recording folder to pupil player, the software quit, reopened and quit again. But if I dragged the sample data, it worked quite well. And I could not open the .mp4 file in my recording folder. I am wondering if I missed setting something?

papr 02 August, 2018, 04:35:39

@user-5ccd98 do you use Windows? There is a bug related to importing Pupil Mobile on Windows whose fix is currently in review.

papr 02 August, 2018, 11:52:36

@user-988d86 ok, I just checked and tested again: Our code does not overrite the recording folder. Did you by any chance run once from source and once from bundle? The default recording locaton is different for each of them!

  • /Users/user/recordings is the default bundle recording location
  • <source code location>/recordings is the default source recording location
user-8779ef 02 August, 2018, 11:55:31

Hey @papr, you spoke to one of my students yesterday (Rudra8), about coordinate transformations in pupil.

user-8779ef 02 August, 2018, 11:55:46

Let me tell you why he's asking all these questions, becauase he still has a bit of confusion.

papr 02 August, 2018, 11:56:35

Yes, I wrote him today. I reimplemented the angle calculation in Python: https://gist.github.com/papr/b0a59dc39d4b6d0ab773dc46eeff9773

The results are as expected. I think the matlab angle calculation implementation is wrong

user-8779ef 02 August, 2018, 11:56:43

YOu're the best.

user-8779ef 02 August, 2018, 11:57:34

I'll check in with him. Let me give you some quick context: he's running a simple test - having someone look at a target at eye-height, on the opposite side of the room. In this case, one would predict that the eye normals, after transformation into world space, would be approximately parallel (with the greatest magnitude along the Z axis).

user-8779ef 02 August, 2018, 11:57:51

However, he's not finding that, and I think the poor guy is losing sleep over it.

user-8779ef 02 August, 2018, 11:58:13

WOrld space = head space.

papr 02 August, 2018, 12:00:27

Yeah, I totally understood the issue and this assumption is correct. In practice it is very difficult though to reach parallel gaze normals. But the angles are <5 degrees for these cases.

user-8779ef 02 August, 2018, 12:01:02

Unfortunately, his vectors are apparently much further apart than that. He knows the math - he's very sharp.

user-8779ef 02 August, 2018, 12:01:22

I'll have him try the same calculation in Python, or using different matlab functions.

papr 02 August, 2018, 12:01:51

Yeah, I suggested to use pdist with the cosine metric instead of the custom angle calculation

user-8779ef 02 August, 2018, 12:02:04

Thanks for that. Ok, lets see what he does.

user-8779ef 02 August, 2018, 12:02:10

or, how that approach works out.

user-8779ef 02 August, 2018, 12:04:04

A while back you tol dme you were going to try debugging in pycharm on osx again (or something like that ). Any luck?

user-8779ef 02 August, 2018, 12:04:20

I need to venture back into the world of plugin development (refining my inter-saccadic interval detector)

papr 02 August, 2018, 12:05:50

I am using Visual Studio Code. I talked to one of the VSCode Python extention core developers at EuroPython last week. Multi-processing debugging is disabled right now but will come after they release the stable version of their new debugger.

user-8779ef 02 August, 2018, 12:06:23

Multii-processing debuggin is disabled in VSCode, or in Pycharm?

papr 02 August, 2018, 12:06:35

in VSCode

papr 02 August, 2018, 12:06:41

I don't use PyCharm but my collegues do without problem.

user-8779ef 02 August, 2018, 12:07:09

I don't get that. My breakpoints withini a plugin do not halt the program. It blows right through them.

user-8779ef 02 August, 2018, 12:07:20

shrug I must be missing something.

papr 02 August, 2018, 12:08:05

You do run the app from within PyCharm, corret?

user-8779ef 02 August, 2018, 12:08:23

Yes. I also have "attache debugger to process" enabled.

user-8779ef 02 August, 2018, 12:08:47

Chat image

papr 02 August, 2018, 12:08:51

My collegue tells me that there are two modes to run a script in PyCharm: normal and debug. You need to run the script in debug mdoe in order to halt at the breakpoints

user-8779ef 02 August, 2018, 12:09:06

Yes. It used to work just fine . Somewhere along the line that functionality stopped.

user-8779ef 02 August, 2018, 12:09:34

....I should grab another package with subprocesses to see if its Pupil specific.

papr 02 August, 2018, 12:10:35

Yeah, good idea

user-d90de9 02 August, 2018, 12:30:47

Hi there, I'm having trouble running the HTC Vive test build. Each time the connection between test build and pupil capture (or puli service) is build, the eye cameras crashes. After that I am not able to restart the cameras in the general settings. Nevertheless, the demo scene performs the calibration and fails in the end. The following error is shown in the pupil capture console:

eye0 - [ERROR] launchables.eye: Process Eye0 crashed with trace: Traceback (most recent call last): File "launchables\eye.py", line 555, in eye File "shared_modules\zmq_tools.py", line 144, in send File "msgpack__init__.py", line 47, in packb File "msgpack_packer.pyx", line 284, in msgpack._packer.Packer.pack File "msgpack_packer.pyx", line 290, in msgpack._packer.Packer.pack File "msgpack_packer.pyx", line 287, in msgpack._packer.Packer.pack File "msgpack_packer.pyx", line 234, in msgpack._packer.Packer._pack File "msgpack_packer.pyx", line 263, in msgpack._packer.Packer._pack File "msgpack_packer.pyx", line 281, in msgpack._packer.Packer._pack TypeError: can't serialize <MemoryView of 'array' at 0x225faf41be0>

Same for the second eye.

I am running latest builds I found with pupil_capture_windows_x64_v1.8-22-gdcb17d1 and Test.Build.3D.VR v0.5.1 on Windows 10. The same problems occured with earlier builds I tested (v1.8-16 and v0.5). I also have reinstalled the drivers without success.

Any help would be appreciated.

papr 02 August, 2018, 12:32:19

This is the same error as above. I fixed it already. The fix will be released this afternoon.

papr 02 August, 2018, 12:34:03

By above, I mean as reported in πŸ’» software-dev

user-d90de9 02 August, 2018, 12:34:53

Ok great, thank you

user-5ccd98 02 August, 2018, 13:29:52

Hi papr, I am not using Windows. I used Macbook Pro (High Sierra 10.13.6). And I just used Capture to record, not Pupil Mobile

user-ef565b 02 August, 2018, 14:57:27

Hello. We are trying to run our Pupil Labs eyetracker through a Python script in OpenSesame. We were going to use the remote annotation example code that was on GitHub as a starting point, but the links we found in the Pupil Docs and online all try to take us to a page that doesn't seem to exist anymore on GitHub. Is there a new location for the annotation code? Thank you!

papr 02 August, 2018, 15:00:01

@user-ef565b the script is still in the same repository. We moved them into the python folder. Also, we will have to update the links in the docs.

user-ef565b 02 August, 2018, 15:01:18

Perfect! Thanks!

user-988d86 02 August, 2018, 15:04:17

@papr Nope, I haven't run from source in many months. I was simply running Pupil Capture 1.7 as I normally do.

papr 02 August, 2018, 15:05:42

@user-988d86 You mentioned that you have a script that interacts with Capture over the IPC? Do you start and stop recordings with it?

user-988d86 02 August, 2018, 15:06:43

@papr It has that ability, but I was not utilizing it at the time. All I was doing was subscribing to notifications and processing the payloads as they come in.

user-988d86 02 August, 2018, 15:09:05

@papr I also know that my colleague, who experienced a similar problem, was not running from source and was not recording eye video.

user-ef565b 02 August, 2018, 15:39:25

Hello again. We're running into a new problem. We found the annotations.py script on GitHub and are attempting to use it, but the pyglui package it uses isn't working correctly for some reason. We forked and cloned pyglui from GitHub, installed glew, installed the pyglui package itself using pip, and ran the setup.py script to build it, but the annotations script is still throwing errors. It looks like the ui part of the package is missing and the other modules are also not building correctly. Have we missed something? (We are running everything on a Windows 10 machine with Python version 3.7). Thank you!

mpk 02 August, 2018, 15:54:04

@user-ef565b if you run pupil capture from bundle you will only need msgpack and pyzmq to run the remote annotations script: https://github.com/pupil-labs/pupil-helpers/blob/master/python/remote_annotations.py

mpk 02 August, 2018, 15:54:12

or am I misunderstanding something?

user-ef565b 02 August, 2018, 15:59:37

This is actually a different script than the one we were trying to use. We must have missed it on GitHub. We've already installed pyzmq and can get msgpack. Just as a clarification, what do you mean by running Capture from bundle? (Sorry, we were using another eyetracker that worked with PyGaze through OpenSesame and are new to this method).

mpk 02 August, 2018, 18:41:34

@user-d90de9 the release for linux and mac has been uploaded now.

user-988d86 02 August, 2018, 19:25:47

@papr update: i just found all of the contents of my recordings folder within the directory that my script was in, in a directory called 'pupil.' Again, I was running Pupil Capture and my script completely as normal. I did not have any functions within my script to move any files, nothing. Overall a mystery! ha

wrp 02 August, 2018, 19:26:27

@user-988d86 can you make a gist with your script and share the link to @papr

papr 02 August, 2018, 19:30:13

@user-988d86 @wrp yes, I would like to see that :)

user-cc04d9 02 August, 2018, 20:33:12

Hey everyone!! Just got the headset in the mail. Would really appreciate if one of you could walk me through the Pupil application setup process

papr 02 August, 2018, 20:58:36

@user-cc04d9 https://docs.pupil-labs.com/#getting-started

user-cc04d9 02 August, 2018, 20:59:05

Thank you!!

user-ef565b 02 August, 2018, 21:05:08

Quick question. We've gotten the notification code working, but we're unclear on exactly how Pupil is storing/incorporating the notifications. Which file do they go into? Thank you again!

user-ef565b 02 August, 2018, 21:06:12

*annotation code. Our apologies.

papr 02 August, 2018, 21:10:24

@user-ef565b They are stored in the notifications.pldata file (on version 1.8). They should show up in Player if you open the recording and load the Player Annotation plugin

user-810714 02 August, 2018, 22:00:14

Hello, is there a way of generating a heat map?

user-810714 02 August, 2018, 22:31:13

How do I open a video in Imotions?

wrp 03 August, 2018, 06:06:24

@user-810714 Heat maps - You can generate heat maps if you use markers within your scene. Please see this section of the docs: https://docs.pupil-labs.com/#surface-tracking

wrp 03 August, 2018, 06:08:23

@user-810714 You can use the iMotions Exporter plugin in Pupil Player to export data to a format that can be imported by iMotions software.

user-14d189 03 August, 2018, 11:29:53

Hi there ,

user-14d189 03 August, 2018, 11:37:17

just a qustinon to some 2D recordings. Can you get a 3D gaze data in later processing? I set pupil producers to offline pupil detection and if I set gaze producers to offline calibration then the error comes up.

user-14d189 03 August, 2018, 11:37:19

circle_detector - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend circle_detector - [ERROR] libav.mjpeg: unable to decode APP fields: Invalid data found when processing input circle_detector - [INFO] camera_models: Previously recorded calibration found and loaded! circle_detector - [INFO] launchables.marker_detectors: Starting calibration marker detection... player - [INFO] gaze_producers: Calibrating section 1 (Unnamed section) in 2d mode... Not enough ref point or pupil data available for calibration. Calibration failed: Not enough ref point or pupil data available for calibration. player - [INFO] gaze_producers: Cached offline calibration data to C:\Users\p.wagner\Documents\phd\pupillab recordings\015\offline_data\offline_calibration_gaze

user-14d189 03 August, 2018, 11:37:53

Thanks for looking into it.

user-810714 03 August, 2018, 14:33:36

@wrp Thank you very much. Already set with the imotions exporter. Great plugin.

user-ef565b 03 August, 2018, 21:40:14

Any ideas why the annotations plugin would work perfectly when running in a script through Jupyter Notebook but not through the exact same code run through Python in OpenSesame? The in line Python script will start and stop the recording with no problems, but none of the annotations commands work through it.

user-ed8563 05 August, 2018, 00:31:30

Hi there, I'm new to marvelous pupil software! A question: I need to measure the pupil diameter and stop! No world camera nor gaze or other stuff; neither I need to store video/frames but just the pupil diameter values; please could you suggest me the best way to get rid of all unnecessary features in order to have a simpler and more performant environment? Thanks really much!!!!

user-ed8563 05 August, 2018, 00:44:14

Just to say a few words about my project: I'm interfacing a diy camera mounted in a Samsung GearVR; the aim is not to see the gaze to interact with the VR but just measure the pupil diameter during a neuroscience/psychophysic test. I plan to use an S7 phone with GearVR to show the stimuli while the eye-camera connected via USB to a MacbookPro; then I have to find a way to "collect" data from the VR and from PupilCapture/Service and import all into Matlab for analysis... I would be the top to be able to refactor the Eye module to be executed directly in the S7 VR (I'm currently planning to work with Unity3D that communicates with pupil capture over Zmq protocol).... Any piece of advice or suggestion would be really precious!!!!!

user-8779ef 05 August, 2018, 00:46:57

@papr Papr, can you point me towards the method used to visualize fixations in the exported video (via video export launcher)?

user-4b5226 05 August, 2018, 05:21:42

Has anyone installed Machine Learning components (AI) towards their algorithms based on Pupil Lab’s data ?

user-4b5226 05 August, 2018, 05:23:54

For better words, towards a specific goal for their purposes ?

user-8779ef 05 August, 2018, 19:11:30

@papr Ah, so The exporter just calls each module's recent_events() ? THanks.

user-8779ef 05 August, 2018, 19:17:15

No, that doesn't seem to be it. My visualization works in pupil player, but not on export. So, I'm still missing a piece of the puzzle

user-8779ef 05 August, 2018, 19:20:57

I have updated my offline saccade and inter-saccadic-interval detector to be much more efficient, and to run as a background function.

user-8779ef 05 August, 2018, 19:21:42

...but, trying to compare the ISI detector to your fixation detector by exporting the same vid with fixation visualization / isi visualization.

user-bbb364 05 August, 2018, 19:29:18

hi!

user-bbb364 05 August, 2018, 19:29:37

does this software show where im looking on screen? i have both python and C++ experience

user-bbb364 05 August, 2018, 19:29:46

if not, is there any way this can be achieved

user-bbb364 05 August, 2018, 19:30:12

doesnt need to be realtime, i can just relay video over top of the eye tracking data

papr 05 August, 2018, 20:51:15

@user-8779ef mmh, you are right that this should work. I will have a look at it tomorrow

papr 05 August, 2018, 20:51:54

@user-bbb364 yes, this works if you use our surface tracking and define the screen as a surface

papr 05 August, 2018, 20:52:07

Realtime as well as offline

user-8779ef 06 August, 2018, 00:43:53

Thanks, @papr papr.

user-c351d6 06 August, 2018, 12:48:24

Hi guys, should the surface definitions which I define in pupil player work in pupil capture?

papr 06 August, 2018, 12:48:45

No, but the other way around should work

user-c351d6 06 August, 2018, 12:49:08

Ah ok. Damn.

papr 06 August, 2018, 12:50:25

To clarify: If you 1. define a surface in Capture 2. make a recording 3. open that recording in Player 4. load the Offline Surface Tracker

then it should recognize the defined surfaces in step 1

user-c351d6 06 August, 2018, 12:58:14

Is there a way to get it working the other way around?

papr 06 August, 2018, 13:06:56

@user-c351d6 No. The reason why the way above works is that Capture stores the surface definitions in a file within the recording which Player can read.

Where would Player place such a file for Capture?

user-c351d6 06 August, 2018, 13:26:07

There is a surface_definitions file in the settings_folder of Pupil Capture. I thought that's possibly the file to store the definitions of surfaces.

papr 06 August, 2018, 13:28:14

Yeah, but then Player would be writing into the capture settings folder. Also, different recordings might use the same markers for different surfaces.

I think the general solution here is to have a clear surface definition format that can be copied explicitly into the capture folder.

papr 06 August, 2018, 13:28:46

Also, keep in mind, that the usual work flow is to use Capture and afterwards Player, not the other way around.

user-c351d6 06 August, 2018, 13:32:52

I thought Pupil Caputre might use the surface_definitions file from the settings folder as standard definitions and will copy it to the recording folder. I also hoped that pupil caputre store the "surface detection information" then in the recording folder as well.

papr 06 August, 2018, 13:40:02

That is actually what happens, yes. Maybe you are able to copy the surface definitons from Player back into the Capture folder. But I don't know if this works and it is definitively not supported officially.

user-c351d6 06 August, 2018, 13:41:06

Yes, for some reason it does not work.

user-c351d6 06 August, 2018, 13:41:48

Sorry for the amount of bug reports today. But we've encountered a lot this morning.

papr 06 August, 2018, 13:42:33

Don't worry, that is great! this is one of the big reasons to have a community project: Reporting bugs is a good way to contribute!

user-c351d6 06 August, 2018, 13:44:44

I actually plan to contribute more in the next months by building a smart glasses plugin for the Vuzix

papr 06 August, 2018, 13:48:14

@user-c351d6 That is great to hear

papr 06 August, 2018, 13:48:55

Feel free to add your repository to https://github.com/pupil-labs/pupil-community

papr 06 August, 2018, 13:49:22

Btw the password for this issue's recording does not work: https://github.com/pupil-labs/pupil/issues/1250

papr 06 August, 2018, 14:33:55

Thanks, that worked

user-c351d6 06 August, 2018, 14:36:02

@papr It seems like it's realy hard to run this experiment...

papr 06 August, 2018, 14:36:50

Yeah, your setup is quite impressive.

user-c351d6 06 August, 2018, 14:42:20

However, the issue that droping frames are breaking a whole recording might be a show stopper for simpler field experiments using pupil mobile as recording device as well.

papr 06 August, 2018, 14:43:49

Dropping a few frames within a 20 minutes recording should be fine. What is not ok is that Player crashes because of it.

papr 06 August, 2018, 14:44:35

I also highly encourage you to include the calibration procedure in the recording in order to be able to do offline calibration.

papr 06 August, 2018, 14:47:24

@user-c351d6 did you see any exceptions being logged Capture during the recording?

user-c351d6 06 August, 2018, 14:48:37

Yes, there were some. I was not smart enough to save to log. It's saves always just the last one, correct?

papr 06 August, 2018, 14:49:10

That's correct. Maybe this behaviour needs to change in the future.

user-c351d6 06 August, 2018, 14:50:19

Having it for each recording whould be better. However, than you should make sure that this files are not growing to large. I'll try to reproduce this when I got time for it.

papr 06 August, 2018, 14:50:33

Technically, if there is a broken frame, the exeption is logged and the frame dropped before saving it. I am unsure why there are broken frames in the recording nonetheless πŸ€”

papr 06 August, 2018, 14:51:12

@user-c351d6 This is actually a great idea. Any warning or error messages during a recording should be logged

user-5ccd98 06 August, 2018, 19:14:41

Hi papr, me again. I still cannot load the recording folder to Pupil Player

user-5ccd98 06 August, 2018, 19:15:19

And I cannot open the world.mp4 and eye0.mp4 with quicktime

user-5ccd98 06 August, 2018, 19:15:30

Any suggestions to solve this?

papr 06 August, 2018, 20:41:37

@user-5ccd98 can you quickly remind me: Did you record using Pupil Mobile? Does the video open in VLC?

papr 06 August, 2018, 20:42:05

And did you send the recording to [email removed]

user-5ccd98 06 August, 2018, 20:45:34

No, I just used the Pupil Capture

user-5ccd98 06 August, 2018, 20:51:06

I tried the VLC

user-5ccd98 06 August, 2018, 20:51:21

the video can be played

user-5ccd98 06 August, 2018, 20:52:50

But Pupil Player still crashed after I dragged the folder into it.

user-5ccd98 06 August, 2018, 20:55:39

I can send the recording folder to the email address. Would you please take a look?

papr 06 August, 2018, 21:06:51

Yes, please do so

user-5ccd98 06 August, 2018, 21:09:39

Sent you.

user-5ccd98 06 August, 2018, 21:10:15

My pupil player worked perfectly with the sample data downloaded from the website...

user-5ccd98 06 August, 2018, 21:10:53

Do no know what is wrong here. I have tried to use different resolution and fps

papr 06 August, 2018, 21:48:28

@user-5ccd98 The recording opens fine for me when I run from source. I will quickly install the newest bundle and check again.

user-5ccd98 06 August, 2018, 21:49:35

Is it because my operating system? I am running it on High Sierra 10.13.6

papr 06 August, 2018, 21:49:58

I am using macOS 10.13.5 right now

papr 06 August, 2018, 21:52:41

@user-5ccd98 I have just installed Pupil Player v1.8-26 and it works as expected

user-5ccd98 06 August, 2018, 21:53:42

I used v1.8-22

user-5ccd98 06 August, 2018, 21:53:57

I will try 26 immediately/

user-5ccd98 06 August, 2018, 22:31:29

It worked!!! Thank you so much...

user-5ccd98 06 August, 2018, 22:31:42

@papr

user-2da779 07 August, 2018, 06:36:29

Hi, I have been using the new of pupil capture v1.8.26. I tried playing some videos on the new version of pupil player, however, only the eye loads and not the wold view. I get the following error-

user-2da779 07 August, 2018, 06:37:10

error message

MainProcess.docx

user-2da779 07 August, 2018, 06:37:49

And when I try playing it on the older version (v.1.7) of pupil player, it crashes.

wrp 07 August, 2018, 06:38:18

@user-2da779 can you please try deleting pupil_player_settings folder

wrp 07 August, 2018, 06:38:27

and then relaunching Pupil Player

user-2da779 07 August, 2018, 06:41:51

ok, will try that now

user-2da779 07 August, 2018, 06:43:40

now it just crashes

wrp 07 August, 2018, 06:44:15

@user-2da779 what OS are you using?

wrp 07 August, 2018, 06:44:52

nevermind - i see that this is Windows

wrp 07 August, 2018, 06:46:24

@user-2da779 can you send the log again for the most recent attempt after restoring default settings?

wrp 07 August, 2018, 06:46:42

It looks like there was a problem reading the audio file based on your first log

user-2da779 07 August, 2018, 06:49:05

ok I will send that

wrp 07 August, 2018, 06:59:23

@user-2da779 this happens with just this one recording or with multiple? Did you record this with Pupil Capture or Pupil Mobile? It looks like the audio file could be corrputed - based on the log message. Transcoding the audio file could resolve this issue. Before further speculation, I will wait to see your log and response.

user-2da779 07 August, 2018, 07:30:53

This has happened with several recordings but only for pupil mobile. The recordings from laptop play perfectly fine. I have recorded another sessions without the audio capture plugin and they seem to be running perfectly fine.

user-2da779 07 August, 2018, 07:31:35

These are the latest logs from pupil capture (laptop) and pupil player (for mobile recording). they seem to be running fine

pupilcapture.docx

user-2da779 07 August, 2018, 07:31:39

pupilplayer.docx

wrp 07 August, 2018, 07:32:26

@user-2da779 Thanks for the update, however you have sent the pupil_capture log and not the pupil_player log

wrp 07 August, 2018, 07:33:04

What version of Pupil Mobile are you using?

user-2da779 07 August, 2018, 07:33:13

pupilplayer.docx

user-2da779 07 August, 2018, 07:33:40

app version 0.17.1

wrp 07 August, 2018, 07:33:59

@user-2da779 please update Pupil Mobile

wrp 07 August, 2018, 07:34:13

the latest version is v0.31.0 and allow for auto-updates of Pupil Mobile on your android device

wrp 07 August, 2018, 07:34:39

Now, in order to salvage your existing audio files, you will probably need to transcode these using a tool like ffmpeg

user-2da779 07 August, 2018, 07:35:23

ah! and would that bring the world camera back, as well?

wrp 07 August, 2018, 07:35:51

@user-2da779 I see no error in your log file

user-2da779 07 August, 2018, 07:37:39

This is from the session that I just recorded. It seems to run fine now that I removed the audio capture plugin.

wrp 07 August, 2018, 07:39:16

Ok, thanks

user-2da779 07 August, 2018, 08:49:23

@wrp I have ran a few session after removing the audio capture plugin and its been running fine on pupil player v1.8.26. I haven't updated the pupil mobile yet. Before I update it, just to confirm, would that solve the issue of audio capture then?

wrp 07 August, 2018, 08:53:53

@user-2da779 please could you try making a test recording using the latest Pupil Mobile - I can not reproduce this issue locally.

user-2da779 07 August, 2018, 09:01:19

ok. I will do that and send it.Thanks! πŸ˜ƒ

user-29e10a 07 August, 2018, 09:15:03

@wrp with v1.7 almost every video file got an "moov atom not found", as it is this files from @user-2da779 ... with v1.8 we didn't see this again (yet) ... there are possibilities to repair these files, via ffmpeg

user-29e10a 07 August, 2018, 09:15:18

it has something to do with missing or messed up headers

wrp 07 August, 2018, 09:15:45

@user-29e10a Thanks for following up. Yes, this is due to the file container not being closed properly (missing headers like you noted)

wrp 07 August, 2018, 09:16:17

@user-29e10a you also were using Pupil Mobile to make the recordings, correct?

user-c351d6 07 August, 2018, 11:47:17

@papr I just was thinking about our experiment and the problems. Surface tracking with compressed files seems to be quite difficult because the surfaces can only be tracked when the head is not moving so fast. Otherwise, due to the compression, the markers can't be tracked. However, it's is actually only necessary to track a surface during a fixation. A fixation usually only happens when the head does not move. Also, the analysis (heatmaps) should just consider the data of fixations because of the human brain which can only process information during fixations. How is this actually at the moment implemented?

To get, for example, the surface distribution only during fixations, do we have to work with the output files from surface tracking and fixation detection?

papr 07 August, 2018, 11:50:14

The offline surface tracker exports fixations on surfaces but the heatmaps use all available gaze, not only fixations

user-c351d6 07 August, 2018, 11:53:30

Only when the offline fixation plugin is activated?

papr 07 August, 2018, 11:54:29

Correct

user-8779ef 07 August, 2018, 16:10:07

@user-c351d6 I challenge your statement that a fixation usually only happens when the head doesn't move. During natural behavior, fixations often arise during VOR.

user-8779ef 07 August, 2018, 16:11:01

or, if the person can hold what they are interacting with, a combined eye and head movement can track the movement of the held object

user-8779ef 07 August, 2018, 16:11:53

Don't just assume that won't happen. If i were to review a paper that clung to that assumption without proper citations, or admitting that it is a limitation of their system, I would challenge the assertion during the review process.

user-8779ef 07 August, 2018, 16:12:25

Better you hear it from a snotty person on discord than a reviewer πŸ˜›

user-29e10a 07 August, 2018, 16:36:45

@wrp no I was not. Simple pupil capture (although controlled via network, I started and stoppes the recording via netmq)

wrp 07 August, 2018, 16:47:32

Thanks @user-29e10a we will try to reproduce this

user-8779ef 07 August, 2018, 18:46:01

Hey guys, my student reports that one of our cameras has been giving us less reliable tracks. It seems the LED is not perfectly seated in its housing. I am tempted to try and disassemble the camera housing and push it back in. Do you think this is a smart move, or am I likely to break something?

Chat image

user-8779ef 07 August, 2018, 18:46:24

(i just unscrewed the screw a bit - that's not an issue)

user-c351d6 08 August, 2018, 11:00:16

@user-8779ef Thanks for the information. That was just an assumption based on guessing. I'm actually not that deep into the literature so far but I also wouldn't write this in a paper without any reference. My part in this project is so far just a technical one.

user-c351d6 08 August, 2018, 11:00:56

Do you guys have any glue what happend here?

Chat image

user-c351d6 08 August, 2018, 11:02:07

The left surface is still tracked with two markers but it is starting to "fly around" in the FOV.

wrp 08 August, 2018, 11:03:46

@user-8779ef regarding the camera led - please send (or ask your student to send) an email to info@pupil-labs.com - and we can coordinate a remote repair or return/repair if needed.

wrp 08 August, 2018, 11:05:54

@user-c351d6 a short recording would be useful (if possible) send to data@pupil-labs.com so we can provide feedback

user-c351d6 08 August, 2018, 11:08:31

I can't provide you a short one but the full one and the times when it happens. Are you actually saving the marker cache once it has calculated completly? I just had it filled completly and than pupil player crashed and it was gone....

wrp 08 August, 2018, 11:12:00

@user-c351d6 I'm AFK now, @papr can you provide a response to this when you are available?

user-8779ef 08 August, 2018, 14:30:16

@wrp Thanks. I'll coordinate soon.

user-8779ef 08 August, 2018, 14:30:47

Guys, having a pyglui error in 1.7.42 This worked previously.

user-8779ef 08 August, 2018, 14:30:50

2018-08-08 10:28:51,592 - player - [DEBUG] plugin: Loading plugin: Vis_Circle with settings {'radius': 16, 'color': [0.0, 0.7, 0.25, 0.2], 'thickness': 2, 'fill': 0} 2018-08-08 10:28:51,593 - player - [ERROR] launchables.player: Process Player crashed with trace: Traceback (most recent call last): File "launchables/player.py", line 420, in player File "pyglui/ui.pyx", line 256, in pyglui.ui.UI.configuration.set (pyglui/ui.cpp:12585) File "pyglui/ui.pyx", line 248, in pyglui.ui.UI.set_submenu_config (pyglui/ui.cpp:12362) File "pyglui/menus.pxi", line 704, in pyglui.ui.Scrolling_Menu.configuration.set (pyglui/ui.cpp:63029) File "pyglui/menus.pxi", line 87, in pyglui.ui.Base_Menu.set_submenu_config (pyglui/ui.cpp:50999) File "pyglui/menus.pxi", line 482, in pyglui.ui.Growing_Menu.configuration.set (pyglui/ui.cpp:59207) File "pyglui/menus.pxi", line 87, in pyglui.ui.Base_Menu.set_submenu_config (pyglui/ui.cpp:50996) IndexError: pop from empty list

user-8779ef 08 August, 2018, 14:31:07

Delete user settings, maybe?

papr 08 August, 2018, 14:32:20

Did you double-click on the close button in the vis circle menu?

papr 08 August, 2018, 14:32:35

Or does this happen on load?

user-8779ef 08 August, 2018, 14:32:47

Yeah, that worked.

papr 08 August, 2018, 14:32:48

If the lather is true, deleting the user settings should help

user-cd9cff 08 August, 2018, 16:10:15

@papr Hello, I am having trouble accessing the base_data key in the gaze datum

user-cd9cff 08 August, 2018, 16:10:51

I can use this command(base = msg['base_data'][0]) to print out the whole datum

user-cd9cff 08 August, 2018, 16:11:02

but I cannot seem to just get norm_pos from it

user-cd9cff 08 August, 2018, 16:21:28

Nvm I got it

user-20de15 08 August, 2018, 18:45:40

Hello, what is the standard value of Maximum dispersion at Fixation detector? and why?

papr 08 August, 2018, 22:05:25

@user-c351d6 I will check and answer to your github issue tomorrow.

user-965faf 09 August, 2018, 09:22:02

Has anyone here implemented Pupil in psychological attention tasks to analyse PD?

user-239f8a 09 August, 2018, 09:32:02

I am studying impact of startling experiences on pilot performance in a simulated flight task.

user-965faf 09 August, 2018, 09:33:56

Sounds interesting. How long is your task? I'm studying PD fluctuations during sustained attention and vigilance tasks

user-239f8a 09 August, 2018, 09:43:57

Not far off what I'm looking at, perhaps even more similar actually. In essence, depending on competence level and confidence, tasks can run typically between 3mins and 12mins. Test participants have 15runs at the same (3stage task) and task scores collated for key performance indicators, which we believe are aligned with the concept of visual acuity. Scores are a reflection of how accurate the participant's read out of instruments are. Currently, I'm focusing on any correlations with fixations in the first instance.

wrp 09 August, 2018, 09:51:14

@user-965faf and @user-239f8a if you haven't already - I would suggest taking a look at papers that cite Pupil (some of them may be of use to your research area) - https://docs.google.com/spreadsheets/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/edit?usp=sharing

user-239f8a 09 August, 2018, 09:52:38

@wrp sure. Thanks for the link.. I am familiar with this database. It has definitely been very helpful. Cheers.

wrp 09 August, 2018, 09:53:28

BTW - if anyone here has papers that are not in this list, please contact me and we can add it 😸

user-965faf 09 August, 2018, 10:03:41

@user-239f8a That's really interesting, and it does sound similar. I need a profile of baseline PD (tonic) and task-responsive PD (phasic) fluctuations across a 25 min Sustained Attention to Response Task (SART) and a 30 min Attention Network Task (ANT), so particularly interested in how these designs and particularly data export - previously conducted in E-Prime and Tobii - are comparable to the Pupil Capture and output. Not familiar with how similar the timestamps, PD validity codes, and PD diameter output will be in attention tasks to that of the Tobii/E-Prime output. What experimental design software did you use? @wrp Thanks wrp, will check that database out now, big help!

wrp 09 August, 2018, 10:04:43

You're welcome @user-965faf 😸

user-239f8a 09 August, 2018, 10:47:47

@user-965faf . I haven't really looked at my case study from the perspective of PD in this case,.. might look at that post doc. Indeed, such comparative insight as you describe could be invaluable. Having said that, I haven't used much software for my experimental design per se. I have primarily approached my study from first principles of factors, levels and responses. Given a reasonable background in statistical analysis also has been handy with hypotheses etc . On the software front, I am using a fuzzy cognitive map approach to analyse causality. I found a tool aptly called FCM Expert developed by some other researchers at Hasselt University for this purpose. The rest of my analysis is based on fixation exports and writing custom python scripts for analysis added to good ol' experimenting and documenting results and responses in transcripts. Along the way however, I did find an interesting read called "Automatic Stress Classification with Pupil Diameter Analysis" by a Marco Perrotti et al. Perhaps this might also be a good rabbit hole to go down. Currently on the 9-5, so apologies for slow responses. Best Regards.

user-8779ef 09 August, 2018, 11:43:05

Hey folks - Yesterday, I began the slow process of checking track quality and exporting fixations for about 30 recorded sessions, 20 mins each. My player has crashed about 10 time in the process due to this bug alone: https://github.com/pupil-labs/pupil/issues/1256

user-8779ef 09 August, 2018, 11:43:51

I've also had to delete player settings about 5 times to account for this issue: https://github.com/pupil-labs/pupil/issues/1258

user-8779ef 09 August, 2018, 11:44:23

(which prevents player from starting up). This is 1.7.42 on OSX High SIerra.

user-8779ef 09 August, 2018, 11:44:41

Overall, of course, very happy with the results despite these setbacks.

papr 09 August, 2018, 12:01:13

@user-8779ef i.r.t. #1258: Could you provide a list of plugins that you use/are loaded? Also, please upload one of the "broken" user_settings_player files.

papr 09 August, 2018, 12:02:25

i.r.t #1256: Do you need audio export? If not, you could rename the audio file to audio_.mp4 as a temporary work-around.

user-8779ef 09 August, 2018, 12:05:55

@papr Thanks. I'll upload the log and user_settings_player files upon next crash.

user-8779ef 09 August, 2018, 12:28:38

Well, that didn't take long πŸ˜› Uploaded them to the pyglui issue.

papr 09 August, 2018, 12:29:39

Thank you

user-3f0708 09 August, 2018, 12:34:48

Good Morning I'm having trouble executing the mouse_control.py code available in github, because the mouse is moving slowly when I perform the tracking of the look with the pupil. Can someone tell me why this is happening?

papr 09 August, 2018, 13:26:27

@user-3f0708 Please post your questions in one channel only. Could you clarify what you mean by slow movement? Do you mean that the cursor lags behind or that the movement of the cursor is actually slow?

user-799634 09 August, 2018, 13:27:16

hello, i'm trying to print gaze distance in real time. I think gaze_point_3d_z shows that. when i finish calibration and look around, printed data is lower than true distance. Should i need to multiply initial value gotten from after calibration? @papr

papr 09 August, 2018, 13:29:02

@user-42840c Depth estimation can be very noisy and is not accurate for distances further than 1.5 meters. It works best in the range of 0.5-1.0 meters.

papr 09 August, 2018, 13:29:47

The estimation is based on vergence.

user-799634 09 August, 2018, 13:48:31

Aha, but in that range of 0.5 - 1.5 meters, printed data seems like not best working. when distance from me to monitor is 1000 mm, printed data of gaze distance is much lower than 1000 mm , also data fluctuates every after calibration.

user-3f0708 09 August, 2018, 13:59:22

@papr I mean the cursor movement is actually reading.

wrp 09 August, 2018, 14:35:25

@user-3f0708 could you clarify? Not sure I understand.

user-af87c8 09 August, 2018, 14:35:37

quick question: what is the time delay of camera to computer? I remember 40ms, but it could have been 38-42ms any of those

user-cd9cff 09 August, 2018, 16:10:26

@papr Hello, I have been querying gaze_normals_3d from python until now, and the data has been coming back with the eye id and the two gaze coordinates for each eye

user-cd9cff 09 August, 2018, 16:11:22

but I am now getting an error message that gaze normals 3d does not exist and when I query gaze normal 3d, (without the plural normals) I am getting three coordinates

user-cd9cff 09 August, 2018, 16:11:36

Has there been a recent change in the source code?

papr 09 August, 2018, 16:47:08

@user-cd9cff check the topic. There is a difference between monocular and binocular data.

user-cd9cff 09 August, 2018, 16:47:26

Yea, it works now thank you

user-3638ed 09 August, 2018, 16:49:32

Hi all! Is it basically possible to subscribe to two different topics (gaze & blink) from one unity scene instance? hasn't worked for me using "PupilTools.SubscribeTo()". Thanks!

papr 09 August, 2018, 17:09:19

@user-3638ed yes, this is possible.

user-3638ed 09 August, 2018, 17:16:03

@papr ok, i looked at the source code of PupilTools.cs - more precisely at the function "SubscribeTo( topic )" and couldn't find a way to bind it to script instance or sth.

Atm i have implemented gaze tracking for wich i subscribe to "gaze" at the startup of one of my scripts (i use the data for selecting items through pursuits which works fine).

Additionally i'd like to subscribe to "blink" using another script, but when both of them are executed, only the last subscription works.

Do you have any hint what i have to do to make it working?

Thanks a lot!

user-2c0e1f 10 August, 2018, 02:57:13

@papr Hello! How to decode a timestamp in gaze_pozitions file?

wrp 10 August, 2018, 03:54:48

@user-2c0e1f what exactly would you like to do? Convert the timestamp to wall time?

user-2c0e1f 10 August, 2018, 03:55:08

@wrp yes

wrp 10 August, 2018, 04:00:11

Timestamps to wall clock time - Please check out the info.csv file that is contained within each recording. You will find two keys Start Time (System) and Start Time (Synced). You can read more about the difference between Start Time System and Start Time Synced at https://docs.pupil-labs.com/#data-format (Start Time System is Unix Epoch at start time of the recording and Start Time Synced is Pupil Epoch at start time of the recording).

You can convert timestamps to wall time with some quick conversions. Here is an example on how to do this with Python using the start times in the info.csv file and an example first timestamp in the exported gaze_positions file:

import datetime
start_time_system = 1533197768.2805 # unix epoch timestamp
start_time_synced = 674439.5502 # pupil epoch timestamp
offset = start_time_system - start_time_synced
example_timestamp = 674439.4695
wall_time = datetime.datetime.fromtimestamp(example_timestamp + offset).strftime("%Y-%m-%d %H:%M:%S.%f")
print(wall_time)
# example output: '2018-08-02 15:16:08.199800'
wrp 10 August, 2018, 04:00:32

@user-2c0e1f I hope this is helps clarify.

user-2c0e1f 10 August, 2018, 04:01:13

@wrp thanks a lot!

wrp 10 August, 2018, 04:01:20

@user-2c0e1f welcome

user-11fa54 10 August, 2018, 09:32:54

how to fit a circle around the eye

user-11fa54 10 August, 2018, 09:33:02

when detecting eye pupil

wrp 10 August, 2018, 09:35:17

@user-11fa54 The pupil should be automatically detected by Pupil Capture when the eye image is within the frame (and under normal conditions). Could you elaborate/provide a concrete example?

user-11fa54 10 August, 2018, 09:36:29

i am implementing a paper Hybrid eye center localization using cascaded regression and hand-crafted model fitting

user-11fa54 10 August, 2018, 09:37:29

please help me with the implementation in opencv

wrp 10 August, 2018, 09:43:40

@user-11fa54 it doesn't seem like your question is related directly to Pupil hardware, software, usage, etc. Perhaps you would like to migrate this question to πŸ”¬ research-publications to ask for pointers/references. Please correct me if I misunderstood.

user-11fa54 10 August, 2018, 09:44:11

thanks

user-29e10a 10 August, 2018, 11:33:34

Hi, I'm getting this kind of errors in my logs, lately.... i'm using the bundled 1.8.26 on Windows ... I did not change anything in my own control code, just passing string back and forth [ERROR] launchables.world: Process Capture crashed with trace: Traceback (most recent call last): File "launchables\world.py", line 454, in world File "shared_modules\recorder.py", line 306, in recent_events File "shared_modules\file_methods.py", line 156, in extend File "shared_modules\file_methods.py", line 144, in append File "msgpack__init__.py", line 47, in packb File "msgpack_packer.pyx", line 284, in msgpack._packer.Packer.pack File "msgpack_packer.pyx", line 290, in msgpack._packer.Packer.pack File "msgpack_packer.pyx", line 287, in msgpack._packer.Packer.pack File "msgpack_packer.pyx", line 234, in msgpack._packer.Packer._pack File "msgpack_packer.pyx", line 263, in msgpack._packer.Packer._pack File "msgpack_packer.pyx", line 281, in msgpack._packer.Packer._pack TypeError: can't serialize <MemoryView of 'array' at 0x1fe9e5ef398>

user-29e10a 10 August, 2018, 11:33:57

capture crashes painfully and pulls everything with it into hell

papr 10 August, 2018, 11:35:31

Hey @user-29e10a which plugins are you running?

user-29e10a 10 August, 2018, 11:36:11

frame publisher, remote, annotation capture, and one visualizer ... no custom one

papr 10 August, 2018, 11:36:30

the accuracy visualizer?

user-29e10a 10 August, 2018, 11:36:50

yep

papr 10 August, 2018, 11:37:26

So my guess is that the issue lies with frame publisher but I thought I had fixed the issue πŸ€”

user-29e10a 10 August, 2018, 11:38:13

it happens when stopping the (eye) recording

user-29e10a 10 August, 2018, 11:41:37

and, I start the frame_publisher plugin programmatically when stopping the recording (through unity)

papr 10 August, 2018, 11:43:22

I was able to reproduce the issue on macos as well

user-29e10a 10 August, 2018, 11:46:13

ok, thanks in advance πŸ˜ƒ there was a second exception leading to a crash, too, I did not copy the log unfortunately, but it said, that capture did not find the path of the recording (maybe I deleted it while recording or very shortly after stopping) ...

papr 10 August, 2018, 12:04:18

I was able to further track down the issue. Not sure why this issue did not come up before. It will be fixed with the next release.

user-29e10a 10 August, 2018, 12:07:21

perfect β™₯

user-af87c8 10 August, 2018, 12:12:33

@papr, @mpk I could not find the delay of cameras to computer on the webpage. I remember Pablo mentionion ~40ms but how much is it exactly? In the abstract I found 45ms, is that the accuracte number to use?

user-29e10a 10 August, 2018, 14:04:02

@mpk any plans to include h265 support for compressing the videos before saving?

user-b0c902 11 August, 2018, 06:09:54

Hi, We have been using the updated version of pupil mobile and it stops capturing at 5GB. Is there a way to increase that?

user-da671c 11 August, 2018, 08:48:41

we are also getting this error message

ErrorLog.docx

papr 11 August, 2018, 09:09:22

@user-b0c902 This issue is very likely related to @user-29e10a his issue from yesterday. Could you please add the crash log to https://github.com/pupil-labs/pupil/issues/1263

user-da671c 11 August, 2018, 10:22:15

sure

user-24fdfb 11 August, 2018, 20:50:53

I have a pupil device connected to a Windows laptop USB port. I also downloaded the Python projects and viewed a couple demo videos.

user-24fdfb 11 August, 2018, 20:51:20

How do I get the software working on the laptop to capture my eyes?

wrp 11 August, 2018, 23:34:21

@user-24fdfb did you download the application bundle - https://github.com/pupil-labs/pupil/releases/latest ? Extract the archive with 7zip. And then run pupil_capture.exe

wrp 11 August, 2018, 23:34:59

You may need to right click and run as administrator the first time to give permission for driver installation

user-24fdfb 12 August, 2018, 23:44:12

Thanks, @wrp . I ran the pupil_capture.exe

user-24fdfb 13 August, 2018, 00:15:23

I ran pupil_capture.exe not as administrator and then as administrator and both times, I see lots of failures.

user-24fdfb 13 August, 2018, 00:15:58

/code MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS ver sion. world - [INFO] launchables.world: Application Version: 1.8.26 world - [INFO] launchables.world: System Info: User: JGreig, Platform: Windows, Machine: Josh-PC, Release: 8.1, Version: 6.3.9600 Running PupilDrvInst.exe --vid 1443 --pid 37424 OPT: VID number 1443 OPT: PID number 37424 Running PupilDrvInst.exe --vid 1443 --pid 37425 OPT: VID number 1443 OPT: PID number 37425 Running PupilDrvInst.exe --vid 1443 --pid 37426 OPT: VID number 1443 OPT: PID number 37426 Running PupilDrvInst.exe --vid 1133 --pid 2115 OPT: VID number 1133 OPT: PID number 2115 Running PupilDrvInst.exe --vid 6127 --pid 18447 OPT: VID number 6127 OPT: PID number 18447 Running PupilDrvInst.exe --vid 3141 --pid 25771 OPT: VID number 3141 OPT: PID number 25771 world - [ERROR] video_capture.uvc_backend: Init failed. Capture is started in gh ost mode. No images will be supplied. world - [INFO] camera_models: No user calibration found for camera Ghost capture at resolution [1280, 720] world - [INFO] camera_models: No pre-recorded calibration available world - [WARNING] camera_models: Loading dummy calibration world - [WARNING] launchables.world: Process started. world - [ERROR] calibration_routines.screen_marker_calibration: Calibration requ iers world capture video input. world - [ERROR] calibration_routines.screen_marker_calibration: Accuracy Test re quiers world capture video input. world - [INFO] recorder: Started Recording. world - [INFO] camera_models: Calibration for camera world at resolution [1280, 720] saved to C:\Users\JGreig\recordings\2018_08_12\000/world.intrinsics world - [INFO] recorder: No surface_definitions data found. You may want this if you do marker tracking. world - [INFO] recorder: Saved Recording. world - [INFO] recorder: Started Recording. world - [ERROR] calibration_routines.screen_marker_calibration: Calibration requ iers world capture video input.

wrp 13 August, 2018, 00:16:47

@user-24fdfb just to check, are you using Win 10?

user-24fdfb 13 August, 2018, 00:16:49

I also ran PupilDrvInst.exe and restarted the computer.

user-24fdfb 13 August, 2018, 00:16:54

Windows 8.1

user-24fdfb 13 August, 2018, 00:17:17

is 10 required?

wrp 13 August, 2018, 00:17:22

Yes

wrp 13 August, 2018, 00:17:38

We only support Pupil software on Win 10

user-24fdfb 13 August, 2018, 00:17:42

ok

user-24fdfb 13 August, 2018, 00:17:56

and there's no older version of the pupil software for 8.1?

wrp 13 August, 2018, 00:18:04

While software may work on versions lower than 10 - we can not offer support

wrp 13 August, 2018, 00:18:43

We only target Win 10. There are no older versions of Pupil for older versions of Windows

user-24fdfb 13 August, 2018, 00:18:52

ok

user-24fdfb 13 August, 2018, 00:19:23

would the steps I took so far work if I had Windows 10?

user-24fdfb 13 August, 2018, 00:26:34

Anyway, thanks. I'll probably continue this effort in a week or 2 when I have a newer laptop.

user-24fdfb 13 August, 2018, 00:27:03

I have Ubuntu 14 on VMPlayer but this may need too many resources.

wrp 13 August, 2018, 00:27:07

@user-24fdfb you should just be able to run pupil capture on win 10

user-24fdfb 13 August, 2018, 00:27:13

ok

wrp 13 August, 2018, 00:27:17

Vm is not recommended

user-24fdfb 13 August, 2018, 00:27:22

ok

user-24fdfb 13 August, 2018, 00:28:04

good day

wrp 13 August, 2018, 00:48:56

Likewise :)

user-5ccd98 13 August, 2018, 03:05:00

Hi, I am trying to track the eye when the user is looking at a work bench. I want to get what and where the user is looking at. I checked the tutorial and found out I might need to use the surface tracking. I am wondering what is the relationship between surface checking and calibration. If I put markers on the surface, do I still need to calibrate the device? I guess we still need manual calibration (print the marker and move it around)? After calibration, what would happen if user moves his head or body? I need to re-calibrate the device?

papr 13 August, 2018, 10:35:58

@user-5ccd98 Be aware that calibration/gaze mapping and surface tracking are two separate processing steps. Each transforms data from one coordinate system to an other:

  1. Calibration gaze mapping: Transforms pupil data (relative to eye video frames) to gaze data (relative to scene video frames).
  2. Surface tracking: Transforms gaze data to surface gaze (relative to the surface's coordinate system)

Therefore you will need to calibrate independently of surface tracking. The second transformation is done automatically as soon as you add surfaces.

papr 13 August, 2018, 11:18:37

Moving head/body is not a problem since the headset moves with it and gaze is still correct (ignoring the issue of slippage).

user-5ccd98 13 August, 2018, 14:45:45

Hi papr, thanks for your reply. Do you have any recommendation of which calibration method I should use? I found many options in the tutorial.

papr 13 August, 2018, 14:46:15

I think manual marker calibration fits your purpose well.

user-5ccd98 13 August, 2018, 14:46:42

Cool, thanks papr.

user-5ccd98 13 August, 2018, 15:08:48

Hi papr, I am trying to run Capture in my windows laptop. But the program crashed when I opened it.

user-5ccd98 13 August, 2018, 15:08:56

Any way to fix this?

papr 13 August, 2018, 15:10:17

@user-5ccd98 which version of windows are you using?

user-5ccd98 13 August, 2018, 15:10:26

windows 10

papr 13 August, 2018, 15:10:58

It crashes immediately upon opening?

user-5ccd98 13 August, 2018, 15:11:04

Yes.

papr 13 August, 2018, 15:11:11

Which version of capture do you use?

user-5ccd98 13 August, 2018, 15:11:43

v1.8-26

user-5ccd98 13 August, 2018, 15:11:59

If I did not connect the device, it worked

papr 13 August, 2018, 15:12:41

So opening capture without the headset, and then connecting it does not work either?

user-5ccd98 13 August, 2018, 15:13:00

Let me try it.

user-5ccd98 13 August, 2018, 15:16:43

It is in ghost mode.

user-5ccd98 13 August, 2018, 15:17:31

Can I re-connect the device by clicking somewhere in the software?

papr 13 August, 2018, 15:24:14

Yes, you should be able to select the cameras in the uvc manager menu.

user-5ccd98 13 August, 2018, 15:26:30

The activate source shows four unknown devices ....

user-5ccd98 13 August, 2018, 15:26:55

The selected camera is blocked or in use

papr 13 August, 2018, 15:27:26

This means that the driver installation was not successful. Please try running Capture with administrator rights

user-5ccd98 13 August, 2018, 15:27:38

Oh, I see

user-5ccd98 13 August, 2018, 15:29:26

Thanks, papr

papr 13 August, 2018, 15:29:57

Did it work?

user-5ccd98 13 August, 2018, 16:00:41

Yes, papr. But I came across another issue. When I used the manual calibration, the process always stopped immediately and showed that "not enough ref data or pupil data for calibration".

user-5ccd98 13 August, 2018, 16:02:21

I ensured that the pupil was within the range...

user-5ccd98 13 August, 2018, 16:08:00

And even dismissing 0% of pupil data due to confidence

papr 13 August, 2018, 17:07:50

Please make an example recording of the procedure and send it to data@pupil-labs.com

  1. Start capture
  2. Start recording
  3. Start manual calibration
  4. Run procedure
  5. Finish calibration
  6. Stop recording
  7. Stop Capture Upload recording and share with email address above
user-5ccd98 13 August, 2018, 17:17:37

Sure.

user-5ccd98 13 August, 2018, 17:18:43

If the calibration stopped, I need to restart it over and over again?

user-5ccd98 13 August, 2018, 17:43:43

Sent you a dropbox link

papr 13 August, 2018, 17:44:52

Thank you! I will have a look tomorrow πŸ‘Œ

user-82f104 13 August, 2018, 17:51:00

hi, I'm trying to use the 120hz monocular pupil labs eye tracker with a separate motion tracker, for high quality tracking with unconstrained head motion in a dark room. i'm trying to follow the calibration procedure from Cesqui et al. 2013 (https://www.ncbi.nlm.nih.gov/pubmed/23902754) but for this i need the sensor size and the focal length to convert the normalized camera units into mm. do you have this spec? thanks!

user-8779ef 14 August, 2018, 12:54:17

jkrum: Does your mocap operate in the visible range, or does it operate in the IR range?

user-8779ef 14 August, 2018, 12:54:24

@user-82f104

user-82f104 14 August, 2018, 14:21:17

in the IR range, it's an Optotrak system

user-5ccd98 14 August, 2018, 15:22:45

@papr I am wondering if you get a chance to watch the video?

user-8779ef 14 August, 2018, 15:35:04

@user-82f104 Have you considered that both the optotrack markers and the eye tracker will be emitting IR light? Have you checked for potential interference?

user-8779ef 14 August, 2018, 15:35:54

Hopefully the temporal signature of the optotrack markers will differentiate them from the eye tracker and prevent false tracking of the eyetracker LED's, but you might check that early on.

user-8779ef 14 August, 2018, 15:53:30

similarly, hopefully the pulsing from the IR emitting optotrack markers will not interfere with the eye tracker.

papr 14 August, 2018, 17:08:59

@user-5ccd98 I found the issue. You are using the stop marker instead of the calibration marker. This is why the calibration immediately stops without having found any reference data. The calibration marker has a black dot in the center of the concentric circles that the subject is supposed to fixate during the procedure.

user-5ccd98 14 August, 2018, 17:32:33

Oh!! Got it. Thanks so much. πŸ˜‚

user-5ccd98 14 August, 2018, 17:32:40

Sorry for the stupid mistake...

user-f81efb 15 August, 2018, 09:55:46

hello

user-f81efb 15 August, 2018, 09:56:06

i am trying to use pupil labs software on my mac using python

user-f81efb 15 August, 2018, 09:56:24

i get the error and can not see the video stream . world - [ERROR] video_capture.uvc_backend: Init failed. Capture is started in ghost mode. No images will be supplied.

user-f81efb 15 August, 2018, 09:56:32

can you help me?

papr 15 August, 2018, 10:04:58

@user-f81efb Please make sure that the cable is fully plugged into the clip.

user-f81efb 15 August, 2018, 10:05:48

yes it is plugged in completely

user-f81efb 15 August, 2018, 10:06:03

i will unplug and re connect and check

wrp 15 August, 2018, 10:07:14

@user-f81efb sometimes a bit more force is required to ensure that the USBC cable is firmly attached to the cable clip of the Pupil headset (as @papr notes)

user-f81efb 15 August, 2018, 10:08:42

when i run using the application Pupil Capture , it works completely fine

user-f81efb 15 August, 2018, 10:08:57

when i run using python, it doesnot work

papr 15 August, 2018, 10:12:26

Are the eye cameras working in both cases?

user-f81efb 15 August, 2018, 10:12:30

yes

user-f81efb 15 August, 2018, 10:12:37

they are working in both cases

papr 15 August, 2018, 10:12:49

Only the world camera does not work when running from source?

papr 15 August, 2018, 10:13:02

Do you have a 3d world camera?

user-f81efb 15 August, 2018, 10:13:07

yes, it says process started

user-f81efb 15 August, 2018, 10:13:14

but doesn't display

user-f81efb 15 August, 2018, 10:13:20

yes i have 3d world camera

papr 15 August, 2018, 10:13:47

OK, then you have to select the realsense backend in the world window

user-f81efb 15 August, 2018, 10:15:00

okay i'll try that

papr 15 August, 2018, 10:15:12

The selector can be found in the Uvc Manager menu

user-f81efb 15 August, 2018, 10:16:18

world - [WARNING] launchables.world: Process started. world - [INFO] camera_models: No user calibration found for camera Intel RealSense R200 at resolution (1280, 720) world - [INFO] camera_models: No pre-recorded calibration available world - [WARNING] camera_models: Loading dummy calibration

user-f81efb 15 August, 2018, 10:16:38

i still don't see any video

user-f81efb 15 August, 2018, 10:16:59

and the eye trackers didn't open yet

user-f81efb 15 August, 2018, 10:17:17

should i use the world window to open the eye tracker window?

user-f81efb 15 August, 2018, 10:17:20

or wait

user-f81efb 15 August, 2018, 10:19:08

the eye trackers window has opened

user-f81efb 15 August, 2018, 10:19:14

but still no video

papr 15 August, 2018, 10:23:02

A possible reason I could think of is that you installed the official librealsense instead of the required custom Pupil Labs version

papr 15 August, 2018, 10:23:33

Else I will try to reproduce the issue this afternoon.

user-f81efb 15 August, 2018, 10:25:06

https://github.com/pupil-labs/librealsense/blob/master/doc/installation_osx.md i used this to install

user-f81efb 15 August, 2018, 10:27:05

thanks πŸ˜ƒ i will check it once again

papr 15 August, 2018, 10:30:37

What Mac do you have?

user-f81efb 15 August, 2018, 10:31:22

i have macbook pro 2012

user-f81efb 15 August, 2018, 10:31:44

with os X El captain

papr 15 August, 2018, 10:33:49

Which bundle version do you use? And do you use the current master?

user-f81efb 15 August, 2018, 10:35:10

i am using the latest release 1.8

user-f81efb 15 August, 2018, 10:37:07

i cloned from git https://github.com/pupil-labs/pupil.git three days back

papr 15 August, 2018, 10:38:20

OK, great, thank you

user-f81efb 15 August, 2018, 10:39:31

Thank you

papr 15 August, 2018, 12:20:50

Hey @user-f81efb , I just tried to reproduce the issue. My R200 is correctly identified, independently of running from bundle or source.

user-f81efb 15 August, 2018, 12:48:01

what do you think could be the problem ?

user-f81efb 15 August, 2018, 12:52:23

it gets stuck after this

user-f81efb 15 August, 2018, 12:52:24

world - [INFO] camera_models: No user calibration found for camera Intel RealSense R200 at resolution (1920, 1080) world - [INFO] camera_models: No pre-recorded calibration available world - [WARNING] camera_models: Loading dummy calibration

papr 15 August, 2018, 13:05:22

Stuck in the sense that the software freezes?

user-f81efb 15 August, 2018, 13:46:55

yes

user-f81efb 15 August, 2018, 13:47:29

if i click on the option to enable eye tracker, eye trackers load after this

user-f81efb 15 August, 2018, 13:47:44

i can also see the gaze points moving on the screen

user-f81efb 15 August, 2018, 13:47:47

but no video

papr 15 August, 2018, 13:47:58

What do you mean by "eye tracker"? Do you mean the eye processes?

user-f81efb 15 August, 2018, 13:48:10

i mean the eye processes

user-f81efb 15 August, 2018, 13:48:15

for each eye

papr 15 August, 2018, 13:48:21

ok, then the application does not freeze.

user-f81efb 15 August, 2018, 13:48:37

ok

papr 15 August, 2018, 13:49:10

Can you post a screenshot of the Realsense Source menu?

user-f81efb 15 August, 2018, 13:52:16

yes

user-f81efb 15 August, 2018, 13:52:24

in a moment

user-f81efb 15 August, 2018, 14:02:28

Chat image

user-f81efb 15 August, 2018, 14:03:37

after choosing the option

Chat image

papr 15 August, 2018, 14:03:40

Please let me clarify. I meant the menu of Realsense Source not realsense Manager. That's the icon below the currently selected one.

papr 15 August, 2018, 14:03:56

The small camera icon

user-f81efb 15 August, 2018, 14:03:57

sorry

user-f81efb 15 August, 2018, 14:04:05

Chat image

papr 15 August, 2018, 14:09:18

@user-f81efb From the installation page that you linked above, please try cmake ../ -DBUILD_EXAMPLES=true and make && make install

user-f81efb 15 August, 2018, 14:09:54

okay you mean for librealsense

papr 15 August, 2018, 14:10:50

Yes, librealsense. The first step is to identify if the c++ examples work. Also let us migrate this discussion to a private conversation since this is a personal setup issue and not relevant for the community.

I will summarize any relevant outcome later.

user-f81efb 15 August, 2018, 14:12:59

okay

user-103621 15 August, 2018, 14:44:44

@user-f81efb Have u deinstalled the Realsense driver from device manager and reinsert the Tracker so the driver reinstalls ? Try this 2-3 times. Worked for us.

@papr is there a known issue with the Auto Exposure Mode 'auto mode' for the High Speed camera ? i always get world - [WARNING] uvc: Could not set Value. 'Auto Exposure Mode'.

papr 15 August, 2018, 14:52:04

@user-103621 Do you use Windows? Because there is not such a device manager on Mac. Which bundle do you use?

user-103621 15 August, 2018, 14:54:37

Yes i use Windows. Bundle is 1.8-16

papr 15 August, 2018, 14:55:00

Please update to 1.8-26. This should fix the issue.

user-103621 15 August, 2018, 15:03:54

Sorry i didnt read he was using mac. As for the exposure i have this error when i try to use Auto Exposure with the world Camera and the new bundle doesnt change my problem

papr 15 August, 2018, 15:08:02

@user-103621 I was just told by my collegue that the "auto mode" does not work as expected and that actual auto exposure can be achieved by using the "aperture priority mode"

user-103621 15 August, 2018, 15:22:10

@papr Thank you !

user-40bf4b 16 August, 2018, 12:03:07

Hi

user-40bf4b 16 August, 2018, 12:03:17

can someone help me with the timestamps format?

user-40bf4b 16 August, 2018, 12:03:46

The pupil headset I am using stores the timestamps as a double, and I can not extract the datetime from it

papr 16 August, 2018, 12:07:54

The timestamps are created from a monotonic clock that has an arbitrary epoch.

papr 16 August, 2018, 12:08:52

Be aware that this epoch is not the unix epoch unless you synced the pupil time epoch to it.

user-40bf4b 16 August, 2018, 12:09:50

how to sync it with unix epoch?

papr 16 August, 2018, 12:12:18

You need to do that either before the recording,using the time sync plugin or by subtracting the time offset between the two epochs. The difference can be calculated using the Start Time (System) (unix timestamp) and Start Time (Synced) (pupil time epoch) keys in the info.csv file of a recording.

user-40bf4b 16 August, 2018, 12:14:26

Thank you!

user-d16987 18 August, 2018, 11:30:37

I'm trying to help someone get the pupil tracker working on linux. It works fine on a windows machine, but on Linux only the front camera seems to be available (I'm testing with cheese). Both pupil cameras show up in the camera list, but if I select them in cheese I just get a few blocky pixels and no movement. This is the same on two machines. One recentish linux mint and one Ubuntou 16.4.

user-d16987 18 August, 2018, 11:31:49

In the pupil software no cameras show up. the 'select available' list shows 4 'unknowns', and trying to select any says they are not available/in use. Suggestions welcome. I am perusing the docs.

user-d16987 18 August, 2018, 11:33:00

Is there an ubuntu package I can test on this ubuntu box?

wrp 18 August, 2018, 11:33:56

What headset are you using? Are you running from source or from the app bundle? Are you using Linux in vm?

user-d16987 18 August, 2018, 11:38:16

hmm. good questions . The linux mint machine is running the app bundle. I've not yet installed the software on my ubuntu box - I was just seeing if I could get to the cameras at all. No VMs involved. Syslog says: idVendor=05a3, idProduct=9232, Is there a better way to find out what version of the hardware it is?

wrp 18 August, 2018, 11:45:59

Let me rephrase re headset. Are you using a DIY headset or a Pupil Labs headset?

wrp 18 August, 2018, 11:47:10

No extensive testing has been done on mint. I would suggest running from src on mint as the bundle was compiled targeting Debian

user-d16987 18 August, 2018, 11:52:43

Pupil labs headset.

user-d16987 18 August, 2018, 11:53:27

OK. Am downloading the bundle to try on ubuntu. Will do a build from source on mint. So the massive .zip is the only packaging - no real packages?

user-d16987 18 August, 2018, 11:54:58

Ah. the zip contains debs. Nice/.

wrp 18 August, 2018, 11:55:02

The zip archive at https://github.com/pupil-labs/pupil/releases/latest contains 3 applications - Pupil Capture, Pupil Player, and Pupil Service

wrp 18 August, 2018, 11:55:20

(you beat me to it πŸ˜„ )

user-d16987 18 August, 2018, 12:02:10

OK. Installing pupil capture on my ubuntu box does indeed get me some pupil camera images, so that's good. I'll do a source build on the mint box and see if that helps. May be back later if I have troubles πŸ˜ƒ

user-d16987 18 August, 2018, 12:02:28

Thanks

wrp 18 August, 2018, 12:02:47

Ok thanks for following up @user-d16987

user-d16987 18 August, 2018, 12:05:51

I'm a Debian Developer so if you are interested in getting the software into Debian/Ubuntu sometime I can sponsor that.

wrp 18 August, 2018, 12:07:13

Cool :+1: we will certainly get in touch

papr 18 August, 2018, 17:37:25

@user-d16987 Your offer is very appreciated!

user-c828f5 20 August, 2018, 18:03:04

Hey guys, could someone shed some light on how pupil computes confidence metric for each eye? Is it mentioned in any literature that could be cited?

user-c828f5 20 August, 2018, 18:13:12

I looked up previous chats and someone mentioned that the confidence metric is the ratio of the support pixels / number of pixels covered by the ellipse fit? I'm not sure if I understood it correctly.

user-40bf4b 21 August, 2018, 12:19:49

Hi. I am using a 200Hz pupil camera headset. What is the focal length of the pupil camera/is there a way to measure it from the pupil capture? I can not find this information on the pupil docs (it only says they are fixed focus)

papr 21 August, 2018, 12:27:05

@user-40bf4b The focal length is part of the camera intrinsics. Unfortunately, we only have pre-recorded camera intrinsics for the world camera. What you can do is to select the eye camera within the world process, run the camera intrinsics estimation plugin and extract the focal length from the saved Pupil Cam2 IDX.intrinsics file. The biggest issue is that you need to make the circle pattern visible to the IR-filtered camera. Neither the version printed on paper nor displayed on screen is visible in the eye camera.

user-c351d6 21 August, 2018, 13:06:11

Hey guys, is there a way to not always detect the fixations from scratch when opening a video in the player? This takes ages for a 20 minutes video! Why are the fixations from pupil captrue not saved once they are detected?

papr 21 August, 2018, 13:09:27

Hey Erik, I am working on that right now actually. πŸ™‚

user-103621 21 August, 2018, 13:12:45

Hello @papr I have 2 questions: 1. is it possible to read the exposure time while using the aperture auto mode ? 2. Why isnt the brightness of the picture not proportional to its absolute exposure time ? When i increase the exposure time it "jumps" back to a darker picture despite having higher exposure. its every 32 values.

user-c351d6 21 August, 2018, 13:13:03

Ah, that sounds nice! πŸ˜ƒ

papr 21 August, 2018, 13:15:20

@user-103621 1. You would have to change the source code for that. This information is not accessible for third-party plugins due to the process boundary.

user-c351d6 21 August, 2018, 13:16:46

@papr Will this be available within the next two weeks? We are getting close to our test...

papr 21 August, 2018, 13:22:13

@user-c351d6 If you run from source, yes. From bundle, maybe.

papr 21 August, 2018, 13:23:41

@user-103621 2. Within the allowed exposure time range the brightness should be linear as expected. It is possible that values bigger than the max values result in darker images than expected.

user-103621 21 August, 2018, 13:26:37

And what does change if u go over the exposure time range ? Does it increse the gain or ?

papr 21 August, 2018, 13:27:41

@user-c828f5 This is the code that calculates the confidence: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/pupil_detectors/detect_2d.hpp#L570-L573

papr 21 August, 2018, 13:29:53

@user-103621 Exposure time ha a defined range within the firmware that is based on the selected frame rate. The behavior outside of that range is not defined.

user-c351d6 21 August, 2018, 13:30:38

@papr How big is the effort to run it from source?

papr 21 August, 2018, 13:31:49

Depends on the OS and your experience with the terminal: Linux/Mac: Easy. Widnows. Difficult/painful. You will also need administrator rights.

user-c351d6 21 August, 2018, 13:38:55

Hmm, I got a Mac and some experience but little time. πŸ˜‰ I would prefer the deployed version πŸ˜‰

papr 21 August, 2018, 13:40:21

So the plan is to release a bug fix release, including the caching feature, soon. I just cannot promise you that it will be released within the next two weeks.

papr 21 August, 2018, 13:41:33

In your case you will have to weight time spent on source installation vs time spent on recalculating fixations. πŸ˜‰

user-c351d6 21 August, 2018, 13:42:59

Well, as I plan to write a Vuzix Plugin in the next time I will be forced to install the soruces I guess.

papr 21 August, 2018, 13:49:37

@user-c351d6 Actually, you do not have to run from source to run/develop third-party plugins. Simply put your custom plugins in pupil_player_settings/plugins and Player will load it automatically.

user-c351d6 21 August, 2018, 13:50:53

Ah, I haven't dived into this. I will check the documentation when time comes.

user-103621 21 August, 2018, 14:01:39

For reading the exposure time while using apertrue mode. Do i see it correctly that it would mean to add the features of reading the exposure time in the pyuvc using libuvc ?

user-103621 21 August, 2018, 14:02:11

And Thanks for the infos !

papr 21 August, 2018, 14:05:25

@user-103621 I am not very sure about that. Are we talking about the world camera, the 120hz eye cameras or the 200hz eye cameras.

user-78dc8f 21 August, 2018, 15:06:57

papr#8338, wrp#1848, mpk#4851: Hi folks. We are running v 0.32 of pupil mobile. When we connect to the mobile using pupil capture on our laptop, it will allow us to connect to the head camera, but it won't show the image (we just get a white screen). We have another phone running v 0.23 of pupil mobile, and this connects just fine. Is there any way to fix the new version quickly or revert to the old version? Note that these issues persist in multiple versions of the pupil capture, so that doesn

papr 21 August, 2018, 15:09:55

@user-78dc8f For clarification: These are two different phones? Are all video streams (world + eye) white when streaming with the phone running v0.32? What does the preview on the phone show?

user-78dc8f 21 August, 2018, 15:11:22

@papr: yes...different phones.

user-78dc8f 21 August, 2018, 15:12:02

@papr....one second on the other queries.

papr 21 August, 2018, 15:12:55

Important: Does the sensor overview show the cameras as idle or as streaming?

user-103621 21 August, 2018, 15:14:20

im talking about the world camera (highspeed model)

user-78dc8f 21 August, 2018, 15:16:38

@papr preview on the phone looked fine. Regarding video streams on laptop, the world cam was white (eye cam was grey...as if nothing had been selected). We'll double-check now on 'idle' vs. streaming...

papr 21 August, 2018, 15:17:06

@user-78dc8f I am able to reproduce the issue on v0.32. Looks like the streaming does not start as it should.

user-78dc8f 21 August, 2018, 15:24:05

@papr Can confirm, all statuses on the phone are 'idle'. Screen is white on world cam, and remains grey on the eye camera.

user-78dc8f 21 August, 2018, 15:27:59

@papr Is there any possibility of being able to revert the version on the phone in question? As we require it for sessions tomorrow.

papr 21 August, 2018, 15:28:39

Yes, I will try that.

user-78dc8f 21 August, 2018, 15:51:15

@papr I need to log off for a bit. Is there a way I can get an update on pupil mobile later this evening? (I'm not too discord saavy...)

papr 21 August, 2018, 15:52:18

Sure, please write an email to ppr@pupil-labs.com such that I can keep you up-to-date on the issue.

user-78dc8f 21 August, 2018, 15:52:43

@papr Awesome. Thanks.

user-e24be9 21 August, 2018, 20:33:10

Hello,

I'm using pupil for a marketing research and I need to get the following data: β€’ Number of fixings β€’ Total fixing time β€’ Total time until the first fixation.

Can you tell me how to get that data please?

papr 21 August, 2018, 21:25:54

Hey @user-e24be9 Use the offline fixation detector to detect and export fixations to csv. You can calculate all the required data based on the exported data.

user-58d5ae 22 August, 2018, 09:37:30

Hey everyone, did anyone manage to evaluate stress levels using pupil size data ?

papr 22 August, 2018, 09:40:05

Hey @user-58d5ae I do not know from the top of my head but you can have a look at our list of papers that cite Pupil: https://docs.google.com/spreadsheets/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/edit?ts=576a3b27#gid=0

user-ad8e2d 22 August, 2018, 11:24:58

Hey, I have a question about the gaze data that is being published by the pupil capture. One computer only receives the topics 'gaze.3d.0.' and 'gaze.3d.1. ' while another computer has 'gaze.3d.01.' as well as 'gaze.3d.0.' and 'gaze.3d.1. ' . Just wondering why this could happen and how to get the other computer to have the same output. They both have the latest versions , we are both running windows. Thanks, Jack

user-a04957 22 August, 2018, 12:14:06

Hello everybody,

I have a question concering the Manual Marker Calibration Method.

Is it good practice to vary in distance with the markers? Or shall one calibrate by always staying in the same 2D-Plane?

I would like to cover a circular area with the user in the center of the circle... so a fixed distance is not the best option for me.

Thank you very much!

user-29e10a 22 August, 2018, 12:24:30

@papr Hi, again regarding the video recording issue: Sometimes the mp4 files are not readable (even with VLC) – I understand, that is due to corrupted headers (although I do not understand why it is happening so often) .... what I missed until today, is that the eye*.timestamps file for corrupted videos is missing!!! This is very bad, because a) in our pipeline we rely on the possibility to look at the eye videos at the correct timestamp and b) redetect the pupils with different pupil detectors – if the timestamp file is missing, the mapping of frame-to-time is lost... strangely, it did happen that one eye video was readable (with timestamp-file) and the other not, with missing timestamps. .... first, I hope you can bugfix this for the next release, and secondly_ Is there a possibility to repair old recordings? we use windows, 1.8.26.... issue is apparent with both codecs (big and small file) ... world video seems to be always fine

papr 22 August, 2018, 12:26:57

@user-29e10a Please remind me, are these videos recorded using Pupil Capture or Pupil Mobile?

user-29e10a 22 August, 2018, 12:35:50

@papr Capture

papr 22 August, 2018, 12:40:07

Is there an open github issue for this already? Sorry that I do not know that. There were a lot of bug reports lately :/

papr 22 August, 2018, 12:41:00

Also: Timestamps are missing and video files are broken if the recording process crashes. The question is if a Python exception is at fault or if it is a crash cuased by one of the underlying c libs...

user-29e10a 22 August, 2018, 12:41:03

@papr not yet, I will raise an issue there right now

user-29e10a 22 August, 2018, 12:41:33

@papr as I recall correctly there is nothing in the logs that points in a direction

papr 22 August, 2018, 12:41:57

Thank you! Try to provide the eye log files. If they do not contain python tracebacks then underlying c libs are at fault.

user-29e10a 22 August, 2018, 12:42:15

ok, I will provide them with the issue

user-29e10a 22 August, 2018, 12:44:38

@papr is there a possibility to recalculate the timestamps? I can repair the video files... and I have the pupil-timestamps (so I can count the frames) ... is there a way to tell when the recording started exactly?

papr 22 August, 2018, 12:45:37

@user-ad8e2d The last part of the topic .0./.1./.01. indicates on which pupil data the gaze point is based on and therefore if the gaze datum is monocular or binocular. Low confidence will be mapped monocularly. If you do not see any binocular gaze on one pc, check the gaze confidence. Also: The HMD Calibration only maps monocularly.

papr 22 August, 2018, 12:47:03

@user-29e10a I do not see a way to restore the timestamps. One could interpolate based on the other timestamp files but this might result in desynced eye videos...

user-ad8e2d 22 August, 2018, 12:54:15

@papr Thank you very much for explaining that.

user-ad8e2d 22 August, 2018, 13:51:17

@papr Do you know how I could only subscribe to the world frame from the frame publisher. Currently when I subscribe to 'frame' I get 'frame' - the world frame, 'frame.eye.0' and 'frame.eye.1'. Is there a way I can get a 'frame.world' similar to the frame publisher example pupil labs provided on the docs. When I use the frame publisher 'frame' is the 'frame.world' as there is no 'frame.world' published. I am using pupil capture, on windows. Thanks, Jack

papr 22 August, 2018, 13:52:45

@user-ad8e2d Is this your issue: https://github.com/pupil-labs/pupil-helpers/issues/28 ?

This is actually a bug in Capture and I have just finished a PR that fixes the issue: https://github.com/pupil-labs/pupil/pull/1276

user-ad8e2d 22 August, 2018, 13:53:25

@papr yea it is

user-ad8e2d 22 August, 2018, 13:53:52

I have another couple of things I have noticed that don't match the current doc

user-ad8e2d 22 August, 2018, 13:55:12

They are minor things like spelling in the json package recieved.

papr 22 August, 2018, 13:56:19

What do you mean exactly?

papr 22 August, 2018, 13:56:34

You should be receiving msgpack and not json, btw πŸ˜‰

user-ad8e2d 22 August, 2018, 13:57:14

Yea that's what I meant sorry, I unpack it to json. πŸ˜ƒ

user-ad8e2d 22 August, 2018, 13:57:23

I'll find an example now

papr 22 August, 2018, 13:59:15

You can use this fix until we release the fixed bundle: https://gist.github.com/papr/59f9b2eba22fa8cc4306d67730f089a3

Put the file in pupil_capture_settings\plugins\ and activate Frame_Publisher_Fixed from the Plugin Manager. Afterards you should be able to receive to world frames only by subscribing frame.world

user-ad8e2d 22 August, 2018, 14:03:10

When I get gaze data I used this doc: https://github.com/pupil-labs/pupil-docs/blob/master/user-docs/data-format.md#pupil-player but I recieve "gaze_normals_3d" and "eye_centers_3d", where as the doc says "gaze_normal_3d" and "eye_center_3d"

user-ad8e2d 22 August, 2018, 14:03:17

@papr

user-ad8e2d 22 August, 2018, 14:03:32

Thanks for the fix πŸ˜ƒ

papr 22 August, 2018, 14:05:49

@user-ad8e2d these are not spelling mistakes! These are binocular gaze sample fields.

papr 22 August, 2018, 14:07:46

Check the topic, it should end in .01.. See the description above

user-ad8e2d 22 August, 2018, 14:09:06

oh ok thanks, my bad! That makes more sense.

user-8779ef 22 August, 2018, 15:13:37

@papr @wrp @mpk Hey guys, just an update. We've finally finished the design of our polycarbonate hot mirror insert into the vive, and ordered 5 pairs from a local optics manufacturer. We are working on the camera mount (have preliminary designs in solid works). There will still be work to do on the software compensation for distortions from the fresnel.

user-8779ef 22 August, 2018, 15:14:10

I'm sure we're still at least a few months out, but we progress!

papr 22 August, 2018, 16:31:42

Cool! Thanks for the update @user-8779ef

papr 22 August, 2018, 16:32:37

@user-a04957 Which type of detection/mapping are you using? 2d or 3d?

user-3f0708 22 August, 2018, 16:41:07

Good afternoon Is it possible to get the calibration process data through a .csv file?

papr 22 August, 2018, 16:42:39

@user-3f0708 What do you refer to by calibration process data? Do you mean the reference marker and pupil data that is collected during calibration?

user-3f0708 22 August, 2018, 16:43:29

Yes @papr

papr 22 August, 2018, 16:45:55

Well, there is no plugin that exports this type of data yet... But the data is definitively stored in the notifications.pldata file. You can write a python script to export that data.

user-3f0708 22 August, 2018, 16:47:45

@papr In which place is this file saved from notifications.pldata on Pupil's platform?

papr 22 August, 2018, 16:49:11

This file is part of every recording. (starting with v1.8, previously the data was part of the pupil_data file)

user-3f0708 22 August, 2018, 16:50:55

Thank you for the informations @papr

user-24fdfb 22 August, 2018, 23:56:00

how do you configure the pupil software?

user-24fdfb 22 August, 2018, 23:57:04

I can record video with it but the mp4 files show mostly white and very light gray

user-24fdfb 22 August, 2018, 23:57:12

hard to see any detail of the eys

user-24fdfb 22 August, 2018, 23:57:14

eyes*

wrp 22 August, 2018, 23:58:52

Hi @user-24fdfb was the eye visible in Pupil Capture when you were recording?

wrp 22 August, 2018, 23:59:27

@user-24fdfb did you try loading the recording in Pupil Player and using the Visualize Eye Video Overlay plugin?

user-24fdfb 23 August, 2018, 00:05:10

I will give that a shot, thanks

user-24fdfb 23 August, 2018, 00:05:36

this discord group is very helpful. really saves lots of time to talk with real people here

wrp 23 August, 2018, 00:06:28

Thanks for the feedback @user-24fdfb 😸

user-24fdfb 23 August, 2018, 00:19:43

woohoo I can see my eye

user-24fdfb 23 August, 2018, 00:19:53

I needed to adjust some settings on the camera

user-24fdfb 23 August, 2018, 00:20:23

It was settings in the pupil capture software

user-24270f 23 August, 2018, 01:25:10

is there a way to calibrate without a world camera and without it being in VR? like, look at a screen marker at a known distance, or anything at all??

essentially i have people wearing augmented reality glasses. i want to know what icons they are looking at...but i dont have access to a world camera due to the system.

user-3f0708 23 August, 2018, 13:13:54

Does the open source platform of the pupil labs accept any Logitech camera model?

papr 23 August, 2018, 13:18:06

This model is officially supported: https://www.logitech.com/en-us/product/c930e-webcam

papr 23 August, 2018, 15:04:58

@user-29e10a I just realised that you can use the start_eye_capture notification to set uvc settings for the eye cameras, even if they are running already: https://github.com/pupil-labs/pupil/blob/master/pupil_src/launchables/eye.py#L517-L518

user-8779ef 23 August, 2018, 17:28:37

@papr Hey pablo, my student @user-c828f5 has been struggling with something for a while (I believe you've been helping him). Would you be surprised to see a notebook in which we show that the angular distance between the left and right eye approximated using 2D POR data (and a pixel-to degree conversion) is very different compared to an angular distance calculation from the same track, but using 3D gaze normals?

user-29e10a 23 August, 2018, 17:57:26

@papr thanks a lot, i will try this!

papr 23 August, 2018, 19:55:51

@user-8779ef what do you mean by por data?

papr 23 August, 2018, 20:05:23

@user-8779ef also, I would like to see that notebook πŸ™‚

user-53afc0 23 August, 2018, 22:17:21

Hi all, I came across two questions when using Pupil: (1) the video size is too large. I plan to record about one hour data. Is there any other way to decrease the video size besides lowering down the video resolution? (e.g., customized file format?) (2) I checked the exported csv files. I am still confused about the timestamps in both files even after I read the descriptions. What I got is something from 2200 to 2300 [the whole recording is about 2 min]. What does these numbers mean? I am wondering if there is any way to get the global time, I mean something like 2018-08-23, 22:45... Thanks for your help!!

user-e711e0 24 August, 2018, 02:10:27

I want to get gaze normx and normy dynamically, so that i can control a gimble camera as i move my eye ball. What python code i am looking for?

user-c351d6 24 August, 2018, 13:17:00

Hi guys, I just have a question about the surface tracking. We are tracking a couple of surfaces in a 20 Minutes video. After filling the marker cache (which takes around 12 minutes), pupil player freezes for a long time (and sometimes endless). Is this a known behaviour?

user-c351d6 24 August, 2018, 13:18:08

And it doesn't save the marker cache in this cases.

user-c351d6 24 August, 2018, 13:19:03

I just measured it, it freezes for about 13 minutes

user-8779ef 24 August, 2018, 13:26:37

@papr POR: gaze locations within normalized screen space. I can't remember the name of hte variable.

user-8779ef 24 August, 2018, 13:27:19

probably norm_pos_x / norm_pos_y.

user-8779ef 24 August, 2018, 13:29:21

Thanks for the reply. So, he's going to put the notebook together. Either 1) he'll find thathe's made some kind of mistake in his calculations, or 2) he'll demonstrate what he's been telling me for a while: that the angular distance between the left and right eye is very different if you use norm_pos_x/norm_pos_y, vs gaze_normal0 / gaze_normal1

user-24fdfb 25 August, 2018, 22:46:59

I want to relay blink and eye movement events to a website. I have experience programming with Python. Is modifying the pupil capture code the best way to do this or is there an API I should study?

user-c351d6 27 August, 2018, 12:35:19

Hi guys, I just have a question about the surface tracking. We are tracking a couple of surfaces in a 20 Minutes video. After filling the marker cache (which takes around 12 minutes), pupil player freezes for a long time (and sometimes endless). Is this a known behaviour? And it doesn't save the marker cache in this cases. I just measured it, it freezes for about 13 minutes.

mpk 27 August, 2018, 13:32:33

@user-c351d6 this is the gaze being mapped onto each surface. We need to refactor to make this work without blocking.

mpk 27 August, 2018, 13:36:40

I have made an issue for this: https://github.com/pupil-labs/pupil/issues/1280

user-a08c80 28 August, 2018, 00:37:43

Hi all. We are a medical research group interested in gaze patterns of surgeons using the da vinci surgical robot. We are uncertain of how to lay a gaze pattern over an outside camera being used as a scene cam, in this case the da vinci robot's surgical camera. Is this possible with the current software available?

user-f5ff51 28 August, 2018, 09:32:23

Hey guys, I was wondering if its possible to buy and use one pupil cam (200hz Eye Camera ) just to check if it meets my needs and its performance before buying the actual headset? I can save up for 650EUR but might not afford the whole headset just yet.

user-f5ff51 28 August, 2018, 09:32:50

I can build a cheap and dirty holder for it until I save up for the headset+second camera upgrade

papr 28 August, 2018, 09:33:32

@user-24fdfb I would recommend to use our zmq network API. See this example python script: https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_messages.py

papr 28 August, 2018, 09:38:44

@user-f5ff51 Hey, yes, you can buy a single eye camera from the store but it does not include the necessary cabling to connect it to a computer.

user-f5ff51 28 August, 2018, 09:43:34

what cabling?

papr 28 August, 2018, 09:45:54

@user-a08c80 Am I correct in assuming that the surgeon's head movement is independent of the robot's scene camera? Our software assumes a fixed physical relationship between eye cameras and scene camera.

papr 28 August, 2018, 09:48:01

@user-e711e0 Check out https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_messages.py

user-f5ff51 28 August, 2018, 10:04:01

@papr what cabling?

papr 28 August, 2018, 10:20:18

@user-f5ff51 the headset includes internal cabling that connects the eye cameras to the USB clip. The eye camera listed on the website does not include the cabling nor the USB clip

user-f5ff51 28 August, 2018, 10:30:28

@papr If its just standard 4 usb pins I can get/make my own and use that for now. Is that the only issue?

papr 28 August, 2018, 11:43:58

@user-f5ff51 The ey cameras have a female jst connector. We can sell you a jst-to-usb cable as well. Please write an email to sales@pupil-labs.com if you are interested in that.

user-f5ff51 28 August, 2018, 12:20:08

@papr That's okay I'll solder one up myself. Just let me know if that single camera will connect with the Pupil software and SDK by itself so I can evaluate it before I can buy the whole system

papr 28 August, 2018, 12:21:04

The software should work perfectly fine with a single eye camera.

papr 28 August, 2018, 12:21:59

@user-f5ff51 What exactly do you need to know/test, btw?

user-a08c80 28 August, 2018, 12:23:07

@papr That's correct. We have developed an approach where the user's head is essentially stationary once comfortable and calibrated in the console viewfinder (we feel somewhat analogous to the oculus rift). Thank you!

user-f5ff51 28 August, 2018, 12:26:18

@papr accuracy, precision, latency, those sort of stuff

papr 28 August, 2018, 12:26:45

@user-a08c80 You are right! I would handle it like VR scenario with the difference that you can show the calibration markers to the scene camera. The only technical problem is to get the scene camera working in Capture.

papr 28 August, 2018, 12:27:26

@user-a08c80 An other question is if the subject is comfortable in the console viewfinder even when wearing the headset.

papr 28 August, 2018, 12:28:12

@user-f5ff51 Be aware that you cannot test gaze mapping (+accuracy, precision) without a world/scene camera. You should be able to use any uvc camera for this purpose though

user-f5ff51 28 August, 2018, 12:31:34

okay, thanks!

user-a08c80 28 August, 2018, 14:01:49

@papr We have taken steps to make sure the setup is comfortable (enough... still improving πŸ˜ƒ ) Ok excellent. Thank you.

user-2686f2 29 August, 2018, 01:11:07

Exporting a recording in Pupil Player yields a .csv file with a column each for an x and y position for my gaze. I've got the 120 HZ pupil cameras, so there are tens of sample positions for every World Camera frame. When I play back the exported data in a program I've written to visualize the gaze path, it's similar to the green dot visualization I see when I play back in Pupil Player, but the output in my program is sampling each point individual and this results in a very jerky and erratic playback. It seems like Pupil Player is using some smoothing/averaging of the data the create the green dot. Can you tell me what kind of smoothing is happening, or point to where I could find it in the source code?

papr 29 August, 2018, 07:36:48

@user-2686f2 This is the visualization for the dot https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/vis_circle.py#L39-L54

papr 29 August, 2018, 07:37:49

As you can see there is no smoothing. We filter for a minimal confidence though!

user-a3b3bd 29 August, 2018, 08:10:02

Hello everyone. Is it possible to tell the pupil cam to stream in Uncompressed mode rather than MJPEG? And is it possible to stream grayscale frames instead of RGB? (If that will require to use USB3 connection, that is not an issue for me). Thank you.

papr 29 August, 2018, 08:11:45

I am not 100% sure about that but @mpk should be able to answer this.

user-a3b3bd 29 August, 2018, 08:12:25

@mpk Is it possible to tell the pupil cam to stream in Uncompressed mode rather than MJPEG? And is it possible to stream grayscale frames instead of RGB? (If that will require to use USB3 connection, that is not an issue for me).

user-2686f2 29 August, 2018, 08:12:53

@papr Thanks!

user-2686f2 29 August, 2018, 08:15:28

I also had a question about calibrating via the manual marker (the target shape). The video provided shows someone hold the target several feet away the person wearing the Pupil glasses. I noticed a slider for "marker size" in the Pupil Capture software. Can you make smaller target print outs and calibrate with them closer to your face? If so, does this increase the accuracy/affect of the calibration in any way?

papr 29 August, 2018, 08:26:41

@user-2686f2 yes, you can print them smaller. Make sure that the markers are detected during the calibration. You should be hearing tick sound if it is recognized.

In regards to the calibration: Always calibrate at the distance that your subject will mainly look at. Also make sure that you cover most of the subject's field of view.

user-2686f2 29 August, 2018, 08:29:09

@papr I've been using the new experimental calibration where you keep your eyes on the target and move your head around. I've been getting better accuracy using that method. I've always been calibrating at the distance my subject mainly looks at, I was just wondering if a smaller target could improve the accuracy (calibration usually occurs at about 2 feet).

papr 29 August, 2018, 08:40:45

Thank you for your feedback! Cool to see that the single marker calib. is being used. I do not think that a smaller marker would increase accuracy.

user-8779ef 29 August, 2018, 18:36:14

Can someone please remind me of the email address to contact if I have a hardware issue and need a repair/replacement?

user-8779ef 29 August, 2018, 18:41:30

found it : [email removed] Thanks!

user-b70259 30 August, 2018, 10:01:01

Hello. I want to get an angle of eye rotation. I'm beginner for python

user-b70259 30 August, 2018, 10:04:35

I'm working on Ubuntu 18.04 and follow the instruction for Linux. I try to run main.py but I can't see any window. How can I check that my setting is correct?

user-b70259 30 August, 2018, 10:14:40

MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version. world - [INFO] launchables.world: Application Version: 1.8.33 world - [INFO] launchables.world: System Info: User: xxx, Platform: Linux, Machine: xxx, Release: 4.15.0-33-generic, Version: #36-Ubuntu SMP Wed Aug 15 16:00:05 UTC 2018 world - [INFO] pupil_detectors.build: Building extension modules... cc1plus: warning: command line option β€˜-Wstrict-prototypes’ is valid for C/ObjC but not for C++ cc1plus: warning: command line option β€˜-Wstrict-prototypes’ is valid for C/ObjC but not for C++

user-b70259 30 August, 2018, 10:14:54

The same warning is displayed continuously.

papr 30 August, 2018, 11:31:06

@user-b70259 Don't worry, this warning is normal. Just wait a bit while the pupil detectors are being build. Once they are build the window will appear.

papr 30 August, 2018, 11:31:56

Also: You do not need to run from source in order to access eye rotation. You can access it via the network interface as well. https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_messages.py

user-40bf4b 30 August, 2018, 11:36:43

Hey, my pupil software time does not match with the actual clock time. So when it records the time of an event, we have an offset, which we want to determine or correct for. Can anyone help?

papr 30 August, 2018, 11:38:20

With actual clock you mean your system time?

papr 30 August, 2018, 11:38:40

Do you want to do it after the effect or before a recording even starts?

user-40bf4b 30 August, 2018, 11:41:54

yes I meant the system time, thanks. I need to record the pupil data with respect to system time, its easier that way when doing the analysis with some external software. For instance, Matlab.

user-40bf4b 30 August, 2018, 11:42:44

Is there a way to do this? If not, what would be the next best thing?

papr 30 August, 2018, 11:49:23

Yes, this is possible. Enable the Time Sync plugin and run this script: https://github.com/pupil-labs/pupil-helpers/blob/master/network_time_sync/pupil_time_sync_master.py

user-40bf4b 30 August, 2018, 11:57:02

Where should the code be run from? The same directory as capture.exe? Does it need to be run before each recording?

papr 30 August, 2018, 11:58:26

It should run on the same computer that runs the external software that you want to sync to.

papr 30 August, 2018, 11:58:39

This needs to run in parallel to Capture, yes.

user-40bf4b 30 August, 2018, 12:20:46

I tried running the code, it says can not find 'Clock_Sync_Master' module. I could not find such a module on the project, am I missing something?

papr 30 August, 2018, 12:25:41

Ah, yes you will need this file from the same repository: https://github.com/pupil-labs/pupil-helpers/blob/master/network_time_sync/network_time_sync.py

Put it next to the first file.

papr 30 August, 2018, 12:26:22

You might need to install other requirements

user-40bf4b 30 August, 2018, 12:46:57

This is the error I get. The files are on the same level

Chat image

user-40bf4b 30 August, 2018, 12:47:51

I checked, uvc is installed. I am running Python 3.6, as required

papr 30 August, 2018, 12:50:22

Installing the requirements on Windows is very tricky. Maybe a small plugin that sets the Pupil clock on load might be better

user-40bf4b 30 August, 2018, 12:56:51

Can you maybe help me with this? I am not an expert on programming...

papr 30 August, 2018, 13:02:51

Sure, just to confirm: You need seconds in Unix timestamps?

papr 30 August, 2018, 13:09:06

aka what ever time.time() returns

user-40bf4b 30 August, 2018, 13:11:03

yes, we want a setup where the pupil recordings are in terms of UNIX timestamps of system time (if that makes sense)

papr 30 August, 2018, 13:11:15

Yes, I understand.

user-40bf4b 30 August, 2018, 13:11:35

thanks a lot, papr

papr 30 August, 2018, 13:37:10

@user-40bf4b https://gist.github.com/papr/87c4ab1f3b533510c4585fee6c8dd430

papr 30 August, 2018, 13:37:52

The plugin does not have any ui. You can make sure that it is running by checking the plugin manager.

user-23e10d 30 August, 2018, 13:46:59

Hey, it seems that the Frame Publisher plugin in pupil capture is broken in the latest release. I think this is fixed by https://github.com/pupil-labs/pupil/commit/b20f4c19dfceabe13798d9b92631fec49ce76dda, but a build with these changes hasn't been released yet. Is there any way to know when the next release will be? We are running on windows 7 so it's kind of a pain to build from source.

papr 30 August, 2018, 13:47:46

Unfortunately, there is no fixed timeline for the release yet.

papr 30 August, 2018, 13:48:51

You can place this fixed plugin in your pupil_capture_settings/plugins folder: https://gist.github.com/papr/59f9b2eba22fa8cc4306d67730f089a3

papr 30 August, 2018, 13:49:11

This way you do not need to run from source @user-23e10d

user-23e10d 30 August, 2018, 13:51:15

Will do. Thank you!

user-3a93aa 30 August, 2018, 20:03:47

Hi, I'm trying to use the offline fixation detector. I believe I have saved 3d pupil information (when I export the data, pupil_positions.csv has 3D info in it), but the fixation detector persists in using the "gaze" method. Any idea how I can troubleshoot what's going on?

user-3a93aa 30 August, 2018, 20:21:44

update: it looks like fixation_detector looks for a field in the pupil data called 'gaze_normal_3d' (line 161); my data has a field called 'gaze_normals_3d'

user-3a93aa 30 August, 2018, 20:33:39

in addition -- is there a way to batch offline fixation detection export?

user-a6b05f 31 August, 2018, 07:52:07

Hi everyone! Is it possible to make a pupil's lab eye tracker communicate with the HoloLens through ZMQ? The netMQ doesn't seem to work on the HoloLens

mpk 31 August, 2018, 09:07:13

@user-a6b05f we did not get it to work but maybe it is possible now. We have a special plugin on Pupil to talk to the hololens.

user-a3b3bd 31 August, 2018, 09:21:42

Hi mpk

papr 31 August, 2018, 09:29:13

@user-3a93aa hey, this is a very subtle bug. Please create a github issue for it. Currently, the batch export is disabled.

user-b70259 31 August, 2018, 09:47:04

@papr After finishing build, windows appeared. But something is wrong so that I got error "video_capture.uvc_backend: Init failed. "

user-b70259 31 August, 2018, 09:47:15

Then, I tried filter_messeges.py with Pupil Service and it worked so maybe I can manage it. Thank you for answering such easy question.

user-a3b3bd 31 August, 2018, 09:52:43

Guy, I've payed real money here, can I please get an answer to my question?

mpk 31 August, 2018, 10:29:10

@user-a3b3bd I just saw your question. You can stream uncompressed but not grayscale. Have a look at the available uvc streams.

papr 31 August, 2018, 10:35:49

Hey @user-a3b3bd This is a community based channel. As you can see there are a lot of questions and it happens that some of them are overlooked by mistake. A simple reminder would have been enough. I would appreciate it if you adjusted your attitude accordingly.

papr 31 August, 2018, 10:36:59

@user-b70259 What hardware are you using? The headset or one of the addons?

user-b70259 31 August, 2018, 10:53:46

@papr I use headset. It maybe bought 3 years ago so camera module is old one. I don't know that could be related.

papr 31 August, 2018, 12:10:45

@user-b70259 do you have a binocular or a monocular headset?

user-3625f2 31 August, 2018, 12:21:24

@mpk Thank you. Is uncompressed better for reduced latency?

papr 31 August, 2018, 12:23:41

@user-3625f2I don't think that there is enough USB bandwidth to run uncompressed with high frame rates.

user-3625f2 31 August, 2018, 12:25:22

If it is 24 bits per pixel @ 200x200 pixels and 200hz than each second pure pixel data needs only 24 MByte bandwith. Should be enough? That said if latency is unaffected I don't care.

user-3625f2 31 August, 2018, 12:26:27

But I feel like any compression will take time and use buffers

papr 31 August, 2018, 12:30:58

I have never tried to measure the latency with uncompressed images myself. Keep in mind that you need the same amount of bandwidth for the second eye camera as well as additional bandwidth for the world camera. I am assuming that you are using a headset.

user-3625f2 31 August, 2018, 12:31:35

Do I really have to connect them to the same USB port?

user-3625f2 31 August, 2018, 12:31:50

By the way how do you measure latency yourself?

user-3625f2 31 August, 2018, 12:35:22

And is there any specific reason not to have grayscale capture to cut the bandwith? I guess mjpeg compression would make it pointless but for uncompressed streams it would cut it by x3.

papr 31 August, 2018, 12:35:30

libuvc provides the start-of-exposure-timestamp for each frame. We define latency as difference between this timestamp and the time that we receive the frame in our consuming application.

user-3625f2 31 August, 2018, 12:36:48

Oh, so you can get good latency measurement purely in code? Thats nice, I have this highspeed camera and LED setup. I guess I didn't really need it here...

papr 31 August, 2018, 12:37:28

What do you mean by "purely in code"?

papr 31 August, 2018, 12:37:54

Also, as mpk said, the cameras simply do not provide the grayscale streaming option.

papr 31 August, 2018, 12:39:07

And technically yes, you could disconnect the cameras from the clip and connect them one-by-one to different usb ports.

user-3625f2 31 August, 2018, 12:40:03

I mean by comparing timestamps in software.

user-3625f2 31 August, 2018, 12:41:12

I understnd what mpk says, I'm just asking a general question, why tell cameras to provide frames in RGB rather than grayscale in your firmware.

user-3625f2 31 August, 2018, 12:47:18

Oh, well. Was worth to ask.

user-3625f2 31 August, 2018, 12:47:48

By the way can I run pupil capture without starting any GUI, just from the command line, if I just want to dump eye position data?

papr 31 August, 2018, 12:49:47

This is not possible yet. You can write your own cli though. Simply use libuvc+pyuvc+pupil_detectors. This would not include any gaze mapping though.

user-3625f2 31 August, 2018, 12:50:34

thank you. "This would not include any gaze mapping though." Why not? Code not open sourced for that yet?

papr 31 August, 2018, 12:55:14

Let me clear up the terms: - pupil data: data relative to the eye camera - gaze data: data relative to the scene camera - gaze mapping: process of mapping pupil data to the scene camera

Everything is open source but the calibration procedure needs further logic to detect reference data. You can use libuvc+pyuvc to receive frames and use the pupil_detectors to receive pupil data. If you want gaze data, you will have to run a calibration first. But there is no isolated code to run in a script though.

user-3625f2 31 August, 2018, 12:55:42

understood, thank you for your time

user-3a93aa 31 August, 2018, 17:19:05

created issue https://github.com/pupil-labs/pupil/issues/1286 .

End of August archive