core


mpk 01 February, 2018, 07:23:03

@user-8779ef what you are seeing is not periodic changes in confidence but the 4-5 confidence samples for that world video frame (eye is recorded most likely at 90-200hz) repeating in the confidence graph when you hit pause in player. This is a bug. The confidence graph should not be be updated during pause. @papr can you implement a fix for this?

mpk 01 February, 2018, 07:23:54

Also note that you are only seeing one video frame out of the 4-5 that match that world frame temporally.

papr 01 February, 2018, 12:29:21

Same goes for the recorded fps graph.

user-62cec9 01 February, 2018, 13:19:58

How feasible/simple would it be to detect head rotation/position based on the world cam and markers? I am interested in eye-in-head gaze analysis, as in, gaze shifts that consist of both head and eye movements.

papr 01 February, 2018, 13:20:42

@user-62cec9 Have a look at this work-in-progress pull request: https://github.com/pupil-labs/pupil/pull/872

user-62cec9 01 February, 2018, 13:22:13

ahh, thanks!

papr 01 February, 2018, 14:10:42

@user-8779ef @mpk I pushed the fix into master. We will replace these graphs with timelines in the long run. They make much more sense in Player. But the timelines are not precise enough without the zoom feature. Therefore we will discontinue the fps/confidence graphs in Player as soon as we introduce timeline zoom.

user-a4d924 01 February, 2018, 15:13:34

Hi everyone. I'm trying to extract fixations-on-surfaces from a large recording (7 GB). In the pupil player, the orange bar showing Marker Cache stops half-way through. When I export the data, "fixations_on_surface" CSV file also ends at a time stamp in the half of the video length. This happens on a Mac (player version 1.2.7, data format version 0.9.15). I have tried doing this on an Ubuntu machine but there the Player freezes and quits. Is the video too large to handle? If so, is there a way to downscale the video's quality after the recording?

papr 01 February, 2018, 15:15:02

@user-a4d924 What happens if you seek in Pupil Player to the second half of the video?

user-a4d924 01 February, 2018, 15:16:38

@papr the recording plays on (although it's laggy). I can see the gaze, but no fixations and no surfaces (both fixations and surfaces are visible in the first half of the video)

papr 01 February, 2018, 15:18:40

@user-a4d924 You should be able to reduce the video file size by transcoding it with ffmpeg. ffmpeg -i <original file> <new file>

user-a4d924 01 February, 2018, 15:19:29

@papr I was worried that changing the resolution will mess up the pixel coordinates used by Pupil's recording

papr 01 February, 2018, 15:20:02

The command above does not change resolution. Just the encoding of the frames.

user-a4d924 01 February, 2018, 15:20:19

@papr alright, thanks a lot! I will give it a go

papr 01 February, 2018, 15:20:44

Please do not overrite the original file though. Just to be on the safe side.

user-dfeeb9 01 February, 2018, 16:14:58

Hi, I've run into a peculiar issue and I'm not sure if I'm being stupid or something broke. I was changing the Local USB Video Source settings for the input from RealSense R200 camera and now it just defaults loading to a dummy calibration on a still of the last input, irrespective of whatever settings I choose. I've tried deleting my pupil settings folder and running from a fresh pupil recorder directory to no avail. Has somebody dealt with this before?

user-dfeeb9 01 February, 2018, 16:30:23

WRT the previous post, i have managed to resolve default input by just reinstalling drivers but changing resolutions on the video source settings will cause the error again

user-dfeeb9 01 February, 2018, 16:33:53

And now it works fine... You may want to just disregard everything I said, it seems something was playing up that I cannot now replicate to report back. lol

papr 01 February, 2018, 16:50:35

@user-e7102b Please see here the complete example: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/remote_annotations.py

user-a4d924 01 February, 2018, 17:19:43

@papr ffmpeg solved my problem - thanks once again.

papr 01 February, 2018, 17:20:03

@user-a4d924 Glad to hear it πŸ™‚

user-e7102b 02 February, 2018, 01:11:23

@papr Thank you for creating the remote annotations example - this works great! Do you have any examples for passing commands from MATLAB to python? I use MATLAB and the psychtoolbox to present stimuli to participants, so I really need to be able to control the eyetracker from MATLAB. I was planning to use the "system.m" command, but I don't think this is suitable.

user-dfeeb9 02 February, 2018, 01:24:51

@user-e7102b hi, I've been quietly watching this conversation because it's of relevance to me also. I am currently interfacing triggers from LUA to python using simple sockets, presumably you can do the same with matlab? Pardon me if I misunderstood your framework/requirements

user-e7102b 02 February, 2018, 01:28:24

Hi @user-dfeeb9 thanks for your message. My overall goal here is to try and use the Pupil as a substitute for the Eyelink 1000 trackers that we currently use in our lab. We use MATLAB to control the trackers in one of two ways. The first way is just to send all the usual commands (e.g. start/stop recording, calibrate, event codes) via MATLAB to the eye-tracker and everything is logged in a datafile on the eyelink machine. The second (and often more ideal) way is to grab the data from the eye-tracker online and record it in MATLAB.

user-e7102b 02 February, 2018, 01:30:19

I'm not very familiar with python, or LUA, so I'm running into problems when trying to execute either of these approaches

user-dfeeb9 02 February, 2018, 01:30:52

No problem, I'm very new to eye-trackers so I sympathise with the feeling of lacking direction

user-dfeeb9 02 February, 2018, 01:31:14

LUA is just another programming language that I happen to be using due to the stimuli we've selected

user-dfeeb9 02 February, 2018, 01:31:37

The principles are essentially the same irrespective of programming languages, which is that you can use a socket to send whatever bytes/triggers you want

user-dfeeb9 02 February, 2018, 01:33:19

As I understand it, zmq (zeroMQ) is the networking library that pupil uses. Because there's no easy way to get zeroMQ running with my current stimuli in LUA, I have a lua -> python socket where I send bytes (something simple like 1, 2, 3, 4) from lua to python. said python server script will read the byte and interpret these codes based on whatever label i've given them. so 1 might be experiment start. based on this interpretation, that server will then trigger an annotation in the pupil IPC during a recording. I'm very new to this so I may be doing things horribly wrong/inefficiently, so someone please do correct me if I am

user-dfeeb9 02 February, 2018, 01:33:36

but based on these principles you could do the same thing from matlab using a similar setup

user-dfeeb9 02 February, 2018, 01:34:18

as for reading data though, I am not sure at all. apologies for not being any help there

user-e7102b 02 February, 2018, 01:34:38

Sure. So you're essentially using a python server as a middle man between LUA and Pupil Capture Software?

user-dfeeb9 02 February, 2018, 01:34:43

yep

user-e7102b 02 February, 2018, 01:34:54

Ok that makes sense

user-dfeeb9 02 February, 2018, 01:35:00

this may be non-ideal depending on the latency and precision you desire

user-dfeeb9 02 February, 2018, 01:35:11

using a UDP socket I have anywhere from 0~30ms latency

user-dfeeb9 02 February, 2018, 01:35:39

you may wish to look into setting up a zeroMQ socket connection between MATLAB and pupil to cut out the middleman, but I have no idea about that

user-e7102b 02 February, 2018, 01:35:41

We actually do something similar with Eyetribe Eyetrackers in our lab

user-e7102b 02 February, 2018, 01:36:09

Right, that would make sense

user-e7102b 02 February, 2018, 01:38:05

So, there's actually a toolbox that I've been playing around with for grabbing the data from Pupil Capture into MATLAB via a python server: https://github.com/matiarj/pupil-helpers/tree/matlabAddV2/pupil_remote/Matlab_Python

user-e7102b 02 February, 2018, 01:39:06

The problem I'm having is that the data it grabs from the tracker are not meaningful

user-e7102b 02 February, 2018, 01:40:58

If I can crack this, then the problem shoudl be solved. Grabbling the data direct from the eye-tracker into MATLAB is ideal because a) it allows all the data to be stored in one place and b) it means you can run gaze-contingent eye-tracking experiments.

user-dfeeb9 02 February, 2018, 01:42:00

That sounds very cool. I actually know very little matlab so I am no help there, hopefully the pupil guys here will help you resolve that. Sorry for butting in without much help there lol

user-e7102b 02 February, 2018, 01:42:48

Of course, I'm sure they will! Thank you for butting in, I need all the help I can get πŸ˜ƒ

user-e7102b 02 February, 2018, 01:47:22

Many vision scientists use MATLAB/psychtoolbox for stimulus control, so I think that resolving these issues and making Pupil as accessible as possible could benefit a lot of poeple in my field

wrp 02 February, 2018, 02:48:22

@user-e7102b I certainly think MATLAB support is important. We will put this on our todo list (with the caveat that the majority of our team reads/writes Python, C++, C, JS, etc). @user-ed537d If you have time, would you be willing to update/work with others in the community to add more features to your example?

user-e7102b 02 February, 2018, 03:07:26

@wrp Thank you - that would be really great. I'll be more than happy to work with you guys with testing out solutions and making working code examples available in a public repository. I fell that @user-ed537d 's code isn't far off working for me...it's just tough to figure out what the problem is with a limited knowledge of python.

user-659e05 02 February, 2018, 08:17:58

Hello, what should be sued as in IR only pass filter for tracking cameras, 850nm or 940nm?

mpk 02 February, 2018, 08:25:00

850 is closer to what we use.

papr 02 February, 2018, 08:26:56

@user-e7102b You should have a look at the Hololens Relay plugin. It defines a separate network interface that uses udp. You should be able to use it with Matlab. It is not as versatile as Pupil Remote. Let us know if you need further commands that are not implemented there.

papr 02 February, 2018, 08:29:58

The example by @user-ed537d is a python script that opens an udp port and relays the data as well after subscribing to Pupil data.

user-659e05 02 February, 2018, 08:30:28

closer? what is yours?

user-659e05 02 February, 2018, 09:13:26

850nm is not visible to the eyes right?

wrp 02 February, 2018, 09:29:32

correct, not within visible spectrum

user-8779ef 02 February, 2018, 16:25:59

@user-e7102b Looks like you could write a matlab wrapper that calls methods in a python module: https://www.mathworks.com/help/matlab/matlab_external/call-user-defined-custom-module.html?requestedDomain=true

user-8779ef 02 February, 2018, 16:47:12

@papr The confidence threshold is used during calibration, correct?

papr 02 February, 2018, 16:47:32

correct

user-8779ef 02 February, 2018, 16:47:38

@papr The nature of my issue posting is this: the same threshold should not be used for calibration and for other settings.

user-8779ef 02 February, 2018, 16:47:58

Imagine I want to export my calibrated data, even when data below threshold.

user-8779ef 02 February, 2018, 16:48:38

I would have to calibrate with a high threshold, then lower the threshold prior to export (then raise it again, since every other plugin seems to rely on . it?)

user-8779ef 02 February, 2018, 16:48:41

It just seems a bit messy.

papr 02 February, 2018, 16:48:56

They keyword here is consistency. There is no reason to export data that was not used in other calculation.

user-8779ef 02 February, 2018, 16:49:24

I don't know that i agree with that.

user-8779ef 02 February, 2018, 16:50:18

Athough using a very high threshold for calibration might produce a great track with limited data.

user-8779ef 02 February, 2018, 16:51:08

...but, I would want a much lower threshold for export so that I can plot and make decisions related to the threshold post-hoc.

papr 02 February, 2018, 16:52:05

So you want to have multiple confidence threshold sliders?

user-8779ef 02 February, 2018, 16:52:21

I think just one for calibration, and one "other."

user-8779ef 02 February, 2018, 16:52:40

It's hard for me to say much about the "other" threshold, because when/where the general setting is used is not transparent.

papr 02 February, 2018, 16:53:11

The exact reason why introducing multiple thresholds would be messy

user-8779ef 02 February, 2018, 16:53:18

Heheh.

user-8779ef 02 February, 2018, 16:53:37

I'm going to stick to my guns here. I think adding a slider to the calibration plane would be a good idea.

user-8779ef 02 February, 2018, 16:53:49

calibration pane/menu

papr 02 February, 2018, 16:54:21

You would need such a slider for each calirbation section as well... This menu is already so full with elements. EDIT: sequence

user-8779ef 02 February, 2018, 16:54:43

Lets put it this way: recently, I had the situation where I had to have a high threshold for a good calibration. When I exported the data, though, it was full of holes.

user-8779ef 02 February, 2018, 16:55:22

It's not clear to me why the export should be thresholded at all, really. Other plugins, I understand, but withholding raw data seems antithetical to a research device.

user-8779ef 02 February, 2018, 16:55:55

I'll likely edit my export module myself. I realize it's a simple commenting out of one or two lines of code.

user-8779ef 02 February, 2018, 16:56:47

...at the very least, I would make it very clear to the user that not all the data is actually exported.

papr 02 February, 2018, 16:57:13

Did the pupil or the gaze positions have holes? Or both?

user-8779ef 02 February, 2018, 16:57:20

Only looked at gaze

user-8779ef 02 February, 2018, 16:57:38

Took me a while to think to run a min(gaze['confidence']) and see what was happening.

papr 02 February, 2018, 16:58:55

Well, I guess this is a documentation issue then.

mpk 02 February, 2018, 16:59:22

@user-8779ef pupil data stream is never filtered.

mpk 02 February, 2018, 16:59:54

the gaze mapper threshholds based on the set level. This means the gaze datastream is fully exported.

mpk 02 February, 2018, 17:00:46

but the gaze datastream has these holes because of the sample selection from pupil data based on the threshhold.

user-8779ef 02 February, 2018, 17:00:57

I getcha.

mpk 02 February, 2018, 17:01:04

we export all data that is generated.

user-8779ef 02 February, 2018, 17:01:49

I don't like it πŸ˜ƒ .

mpk 02 February, 2018, 17:02:11

what do you not like? You would like to have a pupil data mapped to gaze?

user-8779ef 02 February, 2018, 17:02:36

Yes. I think one should be able to calibrate using one threshold (because that makes sense - it imporoves the quality of the calibration )

user-8779ef 02 February, 2018, 17:02:59

and then export all data using no threshold, or a different (lower) threshold for post-hoc analysis.

user-8779ef 02 February, 2018, 17:03:11

You are effectively throwing out data.

user-8779ef 02 February, 2018, 17:03:17

...by not gaze mapping it.

user-8779ef 02 February, 2018, 17:03:29

Gaze map it now. Let me decide what I want to keep later.

mpk 02 February, 2018, 17:05:13

@user-8779ef ok. I get this point. We could just map all pupil data to gaze but we will need to then filter for it in other plugins later.

user-8779ef 02 February, 2018, 17:05:24

That makes sense.

user-8779ef 02 February, 2018, 17:05:37

I guess the difference is that I consider pupil player a waystation to analysis in Python.

user-8779ef 02 February, 2018, 17:05:59

Whereas you consider it good practice to filter the data prior to analysis in Pupil.

user-8779ef 02 February, 2018, 17:06:10

I think flexibility here is key.

user-8779ef 02 February, 2018, 17:06:41

...or, at the least, transparency, because one would not expect "export raw data" to omit data.

user-8779ef 02 February, 2018, 17:07:03

(i realize the issue with semantics there - to you it's not REALLY omitting data)

mpk 02 February, 2018, 17:07:03

https://github.com/pupil-labs/pupil/issues/new

user-8779ef 02 February, 2018, 17:08:30

@papr @mpk Thanks both for listening. How should I phrase the issue request?

user-8779ef 02 February, 2018, 17:08:40

Is this really an issue with documentation, as you would sugges?

mpk 02 February, 2018, 17:08:43

sorry.

user-8779ef 02 February, 2018, 17:08:45

SHould I request what I want - the two sliders?

mpk 02 February, 2018, 17:08:46

wrong link

mpk 02 February, 2018, 17:08:56

https://github.com/pupil-labs/pupil/issues/1052

user-8779ef 02 February, 2018, 17:08:57

ehr, two thresholds

user-8779ef 02 February, 2018, 17:09:34

Ok, that works. Thanks very much @mpk !

user-e7102b 02 February, 2018, 17:11:13

@user-8779ef Thanks for the suggestion re writing a matlab wrapper that calls methods in a python module. I'll see if I can get this working and report back. This seems like a better way to go than using the matlab "system.m" function.

user-8779ef 02 February, 2018, 17:13:00

My pleasure, and good luck!

user-e7102b 02 February, 2018, 17:17:33

@papr I've actually set up @user-ed537d 's python/matlab UDP relay and have managed to pull live gaze/confidence data from the eye-tracker into MATLAB. Unfortunately the data that I'm pulling in do not seem to update in response to my eye-movements/blinks etc. I just see an endless stream of the same numbers e.g. Eye tracking data: [-48.26, -43.69], 0.80 Eye tracking data: [-48.26, -43.69], 0.80 Eye tracking data: [-48.26, -43.69], 0.80 Eye tracking data: [-48.26, -43.69], 0.80 Eye tracking data: [-48.26, -43.69], 0.80 Eye tracking data: [-48.26, -43.69], 0.80

papr 02 February, 2018, 17:21:28

I think it would be worth the effort to look into the hololens relay interface. It will be officially supported by us. You can probably adapt the matlab code part of Matiar's code example.

user-e7102b 02 February, 2018, 17:22:13

OK, sure, I'll take a look at this too.

user-e7102b 03 February, 2018, 00:12:29

@papr , can you direct me towards the hololens relay interface script(s) please?

user-e7102b 03 February, 2018, 00:19:31

Thanks!

user-e7102b 03 February, 2018, 01:35:35

@papr So I changed the port in hololens_relay.py to match the pupil remote port (50020) and then ran hololens_relay.py in terminal using $ python3 hololens_relay.py . I don't see any error messages, but when I look in Activity Monitor there doesn't appear to be a python server running. I'm also unable to read any data into MATLAB. Am i missing something here?

user-e7102b 03 February, 2018, 01:37:09

By the way, I was able to tweak @user-ed537d 's python server code to successfully read the eye-tracker data into MATLAB. However, the rate at which samples are being read in is really very low and inconsistent, so I don't think this is going to work.

papr 03 February, 2018, 01:43:34

@user-e7102b this is a plugin that runs within capture. You will have to write a custom Matlab script that implements the relay protocol as it is described in the plugin doc string

papr 03 February, 2018, 01:44:22

You need to enable it in the plugin manager

user-e7102b 03 February, 2018, 01:53:20

Which option is it in the plugin manager?

papr 03 February, 2018, 01:54:10

It should be called Hololens Relay. Which version are you using?

user-e7102b 03 February, 2018, 01:54:56

version 1.1-2

papr 03 February, 2018, 01:55:31

I think it is a 1.2 feature. Please upgrade your software to use it.

user-e7102b 03 February, 2018, 01:55:48

Ok got it.

user-e7102b 03 February, 2018, 02:05:52

pupil capture v1.3 isn't loading on my machine (Macbook Pro, Sierra 10.12.6). Both pupil player and service seem to be working.

wrp 03 February, 2018, 03:03:56

@user-e7102b did you try deleting pupil_capture_settings dir?

user-7bc627 03 February, 2018, 13:10:20

hello @papr , I am interested in Pupil Mobile, I have just started going through some instruction about this. Yet, I am just first wondering that anyone could suggest which android device we should use to connet the Pupil Mobile? Does Google Pixel 2 xl works with Pupil Mobile? Alos, I want to know is it difficult to do the calibration outdoor? Thanks!

user-e7102b 03 February, 2018, 17:31:30

@wrp I tried deleting the pupil_capture_settings directory but I'm still unable to get pupil capture to start.

user-e7102b 03 February, 2018, 17:38:28

I've downgraded to v1.2-7 - this version appears to be working OK

mpk 03 February, 2018, 17:51:39

there was a bug in 1.3 @user-e7102b it is fixed now. Please re-download and re-install Pupil Capture and it should work.

user-e7102b 03 February, 2018, 18:21:48

@mpk Thanks - it's working now.

user-e7102b 03 February, 2018, 18:49:52

@papr So, with regards to the hololens_relay plugin, woud I still need to set up a python server to interface between MATLAB and pupil capture, or is the idea that I can just tap directly into the UDP socket from MATLAB?

papr 03 February, 2018, 23:46:49

@user-e7102b No, the plugin is the server. No need for further man-in-the-middle-relay scripts. You are able to talk the udp socket directly using MATLAB.

user-e7102b 04 February, 2018, 01:51:04

@papr I'm able to read a stream of 23 numbers into MATLAB using the following code, so it looks like I'm doing something right. Does the following pipeline look correct to you: % construct UDP connection hUDP = udp('127.0.0.1',50021);
fopen(hUDP); % initialize the relay fwrite(hUDP, '0I') % start gaze broadcast fwrite(hUDP, '0S') % read from fread(hUDP);

user-826625 04 February, 2018, 06:17:38

@papr Hi! where can I find pupil apps 1.3 for windows?

papr 04 February, 2018, 10:54:06

@user-826625 We will publish the windows bundles early next week.

papr 04 February, 2018, 10:58:38

@user-e7102b Codes starting with 0 are response codes. These are the codes that you should be receiving from the Hololens Relay. The protocol implements mostly a request-reply pattern. This means that you should call fread after fwrite to see if your request was successfull.

user-e7102b 04 February, 2018, 18:49:15

@papr OK that makes a lot more sense. I'm able to construct the UDP, initialize the relay and start gaze broadcast. One thing I'm not sure of with this approach is whether I'll be able to record data in MATLAB at the true sample rate (e.g. 120 Hz). For example, if I run the code below, the samples do not appear to be updating in real time.

% gaze capture loop for i=1:1000 fread(u) end

papr 04 February, 2018, 19:18:10

@user-e7102b I would always recommend to record in Pupil Capture. The interface is only for live interaction and does not guarantee delivery (due to the socket being udp).

user-e7102b 04 February, 2018, 19:25:42

@papr Sure. My approach with our other eye-trackers in the lab is to stream into MATLAB for live interaction and also record to tracker's data log, in case we miss samples in the live stream

user-e7102b 04 February, 2018, 19:28:49

If I could send commands direct from MATLAB to start/stop recording AND event triggers via the hololens_relay plugin then I could also use this approach with Pupil. However, I don't see these options in the hololens_relay.py documentation?

papr 04 February, 2018, 19:31:37

You are correct. These options are missing. I think both are reasonable functions to implement. Please create a Github issue for this.

user-e7102b 04 February, 2018, 22:49:00

OK, I've submitted an issue. Thank you for your help with this.

user-8889de 05 February, 2018, 06:18:49

Anyone here experienced with pupil on the rpi3?

mpk 05 February, 2018, 07:05:31

@papr before starting to abuse the hololens relay and build a udp based (read unreliable) matlab interface. Can we at least discuss writing a matlab demo that uses zmq and smgpack reading and writing a stronly typed subset of the IPC?

user-8cf4ca 05 February, 2018, 13:36:47

hey all, I have just pulled the latest pupil version and all dependencies and everything runs fine except the screen marker calibration does not result in calibrated gaze point. During the process there are multiple "to far." messages in the console. The screen markers respond normally (turn their color to green when looked at), but after the calibration completes the gaze is all over the place, no correlation with real gaze point whatsoever. The console output: Starting Calibration world - [INFO] calibration_routines.screen_marker_calibration: Starting Calibration

Stopping Calibration world - [INFO] calibration_routines.screen_marker_calibration: Stopping Calibration to far. ... 90 more to far. to far. world - [INFO] calibration_routines.finish_calibration: Collected 77 monocular calibration data. world - [INFO] calibration_routines.finish_calibration: Collected 12 binocular calibration data. Reflection detected Reflection detected Ceres Solver Report: Iterations: 89, Initial cost: 2.729794e+01, Final cost: 7.783502e-03, Termination: CONVERGENCE

wrp 05 February, 2018, 13:38:09

@user-8cf4ca what is the confidence of Pupil detection?

user-8cf4ca 05 February, 2018, 13:40:37

mostly 1.0. I have just noticed that the reported fps is very low though (2 fps in world and 8 fps in eye windows), although it looks like usual 120fps in the eye windows

papr 05 February, 2018, 13:41:42

@user-8cf4ca On which platform do you run?

papr 05 February, 2018, 13:42:12

And am I correct that you run from source?

user-8cf4ca 05 February, 2018, 13:42:56

linux, yes, running from sources, just pulled 30 minutes ago, tried, got this, then switched to the tag v1.3 - same results

wrp 05 February, 2018, 13:43:42

@user-8cf4ca this is a very low frame rate. Could you let us know the specs of the machine you are using?

papr 05 February, 2018, 13:45:39

@user-8cf4ca Do I understand you correctly, that the eye video is smooth as if it would run with 120 Hz but the reported fps is around 8 fps?

user-8cf4ca 05 February, 2018, 13:45:45

it worked wonderfully a month or two ago on the same machine. Intel(R) Core(TM) i7-3820QM CPU @ 2.70GHz, 24gb RAM

user-8cf4ca 05 February, 2018, 13:46:00

@papr, exactly. Same with the world too.

wrp 05 February, 2018, 13:46:09

that's a powerful machine - should be more than enough to run Pupil

user-8cf4ca 05 February, 2018, 13:46:52

yeah, it worked flawlessly before. I might try to just restart and reinstall the dependcies again

user-8cf4ca 05 February, 2018, 14:02:28

ok, everything solved by redoing (sudo make install) in turbojpeg. Apparently it wasn't linked properly or something.

wrp 05 February, 2018, 14:12:20
wrp 05 February, 2018, 14:12:53

@user-8cf4ca thanks for the update - pleased that this was resolved. Makes sense that you would be seeing super low frame rates without turbojpeg

user-02ae76 05 February, 2018, 17:25:22

Hey, working with the newest build of pupil lab (200 fps eye cam) and I can't seem to find any way to adjust the focus of the eye cam. Does anyone know if this just wasn't built in?

papr 05 February, 2018, 17:37:44

@user-02ae76 You are correct, the focus of the new cameras is not adjustable. You should get good pupil detection anyway even if the image does ot look as sharp as with the old cameras.

wrp 05 February, 2018, 19:23:25

@user-02ae76 I'd also like to add on to @papr comment. You can also try using the eye camera arm extender - https://docs.pupil-labs.com/master/#additional-parts - if you are not able to get a good view of your eye region with the standard eye camera arm.

user-b458c2 06 February, 2018, 00:40:18

Dear Pupils Team, Do you have a video showing the performance of the DIY eye-tracker? I'm a college freshman and I'm going to buy the DIY eye-tracking kit but I am wondering how accurate it is. (I want to try to use it and see what kind of projects I might come up with.)

user-41f1bf 06 February, 2018, 00:59:23

@user-b458c2 There are some videos on youtube.

user-41f1bf 06 February, 2018, 00:59:28

https://m.youtube.com/watch?v=lPtwAkjNT2Q

user-41f1bf 06 February, 2018, 01:03:10

Also, there is a sample dataset to play around with pupil player. I am not sure if it is up to date for offline detections though. https://pupil-labs.com/pupil

user-e7102b 06 February, 2018, 01:32:22

Hi, I'm running into a problem when I attempt to view the raw pupil data. When I load a recording into pupil player and hit "export", the raw data does not appear in the "exports folder" (I just get an empty "annotations.csv" file and a "world_viz.mp4" file). Pupil player also crashes a few seconds after the export. I have the "Raw Data Exporter" plugin activated before I hit the export button.

user-e7102b 06 February, 2018, 01:32:52

I'm running pupil player v1.3-9 on Macbook Pro (Sierra)

user-e7102b 06 February, 2018, 03:01:22

I'm also unable to open the pupil_data file that is saved in the recording file. It doesn't appear to have a file extension. In the documentation it states that pupil_data can be read and inspected with a couple lines of python code...what are those lines of code? I've tried reading it in as a pickle file but no luck.

wrp 07 February, 2018, 07:14:11

@user-e7102b please see https://docs.pupil-labs.com/master/#raw-data-with-python re viewing pupil_data

wrp 07 February, 2018, 07:21:23

@user-e7102b please could you share the player.log file so that we can gain insight into the crash? This is located in pupil_player_settings/player.log

papr 07 February, 2018, 08:53:37

@user-e7102b @wrp A user notified me that this crash is due to a mistake in the pupil helper script that sends the annotations. It does not set the source key. The player annotation plugin requires this key for legacy reasons IIRC. We should remove this requirement. The short term version is to fix the helpers script.

mpk 07 February, 2018, 09:04:16

@papr I added the source field:

mpk 07 February, 2018, 09:06:47

@papr I also made an issue: https://github.com/pupil-labs/pupil/issues/1058

user-197bca 07 February, 2018, 11:07:32

@papr I have a question regarding distance calculation between user and a point on a surface. I posted it in github with details: https://github.com/pupil-labs/pupil/issues/1059 Could you take a look at it?

user-29e10a 07 February, 2018, 11:07:57

for everyone who is installing the pupil source on macos high sierra: the developer docs should be updated, opencv does not install ffmpeg automatically... you have to make a brew install ffmpeg ...

wrp 07 February, 2018, 15:53:24

@user-29e10a noted, we can update the docs today

wrp 07 February, 2018, 15:58:02

@user-29e10a I just checked brew opencv3 and it appears that ffmpeg, numpy, and other dependencies are required by opencv3 and therefore should be installed when you do brew install opencv3

wrp 07 February, 2018, 16:00:09

the output from brew info opencv3 shows:

Required: eigen βœ”, ffmpeg ✘, jpeg βœ”, libpng βœ”, libtiff ✘, openexr βœ”, python βœ”, python3 ✘, numpy ✘, tbb ✘
user-826625 07 February, 2018, 19:59:06

Hi! How can I export csv raw data with Pupil Player? Thanks!

user-826625 07 February, 2018, 19:59:44

I was using Pupil Mobile app as well, Can we calibrate with the mobile app? Moreover, when I play the files captured by Pupil Mobile, the gaze circle did not show up as the files recorded by the pupil capture. Is this due to the app itself or am I not using it correctly?

wrp 07 February, 2018, 20:15:12

Hi @user-826625 - Exporting csv files with Pupil Player - Load the Raw Data Exporter Plugin in Pupil Player. Press e or ⬇ to export data. - Pupil Mobile - You should start recording and then show the calibration marker to the participant. You will not be calibrating in Pupil Mobile, only recording video data. You can calibrate post-hoc in Pupil Player as long as you have recorded the calibration session in Pupil Mobile with offline pupil detector and offline gaze mapper. (The reason you didn't see a gaze position/gaze circle in Pupil Player is due to the fact that you did not calibrate and therefore did not have gaze data)

user-826625 07 February, 2018, 21:23:43

Thanks@wrp ! I was trying to follow this video, https://pupil-labs.com/blog/2017-08/pupil-capture-player-and-service-release-v0-9-13/, but I had difficulty finding the "open plugin" "visualizer" "Analyzer" "data source" options. I have activated all plugins from the plugin manager

user-48d784 08 February, 2018, 00:20:40

Hello! I was wondering whether I could use Pupil with an Arrington Research EyeTracker? and if so, where/how should I start? Thanks!

user-e7102b 08 February, 2018, 03:39:05

@papr @wrp Thanks for editing the plugin - now the raw data export is working just great.

wrp 08 February, 2018, 06:39:29

@user-826625 Please note that with v1.0 we changed the entire GUI. Therefore Pupil software no longer looks like the linked blog post. If you check the docs in this section: https://docs.pupil-labs.com/#analysis-plugins you can see how to launch plugins with screenshots of v1.0 versions of Pupil software.

wrp 08 February, 2018, 06:43:22

@user-48d784 Pupil is designed to use UVC compatible cameras and is designed for wearable/head-mounted eye tracking (opposed to remote eye tracking). I do not know much about AR's systems (It seems that many are remote - e.g. not head mounted - unless you are referring to a VR/AR integration).

user-516564 08 February, 2018, 08:47:43

Hey guys,

I have a DIY set with the full hd world cam and I need to make it work with pupil-mobile. Since I have um regular usb for each camera, I am inclined to think a usb-to-usbc hub would do the trick. Do any of you have any experience with that? I am concerned with bandwidth problems, but I imagine that the commercial version of the headset -which has two cameras and a single usb-c output- must deal with the same issues. So, to sum it up, do you know if there is anything inherently different from the way the commercial headset communicates with the mobile device from plugging two cameras to a hub?

wrp 08 February, 2018, 09:30:21

@user-516564 please note that Pupil Mobile is designed and implemented to work with Pupil Labs hardware. Pupil DIY hardware could work in theory, but it has not been tested and we can not provide support for DIY.

mpk 08 February, 2018, 09:38:24

@anlutfi#9800 there are a lot of small details that make the 'normal' Pupil headset work on a single bus and also with Pupil Mobile. We dont have this kind of control over the DIY hardware. Thus DIY use case is only on Desktop and with two usb cables.

user-c77dda 08 February, 2018, 09:39:33

hey pupil labs, for the realsense version of the eyetracker, is there any way to get the depth video recording without compression? Or even better, with lossless compression? I tried setting the bit_rate parameter in the code to high numbers, the resulting depth video is still overcompressed, even though the reported bitrate gets higher

mpk 08 February, 2018, 09:41:47

@user-c77dda I would recommend taking the depth stream and saving it uncompressed as frame wise numpy arrays. You will need an obcene about of disk space.

user-c77dda 08 February, 2018, 09:43:24

ok, thats what I was afraid of. If only there was some video codec supporting 16 bit grayscales

mpk 08 February, 2018, 09:44:40

you could also try saving them lossless using .npz files.

mpk 08 February, 2018, 09:44:53

but I think that will be factor 2 not 10 or so.

mpk 08 February, 2018, 09:45:05

still a lot of data.

user-516564 08 February, 2018, 09:45:26

@wrp and @mpk, thanks for the reply. I understand you can't provide support, but buying a hub is a much cheaper bet for me. So, is there a possibility in theory, or is there anything that makes you think beforehand that it will definitely not work?

mpk 08 February, 2018, 09:47:45

@user-516564 I can not think of anything that will not make it work outright. But I would give it <50% chance it will work out of the box. Also import in Pupil Player will need modifications to the source code.

user-516564 08 February, 2018, 09:50:03

@mpk thanks, worst case I'll have an extra hub. What sort of modifications do you mean? The output files would be different if I used the diy headset?

mpk 08 February, 2018, 09:50:21

camera names are different.

mpk 08 February, 2018, 09:50:30

this needs to be adjusted.

user-516564 08 February, 2018, 09:51:26

Ok, thanks a lot!

user-eb9bfa 08 February, 2018, 14:17:06

Hi pupil. I'm trying to automate marker detection and 'gaze on surface' exporting for an experiment where I have one recording per file (so loading them all into pupil player will be very time consuming). My code seems to locate the markers fine when I visualise the maker detection, but the ids are way off the normal range (I'm getting ids of 2048 and 4608, or example), and the confidences are way off (very low). This means that no gaze positions are being detected on the surface. I'm using an anaconda distribution on windows on a comp without admin rights so I haven't set it up as a developer PC. The same code works fine on a linux laptop set up to run pupil-labs from source. Is this problem something you have come across before?

user-0d187e 08 February, 2018, 14:42:32

I was able to see the cameras unders the "cameras" in the device manager (windows 10) before, but I cannot find them in the device manager anymore since yesterday. The pupil software still recognises them though. any idea what could be wrong?

user-e7102b 08 February, 2018, 17:21:25

Hi - quick question - can you foresee any issues with replacing the standard USB A to C cable that is used to connect the pupil headset to the USB port with a longer cable (e.g. 15 ft)? Thanks

user-c09b2c 08 February, 2018, 17:21:25

Hi everyone, can someone please explain or provide any resource to what exactly is the difference between 3D mode and 2D mode in the eye tracker? I have been trying to figure it out but not got any concrete answers

papr 08 February, 2018, 17:34:52

@user-e7102b Which headset do you use?

papr 08 February, 2018, 17:35:53

If it all I would use an active usb extender cable, that has active signal reinforcement

user-e7102b 08 February, 2018, 18:08:26

@papr We're using the mobile headset. From what I've read, passive cables should be ok up to around 16ft, so I was thinking something like this: https://www.amazon.com/CableCreation-Braided-OnePlus-Macbook-Resistance/dp/B01D3095RW/ref=sr_1_1?s=electronics&ie=UTF8&qid=1518112814&sr=1-1&keywords=usb+a+to+c+cable+10+ft

user-e7102b 08 February, 2018, 18:09:10

If that doesn't work, we could always go down the active cable route.

user-48d784 08 February, 2018, 19:28:30

@wrp thanks! You're right, my Arrington Research camera is remote, not head mounted. Sorry I had not understood that! Cheers

user-c828f5 08 February, 2018, 20:49:37

@papr Hey, I'm sorry if this seems like a question that has already been answered, but I'm looking a brief description of these two parameters in Pupil player. Model sensitivity and Confidence threshold. How do I get an intuition regarding which parameter to control for a better track?

wrp 08 February, 2018, 21:20:58

@user-0d187e cameras should be listed in the libusbK category within the Device Manager under Windows 10. If drivers are not correctly installed, or if drivers are overwritten/removed by a Windows update, Pupil Cameras will be listed in the Imaging Devices or in some cases Cameras categories.

wrp 08 February, 2018, 21:22:56

@user-c09b2c for an overview on 3d vs 2d see: https://pupil-labs.com/blog/2016-03/pupil-v0-7-release-notes/ - This is a blog post from almost 2 years ago, so there have been lots of improvements and changes since, but this will at least provide an intro

user-2798d6 09 February, 2018, 15:16:09

Hello - I downloaded the new Pupil apps for mac, and when I open old recordings in Player, the calibration, fixation detection, etc. fails because there is "no gaze data available to find fixations". Is there a fix for this?

user-2798d6 09 February, 2018, 15:20:23

Nevermind - I deleted the old settings, and I think it's working now!

user-d40c36 09 February, 2018, 19:53:53

Hello - We are purchasing the hardware add on for the htc vive. I though it would be best if we could do the development on the windows machine since it's where the vive is attached to. I am have numerous issues trying to follow the windows dependencies sections. Everything went as planned but when I try to run the main.py the pyav seems to be missing a dll. Does anyone have any comments on how to fix that? I also cannot compile the pyav. The other alternative is trying vive with linux, but that seems like a whole another can of worms.

btw this is the error in the traceback:

File "C:\Extras\Python\Python36\lib\site-packages\av__init__.py", line 9, in <module> from av._core import time_base, pyav_version as version ImportError: DLL load failed: The specified module could not be found.

wrp 09 February, 2018, 21:53:23

Hi @user-d40c36 do you need to modify Pupil source code?

wrp 09 February, 2018, 21:53:46

I ask because, you can also run the Pupil bundle for Windows and add plugins at runtime (perhaps you already know this).

user-d40c36 09 February, 2018, 22:08:31

@wrp We do intend to modify code, but mostly the c++ pupil code to see if we can improve upon it, this is mainly just research. The big issue is we can definitely do it on linux, but i'm not sure how well supported the htc vive is on linux. That is the main issue/problem that we are facing. I'm not sure the instructions are correct for windows, or maybe something has changed in the source code that is affecting the whl pacakges etc.

user-e7102b 10 February, 2018, 04:34:44

Hi - One of my experimental setups uses a Linux machine (Ubuntu 14.04) with a dual monitor setup that uses x-screens, so I'm unable to display the calibration to the participant using the built-in "screen markers" calibration option. To get around this, I wrote a script in Psychtoolbox that displays the manual markers at various locations on the screen. I'm displaying markers at 9 different locations, as suggested in the user guide. However, I've noticed that the gaze data are not being sampled for some of the markers (i.e. blue semicircle does not appear). This seems to happen randomly (i.e. no particular locations or orders). I've played around with different marker sizes, eccentricities, world camera resolutions etc. but haven't managed to stop this happening. Do you have any suggestions how to resolve this? I've attached a video here so that you can see what I'm dealing with: https://www.dropbox.com/s/ifcmdojeo3c04d0/IMG_4031.MOV?dl=0 ..... Thanks!

user-e7102b 10 February, 2018, 04:36:11

btw no participant is being tracked in this example

user-7bc627 10 February, 2018, 05:48:33

hi everyone, well..I just got the pupil mobile now, I have downloaded pupil capture, but it shows not supported? may I know how should I do?

Chat image

papr 10 February, 2018, 09:17:57

@user-d40c36 I would recommend a two computer setup in your case. One Linux machine to which the add on is connected via USB and that runs your modified version of Pupil Capture. The second machine runs windows and your vive application. The hmd eyes integration uses a zmq network connection. It does not matter much if the Pupil Capture runs on the same computer or a computer in the local network.

wrp 10 February, 2018, 09:20:11

@user-d40c36 - as @papr notes, Linux is certainly a more stable (easier env to set up dependencies) for dev than Windows

wrp 10 February, 2018, 09:20:55

@user-d40c36 we do also need to update wheels for windows

papr 10 February, 2018, 09:21:25

@user-7bc627 The warning just says that Disabling idle sleep is not supported on Windows. This is expected behavior since it is actually only implemented on macos. But this feature is not required for the application to run. Does the application start and show up?

papr 10 February, 2018, 09:29:49

@user-e7102b I would recommend to draw the markers as similar as possible as the manual markers in the docs. The outer black ring looks a bit too thin to me. Usually one would subscribe to Pupil Capture and wait until the manual marker calibration broadcasts that it detected enough samples of a marker before showing the next marker. You seem to advance in a fixed time period.

wrp 10 February, 2018, 09:31:32

(link to calibration marker pdf - https://docs.pupil-labs.com/images/pupil-capture/calibration-markers/v0.4_markers/v0.4_marker.v12.master.pdf you can also use the jpg that is in the docs page - note use the calibration marker and not the stop marker for claibration)

papr 10 February, 2018, 09:31:40

@user-e7102b The black and white ratios are important when drawing the markers, not so much their size. I think they could be slightly smaller in your case though.

user-7bc627 10 February, 2018, 10:29:06

@papr now is like this.. how should I do?

Chat image

papr 10 February, 2018, 12:38:03

Looks like your camera is either not connected or the drivers have not been installed. Did you run Pupil Capture with administrator rights? @user-7bc627

user-7bc627 10 February, 2018, 12:39:18

hmm I think so ? I just downloaded the one for Windows from here https://github.com/pupil-labs/pupil/releases/tag/v1.3

papr 10 February, 2018, 12:39:36

Don't worry about the zero division error. You should be able to open the eye windows. I will add a fix to the next release that catches this exception

wrp 10 February, 2018, 12:43:19

@user-7bc627 please right click on pupil_capture.exe and select run as administrator

user-7bc627 10 February, 2018, 12:45:20

ok I just did that. but it seems the same? @wrp

Chat image

wrp 10 February, 2018, 12:47:06
To debug driver installation could you please do the following:

Unplug Pupil Headset from your computer and keep unplugged until the last step

Open Device Manager

Click View > Show Hidden Devices

Expand the libUSBK devices category and expand the Imaging Devices category within Device Manager

Uninstall/delete drivers for all Pupil Cam 1 ID0, Pupil Cam 1 ID1, and Pupil Cam 1 ID2 devices within both libUSBK and Imaging Devices Category

Restart Computer

Start Pupil Capture

General Menu > Restart with default settings

Plug in Pupil Headset after Pupil Capture relaunches - Please wait, drivers should install automatically
papr 10 February, 2018, 12:47:22

@user-7bc627 you mentioned that you are using Pupil Mobile. Did you connect your Pupil Headset to your Phone or to the computer running Pupil Capture?

wrp 10 February, 2018, 12:47:54

(good question @papr )

user-7bc627 10 February, 2018, 12:48:32

it's connected to the laptop now

papr 10 February, 2018, 12:49:14

Then please restart Pupil Capture with administrator rights again

user-7bc627 10 February, 2018, 12:51:03

@wrp i checked what u said, but I dont have libUSBK (I have clicked show hidden devices)

wrp 10 February, 2018, 12:53:25

@user-7bc627 do you see drivers in other categories as noted above?

user-7bc627 10 February, 2018, 12:53:38

Chat image

wrp 10 February, 2018, 12:54:14

Look in imaging devices

user-7bc627 10 February, 2018, 12:54:49

it's "integrated webcam"

wrp 10 February, 2018, 12:57:19

And the pupil headset is connected? You should be seeing the devices in the device manager. What Pupil headset configuration are you using?

wrp 10 February, 2018, 12:57:43

You can also DM me with a order id if you have it

wrp 10 February, 2018, 13:35:19

For those reading the above notes with @user-7bc627 I wanted to note that this behavior may actually be due to the USBC cable either not being fully connected or a defective USBC-USBA cable. I will update with concrete information after further debugging.

wrp 10 February, 2018, 13:43:04

πŸ‘† Issue resolved. The USBC cable was not fully connected to the Pupil headset. Please note that the connector needs to be pushed in fully and requires a bit of firmness to push it in.

user-e7102b 10 February, 2018, 19:08:48

@papr @wrp Re the calibration markers, in the example video I'm actually directly presenting the v0.4 Marker that I downloaded from pupil-docs. It's unclear why there would be sporadic failures.

user-e7102b 10 February, 2018, 19:11:08

I understand @papr 's suggestion to subscribe to pupil_capture and wait for the confirmation that the required amount of samples have been broadcast, but during testing I did try presenting the markers for a really extended duration (e.g. 10 seconds each) and still the gaze sampling won't work for some locations, so I'm not sure if this will resolve the issue.

papr 10 February, 2018, 19:17:02

@user-e7102b Try to move the headset slightly during the calibration procedure. Especially when the markers are not recognized. Please let us know, if you find any consistency/regularities in the marker positions that are not recognized.

user-e7102b 10 February, 2018, 19:18:53

@papr Good point - due to the nature of the dual screen setup I can't wear the headset while I'm viewing the calibration procedure, so it's been resting stationary on a stand the whole time. I'll give this a shot. Thanks

user-e938ee 10 February, 2018, 23:45:18

Hey guys... What kind of encoding does pupil use? I subscribed using nzmqt, but apart from few tags like gaze, etc I get just rubbish

wrp 11 February, 2018, 02:17:45

@user-e938ee please see https://docs.pupil-labs.com/#message-format

mpk 11 February, 2018, 08:54:14

@user-e938ee we use msgpack

user-e938ee 11 February, 2018, 10:07:57

Thanks

user-7bc627 11 February, 2018, 11:54:50

@papr I am just wondering why my eyes in eye camera 0 is reversed? thanks!

Chat image

papr 11 February, 2018, 12:08:04

@user-7bc627 This is due to the camera being physically rotated on the headset. But do not worry, this is no issue for the pupil detection algorithm. You can flip the image in the general settings (this only flips the visualization only though).

user-7bc627 11 February, 2018, 12:13:36

oh I got it thanks! @papr meanwhile, now the pupil player seems crashed..it shows something like "GLFW window failed to create." May I know how should I do?

papr 11 February, 2018, 12:19:28

Yes, this is a known issue. This is most likely to a invalid window size value stored in the user session settings. Please delete the pupil_player_settings folder and start Pupil Player.

user-7bc627 11 February, 2018, 12:25:41

thanks!

user-8779ef 11 February, 2018, 15:51:57

Anyone else notice the latest bundle runs VERY slow?

mpk 11 February, 2018, 15:54:37

@user-8779ef what aspect runs slow?

user-8779ef 11 February, 2018, 15:54:43

player

mpk 11 February, 2018, 15:54:59

v1.3-9 ?

user-8779ef 11 February, 2018, 15:55:04

Running it on a brand new machine with tons of horsepower, and it's not nearly as responsive as it should be

mpk 11 February, 2018, 15:55:15

version 1.2 was faster?

user-8779ef 11 February, 2018, 15:55:22

Uh... the lastest, as of a few days ago (sorry, not on campus, where the machine is)

user-8779ef 11 February, 2018, 15:55:30

Most def. Very noticeable.

user-8779ef 11 February, 2018, 15:55:55

Now, when running from source on my machine, I didn't seem to have the same unresponsiveness.

mpk 11 February, 2018, 15:56:38

@user-8779ef ok. We will have a look into this on monday.

user-8779ef 11 February, 2018, 15:56:45

Ok, thanks. Let me know if Ican help.

user-8779ef 11 February, 2018, 15:56:50

FYI, running on a mac in both cases.

mpk 11 February, 2018, 15:57:05

If you can let us know the exact version of the bundle we can pinpoint the issue.

user-8779ef 11 February, 2018, 15:57:11

Ok, I'll do that on monday.

user-8779ef 11 February, 2018, 15:57:23

Also, I had a crash related to the new offline calibration range,.

mpk 11 February, 2018, 15:57:35

thats from master right?

user-8779ef 11 February, 2018, 15:57:36

I'll replicate so I can post the issue

mpk 11 February, 2018, 15:57:40

great!

user-8779ef 11 February, 2018, 15:58:11

Yeah ... it the range was 1 frame > actual frames. I had to add a "-1" somewhere. I know, I know, very helpful \s

user-8779ef 11 February, 2018, 15:58:26

in gaze mappers. I'll get the details soon.

user-8779ef 11 February, 2018, 15:58:38

That's papr's domain, right?

mpk 11 February, 2018, 15:58:45

great! And thank you for keeping the feedback coming!

user-8779ef 11 February, 2018, 15:58:57

My pleasure.

mpk 11 February, 2018, 15:58:58

@papr wrote this so he is the best contact.

user-8779ef 11 February, 2018, 15:59:06

Ok. I'll send him a message.

user-8779ef 11 February, 2018, 16:30:46

hey, @mpk , another question for you - I was recently trying to install homebrew dependencies on a mac, and got a new message about deprecated formulas. Is this something new?

user-8779ef 11 February, 2018, 16:31:02

Ring any bells? Google is, surprisingly, not very helpful.

user-8779ef 11 February, 2018, 16:34:01

Seems related to this: https://github.com/Homebrew/homebrew-science/issues/6365

user-dfeeb9 11 February, 2018, 17:50:15

Something I thought I'd drop by and ask - has anyone ever had problems with pupil detection in participants using heavy makeup? lol

user-dfeeb9 11 February, 2018, 17:50:34

I noticed on those wearing thick layers of eye-liner that the system would be confused very easily

user-dfeeb9 11 February, 2018, 17:50:43

I'm sure this is common but thought it was interesting enough to mention

user-e7102b 11 February, 2018, 17:58:27

@user-dfeeb9 I think this is a common problem with eyetrackers in general (I've certainly experienced this with other trackers that we use in our lab, such as Eyelink and Eyetribe). We explicity instruct our participants to come in without any eye makeup., and if they do come in with makeup we ask them to remove it.

user-dfeeb9 11 February, 2018, 17:58:54

I have decided to do the same since realising that was the issue, very interesting though. Sorry for stating the obvious, I'm very new to eye-tracking. Thanks @user-e7102b

user-e7102b 11 February, 2018, 18:00:57

Good idea πŸ˜ƒ You might also encounter problems with people with thick/heavy eyelashes.

papr 11 February, 2018, 19:02:17

@user-dfeeb9 @user-e7102b This issue can be slightly worked-around by setting the eye ROI to exclude the eye lashes. Does not always work very well though, but might improve the situation a bit.

papr 11 February, 2018, 19:06:46

@user-8779ef @mpk We noticed a long-time performance issue during paused playback in Player last week. I fixed it in https://github.com/pupil-labs/pupil/commit/2df34d28f06a45902aba05f56c472bea655edc75 This should have made it into the 1.3-9 bundle though -- nor 100% sure though.

papr 11 February, 2018, 19:08:24

I had the trim mark index issue as well for offline data that was generated by a previous version and was not able to reproduce the issue after deleting the cached data. Maybe we should increase the offline data version format to force the system to delete the invalid data...

user-8a8051 11 February, 2018, 22:55:57

Hi all, i'm finding a strange error when attempting to load recordings in pupil player. it reads "updating recording format" - "this may take awhile", the player app closes shortly after.

user-8a8051 11 February, 2018, 22:56:38

i'm using a 2015 mac with OS sierra 10.12.16,

user-8a8051 11 February, 2018, 22:57:13

pupil bundle version 1.2-7

user-8a8051 11 February, 2018, 22:57:27

any tips would be greatly appreciated,

user-8a8051 11 February, 2018, 22:57:29

Thanks

user-6e5aaf 12 February, 2018, 07:22:10

Hello

user-6e5aaf 12 February, 2018, 07:23:31

I'm Yoonmin Kim in Korea. I would like to order a Pulpi-lab's eye-tracker.

user-6e5aaf 12 February, 2018, 07:25:32

Is the Pulpil-lab's analysis software for free?

wrp 12 February, 2018, 10:03:25

Hi @user-6e5aaf yes, software is free and open source. See https://github.com/pupil-labs/pupil/releases/latest

wrp 12 February, 2018, 10:03:35

You may also want to check out the docs - https://docs.pupil-labs.com

user-7bc627 12 February, 2018, 12:34:44

@wrp @papr thanks again for all your explanation and help! Now I finaly understand what the "Markers" mean...so another questions, if the marker need to be presented physically, for example, it means we cannot use Pupil Mobile to do the eye tracking for outdoor advertising board? orr anything that is outdoor (e.g. how people view the building / architecture etc etc)

papr 12 February, 2018, 12:38:28

@user-7bc627 You can use Pupil Mobile and use the offline surface detector to detect the markers in your recording. You will have to add the surface markers to your AOI though. This means that it will not work if your advertising board does not have the markers.

papr 12 February, 2018, 12:39:22

Make sure to monitor you eye cameras for overexposure during your outside recordings. This might happen if it is sunny.

user-7bc627 12 February, 2018, 12:46:39

hmm ok I didn't really get the idea..is there any demo / tutorial for doing offline surface marker? if there is no marker in my recordings, how can it be detected offline?

papr 12 February, 2018, 12:49:06

Sorry if I did not express myself clearly. You will need surface markers in the world video. But you can use the offline surface tracker define surfaces after the recording based on these markers. Similar to the example dataset from our website.

user-7bc627 12 February, 2018, 12:56:30

what I was asking is that... so now, imagine I am collecting data outside and I want to see how people process the beverages in a convenient store (e.g. which beverage attracst customers' attention the most). In this case, obviously I cannot stick the markers on all those beverages...how should we do then? Thanks!

papr 12 February, 2018, 13:01:17

Pupil does not provide any object-detection based surfaces. You will need to either adapt your own object-recognition algorithm and define surfaces based on that, or manually annotate the video frame by frame. There are services that provide automated solutions, e.g https://imotions.com/

user-7bc627 12 February, 2018, 13:07:18

ok great! Then I think my doubts are all solved:) Last question...why it happens very often that the eye 0 doesn't work? Eye 1 is ok, but somestimes if I adjust the eye camera a bit, then Eye 0 will then show as grey background only (and even crashed)

papr 12 February, 2018, 13:08:35

Do I understand you correctly, that you start eye0, it works, you adjust the eye0 (right eye) camera and the window chrashes?

user-7bc627 12 February, 2018, 13:08:55

Yes!

papr 12 February, 2018, 13:12:43

This is with the headset connected to the computer running Pupil Capture, and not running Pupil Mobile and copying the recording to Player, correct?

user-7bc627 12 February, 2018, 13:14:06

yes, correct. the headset connected to the computer running Pupil Capture

Chat image

user-7bc627 12 February, 2018, 13:15:51

(it was ok when I just turned on the Pupil Capture..and after I adjusted both eye camera, the left was still fine, but the right camera just didn't work)

papr 12 February, 2018, 13:16:06

This might be a loose connection to the eye0 camera. I will continue writing you in a personal messsage since this issue is specific to your setup.

user-7bc627 12 February, 2018, 13:16:20

ok sure:)

user-d40c36 12 February, 2018, 15:48:43

@papr @wrp thank for the info. I'll proceed with that route. Just curious, will there be any attempt to update the wheels and get the development environment working with the current github source? The website lists steps to setup windows, but those do seem to be incorrect or out of date.

user-8779ef 12 February, 2018, 19:29:42

@papr Performance issues go away when I hide the timelines by click-dragging to lower the height of the bottom workspace

user-8779ef 12 February, 2018, 19:30:14

@papr So, it seems like all those timelines are stressing out the system.

user-8779ef 12 February, 2018, 19:30:35

20 minute video

user-e7102b 12 February, 2018, 21:20:04

@papr With reference to the manual calibration issues I highligted on Friday, you suggested that I subscribe to Pupil Capture and wait until the manual marker calibration broadcasts that it detected enough samples of a marker before showing the next marker, rather than advancing at a fixed time period. Can you provide or point me towards an example of the python code I would need to use to do this? You mentioned in an earlier comment that ZMQ relies on send/receive command pairs, but I'm not sure what I would need to be sending here in order to receive the broadcasted signals from the calibration routine. Thanks

user-8779ef 12 February, 2018, 21:46:10

@user-e7102b Have you looked here? https://docs.pupil-labs.com/#networking

user-e7102b 12 February, 2018, 22:00:48

Yeah I've looked through this before and I'm still not sure what I'm supposed to be doing. Is the "Reading from the backbone" example under "PUPIL REMOTE" the example that is most relevant to what I'm trying to do ?

papr 12 February, 2018, 22:03:49

@user-8779ef please test the current master for timeline performance. It should be much smoother than the last release. Else we will have to take further action to improve their performance.

papr 12 February, 2018, 22:14:04

@tombullock these are the notifications you need to subscribe to: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/calibration_routines/manual_marker_calibration.py#L85

See the this script on how to subscribe to notifications https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_messages.py

papr 12 February, 2018, 22:18:34

Be aware that subscribing to notifications requires you to prepend notify. to the notifications` subjects to correctly subscribe to them.

papr 12 February, 2018, 22:19:16

Best you subscribe to notify.calibration

user-e7102b 13 February, 2018, 00:51:43

@papr Thanks, this makes sense.

user-e7102b 13 February, 2018, 00:55:54

As an aside, I noticed that pupil diameter does not appear to be displaying in pupil capture. I've switched on the "Display pupil diameter for eye" in the "system graphs" tab, but both graphs are not displaying any output.

user-e7102b 13 February, 2018, 01:00:01

Pupil diameter is being logged in the recording files, so perhaps this is just a visulization issue?

wrp 13 February, 2018, 01:10:51

@user-d40c36 I can update wheels today

user-7bc627 13 February, 2018, 01:48:57

@papr you mentioned that another alternatively we could do is to manually annotate the video frame by frame. Just to confirm that can we do this in Pupil Player?

papr 13 February, 2018, 08:34:51

@user-7bc627 you cannot do that in Player. You will have to do this somehow externally.

user-6e5aaf 13 February, 2018, 11:09:12

@wrp How are Pupil-Lab's products different from other Eye-trackers, such as Tobii and SMI?

user-b23813 13 February, 2018, 12:10:10

Hi, I am using version 1.3.9. It worked great for a couple of days but suddenly eye camera 1 does not work. I know that this could be an issue with the drivers but it worked perfectly 1 day ago so I am not sure what to do. I uninstalled the camera drivers (both visible and hidden) and re-installed but nothing. Here is the message I see. Any ideas?

Chat image

papr 13 February, 2018, 12:47:37

@user-b23813 The log does not indicate any failure. What does happen? Does the eye window open? Does it show a gray screen? Is the ID1 camera listed in the eye's uvc manager plugin?

user-b23813 13 February, 2018, 13:11:38

@papr Hi, yes it shows a gray screen, whereas ID0 works fine. Not sure I know how to check if the ID1 camera is listed in the eye's uvc manager plugin.

Chat image

papr 13 February, 2018, 13:12:49

@user-b23813 Click onto the Activate source selector in the eye window. Is the ID1 camera listed?

user-b23813 13 February, 2018, 13:15:33

@papr No, it's not. I only see ID0 and ID2.

Chat image

papr 13 February, 2018, 13:18:30

@user-b23813 Please make sure that the camera connector is connected properly https://cdn.discordapp.com/attachments/412224263140016128/412598203725381632/JPEG_20180212_141751.jpg

user-b23813 13 February, 2018, 13:28:25

@papr it looks fine

papr 13 February, 2018, 13:29:04

@user-b23813 This seems to be an hardware issue then. Please write an email to info@pupil-labs.com containing your order id and a short description of the problem.

user-b23813 13 February, 2018, 13:29:43

@papr Will do, thank you very much

wrp 13 February, 2018, 17:02:05

@user-6e5aaf Perhaps this is a question better answered by other researchers in the community who have experience with these other products to offer comparison.

user-dfeeb9 13 February, 2018, 18:03:50

Hi, So i've been trying to study and understand the pupil-helper scripts for the time-sync system but I'm not sure I really get how it works, in terms of what is actually going on with the jitter-compensation and what the method is to connect a script to time-sync

papr 13 February, 2018, 18:23:02

@user-dfeeb9 Time sync is absed on this idea: https://en.wikipedia.org/wiki/Network_Time_Protocol#Clock_synchronization_algorithm

papr 13 February, 2018, 18:26:24

@user-dfeeb9 You will need to include network_time_sync.py in your application. And then you probably need to implement the time sync master based on the example: pupil_time_sync_master.py.

user-dfeeb9 13 February, 2018, 18:26:34

I see

papr 13 February, 2018, 18:27:53

Also, in case you did not see it yet: The exact time sync protocol spec that we use: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/time_sync_spec.md

user-dfeeb9 13 February, 2018, 18:28:09

ah I've read this

user-dfeeb9 13 February, 2018, 18:28:31

my main confusion* was exactly how to set it up to synchronise directly with pupil-recorder but i think this makes sense

papr 13 February, 2018, 18:29:26

The implementation of the clock follower is a bit more difficult than the master. Pupil Capture implements both. Therefore I recommend you to implement the master in your application and let Capture do the time following.

user-dfeeb9 13 February, 2018, 18:29:42

i see

user-dfeeb9 13 February, 2018, 18:30:03

this is very helpful, thank you

user-dfeeb9 13 February, 2018, 18:31:11

I will return if I struggle further, but thanks kindly

user-d40c36 13 February, 2018, 19:05:04

@wrp Thanks... I'll check out the updated wheels to see if that solves the windows issue. In the meanwhile I'll continue to go forward with the linux dev/windows setup

user-6952ca 13 February, 2018, 20:55:07

Hello! I am trying to build a research experiment that utilizes the Pupil glasses as an eye tracker. I want to record gaze fixation location relative to a computer screen center, not just relative to the World Camera. Is there a simple method for this?

papr 13 February, 2018, 21:06:41

@user-6952ca Yes, have a look at the surface tracker plugin

user-e7102b 13 February, 2018, 23:26:36

@user-6e5aaf In response to your question about how Pupil compares with other types of tracker (e.g. TOBI, SMI)... I was looking to purchase a head-mounted eye-tracker last year and decided to go with Pupil over Tobi after meeting both companies and trying out their trackers at ECVP in Berlin. I can tell you why I went for Pupil over Tobi. The headset is lighter, it doesn't have a lens and the modular nature really appeals to me (you can swap out cameras, upgrade, 3D print custom frames etc.). The software is all open source and python based, so you have a lot of flexibility there. I haven't tested Pupil extensively yet, but the accuracy, features, sampling rate are all more than sufficient for our purposes. The only issue I've experienced is getting their capture software to integrate with MATLAB/psychtoolbox, but I've just about got this figured out now with the help of the support team (thanks guys!) and they have also acknowledged that adding more MATLAB/PTB support will be beneficial, so I can see that improving.

user-e7102b 13 February, 2018, 23:28:52

Also, I don't know how easy it is to get hold of SMI eye-trackers anymore, since they got acquired by Apple...

user-e7102b 13 February, 2018, 23:30:48

And of course, one huge factor in my decision was the fact that the Pupil head mounted system can be had for <$2k, whereas the Tobi is ~$20K πŸ˜ƒ

user-e7102b 13 February, 2018, 23:38:05

I also have a bunch of experience with Eyelink, but their systems are not head mounted so I don't know if that's relevant. But feel free to contact me if you have any questions

user-b23813 14 February, 2018, 10:23:16

Hi, I see this error message when I open the player. However, the player works. It seems like a directory issue. Do you think that leaving it as it is would be fine or could it somehow affect my analysis later?

Chat image

user-6e5aaf 14 February, 2018, 11:04:14

@user-e7102b Thank you for your answer.πŸ˜€

user-8779ef 14 February, 2018, 13:35:48

@user-6e5aaf Also, Tobii seems to completely disregard the research market, and focuses on games. Their demos at scientific conferences are designed to obfuscate the quality of the track - for example, their latest demo shows the gaze position as a >5 degree cloud of smoke. How am I supposed to evaluate tracking accuracy, precision, or latency with that?!?! Similarly, it's only a recent event that they made eye images available to their customers. It's quite worrying that the company would prefer to hide data and indications of track quality from potential customers. SMI never did this - and their tracks were great, but @user-e7102b said, their trackers are no longer available. Pupil is the most transparent of all of them (open source!), but their products and software are still in alpha/beta, so don't expect something quite as polished as a product that costs >$10k.

user-8779ef 14 February, 2018, 13:37:11

Not yet, anyhow. STill, the software is in good enough shape that I'm conducting funded research using pupil mobile trackers.

user-8779ef 14 February, 2018, 13:58:28

...i'll even go as far to say that it's shaping up to be the best solution for mobile tracking, regardless of price.

user-41f1bf 14 February, 2018, 14:08:52

@user-8779ef Agreed. I would say further that pupil open source software have everything to grow into a fine, pervasive and versatile ecosystem

user-41f1bf 14 February, 2018, 14:09:30

For mobile and stationary

user-41f1bf 14 February, 2018, 14:18:47

So, it is a damn good investiment for developers

user-41f1bf 14 February, 2018, 14:23:42

Pupil is not a threat for imotion like business today. But I was asking myself if it could possibly be in the future. What do you think?

user-4d2126 14 February, 2018, 14:41:25

Hey there, got one piece of pupil with two eye cameras to test out in work. Got it working for the most part but I am kinda struggling with the eye focus set up. We are making games for mobile devices and no matter how I tried to adjust the cameras I was not able to set them up in a way that properly centers on both eyes when they are looking down (sitting position with cell phone in hands, looking down on the screen). Especially right eye that has the camera inverted (is that by design?) seems really hard capture in this position. Would really appreciate any suggestions on how to deal with this, thanks!

user-6952ca 14 February, 2018, 14:42:46

@papr My colleagues and I want to be able to have access to gaze position in reference to the screen center in real time. An example of this is utilizing the Pupil glasses to function as a mouse cursor. I have seen some videos of this being done and am wondering if anyone is currently working on this.

mpk 14 February, 2018, 14:45:48

@user-6952ca this works with current Pupil Capture and this script out-of-the-box: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/mouse_control.py

mpk 14 February, 2018, 14:46:08

This example even controls the mouse. You can remove that part.

wrp 14 February, 2018, 15:23:40

@user-4d2126 right eye camera is inverted by design because the sensor is physically rotated. You can flip the image display in the eye window's general menu if that helps you. Also if you are having difficulty adjusting the eye cameras, you may want to consider using the eye camera arm extender.

wrp 14 February, 2018, 15:24:26

You can see more about the eye camera arm extender here: https://docs.pupil-labs.com/#additional-parts

wrp 14 February, 2018, 15:24:34

we have been shipping this with all new 200hz systems

wrp 14 February, 2018, 15:25:05

but if one wants to buy these parts for an older system, you can do so via our shapeways store (linked in the docs in that section)

wrp 14 February, 2018, 15:26:07

If you'd like some concrete feedback on eye camera orientation, you can DM me a link to an eye video or send an eye video to info@pupil-labs.com for feedback

user-4d2126 14 February, 2018, 15:27:18

@wrp Thanks for the reply! The flip does not really help with this. Tried using the extenders but while it did help with the angle, the picture ended up being too zoomed in (might be wrong use on my part, did not have as much time to fiddle around with it yet) for proper capture. I will definitely stay in touch with you, thanks for offer. Gotta figure out if we can use this or not ASAP.

wrp 14 February, 2018, 15:28:39

@user-4d2126 I am quite confident that we can get you up and running after a bit of adjustment/notes/tips. Please send us an email and I or my team at Pupil Labs can provide you with some feedback

user-e7102b 14 February, 2018, 16:48:28

@yoonmin#2698 You're welcome

user-e7102b 14 February, 2018, 16:57:43

Quick question for the pupil-labs developers: do you have any plans to add a desktop mounted eye-tracking solution to your product line?

wrp 15 February, 2018, 01:03:07

@user-e7102b we will continue to focus on head mounted/wearable systems. Currently no plans in the pipeline for remote systems

user-8909c4 15 February, 2018, 03:48:29

What does it take to get pupil tracking working on a raspberry pi zero w?

mpk 16 February, 2018, 14:08:25

@here We are happy to annouce the release of Pupil v1.4! Get the latest release here: https://github.com/pupil-labs/pupil/releases/tag/v1.4

user-4d2126 16 February, 2018, 14:28:10

Hey guys, is it possible that the color of eyes plays role in the quality of detection? I had great readings from a person with dark eyes, but had great issues with people with light colored eyes. Any settings I could change to work around this?

mpk 16 February, 2018, 14:28:50

@user-4d2126 usually that is not an issue. Can you send a sample eye recording?

user-4d2126 16 February, 2018, 14:29:36

Will do.

user-4d2126 16 February, 2018, 14:35:23

any prefered file sharing service? files are too big for discord.

wrp 16 February, 2018, 14:36:15

@user-4d2126 drive is preferred

wrp 16 February, 2018, 14:36:27

share with info@pupil-labs.com

user-4d2126 16 February, 2018, 14:38:00

@wrp Shared

wrp 16 February, 2018, 14:38:20

Thanks @user-4d2126 - someone from our team will review and get back to you with feedback

user-4d2126 16 February, 2018, 14:41:47

@wrp Perfect, in the mean time, my boss is kinda pressuring me to do this ASAP while I do not really have much experience with this technology in general. Could you share any tips on best possible calibrating process for capture of mobile devices? I have read through the guide but did not really get a good idea of best possible way to calibrate when the field of view is under an angle lower than 90 degrees.

mpk 16 February, 2018, 14:45:10

@user-4d2126 I checked both videos. Detection if fine. The only thing I had to change was reduce the min pupil diamter to around 11.

mpk 16 February, 2018, 14:45:21

Chat image

mpk 16 February, 2018, 14:46:34

@user-4d2126 if you use the other lens make sure to re-calibrate the camera intrinsics. For your usecase you might also want to try 2d mode.

user-4d2126 16 February, 2018, 14:47:39

@mpk Will give it a try, thanks for your help. Will get back to you if I cannot make any break through.

mpk 16 February, 2018, 14:47:49

ok sounds good!

user-29e10a 16 February, 2018, 16:00:22

Hey, just started developing in Windows so far... how is it possible to produce such "binaries" of the pupil source code as you're providing on the release site on github? we do have some windows clients "on the road" where I don't want to run the source directly... πŸ˜ƒ

mpk 16 February, 2018, 16:05:32

@user-29e10a any chance your changes can be put into plugins?

user-29e10a 16 February, 2018, 16:07:33

good idea, I'm not sure, but I think about it – at least for the functional parts. I made some ui customizations which are not interesting for others

mpk 16 February, 2018, 16:10:02

@user-29e10a you can overwrite exsiting plugin with custom plugins. You can also make changes to the ui via a plugin.

mpk 16 February, 2018, 16:10:31

if you can use the plugin path to field deployment is super smooth. Otherwise, get ready for a rough ride πŸ˜ƒ

user-29e10a 16 February, 2018, 16:11:40

ok, then I definitely look into it – on the other hand, what does not kill us makes us stronger 😬

user-137510 16 February, 2018, 17:42:31

hi I've a beginner's question. I'd like to collect <x,y> positions of eye fixations and I can't figure out how to collect those datapoints. Preferably, I'd like to do that through Python though using a ready made software would also work. Can you guys help me sort this out...documentation is tremendously long I get lost every time I dare searching for something of that sort..thanks in advance.

user-5d12b0 16 February, 2018, 18:42:09

Hi, I'm trying to run pupil_capture from src from within PyCharm. Does anyone (@performlabrit ?) know how to modify a run configuration to expand the PATH to include ..\pupil_external ?

user-e7102b 16 February, 2018, 19:18:21

Hi, I've created a very basic "toolbox" containing functions and example scripts that enable Pupil_Capture to be controlled using MATLAB/Psychtoolbox. I'm on a tight deadline to get Pupil up and running, so my solutions may not be the most elegant...but they do the job for now. I figured I'd share here in case anyone else finds this useful. Any feedback from other MATLAB users would be welcome. https://github.com/attlab/pupil_labs_matlab

user-dfeeb9 16 February, 2018, 22:25:11

hey @user-e7102b and others here, I've taken a look and a lot of your python stuff looks very similar to a middleman system I've been working on. It seems we both essentially worked off the pupil-helper scripts. I've made mine public. You can find it here: https://github.com/mtaung/pupil_middleman Functionally they're basically the same and you have the addition of matlab scripts. For the time being, I think our stuff will work until Pupil brings over native support to matlab. Also, I don't know if this is actually all that useful. Some input from the pupil guys would be nice too. I also don't like how I defined certain strings to communicate with pupil functionality. Is there documentation for all available commands?

user-e7102b 17 February, 2018, 01:24:09

Hey @user-dfeeb9 Thanks for sharing this. I'll take a look at your scripts over the weekend.

wrp 17 February, 2018, 01:57:30

@user-e7102b @user-dfeeb9 this is great. Please consider adding a link and short description to your work in https://github.com/pupil-labs/pupil-community so that other members in the community can find and use your tools/scripts

user-634a3c 17 February, 2018, 05:26:49

Hello everyone...i m working on a project related to eye motion tracking but my pupil tracking has some noise issues .anyone to help me out please

user-138d0a 17 February, 2018, 12:40:40

any ideas if floppy discs make a good IR pass filter?

user-516564 17 February, 2018, 13:10:22

@user-138d0a, I would bet it would be waaaaaaay too dark

user-516564 17 February, 2018, 13:12:26

As a rule of thumb, since the ir filters filter just IR light (mostly), you can test your intended filter by placing it in front of the camera and checking the captured footage

user-138d0a 17 February, 2018, 13:13:50

you mean by placing an ir filter i should see black since everything is filtered? But I think any kind of filter is not perfect and will pass at least 1% right?

user-516564 17 February, 2018, 13:14:01

If you can't see anything, as I would bet with floppies, than it's no good as a filter. The other way around isn't true, though

user-516564 17 February, 2018, 13:15:05

No, I mean if you put a filter in front of the original camera, before removing the ir filter, you should be able to see some image

user-138d0a 17 February, 2018, 13:15:55

Sounds the same thing but ir filter and ir pass filtered in different order

user-138d0a 17 February, 2018, 13:16:37

unless both are 100% efficient something will be visible

user-516564 17 February, 2018, 13:17:03

Yeah, my point is that more than something should be visible

user-516564 17 February, 2018, 13:17:58

Not just something, but a lot of things. If it's too dark with both filters, chances are it will be too dark with just the second one(the floppy in your case)

user-516564 17 February, 2018, 13:19:21

But, of course, you could check it in the middle of the disassembly just by holding the floppy in front of the lens, without yet committing to the "surgery"

user-138d0a 17 February, 2018, 13:20:56

the visibility could be a result of imperfection of either of the filters, that wont tell me if the foppy disc actually acts as a real IR pass filter or just a very strong dimmer.

user-516564 17 February, 2018, 13:23:23

This test won't be able to tell you for sure if it's a good choice, but could tell you it's a bad one

user-516564 17 February, 2018, 13:24:00

Of course, the only way of knowing it works is by testing or knowing the properties of the material beforehand

user-138d0a 17 February, 2018, 13:24:27

unless someone else has done that already (likely)

user-516564 17 February, 2018, 13:25:04

I sincerely doubt a floppy would let enough light pass through. But i would like to know this too. Floppies are way easier to get than film negatives

user-e7102b 17 February, 2018, 22:21:04

@papr Is there an built in way to prevent the world camera video from being recorded along with all the other pupil metrics? I plan to do many hours of recordings with my participants and these files are huge so space will become an issue. I don't think i'll ever look at the raw world video footage, just pupil position relative to my defined surface.

user-e7102b 17 February, 2018, 22:22:06

I understand if this is not possible, and I can alway figure out a workaround if not.

user-e91538 18 February, 2018, 00:49:39

A question regarding the setup for the pupil-labs/libuvc on Windows 10, VS17, for x64. I've downloaded and built both the pupil-labs/libuvc and the libuvc. Attempting to use it with a device that definitely works with UVC and all, and has had the libusbK (v3.0.7.0) driver installed on it via Zadik does enumerate the device properly (albeit causes some minor heap corruption), however when I try to start the stream, "uvc_stream_open_ctrl" gives me back the error Not Supported. What could be going wrong there?

user-41f1bf 18 February, 2018, 05:11:46

@user-e7102b how long each of your recording sessions will last? Do you have any precise number in mind? There was a memory issue that prevented people from opening their long recordings in pupil player.

user-41f1bf 18 February, 2018, 05:14:37

Anyway, a workaround is to split recordings... I would recommend 20-30 min

user-41f1bf 18 February, 2018, 05:18:10

You can save a lot of space using the compression option (either less space-more speed or more space-less speed)

user-adb91e 18 February, 2018, 13:41:48

Can anyone tell me if the binocular addons for Vive/Rift provide enough precision for performing eye tracking for foveated rendering? The addon isn't cheap so would like to know if it is worth investing in. If anyone has done it before any links and/or videos would be appreciated.

papr 18 February, 2018, 14:22:55

@user-e7102b You will need to modify the source code to not record the world video. But you will be able to use Pupil Player to analyse the recording, starting with v1.4

user-e7102b 18 February, 2018, 16:44:47

@user-41f1bf Hi, the project we plan to use the tracker for first will mainly involve multiple 5-30 minute recordings, so hopefully that means we won't run into issues. I was hesitant to try the "smaller file, more cpu" compression option as I was planning on running pupil capture on the same machine that I'm using to display stimuli (and I was worried this increased CPU usage might have a negative impact on timing), but I'll give this a try. Thank you for the suggestion and information.

user-e7102b 18 February, 2018, 16:45:40

Btw has this memory issue that prevented people from opening long recordings in pupil player been fixed now?

papr 18 February, 2018, 20:48:37

@user-e7102b This has been mostly a Python 3.5 multi-processing issue. All bundles are frozen using Python 3.6. 30 minutes of pupil_data should be no problem at all. Nonetheless are we planning on introducing changes that deal with very long recordings. But these take time to implement and test thoroughly.

papr 18 February, 2018, 20:50:22

The real time annotation feedback can be implemented easily using log message. I will make a quick PR to include this change

user-41f1bf 18 February, 2018, 21:09:40

@user-adb91e I would recommend you to ask HMD questions in the HMD room. πŸ™‚

papr 18 February, 2018, 21:10:06

For reference, that is vr-ar πŸ™‚

user-41f1bf 18 February, 2018, 21:14:27

@user-adb91e I can say for sure that Pupil have enough precision for a lot of motifs. And I can also say that monocular accuracy is good enough for stationary stuff.

user-adb91e 18 February, 2018, 21:15:02

stationary stuff as in?

user-41f1bf 18 February, 2018, 21:15:24

As in my doctoral thesis

user-41f1bf 18 February, 2018, 21:15:37

:)

papr 18 February, 2018, 21:16:38

I do not know much about this topic but the question for me is if the delay for running the pupil detection is worth the time saved on GPU rendering time.

user-41f1bf 18 February, 2018, 21:17:41

If you are planning to use binocular setups, you should have better chances than me for higher quality detection

user-41f1bf 18 February, 2018, 21:18:06

During stationary inquires

user-41f1bf 18 February, 2018, 21:19:29

I am talking about accurate detection. Pupil precision is excelent in any case as far as I know.

user-adb91e 18 February, 2018, 21:20:38

detection of what?

papr 18 February, 2018, 21:21:02

Pupil detection.

papr 18 February, 2018, 21:21:58

In the monocular case it is difficult to detect the pupil if the subject looks into the opposite direction of the eye camera.

user-41f1bf 18 February, 2018, 21:22:07

Are you aware of how Pupil measures detection quality?

user-41f1bf 18 February, 2018, 21:22:59

There are two basic measures. Precision and accurary.

papr 18 February, 2018, 21:24:41

These two terms refer to the quality of gaze mapping though. Not to the pupil detection itself. (As far as I understood them)

user-41f1bf 18 February, 2018, 21:26:08

Yes! I should have said data quality.

user-41f1bf 18 February, 2018, 21:27:34

Gaze mapping means the end result. The data points that you effectively end with.

user-41f1bf 18 February, 2018, 21:36:15

@user-adb91e I think your question is too broad, as I am understanding it. As @papr have suggested, are you concerned with machine performance (rendering itself)? Or as I suggested, with accurate data during assumed good rendering?

user-adb91e 18 February, 2018, 21:37:03

the latter

papr 18 February, 2018, 21:38:39

Noisy mapping can be counteracted by increasing the high-resolution rendered area.

papr 18 February, 2018, 21:39:43

Up to the point where one would not speak of foveated rendering anymore.

user-adb91e 18 February, 2018, 21:41:54

I can discuss the feasibility of foveated rendering in great detail but first I'd like to know if pupil software and hardware can provide accurate enough data for foveation.

papr 18 February, 2018, 21:45:41

I think accuracy is not the issue here. The question is if the processing is fast enough for foveated rendering. I do not know the total delay by heart that it takes the pipeline to produce a gaze sample starting by capturing the eye video frame. @mpk knows it.

papr 18 February, 2018, 21:53:14

The more I think about it the more I am sure that this is doable. You will need to detect saccades though and disable foveated rendering until you detect a fixation which you can use to re-enable foveated rendering at the position of the fixation.

user-adb91e 18 February, 2018, 21:54:22

there are different types of saccades, which one are you referring to?

papr 18 February, 2018, 21:55:29

I do not know the exact terms. I do not mean micro saccades but those that change the position of the gaze location dramatically

user-adb91e 18 February, 2018, 21:56:20

why disable during those?

papr 18 February, 2018, 21:59:37

Because you will not know where the subject will fixate next and there is a delay between the actual start of the fixation and the Pupil timeline giving you the gaze point for the start of fixation. If you do not disable foveated rendering during the saccade it will result in a poorly rendered scene at the location of the fixation for short period of time (the delay between the acutal start of the fixation and the pipeline giving you the gaze point).

papr 18 February, 2018, 22:00:44

This delay is only a few milliseconds but be enough for the subject to notice the poorly rendered scenes.

papr 18 February, 2018, 22:02:34

On the other side, you render with way less frequency than the system is able to produce gaze points (2x 200Hz). Therefore the delay might be low enough for continuous foveated rendering.

user-adb91e 18 February, 2018, 22:03:59

https://www.youtube.com/watch?v=hKSlewfW_5Q

user-adb91e 18 February, 2018, 22:05:40

this is from 2012 so not sure about performance issues. 200Hz cameras might be unnecessary

papr 18 February, 2018, 22:16:37

The mapping accuracy always depends on the quality of your calibration but lies by 0.8-1.3 degress depending on the chosen detection/mapping mode

user-adb91e 18 February, 2018, 22:19:50

The purpose of foveated rendering is to render more complex scenes than the hardware can handle vs with normal rendering. If it is going to be turned off at any point, at that point there will be fps drop which is a no-no in VR, so not an option.

user-adb91e 18 February, 2018, 22:20:20

it is not there to merely save system resources

papr 18 February, 2018, 22:20:30

Ok, I understand.

user-adb91e 18 February, 2018, 22:24:15

Some proprietary systems guess new gaze position and provide to the user before the eye actually reaches there, does Pupil not have such a feature?

papr 18 February, 2018, 22:27:21

No, gaze points are evidence-based. This means that everything starts with a single video frame of the eye camera on which the pupil detection algorithm is run. It results in a 2d ellipse that is relative to the eye image. In the 2d mapping mode this 2d ellipse is used to generate a single gaze point. In the 3d mapping mode a series of 2d ellipses is used to fit our 3d eye model and to generate 3d data that is projected into the space of the world camera/hmd display

papr 18 February, 2018, 22:30:20

Any guess work on future eye positions would be done based on previous gaze locations and can therefore be calculated outside of the pipeline.

papr 18 February, 2018, 22:39:30

@user-e7102b https://github.com/pupil-labs/pupil/pull/1080

user-e7102b 19 February, 2018, 02:22:01

@papr Great - thank you!

wrp 19 February, 2018, 02:59:43

pyglui v1.18 wheel (for Windows devs) has been uploaded - https://github.com/pupil-labs/pyglui/releases/latest

user-41f1bf 19 February, 2018, 03:26:34

@user-adb91e do you mind telling me what proprietary systems will guess gaze positions? Just to make sure I will not buy them.

user-41f1bf 19 February, 2018, 03:28:39

It was a joke 😬

user-29e10a 19 February, 2018, 08:29:52

@mpk Apparently the plugin system does not allow the manipulation or investigation of the eye images, which is crucial for me, so for this part I have to take the rough ride ... are you working on including the eye images into the plugin pipeline?

papr 19 February, 2018, 08:33:13

@user-29e10a do you need real time access to the eye images? How much of a delay would be OK for you? You can use the frame publisher plugin to broadcast eye images over the IPC. If the IPC adds too much delay, you will have to modify the eye source code for direct access.

user-29e10a 19 February, 2018, 08:49:54

@papr the delay is not very important (1 sec would be ok for example) – but the processing of each frame could be longer – if I manage to do the processing off the main thread, this would be no problem?

user-29e10a 19 February, 2018, 08:51:12

i will look into this – pupillabs never ceases to amaze me πŸ˜ƒ

user-29e10a 19 February, 2018, 08:51:39

by the way, any progress of putting the 200 Hz cams into the vr headsets? πŸ˜ƒ

papr 19 February, 2018, 09:16:14

@user-29e10a the eye and World processes are separated from each other for performance reasons. I would suggest to do the eye frame processing in a separate process as well.

user-4d2126 19 February, 2018, 15:23:29

Hey guys, finished our first test over the weekend with what I would call a moderate success. One thing I have realized is that to be able to accurately track eyes on a mobile device one has to fix the device to one place since if the player holds it in his hands only it's impossible for him to not move it if even a slightly. Is there anything I could do to work around this via software/calibration? If not, do you have any tips for or even a solution for some kind of a holder?

So far the best solution I could think of was using stativ with a grip for mobile phone.

papr 19 February, 2018, 15:26:46

@user-4d2126 "fix the device to one place since" do you refer to the phone or to the headset?

papr 19 February, 2018, 15:27:02

And what happens if this device is moved?

user-4d2126 19 February, 2018, 15:34:03

@papr The tracking shifts accordingly, meaning that what was based on original calibration considered the center is now being on the right side of the screen (if the device was moved to the left). This might be an issue with the way I have calibrated it but so far I haven't found any other way to do it (to be fair I have been using the tracker for a week only).

papr 19 February, 2018, 15:39:41

This is a known error source called slippage. Were you using 2d detection/mapping or 3d?

user-4d2126 19 February, 2018, 15:41:28

@papr I have used 2D mode since I have been having issues with getting an accurate calibration in the 3D mode. 2D was a recommendation from mpk.

papr 19 February, 2018, 15:43:29

Yes, 2d is more accurate on average but more prone to slippage.

user-4d2126 19 February, 2018, 15:44:11

since we are using this on mobile devices ranging from 4 to 6 inch screens, we need it as accurate as possible.

papr 19 February, 2018, 15:44:54

In your case I would recommend to recalibrate periodically

user-4d2126 19 February, 2018, 15:46:10

I have been fiddling with calibration since the beginning and the most accurate results were produced by recording automatic screen calibration on a computer and playing it as a video on a mobile with manual calibration on but this of course leads to the necessity of keeping the device in same spot.

user-4d2126 19 February, 2018, 15:47:18

Not sure if head movement affects the accuracy as much but since the tracking is done in a sitting position (with device being fixed in one place) there should (hopefully?) not be much room for moving the head and getting different view angles.

user-4d2126 19 February, 2018, 15:50:28

I have a very limited understanding of this as I am new to this technology so maybe I am just going completely wrong around this.

papr 19 February, 2018, 15:55:01

Ok, I think the issue is a different one: Your area of calibration is too small. Yes, smaller calirbated areas result in morea accurate results within that area but to very bad results outside of it.

papr 19 February, 2018, 15:58:45

The calibrated area is fixed within the world cameras view of field. If the phone moves out of the area you will get incorrect results if the subject looks at the phone.

papr 19 February, 2018, 16:00:41

Therefore my suggestions is to increase the calibration area.

user-4d2126 19 February, 2018, 16:40:54

@papr So for example if I used a tablet to play the calibration video and then used a phone in the area where the tablet was, I would get accurate results in a wider area? I have been trying similar thing with computer display but the results were not accurate at all.

papr 19 February, 2018, 16:42:05

What do you mean by not accurate at all? What does the accuracy visualizer plugin tell you? It calculates the accuracy in degrees after each calibration

user-4d2126 19 February, 2018, 16:56:20

I was getting an offset (at least an inch) no matter what I did until I tried the aforementioned thing with the video played on phone.

papr 19 February, 2018, 17:36:59

Is an headrest an option for you?

user-4d2126 19 February, 2018, 18:40:53

@papr Depends. Do you think that a headrest would bring better data than having a phone fixed to one position?

papr 19 February, 2018, 21:46:21

The headrest would fix the last thing in your setup that is not fixed yet: the head. The phone is fixed. The headset does not move much if the head does not move. And the phone does not leave the calibration area if the head is fixed. You should increase your calibration area a bit though. Playing the video back on the phone will result on a calibration area that is smaller than the display.

user-4d2126 19 February, 2018, 21:49:57

Fair enough, will definitely look into options for headrest. Was hoping to capture this in a completely natural environment (ie. device in lap and head free) but I guess the technology is not at the point where this would be feasible yet.

papr 19 February, 2018, 21:51:54

@KeenMFG#0059 I understand and Pupil is actually designed for this natural environment. But accuracy comes at a cost.

papr 19 February, 2018, 21:53:51

Or to be more specific: Even if the angular error is small, the absolute error increases with distance.

user-4d2126 19 February, 2018, 21:54:44

Yeah, I was coming here with the idea that for the results we are looking for, will need to make some sacrifices as far as the player's comfort goes. I believe that will be enough to persuade my boss to go for this. :)

papr 19 February, 2018, 21:56:21

Cool, let us know if you need any more information or tips. I think @user-41f1bf also used a headrest in his work. Maybe he can elaborate on his experience with it.

user-4d2126 19 February, 2018, 22:00:07

Great, thanks for all the help. I can definitely use any I can get but fro the reaction to data I got so far the leads seem satisfied. Also really appreciate how you guys are active here. I Believe that this could lead to a long term partnership and hopefully to an actual integration with mobile gaming business in the future for your technology.

user-4d2126 19 February, 2018, 22:02:28

Either way, what I wanted to say. Awesome work, keep it up. :)

papr 19 February, 2018, 22:05:08

Happy to hear that πŸ™‚

user-dfeeb9 19 February, 2018, 22:14:46

hi @papr and @wrp, you wanted me to add our repo to the pupil-community MD right? is that just through a PR?

user-dfeeb9 19 February, 2018, 22:14:59

myself and tom have merged our projects over here: https://github.com/mtaung/pupil_middleman

papr 19 February, 2018, 22:26:09

@user-dfeeb9 Yes, correct, just add a link to your repository to https://github.com/pupil-labs/pupil-community/blob/master/README.md and make a PR with the changes. πŸ™‚

papr 19 February, 2018, 22:26:40

It would probably fit best in the Scripts category

user-dfeeb9 19 February, 2018, 22:30:14

thanks

user-dfeeb9 19 February, 2018, 22:30:36

on a related note @papr , if you ever get a chance I would greatly appreciate a gloss over what I wrote there - its mostly stuff you'll be familiar with, i.e. pupil helper code

user-dfeeb9 19 February, 2018, 22:30:44

but my main worry is that i may have fudged up the implementation somewhere

user-dfeeb9 19 February, 2018, 22:30:51

and i still need to write in proper time sync with network time and pupil-recorder

papr 19 February, 2018, 22:31:58

@user-dfeeb9 of course πŸ™‚

user-dfeeb9 19 February, 2018, 22:32:07

thank you kindly!

user-41f1bf 20 February, 2018, 01:32:25

@user-4d2126 yep, it will improve stability but I would recommend a chin rest if you really need to ensure that something must be always visible for the world camera. For me it was necessary to detect the computer screen so I need the screen always at sight.

user-41f1bf 20 February, 2018, 01:38:38

You should be aware that calibration is highly influenced by the operator (you). So you should develop a routine to efficiently move people around, wearing glasses, find when pupil is not being detected, when calibration markers are not being detected, the best calibration method for your usecase and so on

papr 20 February, 2018, 06:18:49

☝ this is very much true.

user-2798d6 20 February, 2018, 17:44:55

Hello! I love that Player now shows the time bar, but is there somewhere to concurrently see the frame number so I know what frames to export? Thank you!

papr 20 February, 2018, 17:46:45

I am working on that. Expect that option in the next release. Currently this is only possible with the old version.

user-2798d6 20 February, 2018, 17:47:25

Ok - so when I do video export or raw data export can I only do the full video right now?

papr 20 February, 2018, 17:48:02

no, you can still use the trim marks. They are just time based instead of frame index based

papr 20 February, 2018, 17:48:34

I am working on an option that displays the trim range once based on time and once based on frame index

user-2798d6 20 February, 2018, 17:49:15

oh I see! Thanks! One last quesiton - is there now a way to get saccade length/duration or is that still in progress?

papr 20 February, 2018, 17:49:58

This is still work in progress

user-2798d6 21 February, 2018, 01:34:16

I'm working in Player with offline calibration and want to adjust the mapping range, but don't see any way to figure out frames since the bar now shows time. Is there a way to do this or a way to make the mapping ranges done by timestamp instead of frame?

wrp 21 February, 2018, 02:56:49

@user-2798d6 we will be releasing a fix for this soon.

papr 21 February, 2018, 09:47:39

@user-2798d6 The v1.4 Player release allows to set calibration and mapping ranges from the current state of the trim marks. No need to manually type frame numbers anymore.

mpk 21 February, 2018, 13:36:33

Important [email removed] :

We have found a severe bug in Pupil Mobile when used with our new 200hz cameras. We have observed that eye videos recorded with the new 200hz cameras are not in sync with each other nor the world video. This results in faulty calibration and gaze mapping. We are working on a fix as we speak! Please do not use Pupil Mobile with the new 200hz eye cameras/Pupil headset headset until this is resolved. Headsets with 120hz eye cameras are NOT affected by this bug. We will update you all again here when the bug has been resolved.

user-c83a9a 21 February, 2018, 15:20:42

hello all, tried to run pupil player v1.4-2 on mac os 10.13.3. After clicking "open" on mac os Gatekeeper the application immediately quits. I can't find any solution to the problem. Ideas?

mpk 21 February, 2018, 15:25:55

@user-c83a9a does the same happen with capture?

papr 21 February, 2018, 15:42:40

I was not able to reproduce this issue on my Mac. 1) I downloaded the zip file 2) Opened the Pupil Player dmg file 3) Copied the Pupil Player app into my Applications folder 4) Ejected the disk image 5) Opened the Player application from the applications folder 6) Gatekeeper asked for permission 7) I granted permission 8) The "drop a recording" window appears as expected.

user-41f1bf 21 February, 2018, 17:18:11

Hi

user-41f1bf 21 February, 2018, 17:18:21

For pupil plugins in windows

user-41f1bf 21 February, 2018, 17:19:04

What is the home folder for plugin in windows.

papr 21 February, 2018, 17:19:50

AFAIK \Users\<user>\pupil_<app>_settings\plugins\

user-41f1bf 21 February, 2018, 17:20:15

Thanks @papr !

user-41f1bf 21 February, 2018, 17:21:49

And for mac?

user-41f1bf 21 February, 2018, 17:22:36

I will make a pull request with this information, it is missings in docs

papr 21 February, 2018, 17:22:43

~/pupil_<app>_settings/plugins

user-41f1bf 21 February, 2018, 17:22:53

Great!

papr 21 February, 2018, 17:23:06

see my edit above

user-41f1bf 21 February, 2018, 17:23:28

Yes, I got it

user-02ae76 21 February, 2018, 19:18:09

Hey everyone! I am running into an issue with my new 200hz headset. On the new version of pupil capture (didn't happen a few weeks back) when I connect my headset, the eye camera simply won't read in. I get a log message of "[info] Found device. Pupil cam2 ID0" "eye0 warning (video_capture.uvc_backend:capture failed to provide frames. Attempting to reinit. "

Then goes through warning about backlight, pre-recorded calibration etc. I've tried re installing, re plugging and different cords. This camera worked a week ago and nobody has touched it since so it shouldn't be a hardware issue. I appreciate any inisght or tips!

mpk 21 February, 2018, 19:23:24

@arispawgld#8014 what os are you using?

user-02ae76 21 February, 2018, 19:50:40

Sierra MacOS

user-02ae76 21 February, 2018, 20:16:56

I've resolved the issue!

user-02ae76 21 February, 2018, 20:19:46

I do have a question about the new natural features edit mode: when I have tried to use it to increase my tracking accuracy, I notice no difference in the tracking once I mark points. Do I need to click recalibrate for them to take effect?

papr 21 February, 2018, 20:45:02

That, and you need to set the calibration section to natural features and make sure that the natural features lie within in the calibration range

user-537e9a 22 February, 2018, 04:36:28

Hello. I think you revise pupil page information . In the page of "https://docs.pupil-labs.com/#python-libs", "pip install msgpack_python" command is now deprecated according to "https://pypi.python.org/pypi/msgpack-python" on the official python page. This package installation might use "pip install msgpack" instead.

wrp 22 February, 2018, 04:40:12

@user-537e9a thanks for pointing this out

wrp 22 February, 2018, 04:40:34

you can make a PR to https://github.com/pupil-labs/pupil-docs with a change to the install instructions.

user-537e9a 22 February, 2018, 04:44:34

Okay, I will make a PR. Thanks for your quick response.

wrp 22 February, 2018, 04:46:18

You're welcome πŸ˜„

user-8be7cd 22 February, 2018, 08:03:17

Is there an easy way to generate a video out of the surface area?

papr 22 February, 2018, 09:13:59

@user-8be7cd Unfortunately not

user-072005 22 February, 2018, 13:16:15

Hello, I realize there is a major bug in Pupil Mobile right now which could be the cause of my problems. But, is it normal for mobile to not produce a world.mp4 file? We are trying to use a phone owned by the school (Samsung Galaxy 8) and I need to check that the phone isn't the problem here. It's a new 200Hz headset.

papr 22 February, 2018, 13:30:58

@user-072005 Pupil Mobile creates videos according to the camera names: Pupil Cam 2 ID0.mjpeg, Pupil Cam 2 ID1.mjpeg, and Pupil Cam 1 ID2.mp4 (if h264 encoding has beeen enabled for the world cam, default). Therefore there is no world.mp4 after copying the recording from the phone. Nonetheless, Pupil Player will try to detect the world video file and rename it to world.mp4 when it opens the recording for the first time

papr 22 February, 2018, 13:32:02

@user-072005 I suggest you verify that the world camera works correctly by either opening the preview on the phone (click on Pupil Cam1 ID2) or by streaming the video via wifi to Pupil Capture.

user-072005 22 February, 2018, 13:34:33

@papr Yes, both cameras are working and I did stream it to capture. I suppose my overall goal is to figure out why when I play the folder in Pupil Player, there's nothing showing that it detected where the eye was looking. I could see this when I streamed it, just not when transferring the video files. Is this caused by the bug mentioned previously?

papr 22 February, 2018, 13:37:06

@user-072005 No, the bug mentioned earlier is a different problem. Could you please send me a list of all filenames in an example recording after you have opened it in Player? And which Player version do you use?

user-072005 22 February, 2018, 13:42:24

@papr audio_0001.mp4, audio_0001.time, imu_002.imu, imu_0002.time, info.csv, key_0004.data, key_0004.time, Pupil Cam1 ID2.mjpeg, Pupil Cam1 [email removed] Pupil Cam2 ID0.mjpeg, Pupil Cam2 ID0.time, I downloaded the Windows x64 v1.3.9 player.

papr 22 February, 2018, 13:43:00

Pupil Cam1 ID2.mjpeg is your world video.

papr 22 February, 2018, 13:43:37

h264 transcoding was disabled and therefore Pupil Mobile saved the raw mjpeg data stream.

papr 22 February, 2018, 13:44:32

Is Pupil Cam1 [email removed] a typo or is itPupil Cam1 ID2.time` in reality?

user-072005 22 February, 2018, 13:45:05

It was a typo, the latter is correct

papr 22 February, 2018, 13:46:39

Opening this recoridng in Player it should upgrade the recording and you should end up with eye0.mjpeg, eye1.mjpeg, and world.mjpeg and their respective *_timestamp.npy files. Something is wrong if this does not happen.

papr 22 February, 2018, 13:47:26

Are you able to share your recording privately with me such that I can investigate the issue?

user-072005 22 February, 2018, 13:48:05

Sure, but I've never used discord. So how do I do that?

papr 22 February, 2018, 13:48:41

I would recommend to upload the recording to Google Drive and to share the link to the recording with me in a private message.

user-072005 22 February, 2018, 13:49:00

ok, I will do that

papr 22 February, 2018, 13:49:10

Alternatively, you can share it directly with [email removed]

user-2742eb 22 February, 2018, 14:11:23

hi everyone! does anyone of you have experience with running pupil on a nvidia jetson platform? thanks in advance πŸ˜ƒ

user-e7f18e 22 February, 2018, 14:50:11

can anybody tell me that how to check that red dot is coming into pupil properly irrespective of the head movement without using the pupil service exe

papr 22 February, 2018, 14:50:50

Just as an follow up to @user-072005's issue: The recording has been upgraded correctly by Pupil Player. The actual issue was that the gaze was not shown. This is due to the fact that Pupil Mobile does not do any gaze mapping. In order get gaze in Pupil Mobile one has to record the calibration sequence as well and use Offline Pupil Detection/Calibration as described in https://docs.pupil-labs.com/#data-source-plugins

papr 22 February, 2018, 14:52:05

@user-e7f18e What do you mean by that the red dot is coming into pupil? Could you describe your setup?

user-e7f18e 22 February, 2018, 14:53:55

I can give you remote if you want

papr 22 February, 2018, 14:56:26

That is probably not necessary. Could you explain your question in a little bit more detail?

user-e7f18e 22 February, 2018, 14:57:34

when I run any unity scene, in background i can check the eyes before calibration through pupil capture

user-e7f18e 22 February, 2018, 14:58:37

i want to perform this without using the pupil capture service, but through code how can i identify that my eyes would be properly calibrated

papr 22 February, 2018, 15:01:08

You either need Pupil Capture or Pupil Service. These application capture the eye cameras, do the pupil detection, calibration and gaze mapping. There is no way around these applications if you want to use the Pupil add-on.

user-11dbde 22 February, 2018, 15:13:36

Hello I am new to pupil labs

user-11dbde 22 February, 2018, 15:13:46

I just got one for hololens

wrp 22 February, 2018, 15:22:05

Hi @user-11dbde welcome to the Pupil community πŸ‘‹

user-11dbde 22 February, 2018, 15:33:56

thank you!

user-11dbde 22 February, 2018, 15:34:10

i am having troubles in running the unity example for hololens

user-11dbde 22 February, 2018, 15:34:18

it compiles

user-11dbde 22 February, 2018, 15:34:22

but i cannot deploy it

user-11dbde 22 February, 2018, 15:34:38

i receive a code 1 error when trying to deploy

papr 22 February, 2018, 15:35:12

Please see the vr-ar channel for questions related to unity πŸ™‚

user-11dbde 22 February, 2018, 15:51:34

thank you

user-11dbde 22 February, 2018, 16:09:28

no one answering in hmd-eyes πŸ˜ƒ

papr 22 February, 2018, 16:16:16

@user-e04f56, the hololens example author will surely answer when he is back in the office tomorrow.

user-11dbde 22 February, 2018, 16:16:30

Thank you!

user-02ae76 22 February, 2018, 16:54:47

@papr Thanks for the help with the natural features editor! I'm loving this feature

user-c828f5 22 February, 2018, 17:28:43

@papr A quick question. Can you point me towards resources regarding the various parameters in Pupil player?

papr 22 February, 2018, 17:29:47

parameters for what exactly?

papr 22 February, 2018, 17:30:43

You can find an overview over the plugins here: https://docs.pupil-labs.com/#pupil-player

user-c828f5 22 February, 2018, 17:30:43

Pupil diameter threshold is self explanatory, but what about intensity levels? and Confidence threshold?

user-c828f5 22 February, 2018, 17:32:03

and most importantly, model sensitivity.

papr 22 February, 2018, 17:32:34

Are, are you talking about the 3d detector parameters?

papr 22 February, 2018, 17:32:41

In the eye window?

papr 22 February, 2018, 17:38:10

See this paper for the 2d detection parameter explanation https://arxiv.org/pdf/1405.0006.pdf

papr 22 February, 2018, 17:40:46

The model sensitivity describes how sensible the 3d models are too change. But we are continuously working on improving the detection/mapping pipeline. The next iteration will have more in-depth documention

user-c828f5 22 February, 2018, 18:01:33

Thanks @papr yes, I was. I essentially wanted to gain an intuition about which parameters to change on what type of eye image.

user-8e4642 22 February, 2018, 22:15:57

Hello everyone. Any particular suggestion about oTree, Willow or Psychopy to run experiments using Pupil?

user-41f1bf 23 February, 2018, 00:03:22

What about open sesame?

user-2798d6 23 February, 2018, 02:04:51

I just wanted to tell you all that the trim markers for calibration and mapping are AWESOME! Super easy - thank you!

user-e02f58 23 February, 2018, 03:01:34

Same here, I just receive the headset and try the program, It was great

user-e02f58 23 February, 2018, 03:05:56

I am planning to communicate with external hardware device using pySerial, and also synchronize the hardware with pupil's timestamp. Is there any existing plugin that do this?

wrp 23 February, 2018, 03:11:46

@user-e02f58 thanks for the feedback πŸ˜„ you might want to check out this project which is a plugin using pyserial - https://github.com/akramhussein/pupil_plugin_marker_tracking_and_fixation_detection

wrp 23 February, 2018, 03:12:17

you may also want to get in touch with the author of this fork: https://github.com/jesseweisberg/pupil

wrp 23 February, 2018, 03:13:14

the second link - jesseweisberg's fork of pupil does not explicitly use pyserial but uses ROS to control a prosthetic hand. The first project demonstrates how to make a plugin for Pupil that can control external hardware using PySerial.

user-e02f58 23 February, 2018, 03:42:30

thanks!!!

wrp 23 February, 2018, 03:48:05

Welcome πŸ˜„ - You may also want to check out other projects in the pupil-community repo: https://github.com/pupil-labs/pupil-community

user-cbb918 23 February, 2018, 07:33:55

Hello everyone. we try to get pupil capture working on a mac os 10.9 (Mavericks), but it crashes at starting. We have tried the last version of pupil software and the previous one. It works fine on another computer running El capitan. I attach the crash report. Any hint about what it might be happening would be great.

crash_report.pdf

user-e7f18e 23 February, 2018, 11:05:08

what is the standard settings for pupil capture exe and how can i detect the tracking is proper or not in capture service

user-072005 23 February, 2018, 14:35:55

When using the offline pupil detector on Windows 10 with a monocular eye tracker, it crashes before completion almost every time. What could be causes of this?

mpk 23 February, 2018, 15:00:15

@user-072005 are you using Pupil Mobile and 200hz camera?

user-072005 23 February, 2018, 15:02:43

yes

user-072005 23 February, 2018, 15:12:32

it says invalid data found when processing input

mpk 23 February, 2018, 15:18:55

@user-072005 please note that there is a current bug in Pupil Mobile regarding this. We are very sorry. I wrote about this 2 days ago.

mpk 23 February, 2018, 15:19:16

For now I recommend using Pupil Capture. This should be resolved in the next couple days with a new software release.

user-072005 23 February, 2018, 15:20:04

ah ok, I'll keep an eye out for the new release. Thanks!

user-a49a87 23 February, 2018, 15:45:30

Hi, Is there a way to make Capture remember my settings? Every time I launch, I have to set the 'absolute exposure time'. Thank you

mpk 23 February, 2018, 16:17:39

@user-a49a87 this setting is overwritten by us. We can change this. Why do you need a custom value here?

user-44e666 23 February, 2018, 16:25:57

Hi, we're looking to maybe purchase some of the ey-tracking glasses. Anyone got any feedback?

user-3f48b6 23 February, 2018, 16:32:56

Hi, Today I am experiencing some artifacts in my eye image. I'm attaching A screenshot. Is this a compromised camera? These white lines keep showing up with not apparent cause.

Chat image

user-e7f18e 23 February, 2018, 16:35:30

for the camera eye-1 is for left and eye-0 is for right. please tell me if i am wrong

user-a49a87 23 February, 2018, 16:36:25

@mpk with default value, the image is too dark and the algorithm doesn't work well

mpk 23 February, 2018, 16:57:06

@user-e7f18e this sometimes happens but does not affect tracking. If it

mpk 23 February, 2018, 16:57:57

Happens more frequently please write us an email so we can get you a replacement.

mpk 23 February, 2018, 16:58:22

The artefacts can be removed by resetting frame rate or resolution.

mpk 23 February, 2018, 16:58:50

@user-a49a87 what camera are you using?

user-a49a87 23 February, 2018, 16:59:33

Pupil Labs in HMD (HTC Vive)

user-a49a87 23 February, 2018, 16:59:53

If you want a more specific answer, please tell me where to fin it πŸ˜‰

mpk 23 February, 2018, 17:00:17

Ah ok. let's fix this in the source code. What value is good?

user-a49a87 23 February, 2018, 17:00:38

around 120

mpk 23 February, 2018, 17:00:59

Ok. Would you mind making an issue on GitHub?

mpk 23 February, 2018, 17:01:14

We can track and discuss implementation there.

user-a49a87 23 February, 2018, 17:01:50

here is an example of what it looks on 127

user-a49a87 23 February, 2018, 17:02:23

(if I put 128, it became a lot darker, it kind of cyclic)

user-a49a87 23 February, 2018, 17:02:41

No problem. How can I do that?

mpk 23 February, 2018, 17:03:26

Get a GitHub account. Go to GitHub.com/pupil-labs/pupil/issues and raise an issue

user-a49a87 23 February, 2018, 17:12:59

done

user-a49a87 23 February, 2018, 17:15:51

sorry, bad copy-paste. Here is the example with 127 https://we.tl/UwId1P6qtI

mpk 23 February, 2018, 17:30:12

Awesome. We LL have a look at this on Monday!

user-a49a87 23 February, 2018, 17:30:39

thx

user-5d12b0 23 February, 2018, 17:31:28

@user-a49a87 In the same window where you modify exposure time, you should have other options further down to modify gain/gamma/contrast.

user-5d12b0 23 February, 2018, 17:31:46

(and others, like Hue, Saturation, etc)

user-a49a87 23 February, 2018, 17:33:44

ok, thx, I didn't see the magic arrow

user-a49a87 23 February, 2018, 17:34:18

here just above, there is an example of the capture of my eye, is it too bright?

user-5d12b0 23 February, 2018, 17:34:31

The link doesn't work for me

user-5d12b0 23 February, 2018, 17:35:20

And brightness doesn't matter, all that matters is the contrast between the pupil and the surroundings so it can detect the edges of your pupil.

user-5d12b0 23 February, 2018, 17:35:48

Also try modifying the ROI to make sure it's only looking for your pupil in the region of the image where your eye is likely to be.

user-a49a87 23 February, 2018, 17:39:30

here is a new link https://we.tl/dZf3wGGifX

user-a49a87 23 February, 2018, 17:49:56

I find it quite complicated to configure the parameters to have a proper following of the pupil

user-5d12b0 23 February, 2018, 17:58:10

That link is also not working for me. It might be a problem with the corporate firewall.

user-a49a87 23 February, 2018, 18:01:45

When I "play" with absolute exposure time, gamma, constrast, ... Frequently, the shadow cast by the eyelashes are darker than some part of the pupil.

user-5d12b0 23 February, 2018, 18:06:01

Do you have a lot of light leaking in to the HMD? The tracker uses an array of IR emitters to the inside of the camera. These shouldn't be casting significant shadows in the area of your pupil. If you are talking about shadows far above your pupil, then try reducing the ROI

user-a49a87 23 February, 2018, 18:06:01

when I use the algorithm display mode, there is green rectangles appearing, what are they?

user-a49a87 23 February, 2018, 18:06:44

Could you send me a picture of what the ROI should look like?

user-a49a87 23 February, 2018, 18:07:38

I set the ROI as such the pupil is always inside if I look as far up as possible, as far down, and so on

user-5d12b0 23 February, 2018, 18:08:01

It depends on each person's head shape. For me, it's a box about 1/2 the height and 1/2 the width of the full window that sits at the bottom middle of the image. I determine it the same as you.

user-a49a87 23 February, 2018, 18:10:12

ok, so the intersection between the eyelids and the iris is in the ROI. That's the shadows I'm talking about

user-5d12b0 23 February, 2018, 18:13:32

Try rotating the knobs on the Vive strap to get the cameras further away from your eyes, and adjust the focus accordingly. Unfortunately this reduces FOV.

user-a49a87 23 February, 2018, 18:14:20

I'll try

user-a49a87 23 February, 2018, 18:14:39

Thank you for all the advices. I have to go. See you maybe tomorrow

user-5d12b0 23 February, 2018, 18:14:54

good luck

user-13ea21 23 February, 2018, 23:07:00

Hello- does anyone know how to display the pupil video instead of the gaze video in pupil player

user-13ea21 23 February, 2018, 23:10:18

Also- trying to figure out best setting for blink detection, suggestions about confidence intervals?

user-13ea21 23 February, 2018, 23:11:57

How do o make sure it is recording the eye video which is more important to me then the world video

user-13ea21 23 February, 2018, 23:16:42

Is there a way to save settings so all recordings are done on same settings?

user-41f1bf 24 February, 2018, 14:41:14

@user-13ea21 you need two things. First must have enabled the record eye option in pupil capture. Second, if the eye video was recorded, then you can see it using the vis_eye_video_overlay plugin (something like that, I am not completely sure about its name)

user-f68ceb 25 February, 2018, 11:52:45

Hi, Has anyone had issues with Calibration? I am running a MacBook Pro with Sierra 10.13.2 (17C205) and a brand new Headset. The only thing I see when starting screen calibration is a white, blank screen. ... OK – issue seems to be with the 'full screen' setting. It works when it's turned off – but the floating window is quite small?!

user-13ea21 25 February, 2018, 16:15:31

What are the minimum Windows computer requirements to run pupil apps

user-47bcf3 25 February, 2018, 21:58:02

Hi guys, how do I install drivers for win10. When I run "pupil_capture_windows_x64_v1.4.1" it hangs on updating drivers and doesn't load.

user-47bcf3 25 February, 2018, 22:00:43

'

Chat image

user-47bcf3 25 February, 2018, 23:33:11

Damn, what a pain in the rear.. got it though.

wrp 26 February, 2018, 02:46:51

Hi @user-13ea21 Minimum requirements - The key specs are CPU and RAM. We suggest at least an Intel i5 CPU (i7 preferred) with a minimum of 8GB of ram (16GB is better if possible). OS - We support macOS (minimum v10.9.5), Linux (Ubuntu 16.04 LTS), and Windows 10 (windows 10 only).

wrp 26 February, 2018, 02:47:47

@user-47bcf3 were you running with admin permissions? Would you be willing to share some feedback and/or what you did so that we can improve driver installation?

user-41f1bf 26 February, 2018, 03:31:55

I am running Pupil monocular with an i3 with 4GB DDR3.

user-41f1bf 26 February, 2018, 03:34:36

It just works.

user-47bcf3 26 February, 2018, 03:56:34

@wrp On the Pupil website, I could not find anything explaining installing drivers. After searching this Discord channel, the third instructions I followed was this link : https://github.com/pupil-labs/pyuvc/blob/master/WINDOWS_USER.md

But these instructions clearly tell you to plug in your camera after installing Zadig. You cannot install linusbK or Zadig without having your camera plugged in. (if you can, I couldn't figure it out). Which led to more frustration. If you look at these instructions, it isn't until step 5, that it says you need to choose the "composite parent" and not the interface. Well, you can't do this, unless you choose to install the composite back in instruction 1 when you installed libusbk.

It would be a lot easier, if Pupil put a modified version of these instructions on their website(I spent 2 hours today pouring over the website and the git repository looking for driver instructions - they may be there, but I couldn't find them). Then another hour just following these instructions(installed it wrong first, then uninstalled and started over again). Tell the user to plug your camera in at the beginning and watch, for when you have to choose your camera - choose the parent(in the software, the text box is small - you have to scroll to the right to see the part of the description that shows you the difference between the 2.

wrp 26 February, 2018, 03:58:16

@user-41f1bf great to know that lower specs also work well. I have an i5 mbAir (macOS) with 4gb ram and Pupil Capture and Player also work well, but some users with very large recordings may require more RAM - hence the suggestion for higher RAM and CPU.

wrp 26 February, 2018, 03:59:58

@user-91325a thanks for the feedback, and I apologize for this loss of time. Did you go through troubleshooting in the docs: https://docs.pupil-labs.com/#troubleshooting

wrp 26 February, 2018, 04:00:45

One should not need to use Zadig at all anymore - but in some cases it seems that drivers are not installing properly on some Windows users systems. So more information about your setup would be appreciated so that we can smooth out the installer process.

user-47bcf3 26 February, 2018, 04:07:57

ahhh

user-47bcf3 26 February, 2018, 04:09:36

you know - if you've never been on that page before - finding those install directions is like a needle in a haystack. That part of the menu doesn't expand until you are in the Developer Docs... I went through all the stuff at the top of that document - you know "Introduction" Getting Started" User Docs....... It told you how to setup the software, but doesn't say much for how to install drivers.

wrp 26 February, 2018, 04:10:48

Ok - thanks for the feedback @user-47bcf3 we will make the driver installer more present in the getting started section for Windows users. You can also feel free to make a PR to the docs at https://github.com/pupil-labs/pupil-docs

user-47bcf3 26 February, 2018, 04:11:06

At one point, I did find the instructions on how to 'uninstall' but I didn't have any linUSBK devices to uninstall. So I couldn't figure out what to do. After I installed Zadig wrong, that was really helpful though.. because I was able to uninstall everything and start over.

user-47bcf3 26 February, 2018, 04:11:39

sorry, this is my first experience with this software/hardware. I don't know what a PR is

wrp 26 February, 2018, 04:12:59

No worries - your feedback is very helpful. Thanks @user-47bcf3 - A PR is a pull request (it is a way to ask maintainers of a software to accept your changes to code, documentation, or other files that are in a versioned system like git).

user-47bcf3 26 February, 2018, 04:13:08

This was how I found the instructions :

Chat image

wrp 26 February, 2018, 04:13:35

@user-47bcf3 Did you find the link to docs.pupil-labs.com on the front of the box that Pupil was shipped in?

user-47bcf3 26 February, 2018, 04:13:41

Yeah, I don't know how to do that. Maybe someone will read this and do an PR for me.

wrp 26 February, 2018, 04:14:03

@user-47bcf3 We will take your feedback and update docs accordingly.

wrp 26 February, 2018, 04:14:33

BTW @user-41f1bf I see your PR and will review it today.

user-47bcf3 26 February, 2018, 04:14:33

I don't have the box. I am a 3D printing guy, I was asked to make a bracket for some custom glasses. I was just given the glasses and camera and asked to test the bracket with the software to make sure you can see the pupil.

user-47bcf3 26 February, 2018, 04:15:09

lol, that might be part of the problem

wrp 26 February, 2018, 04:15:25

@user-47bcf3 got it. BTW - we have geometry documented for the mounts here - https://github.com/pupil-labs/pupil-geometry

user-47bcf3 26 February, 2018, 04:15:27

If I had the original documentation, that probalby would have prevented this too.

user-41f1bf 26 February, 2018, 04:15:48

@wrp Yes, I should have mentioned that my setup does not support long recordings (40min+). Anyway, I would not recommend such long recordings due to current stability constraints.

user-47bcf3 26 February, 2018, 04:16:31

hahaha, that was the other thing - the guy that gave it to me didn't tell me anything about the camera until last Friday - I've been working on this for 2 weeks. I go to your website to get the software and what do I find... ? The stl's to print the ball joint... lol that would have helped a lot.

user-537e9a 26 February, 2018, 04:20:57

Do you know how to reassemble the world camera of pupil headset? I have the pupil eye tracking headset consisting of 3D world cam and 120hz eye cam and the high resolution cam in Hololenz binocular add-on. I want to change the world camera from 3D world camera to high resolution camera. Please let me know if you have a solution about that.

wrp 26 February, 2018, 04:22:52

@user-537e9a unfortunately the cabling for the 3d world camera is not compatible with the high speed world camera. So hot-swapping these out would not be an option and could damage the system.

user-537e9a 26 February, 2018, 04:34:13

So is it only possible to do hot-swapping the eye camera in this case?

wrp 26 February, 2018, 04:34:53

Yes @user-537e9a this is correct

user-537e9a 26 February, 2018, 04:36:15

Okay... Thanks, @wrp.

wrp 26 February, 2018, 04:36:29

You're welcome @user-537e9a

user-e938ee 26 February, 2018, 08:55:13

Hello, anybody got experience with msgpack with QT or c++?

papr 26 February, 2018, 08:55:58

@user-e938ee I think @user-e04f56 does. I would suggest to hop over to vr-ar and to ask him πŸ™‚

user-e938ee 26 February, 2018, 08:56:09

Thanks!

papr 26 February, 2018, 17:31:34

@user-8779ef @user-41f1bf This is for you: https://github.com/pupil-labs/pupil/pull/1100

user-13ea21 26 February, 2018, 18:12:31

I am getting an error with pupil player- disabling idle sleep is not supported on this OS version. No valid dir supplied, works on my other computer- not sure what I need to do.

user-13ea21 26 February, 2018, 18:19:54

Chat image

mpk 26 February, 2018, 18:47:39

@rakupe#7607 what are you dragging onto the player window? It should be a recording dir (usually called 000,001,002...)

user-8779ef 26 February, 2018, 19:00:25

@papr Thanks! I'll have a look as soon as I can.

user-8779ef 26 February, 2018, 19:01:53

@papr Some questions (I'll start typing, but I feel I may be interrupted by a meeting I'm supposed to have now)

user-8779ef 26 February, 2018, 19:02:20

"Previously, all binocular mappers dismissed pupil data with confidence lower than 0.6 to prevent bad binocular mapping. Now, low confidence pupil data is mapped monocularly." Is there a flag to indicate if cyclopean gaze results from binocular, L, or R eye data?

user-8779ef 26 February, 2018, 19:04:00

Overall, this is a very good move on your part. Thanks for listening!

user-8779ef 26 February, 2018, 19:04:08

and coding and contributing.

user-b23813 26 February, 2018, 21:59:04

@user-41f1bf I had the same issue with the player when I recorded for more than 40 minutes since my files were usually more than 30 GB. I chose to split the files and playing 20-25 minutes of recording goes well so far.

user-13ea21 26 February, 2018, 22:36:21

That’s what happens when I try to start it, it worked a couple times then starting giving me this error message

user-47bcf3 27 February, 2018, 06:48:49

Can someone help me with getting started? When I try to move the camera to center on my eye, the video output only works for about 15 seconds then freezes for about 15-30 seconds, then works for 10 seconds, freezes for 10.. on and off. Is there a setting I can turn on so that it stays on until I get the camera centered on my eye?

wrp 27 February, 2018, 06:49:44

@user-47bcf3 Do you have a monocular or binocular system?

user-47bcf3 27 February, 2018, 06:49:48

mono

user-47bcf3 27 February, 2018, 06:50:16

I have the cable with 2 jacks on it, but only 1 camera plugged in

wrp 27 February, 2018, 06:54:36

@user-47bcf3 your cable has two female jacks on it?

wrp 27 February, 2018, 06:54:46

I believe that you're working with quite an old system.

user-47bcf3 27 February, 2018, 06:54:56

umm, I think the cable has male and the camera is female

user-47bcf3 27 February, 2018, 06:55:08

yup

wrp 27 February, 2018, 06:55:22

Is the camera "freezing" when you do not touch/move it?

wrp 27 February, 2018, 06:55:53

I am asking these questions, because I want to try to understand if this is an electrical/wiring issue

user-47bcf3 27 February, 2018, 06:55:55

let me check

user-47bcf3 27 February, 2018, 06:56:43

hmm, I clicked on the big T on the screen by accident and now the software just freezes

wrp 27 February, 2018, 06:57:03

T is for testing after calibration

wrp 27 February, 2018, 06:57:19

Any information regarding the crash in your pupil_capture_settings/capture.log file?

user-47bcf3 27 February, 2018, 06:58:24

how do I invert the image ? I found it earlier

wrp 27 February, 2018, 06:59:15

To flip the eye image go to the Eye window General > Flip image display

user-47bcf3 27 February, 2018, 06:59:57

how do I get the eye window to open again? it was auto opening before, now just the capture window opens

wrp 27 February, 2018, 07:00:17

World window General > Detect eye 0

user-47bcf3 27 February, 2018, 07:00:22

thanks for the help, this is making me feel dumb

wrp 27 February, 2018, 07:00:36

No problem. I am happy to help out

user-47bcf3 27 February, 2018, 07:04:04

ok, got it working. eye 0 is plugged in and inverted now

user-47bcf3 27 February, 2018, 07:04:11

and it freezes even if I don't move it

wrp 27 February, 2018, 07:05:37

@user-47bcf3 what are the specs of your computer?

user-47bcf3 27 February, 2018, 07:08:05

AMD-FX 6300(3.5 ghz) six core 16GB w Radeon R7 200 2GB GPU

wrp 27 February, 2018, 07:09:36

OK, this machine is certainly powerful enough. This sounds like it could be a hardware issue. Please send us an email at info@pupil-labs.com so we can follow up. If you can also get the original order_id associated with this headset (You would need to ask the individual who gave you this hardware to work with for this information), that would be appreciated.

user-91325a 27 February, 2018, 07:10:38

ok, thanks

wrp 27 February, 2018, 07:42:56

Welcome πŸ˜„

papr 27 February, 2018, 07:54:05

@user-8779ef yes, a gaze datum has a field called base_data. It is a list of the 1-2 pupil positions it is based on. These include all the information you need.

papr 27 February, 2018, 07:55:09

@user-13ea21 This might also happen if the path to the recording includes unicode characters. Could you make sure that this is not the case?

user-47bcf3 27 February, 2018, 07:56:09

Thanks for the help!! Restarting with Default settings reset everything and it's working good now.

user-933e7a 27 February, 2018, 10:17:19

hello pupil

papr 27 February, 2018, 10:17:32

Hey @user-933e7a πŸ‘‹

user-933e7a 27 February, 2018, 10:22:11

@papr πŸ‘ have some question about my eye-tracker. i can't configure it exactly to work.

papr 27 February, 2018, 10:22:38

Happy to help, what are your questions?

user-933e7a 27 February, 2018, 10:28:09

i use an old tracker version. since 2015-2016. and can't get a correct result. i don't know what is wrong. calibration goes good. but after this tracker don't track my gaze/

papr 27 February, 2018, 10:29:34

Do you mean that the gaze is not accurate or that there is no gaze point visible?

user-933e7a 27 February, 2018, 10:30:37

mostly the gaze is not accurate.

user-933e7a 27 February, 2018, 10:31:28

confidence >0.9

papr 27 February, 2018, 10:31:43

Does it have a consistent offset or is it jumping a lot? Could you check if your pupil confidence is high? You can do so by checking the confidence graphs in the world window.

papr 27 February, 2018, 10:31:49

Ok, thanks.

papr 27 February, 2018, 10:32:13

Do you use 2d or 3d detection/mapping?

user-933e7a 27 February, 2018, 10:32:43

it jumping or mirroring. i use 3d

papr 27 February, 2018, 10:33:49

Could you test 2d detection/mapping and see if the issue remains?

user-933e7a 27 February, 2018, 10:34:28

yes, i tested 2d too. but got same result

papr 27 February, 2018, 10:35:24

Which operating system, Capture version and what type of eye tracker (monocular/binocular) are you using?

user-933e7a 27 February, 2018, 10:35:24

also i used a different software version

user-933e7a 27 February, 2018, 10:35:58

main: linux. and also try on macos and windows

user-933e7a 27 February, 2018, 10:37:06

capture version 1.3 , 1.1, 0.9 use monocular tracker

papr 27 February, 2018, 10:38:18

Alright, thank you for this information. So, just to see if I understood you correctly: You have been able to use the system as expected for a while but now you are having trouble getting accurate gaze tracking?

user-933e7a 27 February, 2018, 10:39:16

yes, correct.

papr 27 February, 2018, 10:40:26

And you cannot think of anything that changed in your setup?

user-933e7a 27 February, 2018, 10:43:06

i'm a new in this and i make a new installation and configuration. i tried to reproduce the setup from oldest systems but no results.

papr 27 February, 2018, 10:44:09

Ah ok, so a collegue of yours was able to get accurate results but not yourself?

papr 27 February, 2018, 10:46:30

Could you make an example recording that includes the calibration procedure and a recording of your eyes, and upload and share it with me? I might be able to give tips on the setup/procedure πŸ™‚

user-933e7a 27 February, 2018, 10:50:46

collegue don't get a correct result too. yes, i can.

papr 27 February, 2018, 14:06:36

@user-8779ef You can also see it in the gaze topic field. These are the gaze topics in <= v1.4: gaze.2d.0, gaze.2d.1, gaze.2d.2, gaze.3d.0, gaze.3d.1, gaze.3d.2 And will be changed to gaze.2d.0., gaze.2d.1., gaze.2d.01., gaze.3d.0., gaze.3d.1., gaze.3d.01. in >=v1.5.

.2d/.3d denote the mapping method .2 and .01. indicate binocularly mapped gaze. .0 and .0.indicates monocular gaze based on pupil.0 and .1 and .1. based on pupil.1 respectively.

user-072005 27 February, 2018, 15:20:27

I took the glasses outside and the world video is totally washed out. How do I correct this?

papr 27 February, 2018, 15:20:44

@user-072005 what do you mean by washed out?

user-072005 27 February, 2018, 15:21:43

Everything is white. I can sort of make out the horizon but I can't really see anything

papr 27 February, 2018, 15:25:04

This means that your world video is overexposed. Did you disable auto exposure priority in the UVC Source settings menu? It is enabled by default and prevents over exposure.

user-072005 27 February, 2018, 15:25:44

I was using the mobile app. Would that setting be in the app or the player?

papr 27 February, 2018, 15:26:48

Ah, you will have to set this in the mobile app.

user-072005 27 February, 2018, 15:30:12

Ok so I will do that in the pupil cam1 ID2 settings?

papr 27 February, 2018, 15:30:47
  1. Select the Pupil Cam1 ID2 camera
  2. Click on the three bars on the top left
  3. Select Auto-exposure priority
  4. Set it to frame rate may be dynamically
user-072005 27 February, 2018, 15:31:13

ah ok it's working, thank you

user-cf2773 27 February, 2018, 16:32:06

Hi there, I am sure this has been already discussed, but I have followed all the steps on your website to install Pupil capture on an Ubuntu 16.04 machine, and I get this error: ImportError: /opt/pupil_capture/cv2.so: undefined symbol: ZN2cv2ml6RTrees4loadERKNS_6StringES4 I did uninstall the python-opencv package with pip, but with no success. When trying to install pyuvc, I also get an error /usr/lib/gcc/x86_64-linux-gnu/5/../../../x86_64-linux-gnu/libturbojpeg.a: error adding symbols: Bad value

papr 27 February, 2018, 16:37:40

@user-cf2773 Please see todays pyuvc issue on how to solve the latter problem: https://github.com/pupil-labs/pyuvc/issues/44

papr 27 February, 2018, 16:38:10

In short: sudo rm /usr/lib/x86_64-linux-gnu/libturbojpeg.a

papr 27 February, 2018, 16:38:24

And reinstall pyuvc

user-cf2773 27 February, 2018, 16:39:03

Thanks, pyuvc now works!

papr 27 February, 2018, 16:39:29

opt/pupil_capture/cv2.so is the bundle cv2 lib. This should not be loaded during runtime if you are running from source...

papr 27 February, 2018, 16:39:50

This indicates that your opencv installation is not correct

user-cf2773 27 February, 2018, 16:43:09

πŸ€” I followed every step, maybe it got installed in another way?

papr 27 February, 2018, 16:45:31

Please uninstall the bundle to make sure that the bundle cv2 version does not interfere with your manual installation

user-cf2773 27 February, 2018, 16:59:58

How can I check if my opencv installation is correct? I think I've followed all the steps

papr 27 February, 2018, 17:00:39

run python3 in the terminal and execute import cv2. This should work without errors

user-cf2773 27 February, 2018, 17:01:38

It does, and the version is 3.3.0-dev

papr 27 February, 2018, 17:02:52

perfect. That looks like a successful setup

user-cf2773 27 February, 2018, 17:03:09

But I still get that issue when launching pupil_capture

user-cf2773 27 February, 2018, 17:03:31

Could not find platform independent libraries <prefix> Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version. world - [ERROR] launchables.world: Process Capture crashed with trace: Traceback (most recent call last): File "launchables/world.py", line 105, in world File "/home/pupil-labs/.pyenv/versions/3.6.0/envs/general/lib/python3.6/site-packages/PyInstaller-3.3.dev0+g8998a9d-py3.6.egg/PyInstaller/loader/pyimod03_importers.py", line 631, in exec_module File "shared_modules/methods.py", line 19, in <module> File "/home/pupil-labs/.pyenv/versions/3.6.0/envs/general/lib/python3.6/site-packages/PyInstaller-3.3.dev0+g8998a9d-py3.6.egg/PyInstaller/loader/pyimod03_importers.py", line 714, in load_module ImportError: /opt/pupil_capture/cv2.so: undefined symbol: ZN2cv2ml6RTrees4loadERKNS_6StringES4

world - [INFO] launchables.world: Process shutting down.

papr 27 February, 2018, 17:06:21

Wait, I think you are mixing things up. There are two ways to run Pupil. One is by running from bundle. You can download the releases from github and install the containing .deb files. You do not need to install the dependencies for that. They come with the bundle.

user-cf2773 27 February, 2018, 17:06:52

Oh OK. I did install the .deb

papr 27 February, 2018, 17:07:04

To run from source you need to install the dependencies, download the source and run python main.py in your terminal from the <repository path>/pupil_src folder

user-cf2773 27 February, 2018, 17:07:08

But then how should that generate the issue?

papr 27 February, 2018, 17:07:39

You execute the bundle from terminal by running pupil_capture

user-cf2773 27 February, 2018, 17:07:55

That's what I am doing

papr 27 February, 2018, 17:08:22

Running the bundle should not fail. If it fails it is not installed or executed correctly

user-cf2773 27 February, 2018, 17:13:52

I have installed the bundle with sudo dpkg -i pupil_capture_linux_os_x64_v1.4-1.deb and then run pupil_capture and I get that error

papr 27 February, 2018, 17:38:22

As a follow up, @user-cf2773 is using a Intel Xeon CPU which is a different architecture than for whcih the bundle is compiled for. The bundles do not run on Intel Xeons. The solution is to run from source as described in https://docs.pupil-labs.com/#developer-setup

user-bd0840 27 February, 2018, 20:03:10

@papr could you elaborate a bit? does this only apply to the linux bundles? or only a particular generation of xeons? thanks πŸ˜ƒ

papr 27 February, 2018, 20:03:37

This applies to macos bundles as well

user-bd0840 27 February, 2018, 20:09:03

@papr, i was already asking that in vr-ar, but maybe here it's better: i'm writing some calibration code for the trackers atm, and was wondering whether there is some more documentation available what the reference points should be, etc?

user-bd0840 27 February, 2018, 20:09:55

(talking about HMD calibration)

papr 27 February, 2018, 20:59:03

@user-bd0840 in the hmd case, you provide the ref positions. The calibration will be within the coordinate system of these reference points

user-bd0840 27 February, 2018, 21:00:34

@papr thanks! as far as i understand, 2D calibration would give me x/y positions on the hmd screen, and 3D an actual world position (in case i calibrated with them), is this correct?

papr 27 February, 2018, 21:01:04

correct

user-bd0840 27 February, 2018, 21:01:45

do you have any recommendations on the distribution of the reference points?

papr 27 February, 2018, 21:02:32

They should cover the field of view of the subject

user-bd0840 27 February, 2018, 21:03:19

great, that was the basic assumption. thank you πŸ‘

user-ca2a2b 27 February, 2018, 21:07:09

Hi there. When using Capture remote recorder with the glasses connected to the mobile app. Can you do online calibration. Or is it only offline calibration?

wrp 28 February, 2018, 01:59:39

Hi @user-ca2a2b If you select Pupil Mobile as a capture source in UVC manager in Pupil Capture, you can calibrate "online" and save calirbation data and video data on the computer running Pupil Capture. The Remote Recorder plugin is designed only to start/stop recordings on each connected Pupil Mobile source.

user-ca2a2b 28 February, 2018, 06:33:00

Great. ThanksπŸ‘

user-e7102b 28 February, 2018, 06:38:10

Hi - I'm setting up my pupil mobile headset to track a screen, using four surface markers to define the screen as a surface (one in each corner). The markers seem to be momentarily lost and found, causing the dimensions of the surface to be disrupted (for an example, see here: https://www.dropbox.com/s/pip8qzdxf9nsxim/IMG_4062.MOV?dl=0). Does anyone have tips for how I can improve consistency? The markers in this example are printed and physically stuck on the monitor, but I've also done this with markers drawn to the screen via psychtoolbox, and I see similar effects. The background needs to be white for this study, so no option to lower the screen brightness. Any tips would be appreciated. Thanks!

mpk 28 February, 2018, 07:20:06

@tombullock#3146 can you try smaller markers, those are quite big. otherwise I think it should be fine.

user-e7102b 28 February, 2018, 08:05:14

@mpk The screen is about 1.2 m from the participant...I could probably make the markers a bit smaller, but if I go too much smaller they are not detected. If it's not a big deal then I won't worry about it.

user-e7102b 28 February, 2018, 08:05:20

Thanks

mpk 28 February, 2018, 08:05:41

@user-e7102b check the min marker size slider. I think you can make the markers a bit smaller.

user-e7102b 28 February, 2018, 08:12:44

@mpk Yes, I'll give that a try and see if it helps.

user-933e7a 28 February, 2018, 08:14:02

hi there!

user-933e7a 28 February, 2018, 08:15:24

@papr i'm ready

user-8b1388 28 February, 2018, 08:46:34

does anyone can help me to configure tracker to accurate work

user-8b1388 28 February, 2018, 09:23:57

@mpk hi. you can't?

user-c6b893 28 February, 2018, 12:07:30

Hi, we’d like to use the 3D gaze feature of our pupil tracker (binocular + fisheye world cam). However, the Z values we are currently receiving from gazepoint_3d seem to be not in millimeters (sometimes they are even negative). Do we need to calibrate on a 2D plane when using the 3D Gaze Point, and if yes, does the distance to the plane make any difference? Is it even possible to do a separate 3D calibration? And how can we have the raw Z values visualized or printed within the pupil UI?

We already tried to dig through the source code to understand the various coordinate systems and extrinsics that are involved - at the moment we assume that metric values for gazepoint_3d should be available, with their origin being the optical center of the world camera (+z to front, +x to the right, +y up). However, we still cannot really tell where involved transformations e.g. the eye camera intrinsics, or extrinsics eye-to-world, or extrinsics eye-to-eye are coming from.

We’d be grateful for any help or hints - thanks in advance!

papr 28 February, 2018, 12:09:54

@user-c6b893 The 3d gaze mappers have their own side menu. It should include a button to open the debug window. It includes a 3d visualization of the calibrated 3d gaze.

user-8b1388 28 February, 2018, 12:15:04

@papr can we continue?

user-c6b893 28 February, 2018, 12:16:51

Thanks @papr - we already stumbled upon this window, however it only seems to give us a visualization of the 3d gaze mapping, but no numerical Z values or (also numerical) eye positions to check if we are actually getting "sane" values in millimeters somehow corresponding to our current gaze point (right now the visualizer looks quite good, but we are receiving values of -900 (mm) for Z - relative to the world cam this would mean we are looking behind our own head, right?).

papr 28 February, 2018, 12:20:31

@user-c6b893 I am not exactly sure about the coordinate system's orientation. Be aware that gaze data contains a confidence value that gives a quality estimation. How are you getting the data? Are you subscribing to it or do you use the exporter in Player?

user-c6b893 28 February, 2018, 12:35:19

Thanks @papr , we'll have a look at the confidence values. We are getting the values through subscribing via zmq in Unity as well as via the exporter, they are identical. According to the visualizer (given the axis colors are green = y, red = x, blue = z) the z values should be positive in front of the camera.

user-2798d6 28 February, 2018, 15:43:41

This may be more of a general question, but is there an easy way to see dispersion in real time? We are having participants look at sheet music and trying to figure out how much music "fits" into 1 degree.

papr 28 February, 2018, 15:47:09

Do you mean dispersion over time?

user-2798d6 28 February, 2018, 17:23:39

@papr - I don't think so. I want to figure out for example if I look straight ahead then I look 1 degree to the right, how much of the sheet music will I have covered with that sweep.

papr 28 February, 2018, 17:25:27

This depends on the distance of the sheet, if I am not mistaken.

user-2798d6 28 February, 2018, 17:31:15

Right! But do you happen to know a way for me to figure that out in the moment?

papr 28 February, 2018, 17:32:52

You could add a surface marker of a specific size to it and calculate its 3d positions. There is some initial work on that: https://github.com/pupil-labs/pupil/pull/872

user-f1eba3 28 February, 2018, 17:33:55

Hi guys noob question in 1.2.3 : After I just got the addon for htc vive and installing the hardware is there any software already available to see that it works ?

papr 28 February, 2018, 17:35:02

Yes, start Pupil Capture and see if the eye window is able to detect an image

papr 28 February, 2018, 17:35:23

Pupil Capture needs to run on the computer to which the add on is connected to.

user-f1eba3 28 February, 2018, 17:45:18

is it normal to identify something even when no one is using the headset ?

papr 28 February, 2018, 17:45:58

Yes, the pupil detection algorithm works on best-effort bases.

user-f1eba3 28 February, 2018, 17:46:45

also has anyone noticed that glasses have any influnce on the eye tracking

papr 28 February, 2018, 17:48:28

Depends on how you position the glasses. If the eye cameras have a free view onto the eyes, the glasses do not have any influence. If the eye cameras record through the glasses, you might encounter issues with the glasses' refraction. Especially after they have been moved slightly.

user-e7102b 28 February, 2018, 17:57:16

@mpk Reducing the marker size helped improve stability. I think I found an optimal size for my screen distance https://www.dropbox.com/s/mmka0msf62g4340/IMG_4063.MOV?dl=0

user-e7102b 28 February, 2018, 17:57:27

Thanks for the advice

user-b4a71f 28 February, 2018, 17:58:49

Follow-up question to our questions above - is it possible to get the gaze point coordinates (x/y) in the world camera coordinate frame for each eye separately?

papr 28 February, 2018, 18:00:10

Yes, we have a dual-monocular gaze mapper. But there is no user-option yet to enable it explicitly. Please create a github issue and I will see to it next week. πŸ™‚

user-ed537d 28 February, 2018, 19:47:08

Is there a way to pull or calculate where the user is looking in terms of visual angle.

papr 28 February, 2018, 20:05:38

@user-ed537d do you mean in relation to the world camera or in relation to an object that is visible in the world camera?

End of February archive