πŸ‘ core


user-a6b05f 01 September, 2018, 02:20:46

@mpk do you know where I could get the plugin? I'm not sure it's possible to get Pupil and HoloLens to talk to each other yet. I was thinking of building a server for the HoloLens but was told that it wouldn't work very well.

papr 01 September, 2018, 11:42:35

@user-a6b05f mpk was referring to the HololensRelay plugin that is already being shipped in Capture if I am not mistaken

user-af87c8 02 September, 2018, 10:17:13

@papr A question regarding the new serializd objects: is it possible to have a "fake-serialized-object" that I can use the same way as the legacy-lists of dicts? I used to call several pupil-functions (e.g. recalibrations) directly from python. With the new serialized objects introduced in 1.8, I now need to use "Incremental_Legacy_Pupil_Data_Loader" to load my file. But with this, I cannot e.g. change pupil['gaze_positions'] and then apply the recalibration function on the changed object. So I probably need a fake-serializer because I do not want to write my data everytime I change something just so that the serializer can re-read it from disk. Or alternatively, is there a way to easily switch between the "lists" and a serialized msgpack object?

I don't know anything about your new way to save data. All my recorded data is pre 1.8

papr 02 September, 2018, 15:53:23

@user-af87c8 I am a bit confused what the exact issue is. Your data is pre 1.8 but you want to load in the 1.8 format? Incremental_Legacy_Pupil_Data_Loader is only for conversion to the new format. You can still use load_object on the old format/pupil_data file.

user-decc04 02 September, 2018, 18:16:46

Hello everyone, I was wondering if someone has done a DIY hardware setup for the VIVE Pro yet using Pupil?

user-b70259 03 September, 2018, 02:00:04

@papr I'm using the monocular headset.

user-b70259 03 September, 2018, 02:00:16

Could you tell me more about the angle data? Angle data of the 'ellipse' and 'projected sphere' are provided, but I'm not sure what is 0 deg based on? And, could I get only the angle data without string searching?

user-af87c8 03 September, 2018, 07:55:15

@papr Thanks for your response! I was myself a bit confused, so now hopefully a bit clearer: So say, I want to call the calibration function of pupillabs. It expects the new dataformat in 1.8 (tries to unserialize in gaze_producer). But I want to change my data before putting it into the calibration function (e.g. remove some portions, split it up, fix timing etc.). Now my data is in list format, but the function expects serialized data and throws an error. I want somethign that allows me to use lists in new versions (e.g. a list2serialize function)

user-ad8e2d 03 September, 2018, 09:55:02

Hey @papr , do you by any chance know when the new release will be out?

papr 03 September, 2018, 09:56:48

@user-ad8e2d Hey, unfortunately the release is being delayed and there is no release date yet.

papr 03 September, 2018, 09:57:36

What is it exactly that you need from the new release? Maybe I can find a work-around for you.

papr 03 September, 2018, 10:05:35

@user-af87c8 Try this before passing the values into the function.

import msgpack
gaze_serialized = [msgpack.packb(gaze, use_bin_type=True) for gaze in gaze_list]
papr 03 September, 2018, 10:08:36

@user-b70259 Then my guess is that the failed to init call came from the eye1 process. Just close the second (gray) eye window.

The 3d pupil data is relative to eye coordinate system. The data is msgpack encoded. There should be no need for string searching.

user-af87c8 03 September, 2018, 10:09:28

@papr thanks! I knew there had to be an "easy" way to become compatible. But I was lacking the msgpack vodoo for it πŸ˜‰

papr 03 September, 2018, 10:10:50

@user-af87c8 let me know if there are any other incompatibilities πŸ™‚

user-ad8e2d 03 September, 2018, 10:17:55

@papr it is just subscribing to the frame.world on the frame publisher on windows

papr 03 September, 2018, 10:21:50

@user-ad8e2d Did you see the temporary fixed plugin? Were you able to get it working?

user-ad8e2d 03 September, 2018, 10:33:28

@papr No I didn't get it working. I couldn't find the file pupil_capture_settings, but there is a file called plugins in \torch\utils\trainer\plugins. Do I have to build it from source?

papr 03 September, 2018, 10:34:42

pupil_capture_settings is a folder and it should exist directly under your home directory. This folder includes another folder called plugin in which the temporally fixed version needs to be placed.

papr 03 September, 2018, 10:35:18

The \torch directory is not related to our project.

user-ad8e2d 03 September, 2018, 10:37:55

@papr found it thanks for your help πŸ˜ƒ

user-ad8e2d 03 September, 2018, 10:56:23

@papr I have included your file and the help message shows up in the capture UI, but I don't receive the frame.world when I subscribe to the frame publisher.

user-ad8e2d 03 September, 2018, 10:57:05

Could it have anything to do with the recent_events function as it still says frame (just guessing, I have no clue)

papr 03 September, 2018, 10:58:22

There should be two "Frame Publisher"s listed. Make sure to load the "Frame Publisher Fixed" plugin from the plugin manager

user-ad8e2d 03 September, 2018, 10:59:48

@papr That's it! Thanks for your help

user-ad8e2d 03 September, 2018, 10:59:57

Works great!

papr 03 September, 2018, 11:00:10

Nice! Happy to help πŸ™‚

user-3f0708 03 September, 2018, 17:05:59

How do I check the accuracy of the tracking? Does the pupil labs platform have any plugins or scripts for this type of verification?

papr 03 September, 2018, 18:26:50

@user-3f0708 check out the accuracy visualizer. It is loaded by default

user-3f0708 03 September, 2018, 18:33:49

ok

user-648a71 03 September, 2018, 18:52:54

Hi, I am trying to build boost.python, but I got an error when run the bootstrap.bat. Ther error is "'cl' is not recognized as an internal or external command, operable program or batch file.". Does anybody know a solution?

papr 03 September, 2018, 19:43:01

@user-648a71 am I right that you are trying to run from source on windows?

user-6aef01 04 September, 2018, 10:31:33

Guys can anyone help me making my own JST to USB cable for pupil cam? I'm hoping it's the standard pinout of USB but sometimes it's different.

user-4b5226 04 September, 2018, 10:33:05

@user-6aef01 does this video help ? https://youtu.be/nrWDFt_sLDs

user-6aef01 04 September, 2018, 10:34:09

I don't think the JST connector to USB pinout is universal among all USB cameras. So I want Pupil-Labs to verify the pinout.

user-4b5226 04 September, 2018, 10:35:11

aah. i'll test some things out sometime this week & get back to ya with an update

user-6aef01 04 September, 2018, 11:29:30

better askthe devs, if you mix pins you can damage the pcb

user-4b5226 04 September, 2018, 18:12:08

Has anyone answered which angle the Pupil circuitry needs to be placed in an oculus GO to [email removed] ?

user-4b5226 04 September, 2018, 18:12:55

Also, does the distance of the Pupils matter when placed within a VR hmd ? Or does the calibration take that into account & adjust accordingly ?

user-648a71 04 September, 2018, 18:30:33

@papr Yes, I am trying to run the source code from Windows.

papr 04 September, 2018, 22:44:51

@user-648a71 may I ask for the particular reason? Running from source is very difficult to get right and is often not required. I am sure that I can find an alternative solution for you.

user-a6b05f 05 September, 2018, 06:16:05

@papr Thank you! I shall try this out

user-6aef01 05 September, 2018, 08:00:56

@user-4b5226 That video is unrelated by the way.

user-6aef01 05 September, 2018, 08:01:39

@mpk @papr What is the JST connector pinout for the pupil cam?

papr 05 September, 2018, 08:14:29

@user-6aef01

Chat image

papr 05 September, 2018, 08:15:32

From top to bottom: Ground, D+, D-, 5V

user-ad8e2d 05 September, 2018, 08:41:28

Hey @papr , I am using c++ on windows and when I get gaze.3d.01. data, I unpack into a msgpack::object, then use cout to get this: {"topic":"gaze.3d.01.","eye_centers_3d":{0:[18.8821,13.5578,-25.179],1:[-38.8884,13.211,-24.1582]}... Then when I go to put it into a string it seems to have a problem with the 0 after the "eye_centers_3d" and believes it is the null terminator, as a solution I use stringstream and put "" around the 0. Just wondering if there is a way you could put the "" around the 0 before you send it or if you know if there is a way to do this better as my solution is not that great. Thanks, Jack

papr 05 September, 2018, 08:45:36

@user-ad8e2d It is totally legal to use numbers as keys in msgpack-key-value-mappings and unpacking is working correctly as well. Therefore I suspect the issue to be specific to how msgpack::object converts the object into a string representation.

user-ad8e2d 05 September, 2018, 08:46:30

That's great! I'll look into that thank you

user-b70259 05 September, 2018, 09:12:27

@papr I thought that is hardware problem, so I checked the cable, and Pupil Capture works well now. Then, after calibration, I got two angles 'theta' and 'phi'. The eye coordinate system is 3D polar coordinate system, right? The angle I need is theta, so my problem is solved. Thank you so much!!

user-ad8e2d 05 September, 2018, 10:32:36

Hey, could anyone give me a hint of how they parse the gaze data in c++. The way I am doing it is not great and would love if someone could help. πŸ˜ƒ

user-d9bb5a 05 September, 2018, 12:21:14

Good afternoon. Please tell me what the data means - base data?

user-d9bb5a 05 September, 2018, 12:21:54

I'm worried that I'm not analyzing them correctly ...

papr 05 September, 2018, 12:23:27

The base_data field includes the data on which the current datum is based on. E.g. two pupil datums can be combined and mapped to one binocular gaze datum. The gaze datum's base datum would include the two pupil datums.

user-d9bb5a 05 September, 2018, 12:28:03

Can you advise where to find information, how to use this data correctly?

user-d9bb5a 05 September, 2018, 12:28:19

or examples

user-d9bb5a 05 September, 2018, 12:34:17

please...

papr 05 September, 2018, 12:41:41

What do you mean by "use [...] correcly"? Correctly in which use case?

user-d9bb5a 05 September, 2018, 12:48:20

In the case of analysis of data obtained during the study equipment Pupil. I analyze fixations, saccades, study time, the diameter of the pupils .. and the question arose how to use "base data"

papr 05 September, 2018, 12:52:37

This is the process chain: pupil -> gaze -> fixations. base data is just used to go back within this chain: e.g. you have one fixation and want to know which pupil positions belong to it.

base data is not new data but a reference to existing data. There is no "This is how you analyze base_data" because it is context dependend data.

user-d9bb5a 05 September, 2018, 12:54:38

Thank you very much! Immediately confused myself) Tomorrow is a very important study, here I am preparing ...

user-d9bb5a 05 September, 2018, 12:56:55

Can you still tell me what data you can use?

user-d9bb5a 05 September, 2018, 12:57:13

My research will be in the store

papr 05 September, 2018, 13:06:30

Please see my personal message to you.

user-d9bb5a 05 September, 2018, 13:06:51

Ok

user-bf07d4 06 September, 2018, 22:05:15

help needed for manual marker calibration... please

wrp 07 September, 2018, 03:50:47

Hi @user-bf07d4 please let us know what steps you have taken for manual marker calibration so that we can provide you with concrete feedback.

user-799634 07 September, 2018, 05:22:47

Hi, i want to get angle of binocular vector at point of convergence (where i looking at). i used 3d calibration mode and computed scipy.spatial.distance.cosine(s0_normal, s1_normal), i think it is radian value, so i converted that to angle by using np.arccos(result). Is this right access to get angle of binocular vector? @papr

mpk 07 September, 2018, 07:04:18

@user-799634 that data is already in the gaze datum. Just look at gaze point 3d.

user-799634 07 September, 2018, 07:15:35

My basic purpose is to get gaze distance. but when i get data using gaze_point_3d_z that data is not as similar as actual distance where i looking at. In my test, when i looking at 500mm, the z value shows approximately 60~80mm. I just printed ['gaze_point_3d'][2] @mpk

mpk 07 September, 2018, 07:43:41

@user-799634 yes the value is not a perfect ruler. You would have to add an additional depth calibration if you need this kind of accuracy. Estimating true depth through vergence can be tricky. May try adding different depth levels during your calibration?

user-799634 07 September, 2018, 08:02:22

I do not know that part. Actually i'm beginner of python so could you please advise me little how to add different depth levels during calibration @mpk ?

papr 07 September, 2018, 10:31:50

@user-799634 you do not need to be a programmer at all to try different calibration depths. Just position the marker at different depths during the calibration. This is easiest done using a printed marker and using the manual marker calibration procedure.

user-4b5226 08 September, 2018, 05:06:19

Hi ! What’s the minimum CPU power required on Linux | Windows | MacOS ?

mpk 08 September, 2018, 07:39:16

@user-4b5226 depends on the setup. 200hz ? binocular? marker tracking required?

mpk 08 September, 2018, 07:40:01

we have Pupil running on intel m with 1.2ghz but you can also max out at i5 if you want to run with all features active.

user-bb47e7 09 September, 2018, 17:41:54

I'm deleting this message because I find that I did not install the driver properly. I don't understand installation instructions. I should have contacted my peers who had experience with this sort of thing instead of writing here. I must apologize.

papr 09 September, 2018, 17:43:00

@user-bb47e7 let's try to get it working from bundle (not from source) first

papr 09 September, 2018, 17:43:27

If all cams are blocked it does mean that the drivers were not installed correctly

papr 09 September, 2018, 17:43:43

Please restart capture with administrator rights

user-bb47e7 09 September, 2018, 17:55:32

Eye cameras working fine. Problem is only with world camera

papr 09 September, 2018, 17:57:21

@user-bb47e7 please write an email to info@pupil-labs.com mentioning that you are having problems getting your realsense camera to work

user-bb47e7 09 September, 2018, 17:58:28

OK. Thank you for replies.

user-7a9710 10 September, 2018, 08:25:32

Hi, I have a question about how pupil lab work. How can a single eye camera know the 3d transformation of our eye? The software gives us eyecenter,eyedia meter,pupil location etc. How on earth could it figure out all of that just from the image of the eye.

user-6fba51 10 September, 2018, 09:07:00

What's the image sensor in the 200Hz Pupil cam? I'm familiar with the IC and can generate custom firmwares for experimenting with various exposures and modes if I know the sensor as well. I know this voids any warranty but I have enough funding for my research to afford that.

papr 10 September, 2018, 09:18:39

@user-6fba51 Please write an email to ppr@pupil-labs.com concerning this issue.

user-fdb87b 10 September, 2018, 09:29:54

What do you guys use to measure your camera's latency?

user-fdb87b 10 September, 2018, 12:25:33

And is it okay my Pupil Cam resolution is set to 192x192 in Capture ratherthan 200x200?

papr 10 September, 2018, 12:29:55

@user-fdb87b yes this ok. In regard to the camera latency: We have access to two different timestamps: 1. Start of exposure 2. Frame availability in software

The difference between these two timestamps is our latency

user-fdb87b 10 September, 2018, 12:30:39

How do you know start of exposure?

papr 10 September, 2018, 12:31:17

it is provided in the frame header data. libuvc exposes this data to the user

user-fdb87b 10 September, 2018, 12:31:46

hm, cool. Any way I can add the extra 8 pixels to be used?

papr 10 September, 2018, 12:32:28

Sorry, but I am not sure what you mean by that.

user-fdb87b 10 September, 2018, 12:32:41

I mean can I use 200x200 frames?

papr 10 September, 2018, 12:34:42

No, I think there are only 192x192 and 400x400 frame sizes available

user-fdb87b 10 September, 2018, 12:35:26

OK. maybe you should updated the webpages that say frames at 200hz are 200x200/

papr 10 September, 2018, 12:41:43

I will forward your feedback. πŸ‘ Thank you very much!

user-fdb87b 10 September, 2018, 12:42:07

have a good day

papr 10 September, 2018, 12:42:14

You too!

user-a3341c 11 September, 2018, 07:36:16

Does Pupil code us Dshow, MSMF or WindowsRT to access the cameras?

papr 11 September, 2018, 07:38:00

@user-a3341c we use libuvc.

user-a3341c 11 September, 2018, 07:39:55

I guess I need to check what that uses internally. I currently use DSHOW in my code to list cameras by both their names and indexes to choose what camera to use and want to integrate the Pupil Labs code with my code. I'll either need to modify my or Pupil code more, not sure which. If they both used the same frontend (therefore same indexes) things would be easier.

papr 11 September, 2018, 07:41:22

@user-a3341c we also the libusbk drivers on windows, fyi.

user-a3341c 11 September, 2018, 07:42:26

isn't libusbk just a general interface, don't you still need a library for accessing UVC cameras specifically?

papr 11 September, 2018, 07:43:25

@user-a3341c yes, this is where libuvc comes in

papr 11 September, 2018, 07:44:35

(if I understood it correctly)

user-a3341c 11 September, 2018, 07:44:45

well, I guess I need to modify my code then to use libuvc as well, otherwise I have to modify Pupil library to not use libuvc which can lead to many missues

papr 11 September, 2018, 07:45:27

@user-a3341c yes, I recommend that as well

user-a3341c 11 September, 2018, 07:46:05

Doe libuvc provide methods to list available cameras?

user-a3341c 11 September, 2018, 07:46:49

found it, uvc_get_device_list

papr 11 September, 2018, 07:47:22

@user-a3341c be aware that this fn also lists devices that are in use

user-a3341c 11 September, 2018, 07:48:24

I think a simple try, Except on uvc_open() would do the trick in figuring out which are available

papr 11 September, 2018, 07:48:52

Yes, I think so too

user-a3341c 11 September, 2018, 07:49:00

okay, thanks for the help

papr 11 September, 2018, 07:49:24

No problem! Let me know if you have any other questions :+1:

user-a3341c 11 September, 2018, 07:50:18

um, I actually forgot one. Any ideas why libusbk pupil cam isnt recognized by any other USB camera program?

user-a3341c 11 September, 2018, 07:50:30

I thought the driver shouldn't matter

papr 11 September, 2018, 08:16:04

I don't know any details about that but my guess would be that other programs look for a specific driver categories and libusbk is not one of them.

user-a3341c 11 September, 2018, 08:52:08

ok

user-a3341c 11 September, 2018, 08:53:04

any suggested camera software tio check pupil cam without launching the pupil programs?

papr 11 September, 2018, 08:54:15

You can use pyuvc and write a small test utility if you installed the source dependencies.

papr 11 September, 2018, 08:54:45

This is the official pyuvc example: https://github.com/pupil-labs/pyuvc/blob/master/example.py

user-a3341c 11 September, 2018, 08:55:11

neat. thanks

user-a3341c 11 September, 2018, 08:56:20

here's a less important question: why do DIY cams also need to be reflashed with a libusbk driver? Looking into libuvc that doesn't seem to be a requirement

papr 11 September, 2018, 08:58:55

Unfortunately, I do not know that. @user-0f7b55 do you have inside on that?

user-0f7b55 11 September, 2018, 09:14:01

Reflashing? If you mean modification of firmware , no , this is not required at all

user-0f7b55 11 September, 2018, 09:15:35

As long as the camera is UVC compliant

user-a3341c 11 September, 2018, 09:50:25

I mean swapping the driver from libusb? to libusbk. Dunno if that changes the firmware on the cameras themselves. Why is that required. To moment someone does that his camera won't work with any other program.

papr 11 September, 2018, 09:55:47

We use libuvc as a cross platform way to list and access cameras. AFAIK it is not possible to list and access devices from other driver categories other than libusbk on Windows. Therefore we need you to install the libusbk drivers for the DIY cams as well.

Installing the drivers does not flash the firmware. "Flashing drivers" is a somehow confusing term. I suggest we use "flash firmware" and "install drivers" as two separate terms to avoid confusion.

user-a3341c 11 September, 2018, 09:56:56

"AFAIK it is not possible to list and access devices from other driver categories other than libusbk on Windows."

user-a3341c 11 September, 2018, 09:57:06

Not really true, let me find some code

user-a3341c 11 September, 2018, 10:00:09

here: https://pastebin.com/aUv51bfD .This uses Qt to list all cam names and their indexes on Windows an Linux, but if a Python wrapper was generated for MSMF we maybe wouldn't even need the Qt dependency. Still better than making webcams not work with other programs, imo.

papr 11 September, 2018, 10:03:00

You are right. I need to specify my statement: "AFAIK it is not possible to list and access devices from other driver categories other than libusbk on Windows [using libuvc]."

papr 11 September, 2018, 10:05:27

And yes, we could start doing of the following things:

  1. Write a qt backend. The question is if this would allow us to access all camera features (e.g. 200hz frame rate, etc)
  2. Write a platform specific backend. Unfortunately we do not have the resource to do so.
user-a3341c 11 September, 2018, 10:06:07

I don't know if it would provide the features of libuvc. Like libuvc can access info such as start of exposure time? Didn't know cameras provide that info at all.

papr 11 September, 2018, 10:07:37

Yes, that would be an other issue to consider.

papr 11 September, 2018, 10:09:23

So summarizing: We sacrifice non-libuvc cam access on Windows in order to get code maintainability and all camera features

user-a3341c 11 September, 2018, 10:10:20

Pupil cam, though, worked for me without requiring to install libusbk drivers. How is this possible.

user-0f7b55 11 September, 2018, 10:15:24

Pupil capture takes care of driver install on windows

user-a3341c 11 September, 2018, 10:16:13

You mean in the background when connecting pupil cam? Because I didn't install Pupil Capture, just unzipped.

user-0f7b55 11 September, 2018, 10:17:55

when you run pupil capture the first time it will replace the driver with libusbk

user-a3341c 11 September, 2018, 10:18:16

ok. So this is only a Windows thing?

user-0f7b55 11 September, 2018, 10:19:38

yes

user-0f7b55 11 September, 2018, 10:19:46

lubusbK is strictly windows

user-a3341c 11 September, 2018, 10:19:52

ok thank you

user-0f7b55 11 September, 2018, 10:21:08

sure

user-eafbc8 12 September, 2018, 00:17:37

Looking to use pupil with a portable, small form factor computer (like raspberry pi/beaglebone). Is there any such computer hardware suggested for pupil? (Also, is that at all likely to work and have enough processing power?)

user-ec0ec0 12 September, 2018, 06:34:34

Hello, I'm getting this error when using my DIY camera as eye camera.

This is the log poped up in command line window: eye0 - [WARNING] video_capture.uvc_backend: Capture failed to provide frames. Attempting to reinit. eye0 - [INFO] video_capture.uvc_backend: Found device. SIT USB2.0 Camera. eye0 - [ERROR] uvc: Could not init 'Roll absolute control'! Error: Error: Pipe error. eye0 - [INFO] video_capture.uvc_backend: Hardware timestamps not supported for SIT USB2.0 Camera. Using software timestamps. eye0 - [INFO] camera_models: No user calibration found for camera SIT USB2.0 Camera at resolution (320, 240) eye0 - [INFO] camera_models: No pre-recorded calibration available eye0 - [WARNING] camera_models: Loading dummy calibration eye0 - [WARNING] uvc: Could not set Value. 'Absolute Exposure Time'. eye0 - [WARNING] uvc: Could not set Value. 'White Balance temperature'.

Any ideas?

user-9945c8 12 September, 2018, 08:38:07

Hi, I am performing offline calibration on/exporting a large number of eye tracking files that range from 20-35 minutes in length. I am having an issue where it takes around an hour and a half to process each video and often my computer freezes. How powerful does a computer need to be to process these files and is there an OS that you would recommend? Thanks ,Sara

papr 12 September, 2018, 08:39:13

hey @user-9945c8 Which version of Player do you use and which OS did you try?

user-9945c8 12 September, 2018, 09:18:47

I am using 1.7 on Windows and Mac

papr 12 September, 2018, 09:19:43

Please upgrade to v1.8. We made major improvements in regards to memory requirements in this version.

user-9945c8 12 September, 2018, 09:20:48

Yes, I just did and processed a video and it is quite a bit faster/didn't freeze. Thanks and that is a huge step in a better direction! About the OS, how is processing time on a Linux compared to the others?

papr 12 September, 2018, 09:21:19

I don't think that the OS matters much in this case.

user-9945c8 12 September, 2018, 09:21:33

Also, I would like to get the processing/export time for each video down to about 30 min. Would I be able to achieve this with a regular desktop computer?

papr 12 September, 2018, 09:22:42

This is difficult to say since export time depends on multiple things like exact recording length and if audio export is active.

user-9945c8 12 September, 2018, 09:23:36

Ok. So without audio brings the time down signficantly then? It would be helpful for us if we could see a layout of what factors contribute to processing time and how much as we are working backwards to try to figure this out now.

user-9945c8 12 September, 2018, 09:24:48

I also have an unrelated data collection question. If we have already calibrated can we adjust the angle of the world cam without affecting the calibration in-session? I know we can't move t he eye cam, just not sure about the world cam.

papr 12 September, 2018, 09:27:42

Unfortunately, we do not have any exact numbers on these factors. The biggest factor is the video encoding itself. You will not get around that. Audio export is definitively a factor as well. Audio is automatically exported if an audio.mp4 file was found. Exporting raw data etc does not consume much time.

You should not change any of the camera relations (incl. world cam) after calibration.

user-9945c8 12 September, 2018, 09:28:28

Ok, thanks for your help! Might follow up soon with some questions about offline processing but that gives me a good start.

papr 12 September, 2018, 09:29:04

Sure thing πŸ™‚

user-4580c3 14 September, 2018, 04:18:22

Hi

user-4580c3 14 September, 2018, 04:18:34

Hello?

user-4580c3 14 September, 2018, 04:19:14

@papr hello?

user-4580c3 14 September, 2018, 04:21:25

I have a queation at pupil-lab

user-4580c3 14 September, 2018, 04:21:40

Pupil-lab function

wrp 14 September, 2018, 04:24:26

go ahead with your question @user-4580c3

user-4580c3 14 September, 2018, 04:34:24

Can you measure the slight tremors of the eye that occur when you gaze at one place and the movement of the eye that occurs to keep the object clearly?

wrp 14 September, 2018, 04:48:16

@user-4580c3 these are referred to as micro-saccades (if I am not mistaken). We do not have a saccade classifier built into Pupil software, however, we do have a fixation classifier. You might want to look at the fixation classifier plugin and the supporting gaze data that is used to classify a fixation and then calculate movements based on the associated gaze data for each fixation.

user-4580c3 14 September, 2018, 05:04:10

@wrp thank u :) Let me ask you one more question. Does it include Fixation, saccade, gaze pursuit, and gaze path? and can you output data with that function?

user-5787a9 14 September, 2018, 09:57:37

Hey everyone! Sorry to bother you @papr but I have some questions about running pupil on a docker container. I've downloaded the docker files from https://github.com/pupil-labs/pupil-docker-ubuntu, the building go fine and then after it's built I log inside the container by using the commands on that repo. When I start main.py though everything fails with the error:

user-5787a9 14 September, 2018, 09:57:40

world - [INFO] pupil_detectors.build: Building extension modules... world - [INFO] calibration_routines.optimization_calibration.build: Building extension modules... world - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend world - [INFO] launchables.world: Session setting are from a different version of this app. I will not use those. world - [ERROR] launchables.world: Process Capture crashed with trace: Traceback (most recent call last): File "/root/pupil/pupil_src/launchables/world.py", line 317, in world main_window = glfw.glfwCreateWindow(width, height, "Pupil Capture - World") File "/root/pupil/pupil_src/shared_modules/glfw.py", line 522, in glfwCreateWindow raise Exception("GLFW window failed to create.") Exception: GLFW window failed to create.

papr 14 September, 2018, 10:36:13

Actually, I do not know if this supposed to work. The docker image is mostly meant for building the bundle. @wrp do you have further inside?

wrp 14 September, 2018, 10:56:08

@user-5787a9 you can "run" Pupil in Docker, but you will not be able to access sensor streams from within a docker container

wrp 14 September, 2018, 10:56:29

The objective of the docker container was for building Pupil apps within a container/reproducable env

user-5787a9 14 September, 2018, 11:09:11

Aww too bad. Thanks!!

papr 14 September, 2018, 11:10:13

@user-5787a9 you can use the bundle though if you are looking for an easy way to run Pupil.

user-5787a9 14 September, 2018, 11:15:58

Yep, I was trying to build it from source cause I wanted to add a custom plugin

user-5787a9 14 September, 2018, 11:16:19

But I had some problems with dependencies (damn ros)

papr 14 September, 2018, 13:32:25

@user-5787a9 you do not need to run from source if you want to add a custom plugin.

user-5787a9 14 September, 2018, 15:10:25

'kay, I've followed the docs to install the dependencies and everything went fine. I also installed tensorflow as required for my plugin, then downloaded from source pupil as written in the docs (https://docs.pupil-labs.com/#download-and-run-pupil-source-code). But when I run main.py I get:

user-5787a9 14 September, 2018, 15:10:56

world - [INFO] pupil_detectors.build: Building extension modules... cc1plus: warning: command line option β€˜-Wstrict-prototypes’ is valid for C/ObjC but not for C++ cc1plus: warning: command line option β€˜-Wstrict-prototypes’ is valid for C/ObjC but not for C++ cc1plus: warning: command line option β€˜-Wstrict-prototypes’ is valid for C/ObjC but not for C++ cc1plus: warning: command line option β€˜-Wstrict-prototypes’ is valid for C/ObjC but not for C++ cc1plus: warning: command line option β€˜-Wstrict-prototypes’ is valid for C/ObjC but not for C++ cc1plus: warning: command line option β€˜-Wstrict-prototypes’ is valid for C/ObjC but not for C++ cc1plus: warning: command line option β€˜-Wstrict-prototypes’ is valid for C/ObjC but not for C++ cc1plus: warning: command line option β€˜-Wstrict-prototypes’ is valid for C/ObjC but not for C++ cc1plus: warning: command line option β€˜-Wstrict-prototypes’ is valid for C/ObjC but not for C++ cc1plus: warning: command line option β€˜-Wstrict-prototypes’ is valid for C/ObjC but not for C++ world - [INFO] calibration_routines.optimization_calibration.build: Building extension modules... cc1plus: warning: command line option β€˜-Wstrict-prototypes’ is valid for C/ObjC but not for C++ world - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend world - [ERROR] launchables.world: Process Capture crashed with trace: Traceback (most recent call last): File "/home/optolab/Desktop/pupil/pupil_src/launchables/world.py", line 134, in world from remote_recorder import Remote_Recorder File "/home/optolab/Desktop/pupil/pupil_src/shared_modules/remote_recorder.py", line 26 sensor: ndsi.Sensor ^ SyntaxError: invalid syntax

world - [INFO] launchables.world: Process shutting down.

papr 14 September, 2018, 15:17:35

Which version of python do you use @user-5787a9 ?

papr 14 September, 2018, 15:18:19

Please be aware that Pupil requires python 3.5 orhigher

user-5787a9 14 September, 2018, 15:36:10

import sys print (sys.version) 3.5.2 (default, Nov 23 2017, 16:37:01) [GCC 5.4.0 20160609]

user-5787a9 14 September, 2018, 15:37:52

(and of course I run python3 make.py)

user-5787a9 14 September, 2018, 16:29:45

ok let's try it in another way. I've downloaded the pupil apps from the website, latest version. I want to add this plugin: https://github.com/jesseweisberg/pupil, the object_detector_app specifically. I've read in the docs that I just need to create a folder like that: /home/pupil_capture_settings/plugins/object_detector_app

user-5787a9 14 September, 2018, 16:30:32

where basically i just copy the folder from the original repo by jesseweisberg to that plugins folder

user-5787a9 14 September, 2018, 16:32:11

I've also noticed that in the init.py of the object_detector_app there's a path to jesse's system (/home/jesse/dev/pupil/pupil_src/shared_modules/object_detector_app), so I've changed to the path linking to where the object_detector_app is in my machine

user-5787a9 14 September, 2018, 16:32:27

but when starting the pupil app the plugin is not in the list

user-5787a9 14 September, 2018, 17:07:09

@papr If i change "sensor: ndsi.Sensor" to "sensor = ndsi.Sensor" the problem seems to go away on that line, but I need to comment the ["sensor"] part. Same thing for the similar instances after that one. But another problem arises with msgpack: "msgpack.exceptions.ExtraData: unpack(b) received extra data." 😩

papr 14 September, 2018, 18:58:47

@user-5787a9 yeah the sensor stuff are type annotations but I thought they were compatible with 3.5

papr 14 September, 2018, 18:59:20

@user-5787a9 the other stuff might be related to the Capture version that you are running

user-5787a9 14 September, 2018, 19:27:45

When I ran the docker with python 3.6 those errors didn't show up. I'll try a fresh install with it next week to be sure of that... In the meantime I can try to port jesse object detector in a plugin fashion so it can be loaded by the normal app

user-5787a9 14 September, 2018, 19:28:15

Thank you for your help ☺ I'll update you as soon as I have news

user-2798d6 16 September, 2018, 01:50:00

Hello all! I've tried searching this, but Discord crashes when I do, so I'll ask here. I'm trying to get more information on the Accuracy Visualizer. Do any of you all have ways to improve accuracy with manual marker calibration?

user-591137 16 September, 2018, 02:45:03

Hello, guys

user-2798d6 16 September, 2018, 18:54:00

One other question - when I use fingertip calibration in Capture, my frame rate drops drastically. The scene is moving in slow motion. Is there a fix for that? Thank you!

wrp 17 September, 2018, 04:31:36

@user-2798d6 how exactly are your performing the manual marker calibration? How many positions? What accuracy are you seeing? Have you tried single marker calibration method using manual marker (select this option from the drop down menu)?

wrp 17 September, 2018, 04:31:59

@user-591137 Hi πŸ‘‹ welcome to Pupil community chat!

wrp 17 September, 2018, 04:32:37

@user-2798d6 fingertip calibration frame rate will drop if you are using a machine without NVIDIA GPU. If you click V to turn off Visualization frame rate will speed up even on non NVIDIA GPU machines.

wrp 17 September, 2018, 04:33:32

V visualizes the fingertip points within the world view. Clicking V in the world GUI, will enable/disable the visualization. You may want to start with visualization on so that you can see that it is working and then disable when you calibrate if you want to speed up frame rate - for example.

user-2798d6 17 September, 2018, 14:39:06

@wrp We are performing manual marker calibration with the marker printed on an 8.5 x 11 page. We are standing about 6 feet away from the participant and using about 9 positions (a square and then a center point).

user-2798d6 17 September, 2018, 14:49:31

Not sure if this is the place to ask this, but my left eye camera is not working correctly. It started by freezing a lot, and then today was flickering and cutting off entirely. I'm not well versed in working on electronics, so I'm hesitant to mess with it too much myself. Do you all have any suggestions for this? Thank you!

user-072005 17 September, 2018, 15:17:27

Hi, I'm having a problem with the Pupil Mobile app where the world camera recording has some problem. It produces a file that has say 500 MB, but the video says it is 00:00:00 long. This causes the Player to not be able to load the files. In effect, I lose that entire run. Is it possible to recover these files somehow?

papr 18 September, 2018, 07:32:18

@user-2798d6 please write an email to info@pupil-labs.com concerning the hardware issue

papr 18 September, 2018, 07:32:58

@user-072005 which version of Pupil Mobile did you use to make the recording?

papr 18 September, 2018, 07:40:38

@user-2798d6 also to clarify @wrp's statement: What lowers the frame rate is not the visualization itself but the processing of the neural network that does the fingertip detection. The NN processing is turned on if the calibration is active or if the visualization is enabled. Therefore you will probably experience frame drops during the calibration as well. This might result in a worse calibration result.

user-072005 18 September, 2018, 11:46:40

@papr when I posted, the version before the current one, but I went out yesterday after updating the app and tested again, and it happened again

user-072005 18 September, 2018, 11:52:10

Has anyone else reported this problem? I see it isn't in the github. I didn't encounter it earlier when I was collecting data in March, but it became a big problem when I went out again in June, and continues. But, if no one else has this problem, could it be that I'm using a Samsung Galaxy s8?

user-29e10a 18 September, 2018, 11:56:23

@user-072005 @papr maybe this is related to my issue? https://github.com/pupil-labs/pupil/issues/1275 ... I'm not using pupil mobile, but perhaps there is something in the network pipeline broken...? we use frame publisher, which generates a high load, too

user-29e10a 18 September, 2018, 11:56:54

this issue is driving us nuts

user-072005 18 September, 2018, 12:19:41

I'm not using capture and I don't see a timestamps file in the files that work, but I'm not the most savvy in this field. It does sound like a similar situation. The problem started at the same time as I started getting a bunch of folders that just have two files key_00040000.data and key_00040000.time in them. But this happens both when the video works and when it doesn't. Not sure if that could be related

user-29e10a 18 September, 2018, 12:23:22

@user-072005 do you have time to test it with v1.6? I'm pretty sure our issue exists after 1.7 was introduced...

user-072005 18 September, 2018, 12:23:43

v1.6 of player?

user-072005 18 September, 2018, 12:23:54

or... mobile?

user-072005 18 September, 2018, 12:24:49

The issue is with the recorded files in my phone that I made with pupil mobile

user-29e10a 18 September, 2018, 12:30:56

ok, I see, nevermind πŸ˜ƒ

user-019256 18 September, 2018, 16:34:28

Hey everyone, I am trying to change the gaze mapper to Dual_Monocular_Gaze_Mapper to get gaze data for each of the eyes separately. Is there an easy way to do that? And would it even be the right thing to do? At the end I would like to have gaze data on a surface for each eye separately. Will the export function in pupil player create a fitting gaze_positions.csv or gaze_positions_on_surface.csv with columns for each eye if I just change the gaze mapper? Thank you very much in advance, Andreas

papr 18 September, 2018, 19:26:26

@user-019256 Yes, this would work. You will need to adjust finish_calibration.py since this is the file that selects the Binocular_Mapper over the Dual_monocular_Mapper

user-a08c80 18 September, 2018, 22:53:02

What are the principles of using a new scene camera with pupil headset pupil cams? We have a USB-out from our camera, but it is not being detected in Capture as a source.

user-6dcf68 19 September, 2018, 07:03:27

Heya folks - could anyone be of assistance at this hour?

papr 19 September, 2018, 07:08:56

@user-6dcf68 hey, what's up? πŸ™‚

papr 19 September, 2018, 07:09:36

@user-a08c80 on which operating system is that?

user-6dcf68 19 September, 2018, 07:36:08

Howdy! Trying to get this installed on windows 10 for class. Https://github.com/pupil-labs/pupil/releases/tag/v1.8

user-6dcf68 19 September, 2018, 07:38:48

Attempted to unzip. I installed python too. It's not running

papr 19 September, 2018, 08:15:46

@user-6dcf68 You do not need to install python explicitly

papr 19 September, 2018, 08:16:44

The zip files that you can download on the linked github page are bundled applications. They should run without having to install anything manually.

papr 19 September, 2018, 08:17:24

Just unzip the 7zip file and execute Capture.exe. The first time with admin rights to install drivers

user-6dcf68 19 September, 2018, 13:59:13

sir. you are the greatest. I cannot possibly thank you enough. This app is great!

user-a7d017 20 September, 2018, 01:02:56

@wrp @papr Hi, for a class project I want to modify offline fixation detector to be offline saccade detector, using a basic saccade detection algorithm. Can someone explain/point to where does offline fixation detector read the data for previously recorded video, such as in using fixation detector in Pupil Player rather than Pupil Capture? I've went over the source code in fixation detector plugin but I'm not sure

papr 20 September, 2018, 09:05:49

@user-a7d017 Hey, this sounds like a cool project! πŸ™‚ The fixation_detector.py consists of multiple parts:

  • Online Fixation Detector
  • Offline Fixation Detector
  • Shared code for both above

This is the detectionfunction that the fixation detector calls in a background process: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/fixation_detector.py#L152

It takes gaze data as input and yields detected fixations. You should be able to modify it to yield saccades.

user-5787a9 20 September, 2018, 13:56:57

@papr Hey, I've followed the installation instructions on the pupil-docs page, I'm running ubuntu 16. The opencv that gets installed is 4, so when running main.py it fails cause it tries to load opencv2/core.hpp. with opencv4 the correct path is usr/local/include/opencv4/opencv2, but if I change it to point to it it fails again cause core.hpp tries to call other files with the same method... I don't think is a good idea to change every path in the local files, so is there any clean way to correctly load opencv2 headers?

papr 20 September, 2018, 13:58:28

@user-5787a9 Checkout an older opencv version after cloning the git repository

papr 20 September, 2018, 13:58:37

e.g. v3.2

user-5787a9 20 September, 2018, 14:16:05

I resolved making a copy of the entite opencv2 folder on top, now the headers are found

user-5787a9 20 September, 2018, 15:03:18

@papr ok now the error is the following, don't know what to do:

user-5787a9 20 September, 2018, 15:24:38

MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version. world - [INFO] launchables.world: Application Version: 1.8.46 world - [INFO] launchables.world: System Info: User: optolab, Platform: Linux, Machine: stark, Release: 4.15.0-34-generic, Version: #37~16.04.1-Ubuntu SMP Tue Aug 28 10:44:06 UTC 2018 world - [INFO] pupil_detectors.build: Building extension modules... world - [INFO] calibration_routines.optimization_calibration.build: Building extension modules... world - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend Exception in thread Thread-3: Traceback (most recent call last): File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner self.run() File "/usr/lib/python3.5/threading.py", line 862, in run self._target(self._args, self._kwargs) File "main.py", line 160, in log_loop topic,msg = sub.recv() File "/home/optolab/pupil/pupil_src/shared_modules/zmq_tools.py", line 99, in recv payload = self.deserialize_payload(remaining_frames) File "/home/optolab/pupil/pupil_src/shared_modules/zmq_tools.py", line 110, in deserialize_payload payload = serializer.loads(payload_serialized, encoding='utf-8') File "msgpack/_unpacker.pyx", line 208, in msgpack._unpacker.unpackb msgpack.exceptions.ExtraData: unpack(b) received extra data.

user-5787a9 20 September, 2018, 15:24:52

the only difference is that I've installed tensorflow-gpu now

user-5787a9 20 September, 2018, 15:25:15

don't know if something broke because of that, a minute ago I was able to start everything

user-a08c80 20 September, 2018, 15:26:38

@papr we have access to both OSX and Windows 10. Currently mac seems to be working better

user-5787a9 20 September, 2018, 16:01:49

yeah it seems that tensorflow breaks something in msgpack. If I uninstall it everything works again, problem is I need tensorflow...

user-e267d1 20 September, 2018, 20:51:50

Hi everyone, I'm trying to sync two pupil labs eye-trackers together wirelessly. I purchased two mobile bundles and have the app running on both phone but cannot get them to see each other. Is there a setting in the pupil service application that allows the simultaneous viewing of two phones?

papr 20 September, 2018, 21:29:27

@user-e267d1 What do you mean see each other?

papr 20 September, 2018, 21:31:24

@user-a08c80 On windows you need to manually install the libusbk drivers for your other camera

papr 20 September, 2018, 21:31:45

@user-5787a9 Are you sending anything to the ipc via pupil remote or something similar?

user-e267d1 20 September, 2018, 22:25:01

@papr I wish to view and record the output of two pupil mobile phones simultaneously. Pupil capture (Mac) is configured in a way that only allows me to see the output of one phone (first person view and eyecam). The end goal is to hit record and have both phones capturing footage while synced.

user-591137 21 September, 2018, 03:11:00

hey, a friend told me about this server

user-591137 21 September, 2018, 03:11:08

and... he said that I could make my own eye thingy

user-591137 21 September, 2018, 03:11:11

How can I do that?

wrp 21 September, 2018, 03:15:25

@user-591137 please see https://docs.pupil-labs.com/#diy

user-591137 21 September, 2018, 03:15:44

Oh, that looks pretty cool

user-591137 21 September, 2018, 03:15:53

Unfortunately looks like I must be smart to make that kind of stuff 😦

wrp 21 September, 2018, 03:16:38

If you want to go DIY route, the link above provides instructions for building hardware. If you want to get an eye tracking headset (e.g. not build yourself) please see - https://pupil-labs.com/store and then download software from https://github.com/pupil-labs/pupil/releases/latest

user-5787a9 21 September, 2018, 06:13:39

@papr nope nothing. I've tried yesterday with version 1.5 of pupil and it works with both msgpack and tensorflow installed. Today I'll try with version 1.7

user-5787a9 21 September, 2018, 07:54:26

'kay, versions 1.8 and 1.7 of pupil do not work, but from 1.6 and below the problem goes away. In my opinion is just a call to a function that exists both in tensorflow and msgpack, downgrading both do not resolve the issue unfortunately

user-81d8b5 21 September, 2018, 11:31:22

I received my pupil labs glasses, but I can't get them to work. I get a sound that the device is connected, but there is no response afterwards (i.e., no further installation of any driver). I tried both Apple and Windows machines, but with no avail. Anyway has an idea?

wrp 21 September, 2018, 11:33:51

@user-81d8b5 what headset configuration are you using?

wrp 21 September, 2018, 11:34:37

@user-81d8b5 What versions of macos and windows?

user-81d8b5 21 September, 2018, 11:37:14

the binocular and high speed

user-81d8b5 21 September, 2018, 11:37:40

using latest version of windows and macOS

wrp 21 September, 2018, 11:39:21

Can you ensure that the usb c cable is firmly connected to the collar clip.

user-81d8b5 21 September, 2018, 11:39:32

yes, I even tried different cables

wrp 21 September, 2018, 11:39:37

In windows please run pupil capture as admin

user-81d8b5 21 September, 2018, 11:40:16

The glasses just don't show up, not even in the device manager

user-81d8b5 21 September, 2018, 11:40:42

It gives a connected sound, but there's no further action

user-81d8b5 21 September, 2018, 11:41:56

the drivers don't get installed

wrp 21 September, 2018, 11:47:22

Are you running the latest version of Pupil Capture and right click and running as admin?

user-81d8b5 21 September, 2018, 11:47:29

yes i do

user-81d8b5 21 September, 2018, 11:49:23

could it be that the usb clip is broken?

wrp 21 September, 2018, 11:50:43

Ok, we will follow up via email to discuss further

user-81d8b5 21 September, 2018, 11:51:37

which email?

user-67e255 21 September, 2018, 12:02:13

@user-81d8b5 we've sent further instructions via our info@ email, thanks!

user-e267d1 21 September, 2018, 15:18:39

Hi, I'm trying to connect my pupil mobile headset to my mac using the pupil labs software however when I view the stream, the headcam view keeps entering ghost mode (every ~3ish seconds). Any tips on how to prevent this?

user-e267d1 21 September, 2018, 17:16:36

I am also trying to use an external microphone connected to the mobile phones, is there a setting where it uses my external microphone and not the phone's internal mic?

user-e267d1 21 September, 2018, 17:41:19

Another question that we have is that when using two phones, we want to be able to start recording at the same time. We can do independent recordings, but would like a way to sync the output of both phones, either using timestamps or some other method. Does anyone have experience in this?

papr 21 September, 2018, 19:55:10

@user-e267d1 which version of Pupil Mobile and Capture are you using?

user-3f0708 21 September, 2018, 20:22:46

In the pupil it is possible to obtain the coordinates of fixation of each frame during the tracking with the corresponding time?

papr 21 September, 2018, 20:27:03

@user-3f0708 yes, this information should be available after exporting

user-3f0708 21 September, 2018, 20:27:59

I will analyze the files after export

papr 21 September, 2018, 20:30:46

It might be possible that the exported data only includes start and stop frame indeces πŸ€” but I will have to look up the details

user-3f0708 21 September, 2018, 20:31:28 smoothing out the gaze so the mouse has smoother movement
        smooth_x += 0.35 * (raw_x-smooth_x)
        smooth_y += 0.35 * (raw_y-smooth_y)
        x = smooth_x
        y = smooth_y

If changing this value 0.35 will have much impact on mouse movement during the execution of the mouse_controll.py scrip

papr 21 September, 2018, 20:34:14

@user-3f0708 sorry, is this related to your previous question about fixations?

user-3f0708 21 September, 2018, 20:39:05

No

user-965faf 22 September, 2018, 08:29:10

Hi all, could someone provide me with a list of psychological experiments using Pupil Labs equipment that investigated changes in pupil dilation in sustained attention tasks? Someone provided this a few months ago and I can't for the life of me find it again

papr 22 September, 2018, 10:20:22

@user-965faf do you mean our citation list? https://docs.google.com/spreadsheets/u/1/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/htmlview#gid=0

user-965faf 22 September, 2018, 11:50:50

Thanks a lot @papr

user-c6a3b8 22 September, 2018, 17:44:30

hi

user-c6a3b8 22 September, 2018, 17:44:54

can we make heatmap video from pupil eyetracker saved file

user-c6a3b8 22 September, 2018, 17:45:12

if yes please help me how

user-965faf 22 September, 2018, 17:46:20

Hi again.

user-c6a3b8 22 September, 2018, 17:47:58

hi jahoda

user-965faf 22 September, 2018, 17:48:19

Has anyone here managed to successfully integrate PsychoPy and Pupil? I've been looking at the old discussions about this at the start of the year, and have looked at the helper scripts, but I'm a bit lost. If anyone's made a psych experiment analysing PD with PsychoPy and Pupil, I'd appreciate a brief rundown of how you went about it.

user-c6a3b8 22 September, 2018, 17:49:00

please help me

user-19417c 22 September, 2018, 22:09:05

hi all, just want to know if any one works with pupil here? Im from Canada and want details about ordering one.

mpk 23 September, 2018, 05:11:39

@user-19417c I think most people in this forum have Pupil headsets. Ordering is pretty straightforward, just go to the website and check out the store If you want more info a bout the order process, just write an email to info@pupil-labs.com

user-f99340 24 September, 2018, 10:55:58

Hi everyone, I just bought two pupillab headsets and I have some serious issues when I run Capture on a MacBook Pro

user-f99340 24 September, 2018, 10:56:10

Is this the right place to ask?

wrp 24 September, 2018, 10:56:32

Hi @user-f99340 πŸ‘‹ - Yes this is the correct place to ask questions.

wrp 24 September, 2018, 10:57:40

Please could you provide us details about the headset configuration (e.g. high speed world camera + 200hz binocular eye cameras) as well as details about the macOS version you are using. Please also describe the behavior you are observing or the issue you are facing so that we can provide concrete feedback.

user-f99340 24 September, 2018, 10:58:06

Ok

user-f99340 24 September, 2018, 10:58:45

high speed world camera + 200hz binocular eye cameras

user-f99340 24 September, 2018, 10:59:13

On MacBook pro

user-f99340 24 September, 2018, 10:59:42

With High Sierra 10.13.6

user-f99340 24 September, 2018, 11:00:19

Graphics: Intel HD Graphics 630 1536mb. I am mentioning cause might be related

user-f99340 24 September, 2018, 11:00:40

Problem

user-f99340 24 September, 2018, 11:01:30

When I start Capture everything looks all right

user-f99340 24 September, 2018, 11:01:32

When I make the window Full Screen

user-f99340 24 September, 2018, 11:01:47

When I start Capture everything looks all right

user-f99340 24 September, 2018, 11:02:03

When I make the window Full Screen

user-f99340 24 September, 2018, 11:02:34

Capture becomes all blue

user-f99340 24 September, 2018, 11:03:46

If I minimize again it works again

Chat image

user-f99340 24 September, 2018, 11:04:09

When I press calibration

wrp 24 September, 2018, 11:04:10

@user-f99340 thanks for the detailed description and screenshot

user-f99340 24 September, 2018, 11:04:26

It goes all white in full screen

user-f99340 24 September, 2018, 11:04:35

And then when I cancel calibration

user-f99340 24 September, 2018, 11:04:42

All windows are messed up

wrp 24 September, 2018, 11:05:35

@user-f99340 just to confirm, you are using the latest version of Pupil Capture - v1.8 - correct?

user-f99340 24 September, 2018, 11:06:49

Yes

wrp 24 September, 2018, 11:06:58

@user-f99340 I am not able to reproduce this issue on macOS, however this could be related to graphic drivers and Retina display. @papr please could you look into this behavior and make an issue based on @user-f99340 feedback.

papr 24 September, 2018, 11:16:06

@wrp Sure πŸ‘

papr 24 September, 2018, 16:14:27

@user-f99340 Just out of interest, how do you connect your headset to the laptop?

papr 24 September, 2018, 16:28:15

I tried to reproduce your issue -- my setup: - 13" MacBook Pro, early 2015, Intel Iris Graphics 6100 1536 MB, Retina Display - Bundled Pupil Capture, v1.8.26

Neither of the above issues (1. blue screen on fullscreen, 2. white screen during calibration) happens for me. I think this issue is a duplicate/similar to https://github.com/pupil-labs/pupil/issues/1214

user-db5f49 25 September, 2018, 10:02:56

anybody free for a Chat ?

papr 25 September, 2018, 10:03:17

Hey @user-db5f49 Whats up? πŸ™‚

user-db5f49 25 September, 2018, 10:03:49

we`re in class

user-db5f49 25 September, 2018, 10:03:55

from Germany

wrp 25 September, 2018, 10:36:09

Hi @user-db5f49 welcome to Pupil community chat πŸ‘‹

user-8779ef 25 September, 2018, 13:13:33

Hey folks. I'm looking over some old data collected by a student, and see some high frequency noise in more than one data. Any idea where this might have come from? Camera gain up too high, perhaps?

Chat image

user-8779ef 25 September, 2018, 13:13:51

High frequency noise = black dots

papr 25 September, 2018, 13:33:15

@user-8779ef was this data recorded using the 200 Hz cameras?

papr 25 September, 2018, 13:35:19

@user-8779ef this might be related to this https://github.com/pupil-labs/pupil/issues/1116

user-52cbe2 25 September, 2018, 13:42:06

Hello. I got my eyetracker a little while ago, but am just starting to use it now.

user-52cbe2 25 September, 2018, 13:45:18

I got pupil_capture, etc. from the github repo, and everything seems to be working fine. Now I would like to do some programming: write a Python program to consume the pupil and gaze data being sent out, I presume, by pupil_capture. I've been looking through the documentation, but can't seem to find a simple working example or instructions to do this. (If I've missed it, I apologize.) Where are the Python libraries I need to install? Are there examples?

user-52cbe2 25 September, 2018, 13:46:03

NB: I don't want to modify your application or library, I just want to consume data from it using a Python program. Thanks a lot!

papr 25 September, 2018, 13:48:12

This can be used to consume pupil data

user-52cbe2 25 September, 2018, 13:50:24

Thank you.

user-52cbe2 25 September, 2018, 13:51:00

Is there a higher-level API that would allow you to do things like

user-52cbe2 25 September, 2018, 13:51:19

import eyetracker

user-52cbe2 25 September, 2018, 13:51:39

tracker = eyetracker.Eyetracker()

papr 25 September, 2018, 13:51:49

No, unfortunately not

user-52cbe2 25 September, 2018, 13:51:53

tracker.get_gaze()

papr 25 September, 2018, 13:52:14

The highest level api would be to use the network interface as in the example

user-52cbe2 25 September, 2018, 13:53:34

You mean the filter_messages.py example?

papr 25 September, 2018, 13:54:15

Yes

user-52cbe2 25 September, 2018, 13:54:38

OK, I'll try. Thank you.

user-61ae37 25 September, 2018, 13:57:33

Hell all. So I have a question, is there anyway to get the software to output a coordinate of error or just all 0 when a frame is dropped? So in data I’m working with from a lab, the data is shorter then the recorded task. And chunks of the data are gone we lost 20sec for one (and it’s a hand task that only takes 10sec to do and then repeat). From what I understand is if the surface you denote is gone from the frame or the eyes go somewhere else that coordinate isn’t recorded and it hops to the next. I need the data to give me the full time it’s being recorded even if cells have no data I need to have the time match up. Since we are trying to match the eye tracking with kinematics, 20sec missed from the data shifts the eye-data and then it won’t line up with the kinematic movement data.

user-61ae37 25 September, 2018, 13:57:46

Hello not hell whoops

papr 25 September, 2018, 14:04:43

@user-61ae37 could you describe your setup a bit further? You mentioned that you use surfaces? Which data are you missing exactly?

user-019256 25 September, 2018, 14:07:53

Hi! I am trying to run pupil from source on a MacBook Air , but get the following error: ModuleNotFoundError: No module named 'uvc' I traced the error back to the installation of pyuvc, which shows "Failed building wheel for uvc" and later "ld: library not found for -luvc.0.0.7" along with "error: command 'clang' failed with exit status 1". I think the reason for this is that the installation of libuvc shows the following: "MACOSX_RPATH is not specified for the following targets: uvc".

How could I solve this? And is my assumption right that the whole problem starts at the installation of libuvc?

Thank you very much in advance!

papr 25 September, 2018, 14:24:15

@user-019256 Yes, please make sure that libuvc was installed correctly. The second step is to make sure that pyuvc can find the libuvc installation.

user-019256 25 September, 2018, 14:51:22

Thank you for the quick answer. Do you have any idea how I could fix my installation of libuvc? The error I get is: Policy CMP0042 is not set: MACOSX_RPATH is enabled by default. Run "cmake --help-policy CMP0042" for policy details. Use the cmake_policy command to set the policy and suppress this warning.

MACOSX_RPATH is not specified for the following targets:

uvc

papr 25 September, 2018, 14:52:46

I would start by following the instructions and run cmake_policy πŸ™‚

user-019256 25 September, 2018, 15:33:43

OK I found the mistake. After installing libuvc again the error "ld: library not found for -luvc.0.0.7" still came when trying to install pyuvc. I noticed that there was only libuvc0.0.8 installed on my system so I changed the line libs = ['turbojpeg', 'uvc.0.0.7'] to libs = ['turbojpeg', 'uvc.0.0.8'] in setup.py in the pyuvc folder. After that everything works. Could it be that the current version of libuvc is not suitable for the current version of pyuvc? Or is this just a problem on my system? Either way, it works now πŸ˜ƒ

user-61ae37 25 September, 2018, 15:44:25

@papr from what I can tell we get full sets of eye coordinates dropped. So say the video shows a recording of 5:00min and then the excel export when you take the full time that is out put its 4:30min. So the problem is I"m not sure which parts of data drop. So we do eye tracking during an upper extremity prosthetic task. That has two sides of the table that they have to pic up a disc and transport. The world view camera because of head rotation will lose half the table at points. So what this set of PhDs did was create three surfaces in the AOI. The right side of the table (start) the left side of the table (end) and then the prosthetic. So what I think happens is if the head moves and the prosthetic is out of the world view and the eyes go to that segment that isn't in the world view the data doesn't populate and the export just cuts the output short. I'm not 100% sure.

papr 25 September, 2018, 16:04:57

@user-019256 Lib version naming is very tricky to get right on all platforms unfortunately.

marc 26 September, 2018, 08:01:57

Hey @user-61ae37 ! Could you specify in more detail what is missing in what file? Let me summarize what data can expect to get:

  • If you are recording the scene with say 30 Hz and the eyes with 200 Hz, you should of course get multiple gaze estimates per scene frame. In the gaze_positions.csv file of your recording you should therefore always see multiple rows with the same index value. The index values identifies the scene video frame the gaze estimate belongs to.

  • For every gaze estimate that is being made, the software tries to map it to all surfaces. The necessary requirement for this mapping is that the surface was detected in the according world frame. All successful mappings are saved in surfaces/gaze_positions_on_surface_<surface_name>_<surface_id>.csv. If your surface is always visible you should again see multiple rows with the same world_frame_idx. However if your surface is ever not detected in the scene, no further rows will be added to this file until it reappears in the scene. Therefore there can be gaps in between the world_frame_idx values or timestamp values. If you subtract the set of world_frame_idx values from the set of index values, you get the indices of the frames where the surface detection was not successful.

  • The file surfaces/srf_positons__<surface_name>_<surface_id>.csv contains detection results for the surface, i.e. there is exactly one row for each scene frame with a successful surface detection. The frame_idx value identifies the scene frame this detection belongs to.

Maybe this clears things up a bit? If not, please let me know data exactly is missing in what files so we can trace the source of the problem!

user-f2aed7 26 September, 2018, 12:01:42

Hi! I am using the pupil headset (200 Hz binocular) for research purposes and I would like to know if you have an EC declaration of conformity (necessary for ethical approvals..)

papr 26 September, 2018, 12:03:53

We have an EC declaration

user-f2aed7 26 September, 2018, 12:04:31

@papr thanks for your quick answer! can I get this file somehow?

papr 26 September, 2018, 12:10:00

Please contact info@pupil-labs.com for that

user-f2aed7 26 September, 2018, 12:13:56

@papr done! thank you

user-b0c902 26 September, 2018, 13:03:48

Chat image

user-b0c902 26 September, 2018, 13:04:31

Hi, we are not sure about how to resolve this error.

papr 26 September, 2018, 13:04:39

Hey @user-b0c902 You should have gotten a pop-up asking for permissions. Did you deny it?

papr 26 September, 2018, 13:05:14

I think the easiest way would be to reinstall the app.

user-b0c902 26 September, 2018, 13:06:31

Hmm, not sure about the pop up. Never saw it before. Will check again and re-install it. Thanks!

user-b4961f 27 September, 2018, 11:45:26

Hi, I have a question regarding the mobile app use. is there a way for me to calibrate the headset via a connection to a computer, then taking it to the field (far from wifi), and still use the calibration? can I use the user_calibration_data file somehow with the pupil player offline tracking?

papr 27 September, 2018, 12:13:27

@user-b4961f I recommend to use offline calibration in this case. Just record the calibration procedure on site.

user-b4961f 27 September, 2018, 12:21:11

and is there a way to validate it, before running a long recording session?

user-b4961f 27 September, 2018, 12:22:18

could I take an offline calibration from one recording and use it for a different recording with offline tracking?

papr 27 September, 2018, 12:22:26

No, unless you run it in parallel.

papr 27 September, 2018, 12:22:40

This is not possible yet but is planned

papr 27 September, 2018, 12:23:18

I highly recommend to recalibrate in regular intervals though.

user-b4961f 27 September, 2018, 12:26:47

I see. you mean every 10 minutes or so run a calibration procedure, and the offline tracking will know to disregard an old calibration? or should I start the player again for each calibration?

papr 27 September, 2018, 12:27:45

You have to options: 1. One long recording including all calibration procedures 2. Start a new recording before each calibration

papr 27 September, 2018, 12:28:26

The offline calibration allows you define multple calibration and mapping ranges which is helpful for case 1

user-b4961f 27 September, 2018, 12:29:49

OK thank you. and you probably can't give me a definite answer, but if I save a user_calibration_data file, might it be useful in the future? that is, is it your near future plans to implement it?

papr 27 September, 2018, 12:33:22

Actually, the user calibration file is not useful and is only meant as temporary storage of the calibration data.

Capture stores all calibrations during a recording as notification. And at the start of a recording the content of an existing user_calibration_data will be republished as notification

user-b4961f 27 September, 2018, 13:00:57

and the player does it differently? could a plugin in the player use this republished notification as calibration?

papr 27 September, 2018, 13:03:53

You mean as import into an other recording? No.

As I said, we are planning to add the functionality to export and import calibrations. This is the open issue as reference: https://github.com/pupil-labs/pupil/issues/1003

user-b4961f 27 September, 2018, 13:07:04

Thank you. One more thing, do you see any issue that would not allow me to use the mobile app with the HMD add-on to make it wireless? I asked it in the HMD group, but no one answered..

papr 27 September, 2018, 13:28:38

@user-b4961f You mean by streaming the hmd addon's cameras to the PC runnign Capture?

papr 27 September, 2018, 13:28:44

This should work.

user-ecb916 27 September, 2018, 14:14:35

Hi,

user-ecb916 27 September, 2018, 14:16:23

I want to use my heatmaps for a congress abstract and need a legend to explain the colors of the map . I checked the export folder and couldn't find anything like that. Is there a special way to create one?

papr 27 September, 2018, 14:32:08

The values for each pixel are normalized by the maximum value for each surface. This means that surface heatmaps are not comparable to each other

papr 27 September, 2018, 16:08:04

@user-8779ef We were able to reproduce your PyCharm multiprocessing debug issue!

papr 27 September, 2018, 16:11:16

For reference: https://github.com/pupil-labs/pupil/pull/1306

user-988d86 27 September, 2018, 16:18:07

Is there any way to send messages through ZMQ about whether Pupil Capture is in the process of recording?

papr 27 September, 2018, 17:15:05

@user-988d86 No, you cannot request the current state. Currently, you can only listen for state changes: rec start and stop.

user-ae156c 28 September, 2018, 14:15:10

Hi, I am trying to use the surface plugin to track if a worker is looking at a robot during the interaction. As stated in the doc surface hit are streamed but are they also recorded? I can't find an array with it in the recordings. I believe it produces a csv with hit counts but I would be more interested by hit timestamps. Thanks for your help.

user-bf07d4 29 September, 2018, 14:31:12

hello im using a diy setup, trying to detect properly the pupil,can somebody tell me what its wrong with the image from the eye camera?

Chat image

mpk 29 September, 2018, 18:07:44

@user-ae156c we actuelly dont record them. instead you should use the offline surface tracker for that.

user-ae156c 29 September, 2018, 18:17:18

@mpk Thanks for your reply. Is there a documentation on how to use the offline tracker?

papr 29 September, 2018, 20:14:13

@user-ae156c yes, please see our documentation for details docs.pupil-labs.com

user-81fd53 30 September, 2018, 14:39:47

Hello, How do we stream a video from the world camera of the pupil glasses?

papr 30 September, 2018, 14:40:27

Just to clarify: Do you see the video feed in Capture?

user-81fd53 30 September, 2018, 15:30:39

i do, but how can i use it in my python code, i want to do object detection on the live video.

user-c828f5 30 September, 2018, 21:09:13

Hey guys,

user-c828f5 30 September, 2018, 21:09:35

Which file in offline_data holds the calibration points from a natural features calibration?

End of September archive