πŸ‘ core


user-fbd0bd 02 September, 2017, 00:31:36

Hi all. I have just received Pupil hardware. It is good to see you here in discord! Now I am having installation problem in window environment, mainly due to ceres compilation issues. I raised an issue in github, ( https://github.com/pupil-labs/pupil/issues/822 ) so If you have time, please share your kind advice. Thank you!

mpk 02 September, 2017, 11:39:38

Hi Injoon, do you have to make changes to the code? Otherwise I would recommend trying the bundle version first: https://github.com/pupil-labs/pupil/releases/tag/v0.9.14

user-a95584 02 September, 2017, 14:59:24

Hi! Yes, unfortunately i need to change the code. 😦

user-27b8ef 02 September, 2017, 22:32:30

Hi all .. just a quick question regarding the data which is being saved in the pupil_data file. Is there any way to extract the eye ball position in the reference frame of the camera? In a dream world, I'd want the 3D position of the eye ball surface when looking straight πŸ˜ƒ

user-27b8ef 02 September, 2017, 22:33:08

Not sure whether that's possible at all, or whether you only have the eye ball position w.r.t. the eye camera(s)

user-41f1bf 02 September, 2017, 23:01:45

Hi Injoon, have you checked plugins out? They will allow you to do great many things quickly without directly changing the source.

user-2d6071 04 September, 2017, 15:57:05

Hi, we are having some troubles with the pupil tracking. Sometimes it is only processing half of the pupil, especially with big pupils. I attach a picture where only half of the pupil is being processed. We have been playing with the minimun and maximum sizes, camera position and the ROI without success. Do you have any tips to avoid this problem?

Chat image

wrp 04 September, 2017, 15:59:02

@user-2d6071 this is a very distal angle

wrp 04 September, 2017, 15:59:17

Please try moving the camera

papr 04 September, 2017, 16:00:07

@user-2d6071 Please send us an example recording. This looks like a bug. The video would help us to debug the problem.

wrp 04 September, 2017, 16:00:10

@user-2d6071 if you haven't already, please see the good and bad eye images section in docs for reference: https://docs.pupil-labs.com/#3-check-pupil-detection

user-2d6071 04 September, 2017, 16:00:19

we have tried moving it, but we can try again. Do you think that is the reason of the half detection?

user-2d6071 04 September, 2017, 16:00:30

ok, we will record a video πŸ˜ƒ

papr 04 September, 2017, 16:00:52

Simply enable the eye recording option in the recorder menu and hit R

papr 04 September, 2017, 16:01:58

The reason for the half detection seems to be that the secondary ROI (the smaller blueish squares) are not set correctly

wrp 04 September, 2017, 16:02:43

@user-2d6071 are you aware that you can slide the eye camera arm: https://docs.pupil-labs.com/#pupil-headset-adjustments

papr 04 September, 2017, 16:03:19

@wrp The pupil is huge in the picture. This is probably a thresholding issue

papr 04 September, 2017, 16:04:17

@user-2d6071 Do you run from source or from bundle?

user-2d6071 04 September, 2017, 16:04:30

from bundle

user-2d6071 04 September, 2017, 16:09:07

@wrp yes, we move the arm to center the pupil, we reached that off axis image while trying to fix the pupil tracking

papr 04 September, 2017, 16:09:28

We will investigate the issue as soon as we have the recording. In the mean time, I would recommend installing the source requirments such that you can apply changes yourself. This way you will not need to wait for the next bundle

user-2d6071 04 September, 2017, 16:23:59

@papr we have recorded two videos, each one in of the recording modes (big file, less cpu and small file, more cpu), they are available in this link: https://1drv.ms/f/s!AiyjKz0X3Zomg-RL-WIIhnwTy0g-yw The eye which fails is the eye 0. Thank you very much for your help and responsiveness.

papr 05 September, 2017, 07:07:15

@Juan#7767 I will have a look at your recording now

user-6419ec 05 September, 2017, 07:54:23

Hi everyone πŸ˜ƒ I was wondering if someone has a ready C# implementation of pupil remote client for getting real time data (ZMQrequest)?

papr 05 September, 2017, 08:04:46

@user-6419ec in case that you find an implementation or build one youself, please do not hesitate to make a pull request to pupil-helpers. This way other people will have easier access to it.

user-6419ec 05 September, 2017, 08:08:34

@papr I'll upload the implementation once it starts working

papr 05 September, 2017, 14:04:08

@Juan#7767 Please set the ROI like this:

Chat image

papr 05 September, 2017, 14:06:30

The combination of dark eye lashes and huge pupil make the algorithm fail. It works if you set the ROI such that it excludes the eye lashes. Make sure to increase the pupil max setting if the pupil is as much dilated as in your case.

user-2d6071 06 September, 2017, 07:32:10

@papr thank you very much, we will try later setting a smaller ROI, which luckily should work with our experiment where the pupil position should be stable.

papr 06 September, 2017, 07:47:24

@user-2d6071 Keep in mind, that you always can run the offline pupil detection for existing recordings in Pupil Player. This way you can fix old recordings.

user-2d6071 06 September, 2017, 08:32:52

@papr it works now, even with this particular subject!

papr 06 September, 2017, 08:45:29

Great!

user-006924 06 September, 2017, 13:47:36

@wrp sorry for the late response, I'm using the latest Pupil Labs, version 13 and I'm using it on windows 10. It appears when I'm using Pupil Player. It doesn't interfere with the workings of the software, at least it seem so. It just closes the software when I want to export data but the data is already exported and saved in the folder.

user-d811b9 06 September, 2017, 14:26:37

How can use OFFLINE CALIBRATION if : 1. The calibration process has not been included in my recording 2. The calibration process is made with screen marker

mpk 06 September, 2017, 16:31:19

@mattiaparentela#1655 you will need to know where the subject was looking for a few locations in the video frame. This can be markers but could also be anything else.

user-d811b9 06 September, 2017, 16:52:35

@mpk so if I asked to look at a particular point in the visual field and the instrument because of incorrect calibration does not locate that point I can use that element to do an offline calibration?

mpk 06 September, 2017, 17:01:00

That's right but you will need 5 points in the scene

user-d811b9 06 September, 2017, 17:02:11

Ok thanks @mpk

mpk 06 September, 2017, 17:02:46

Let us know how that goes. It a new feature and feedback is appreciated!

user-ed537d 07 September, 2017, 01:14:35

Anyone available to help with my libuvc import issues?

user-2798d6 07 September, 2017, 01:51:04

Is there a way to get a csv file or something similar that lists fixations and their durations? I've loaded the fixation plugin, but would like to have a file with all of that information if possible!

user-ed537d 07 September, 2017, 03:28:29

Libuvc and pyuc master versions don't work posted my issue

user-ed537d 07 September, 2017, 03:28:39

And fix/workaroubd

user-10fa32 07 September, 2017, 05:42:18

good morning. i have just started toi work with pupil labs eye tracker in HTC Vive . can somebody guid me what software I need for the best accuracy?

user-d811b9 07 September, 2017, 11:03:10

@mpk in the offline calibration the five points must be indicated at the same frame of recording or i can establish a series of points on the entire recording ?

mpk 07 September, 2017, 11:03:29

a series of points. They need to be looked at by the subject.

user-d811b9 07 September, 2017, 11:12:53

@mpk Sorry but i don't uderstad. I try to explane me better.If I see that the subject is looking at a single element but the instrument misplaced the pointer, can I use that element to recalibrate the instrument? For example, the subject is driving on a road and realizes that there is a STOP signal. From the recording it seems to be watching them but there is an error in the instrument and then the pointer does not fall on the stop signal. Can I use this signal to recalibrate the instrument?

papr 07 September, 2017, 11:13:35

Yes, it is possible.

user-d811b9 07 September, 2017, 11:16:49

ok thanks @papr

papr 07 September, 2017, 11:17:25

Create a new section In the offline calibration plugin. Set the calibration type to "natural features". This will allow you to click within the picture to define the location were you think the subject is looking at. Set the calibration range such that it includes the frame that you choose to define the point. Repeat for multiple locations were you know that the subject looked at. Preferably in different locations of the field of view. Afterwards click calibrate.

user-d811b9 07 September, 2017, 11:20:48

Great! Thanks again @papr

papr 07 September, 2017, 11:21:49

No problem

user-6419ec 07 September, 2017, 14:09:38

Hi everyone, i am trying to receive specific parameters in real time by pupil remote and zmq subscriber. I found in pupil-helpers/pupil-remote the file filter_messages.py. However by running the script it always stops here https://github.com/pupil-labs/pupil-helpers/blob/b472415133a1c2c65317c332af0930588ebe8f68/pupil_remote/filter_messages.py#L33 without an exception. Do you have any clue what happend ? Or guess what am I doing wrong?

papr 07 September, 2017, 14:12:19

@user-6419ec the script waits for incoming messages. If it stops indefinitly at this point, it means that you are not receiving any pupil data. Pupil data is produced and published if at least one eye window is open and its video source is active.

papr 07 September, 2017, 14:13:05

Use the Test Image source to generate zero-confidence test data if you do not have an headset around to test

user-6419ec 07 September, 2017, 14:15:32

thanks a lot:) Only the worl camera was running, this was my fault

papr 07 September, 2017, 14:16:00

Good to hear that the problem could be resolved πŸ˜ƒ

user-a49a87 07 September, 2017, 14:23:29

Hi, I try to follow the pupil docs to install Windows Dependencies, but an error occured when I try to install msgpack_python File "c:\python36\lib\site-packages\pip\compat__init__.py", line 73, in console_to_str return s.decode(sys.stdout.encoding) UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 116: invalid start byte

wrp 07 September, 2017, 14:24:09

Hi @user-a49a87 - can I ask you a quick question; do you need to run from source?

wrp 07 September, 2017, 14:24:51

What features do you require from source?

user-a49a87 07 September, 2017, 14:25:37

I didn't manage to make things work with capture

wrp 07 September, 2017, 14:26:36

If you don't mind me asking, what are you trying to achieve? Perhaps there is an easy way to do so with a plugin or via the IPC backbone?

user-a49a87 07 September, 2017, 14:27:49

I try to get the eyetracking working to get the gaze info into Unity to be able to get eye-tracking in the HTC Vive

wrp 07 September, 2017, 14:28:26

@user-a49a87 in this case there is no necessity to build Pupil from source

papr 07 September, 2017, 14:28:26

You do not need to install the source requirements for that. You can simply use the bundle from our release page.

user-a49a87 07 September, 2017, 14:28:38

I try Capture, but it doesn't detect the pupils

papr 07 September, 2017, 14:29:15

@user-a49a87 @wrp maybe it would be better to move this discussion to the πŸ₯½ core-xr channel?

wrp 07 September, 2017, 14:29:48

Yes, πŸ₯½ core-xr is appropriate channel for this discussion. @user-a49a87 please migrate.

user-a49a87 07 September, 2017, 14:30:06

ok

user-ed537d 07 September, 2017, 18:03:50

Can anyone help with getting my time sync master to send commands to time sync follower that's running my capture software?

user-2798d6 07 September, 2017, 18:11:34

Is there a way to get a csv file or something similar that lists fixations and their durations? I've loaded the fixation plugin, but would like to have a file with all of that information if possible!

mpk 07 September, 2017, 18:58:34

Hi @user-2798d6 if you run the fixation detector in player and hit export (e) you should get such a csv file.

user-2798d6 07 September, 2017, 19:04:14

The 2D or 3D fixation detector?

mpk 07 September, 2017, 19:04:26

I think it should work for both.

user-2798d6 07 September, 2017, 19:05:16

Ok, lately when I've done that, the program has shut down

user-2798d6 07 September, 2017, 19:06:33

Should I try re downloading the program? Or is there another fix?

user-ed537d 07 September, 2017, 19:07:32

@mpk @wrp did you guys see my issue with pyuvc and libuvc?

user-ed537d 07 September, 2017, 19:07:41

@papr

user-a95584 07 September, 2017, 23:01:31

Hi all. Does anybody know reference work for the 3d search part using Kalman filter in the code? I am looking for some papers and documents explaining algorithm of Filtercircle3 and predictPupilState in EyemodelFitter.cpp.

user-5874a4 08 September, 2017, 05:50:41

@wrp @mpk questions migrated from πŸ₯½ core-xr channel. I posted the calibration error as an issue on github page for Pupil Capture 0.9.14. They acknowledged the bug and closed it after fixing it. Here is the reference to that issue page, https://github.com/pupil-labs/pupil/issues/825. Do anyone know how to use the version with the fixed code?, I see the main download still contains the faulty version.

wrp 08 September, 2017, 05:56:55

We can make a new windows bundles today @user-5874a4

user-2798d6 08 September, 2017, 15:44:46

Hello! When I try to export after loading the 2D fixation visualizer, the Player program shuts down before it exports. Is there a fix for this? Or do I need to redownload the program?

mpk 08 September, 2017, 15:45:35

This might be a bug are you on the most recent version of player?

user-2798d6 08 September, 2017, 15:46:33

I believe so I'm on 9.13-48

mpk 08 September, 2017, 15:47:03

I think the lastest is 9.14

user-2798d6 08 September, 2017, 15:47:51

Ok, I'll download that and try again. Thank you! Is there a way to get notification of new versions or should I just keep checking the website?

user-2798d6 08 September, 2017, 15:47:54

Thank you!

mpk 08 September, 2017, 15:48:26

We announce it in this chat but I think we'll add an in app notification soon

user-2798d6 08 September, 2017, 15:52:19

The new version doesn't shut down! Thank you for your help!

user-2798d6 08 September, 2017, 15:52:34

I'll make sure to keep a closer eye on the new releases

mpk 08 September, 2017, 15:54:31

Glad this was so quickly fixed:-). Have a good weekend!

user-6419ec 08 September, 2017, 16:06:15

Hi everyone, it's me again πŸ™ˆ i try to set the PUPIL EPOCH to UNIX EPOCH because i have different measuring instruments. I found https://github.com/pupil-labs/pupil/issues/392 and thought by editing the lines (250-253) in pupil_remote.py with time.time() this would be fixed. However because of the method get_timestamp() in world.py this isnΒ΄t that easy. Is there any way to implement the UNIX EPOCH? I know that there is also time_sync, but in my usecase this isn't the best way. Thanks a lot

Chat image

user-6419ec 08 September, 2017, 16:06:26

Chat image

mpk 08 September, 2017, 16:08:56

I think my response to the issue is still current and should work fine: https://github.com/pupil-labs/pupil/issues/392

mpk 08 September, 2017, 16:09:22

basically set the offset to the difference of pupil time and your unix time.

mpk 08 September, 2017, 16:09:33

either with pupil remote or a small plugin.

user-6419ec 08 September, 2017, 16:12:47

thanky, i will try. nice weekend to you

mpk 08 September, 2017, 16:14:11

to you too. Once you have something to show and questions dont hesitate!

user-13c487 11 September, 2017, 20:29:30

Howdy, I can't get Pupil Capture v0.9.14-7 to detect the HTC Vive display in the Pupil Capture - World window. I get the following error when I launch Pupil Capture: [ERROR] video capture.uvc_backend: Init failed. Capture is started in ghost mode. No image will be supplied. Capture is detecting each individual eye camera but I cannot get the virtual environment as seen through the Vive to be a source for World.

papr 11 September, 2017, 20:30:53

@user-13c487 I do not think that this is possible. =/

papr 11 September, 2017, 20:32:25

The display is an output device. Capture looks for an input device. You would need to stream a single picture into Pupil Capture by using the NDSI protocol for example.

papr 11 September, 2017, 20:33:25

The Vive Display is actually two displays, one for each eye. You would need to unify this picture. That is what I meant with the "single picture"

user-13c487 11 September, 2017, 21:20:51

Makes sense about the Vive Display being two displays. I'd be interested in seeing how to get the Game window in Unity Editor to show up in World.

papr 11 September, 2017, 21:22:56

I think the usual work flow is the other way around. Get the gaze data into unity. You have everything there. Getting a single rendered image into the world process does not help you much.

user-13c487 11 September, 2017, 21:51:57

So how do I go about recording the virtual environment with the eye tracking overlay? The PupilGaze plugin has a "Recording" button that only works if a recording is in-progress in Pupil Capture. It creates a video file called "Unity_MainCamera.mp4" but it only shows the skybox and not the virtual environment I built.

papr 11 September, 2017, 22:07:12

Sorry I cannot help you with that. I have no experience with the unity setup.

papr 11 September, 2017, 22:07:46

Maybe some of the guys over at πŸ₯½ core-xr can help you out.

user-13c487 11 September, 2017, 22:07:53

Thanks.

user-2798d6 13 September, 2017, 02:18:49

I may just be missing this on the website, but how do these glasses track the eye and measure pupil size? For example, another tracker I was using used infrared and corneal reflection. Is that what is happening here or is it something different?

wrp 13 September, 2017, 02:21:52

Pupil captures eye video in IR

wrp 13 September, 2017, 02:22:50

@user-2798d6 Corneal reflections are not used by the pupil detection algorithms

user-2798d6 13 September, 2017, 02:23:24

Thanks! So it's just the IR?

wrp 13 September, 2017, 02:24:13

The algorithm just needs IR video of the eye region, yes

user-2798d6 13 September, 2017, 02:25:34

Thanks! Trying to make sure I'm explaining correctly to my advisor

wrp 13 September, 2017, 02:25:59

I can send you a link to an old (2014) technical report that provides and overview of the 2d detection steps

wrp 13 September, 2017, 02:26:06

On moment

user-2798d6 13 September, 2017, 02:26:11

sure! that would be great!

wrp 13 September, 2017, 02:27:37

https://docs.pupil-labs.com/#ubicomp-2014-paper

wrp 13 September, 2017, 02:27:51

Please see the paper linked via the docs link above

wrp 13 September, 2017, 02:28:26

Note - this paper is from 2014 and does not include information about the 3d detection and mapping algorithm

user-a49a87 13 September, 2017, 13:23:00

Hi, I don't manage to have the two cam working at the same time

user-a49a87 13 September, 2017, 13:25:06

In the terminal, it says: video_capture.uvc_backend: Capture failed to provide frames. Attempting to reinit.

user-a49a87 13 September, 2017, 13:26:08

one cam is working, then glitches, then the other work but the first one fails

papr 13 September, 2017, 13:26:46

@user-a49a87 I have a few questions for you. Which Capture version do you use? Which OS do you use? Did it work before? When did you buy the headset? Did you try to disconnect the headset for multiple seconds? Is it always the same camera that fails? πŸ˜ƒ

user-a49a87 13 September, 2017, 13:30:44

Pupil 09.14-7 on Win10 it was working last week but I already had this problem with my previous computer I bought the cam to add in a HTC-Vive No, I didn't try to disconnect the cams It is always the same, cam 1 is working, cam 0 isn't and I get some glitches, then cam 0 is working and cam 1 isn't (it is always only one cam working at a time)

user-a49a87 13 September, 2017, 13:31:47

if it might help, here is a print screen of the terminal

user-a49a87 13 September, 2017, 13:32:11

Chat image

user-a49a87 13 September, 2017, 13:35:39

Now I tried to disconnect the cams and the problem is still there

wrp 13 September, 2017, 13:36:47

Hey @user-a49a87 please send an email to info@pupil-labs.com with the order_id associated with this hardware and we can diagnose further and determine if we need to initiate a repair/replacement.

papr 13 September, 2017, 13:39:40

@user-a49a87 are you connecting the headset to a usb 2 or 3 port?

user-a49a87 13 September, 2017, 13:40:48

USB 3

papr 13 September, 2017, 13:59:36

Ok, then please follow @wrp 's intructions πŸ˜ƒ

wrp 13 September, 2017, 13:59:55

@user-a49a87 and I are already in contact via email - thanks!

user-2798d6 14 September, 2017, 00:22:36

Hello - My audio is not merging with the video when I export from Player. The audio file will play, so I know it was recorded, it just isn't being included in the video export.

user-2798d6 14 September, 2017, 00:53:25

Also, what is the green circle around the eye that appears in addition to the red circle and dot around the pupil? Is there a way to adjust that or is it an automatic process?

user-2798d6 14 September, 2017, 01:24:05

One more issue I'm running into tonight - When I drop a recording into Player it says "There is no gaze data for this video. Aborting." This is happening even with videos that used to have gaze data and would show gaze visualizations with a plugin. I tried re downloading all of the software, but it's still happening.

wrp 14 September, 2017, 01:25:45

@user-2798d6 the green circle is a visualization of your eyeball as estimated by the 3d detector.

wrp 14 September, 2017, 01:26:25

@user-2798d6 are you supplying the correct directory to Pupil Player?

user-2798d6 14 September, 2017, 01:26:41

I've tried several different three0digit folders from various recordings

wrp 14 September, 2017, 01:26:49

@user-2798d6 re audio - what OS and version are you using?

user-2798d6 14 September, 2017, 01:27:21

OS Sierra 10.12.6

wrp 14 September, 2017, 01:28:04

@user-2798d6 please reset pupil player settings. Delete the directory named pupil_player_settings in your user directory

wrp 14 September, 2017, 01:28:12

And try opening the files again

user-2798d6 14 September, 2017, 01:29:25

That worked! I'm curious - what did that do and why did that work?!

wrp 14 September, 2017, 01:29:56

Likely you had set pupil player to use offline calibration

wrp 14 September, 2017, 01:30:09

You can switch back to using the calibration from the recording

user-2798d6 14 September, 2017, 01:30:59

Oh you are so right!

wrp 14 September, 2017, 01:32:49

@user-2798d6 I will try to reproduce audio muxing behavior you noted and get back to you a bit later

user-2798d6 14 September, 2017, 01:33:24

ok, thank you!

wrp 14 September, 2017, 08:22:17

I just wanted to follow up and report that @user-a49a87 issue was a driver issue. To all those on Windows - if you are experiencing issues with Pupil cameras not being accessible in Pupil Capture (and you recently updated Windows), please open Device Manager, goto View > Show Hidden Devices and delete all drivers for Pupil Cam - especially those listed under the Imaging Devices group. This resolved @user-a49a87 issue.

user-72b0ef 14 September, 2017, 09:27:05

Hey everyone, I was wondering if Pupil makes a option/init file after running for the first time that is saved in another place as the folders: pupil_capture_v, pupil_player_v or pupil_service_v.

user-a49a87 14 September, 2017, 11:26:12

@wrp I got the same problem as before, but I manage to solve the problem again by running as administrator

user-a49a87 14 September, 2017, 11:26:28

Is it a known issue?

papr 14 September, 2017, 11:28:23

@user-72b0ef _v? It should be *_user_settings

wrp 14 September, 2017, 11:35:39

@user-a49a87 this is not common behavior

wrp 14 September, 2017, 11:36:25

I will try to recreate this - but not sure that I will be able to do so as it may be related to your system setup.

mpk 14 September, 2017, 11:40:53

@papr @user-72b0ef you cannot just transfer the settings file. They are not meant to be portable. It may work between service and capture but thats really a hack solution.

mpk 14 September, 2017, 11:41:30

however you can move the eye settings files. Those should be portable from serivce<->capture.

user-a49a87 14 September, 2017, 11:46:22

I have 2 users on my laptop. To avoid any problem, I used the administrator with you. But I use to work on an other user and when I tried again after you "left" with my non admin user, the bug was still on. But running as administrator make the bug disappear

user-a49a87 14 September, 2017, 11:47:35

Is there a way to store a configure file somewhere to set the recording folder and other setup on a text file?

user-72b0ef 14 September, 2017, 12:15:12

@mpk @papr I'm not really trying to transfer the file from capture to service or anything like that...... I'm trying to find the settings file, so that I can transfer it from a laptop to a PC. Because in-between tests Pupil went goof on me (I didn't change any settings in pupil). One eye only started returning the values inifinty and the other "0.2". I got pupil on another laptop and it worked fine there, the correct info is being send again (coordinates where the eye is looking.), which shows that it's not a hardware issue or an issue with my project itself. But after deleting pupil from the PC and un-packing it again from the .rar file it has still kept its settings from before.......(already tried re-installing cam drivers as well already)

papr 14 September, 2017, 12:16:23

Just delete the settings folder to test if the issue persists

user-a49a87 14 September, 2017, 12:35:25

Hi, Is there a way to store the path of my recordings? When I play back the captured pupils videos to see if the algos are correctly finding the pupil+gaze,I have to type again the path of the recording (and it's quite frustrating)

papr 14 September, 2017, 12:36:59

@user-a49a87 you can start the recording via a notification that includes the session name (which can be a path). You send notifications via pupil remote or within a plugin

papr 14 September, 2017, 12:37:46

If you have an experiement software it can trigger the start of the recording as well as say where to store it using this notification

user-a49a87 14 September, 2017, 12:47:17

I don't understand you answer, so I guess that my question wasn't clear ^^ I just got the cam working today and I try to see if the algorithm car track my pupils. I launch pupil_capture.exe and record my two eyes in file to be able to replay the videos as sources for the algo. The question was: when I switch in one of the eye of pupil_capture.exe from local_USB to Video File Source, I have to type the path of the recordings, is there a way to save this path?

user-a49a87 14 September, 2017, 14:37:15

Hi, I don't manage to make the pupil detection to work. Here is a print screen of the "mode algorithm" view

user-a49a87 14 September, 2017, 14:37:26

Chat image

papr 14 September, 2017, 14:37:48

Ah I understand now. After selecting Video File Source as manager you should be able to drag and drop the videos on top of the eye windows.

user-a49a87 14 September, 2017, 14:38:09

great :D Thx

papr 14 September, 2017, 14:38:10

Please move the ROI such that it does not include the eye lashes

papr 14 September, 2017, 14:38:27

and increase the maximum pupil size

papr 14 September, 2017, 14:38:38

these are a pair of huge pupils πŸ˜„

user-a49a87 14 September, 2017, 14:38:53

the green square seems too small, but I didn't find a way to increase their size

user-a49a87 14 September, 2017, 14:39:45

the pupils are big because it's inside the HTV Vive and I have no images atm (I try first to test the eye-tracking)

papr 14 September, 2017, 14:41:25

As I said, try to change the ROI to not include eye lashes. You can also simply show the Steam VR environment to reduce pupil size

user-a49a87 14 September, 2017, 14:55:27

It works better, but it is not stable, as soon as I look at the edge of the "screen", the pupil is lost

papr 14 September, 2017, 14:56:03

Could you post a screenshot of that?

user-a49a87 14 September, 2017, 14:58:07

Chat image

papr 14 September, 2017, 14:58:55

The reason why the algorithm fails is because of the bad contrast

user-a49a87 14 September, 2017, 14:59:18

OK, what should I do?

papr 14 September, 2017, 14:59:55

Could you try to use the distance slider of the Vive and move the eye cameras a bit further away?

papr 14 September, 2017, 15:01:09

The IR light needs to be able to spread evenly

papr 14 September, 2017, 15:01:46

If this does not work try to increase the aperature

user-a49a87 14 September, 2017, 15:05:15

here are the values for the aperture

user-a49a87 14 September, 2017, 15:05:23

Chat image

papr 14 September, 2017, 15:05:45

Try to increase exposure time

papr 14 September, 2017, 15:06:37

be aware that very high exposure times can result in problems for the camera. In this case it will turn itself off.

user-a49a87 14 September, 2017, 15:07:43

Yes, when I increase the exposure time, it get lighter then darker suddenly

papr 14 September, 2017, 15:08:03

try to find a value where it is lighter

papr 14 September, 2017, 15:08:15

e.g. 75 should work

user-a49a87 14 September, 2017, 15:20:19

Chat image

user-a49a87 14 September, 2017, 15:20:28

it seems much better

user-a49a87 14 September, 2017, 15:20:50

Do you have other advises?

papr 14 September, 2017, 15:22:58

Not at the moment. Looks very good! It is really bright though. But as long as it works ✨

user-a49a87 14 September, 2017, 15:24:18

I can decrease aperture... You suggested me to get it brighter... I did so

user-a49a87 14 September, 2017, 15:24:31

Do you think I should decrease a bit?

papr 14 September, 2017, 15:25:40

Yes. But the exact values always depend on your setup. Try different values and look how robust the detection is

papr 14 September, 2017, 15:25:57

The goal is to get a strong contrast between iris and pupil

user-a49a87 14 September, 2017, 15:26:39

Ok, thanks so much... Next step, get it to work with Unity

papr 14 September, 2017, 15:29:51

hehe Good luck with that. πŸ˜ƒ Keep an eye on the unity plugin repo. There should be improvements comming soon πŸ˜ƒ

mpk 14 September, 2017, 16:22:28

@Djinn#0083 one last think to improve is to turn the lenses a little bit until the foxus in on the pupil and not the eye lashes.

user-2798d6 14 September, 2017, 16:25:47

I don't see @wrp signed in, but I wanted to check in again about audio issues. I am getting an audio file that works after using Capture, but the audio isn't merging with the video during export in Player. I'm on Mac OS Sierra 10.12.6.

papr 14 September, 2017, 16:30:57

@user-2798d6 I gues you nade already sure to use the most recent version of Pupil Player?

user-2798d6 14 September, 2017, 16:39:34

I did - I re downloaded last night, but it's still happening

papr 14 September, 2017, 16:40:57

ok. Then please have a bit of patience until we are able to reproduce the issue.

user-2798d6 14 September, 2017, 19:42:23

I had asked earlier about audio, and I am fine to be patient while you all try to recreate the issue! But may I check my understanding? Is the audio supposed to merge with the video file during export or is that still needing to be done manually?

papr 14 September, 2017, 19:44:58

It is supposed to merge during export

papr 14 September, 2017, 19:45:10

In the newest version

user-2798d6 14 September, 2017, 19:51:44

Ok, just wanted to make sure I wasn't trying to do something it wasn't supposed to do. I'll be looking forward to your answer and I appreciate your help!

papr 14 September, 2017, 19:52:28

πŸ‘

user-e04f56 15 September, 2017, 14:40:13

Hello there

wrp 20 September, 2017, 03:18:12

[email removed]

user-cf2773 21 September, 2017, 10:36:24

Hello everyone! I am having problems in loading the pupil_data pickle in python: it raises me the error "ValueError: unregistered extension code 28847" in both python 2 and 3. Does any of you know how to solve this issue?

user-cf2773 21 September, 2017, 10:39:53

Oh right, I have to use msgpack. Thanks a lot @wrp!

wrp 21 September, 2017, 10:41:47

Welcome @user-cf2773

user-ed537d 21 September, 2017, 21:08:47

how do you get two pupils to time sync? i try chaning the name from clock master to clock follower and it just reverts back to master. I also tried chaning the bias and that doesn't work either

papr 21 September, 2017, 21:14:30

@user-ed537d Do both instances see each other?

papr 21 September, 2017, 21:14:36

They should list each other

user-ed537d 22 September, 2017, 00:28:32

they don't

user-ed537d 22 September, 2017, 00:28:52

i'm unsure why they don't @papr how do i need to set it up?

papr 22 September, 2017, 05:32:08

Same wifi, same time sync group, same time sync protocol number. Do you use 0.9.14?

user-ed537d 22 September, 2017, 05:33:25

Yeah I believe so I'll check tomorrow morning. Do I need to change the node info?

user-ed537d 22 September, 2017, 05:34:05

Also how do I get the network time sync in pupil helpers to be the master?

papr 22 September, 2017, 05:36:58

Do you mean the bias? No, only if you want to control which of the Pupil capture instances is the clock master.

user-ed537d 22 September, 2017, 05:38:16

Yeah it wasn't working. What I'm trying to do is sync up the neural timestamps we get to the pupil. I'm having problems even getting the pupil helpers time sync script to even send a time to pupil capture. Any advice/direction?

papr 22 September, 2017, 05:43:19

So wait, I thought you were using two Pupil capture instances? What is your exact setup? If you are using the helpers, do the Pupil Remote ones work?

papr 22 September, 2017, 05:46:30

In case that you want to have a bit more insight on how the time sync works under the hood : https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/time_sync_spec.md

user-d7b89d 25 September, 2017, 08:56:23

Hi everybody, has someone used pupil labs at night especially in the car / while drriving?

mpk 25 September, 2017, 08:57:12

@user-d7b89d dont think so.. Pupil dilation may become a problem... But I think you can make it work. We have used Pupil in flight simulatiors and those a quite dark as well.

user-d7b89d 25 September, 2017, 08:58:11

sounds good, do you have some recommondation for the setting?

mpk 25 September, 2017, 08:58:26

Pupil min and max. also ROI.

user-d7b89d 25 September, 2017, 09:04:47

thanks, I'll try this

user-29e10a 26 September, 2017, 09:50:14

Hi, we're using uncalibrated pupil positions. So we get in pupil_data the circle3d_norm_pos. Works great so far, but is there a possibility to calculate angles from this easily? I know there is a theta and phi in the data, but I haven't found any docs about them. And the "center" of the eye should be zero angles, should it? ... Secondly, can you name the torsional rotation of the Vive-Cameras? So the rotation around of the line of sight. It is something between 20 and 30 degrees, but I would like to know the exact amount. Best wishes, Matthias

papr 26 September, 2017, 09:52:55

Theta and Phi are the polar coordinates of the pupil on the 3d model

papr 26 September, 2017, 09:54:38

IIRC the convention is that the 'looking straight forward' eye position does not correspond to the zero angle. This is mostly to prevent zero crossings in the data.

papr 26 September, 2017, 09:55:27

From tests one of both values ranged from 0-90 degree and the other from 180-360 degrees

user-29e10a 26 September, 2017, 09:56:02

ah ok, sounds like a tangens agreement πŸ˜ƒ

user-29e10a 26 September, 2017, 09:56:28

so phi is the "left/right" and theta the "up/down" angle?

papr 26 September, 2017, 10:00:39

not sure. I am not even sure if it is consistent with both eyes since one camera is rotated. You should be able to infer the correlation using two 2d norm_pos entries

papr 26 September, 2017, 10:15:11

@user-29e10a This is not important if you are only interested in the angle difference between two pupil positions. Simply calculate the euclidian distance in thetheta-phi-space to get the angle difference. E.g., the new fixation detector makes use of this.

user-29e10a 26 September, 2017, 11:57:04

right, the absolute value is not important for us, but a true horizontal motion of the pupil should only be seen in the phi angle. if I calculate the euclidian distance between the two directions, in my opinion I get a equipotentialline around the center and therefore I'm not sure "how" the pupil has moved, only how far. so it would be nice to rotate the coordinate system as the cameras are rotated... does that make sense? πŸ˜ƒ

user-29e10a 26 September, 2017, 12:05:21

another question ... we have a Microsoft LifeCam 3000 HD and we were like to use this as a world cam. this is definitely UVC compatible and works with libUSBk drivers. But pupil capture tells me the device is blocked or not available. Is there are way to make this running? we are working under windows 10 latest version.

papr 26 September, 2017, 12:07:55

I think the polar coordinate system is actually fixed to the camera coordinate system

user-29e10a 26 September, 2017, 12:09:48

yes, there are, but it would be no problem to make some coordinate system rotation. but whats the exact angle? and is the camera rotated in theta direction, too?

papr 26 September, 2017, 12:09:54

I think the driver has some kind of internal list of USB ids that it assumes to be compatible. Currently this list only includes Pupil Labs cameras. @wrp should be able to answer this in more detail

mpk 26 September, 2017, 12:11:56

@user-29e10a the rotation translation of each camera is estimated during calibraiton. You can extract this data if you modify the source.

mpk 26 September, 2017, 12:12:23

the eye pos and gaze normal for each eye (in world coordinates) is also available in the gaze datum.

wrp 26 September, 2017, 12:13:45

@papr is correct re the windows driver. It has a list of PID and VIDs for Pupil Labs hardware (cameras)

wrp 26 September, 2017, 12:14:33

@user-29e10a you would need to install libusbk drivers for this camera manually

user-29e10a 26 September, 2017, 12:14:34

ok, then I will search for some calibration matrices to ”post-calibrate" my data

wrp 26 September, 2017, 12:15:04

You can use a tool like Zadig to install libusbK drivers for your device

user-29e10a 26 September, 2017, 12:15:06

@wrp can I modify the list? the driver for libusbk is already installed and working in other applications

user-29e10a 26 September, 2017, 12:15:09

yep

user-f1d099 26 September, 2017, 18:23:08

Hi, I'm currently attempting to replace test image capture with that of what I captured using OBS for a project in unity, I'm currently finding it difficult to sync up the recordings in pupil player. Currently I'm trying to match the capture fps to that of the eye cameras. I'm wondering if there is a more simple and elegant solution to what I am trying to do.

user-7323ec 26 September, 2017, 21:58:49

Hi, I am wondering if it is possible to record from more than one eye tracker on the same laptop, while having both trackers plugged into the laptop's USB drive? Thanks!

user-7323ec 26 September, 2017, 21:59:30

I mean USB ports. Thanks

user-7fea1c 26 September, 2017, 23:35:50

Hi Everyone, anyone here officially from Pupil? Been trying to get in touch with the company for a couple weeks now, and haven't gotten any replies on info@pupil-labs.com

wrp 27 September, 2017, 01:04:01

Hi @user-7fea1c apologies if we have not replied. We usually are quite quick with responses. I will follow up via PM/Email

user-7fea1c 27 September, 2017, 01:05:58

please follow up via email. Lets set up a call if possible

wrp 27 September, 2017, 01:18:16

@user-7323ec this is possible if your computer has more than one USB bus. You could also try reducing the resolution of the world and eye cameras. Please check out the docs for more on this topic here: https://docs.pupil-labs.com/#usb-bandwidth-and-synchronization

user-3902f9 28 September, 2017, 08:16:01

what's the quickest way to get my pupil-labs vive insert working in unity?

user-3902f9 28 September, 2017, 08:16:08

any sample unity projects?

mpk 28 September, 2017, 13:01:21

@cmanders#0602 I would recomment asking this in hmd-eyes the devs can point you to a unit3d example.

user-2798d6 28 September, 2017, 16:23:03

Hello! Is there a way to get data about saccade lengths throughout a recording? I'm able to get a csv file of fixation durations, but would like to get saccade lengths as well if possible. Thank you!

papr 28 September, 2017, 16:30:56

We are working on such a feature

user-2798d6 28 September, 2017, 16:46:54

Oh great! Thank you!

user-2798d6 28 September, 2017, 18:36:49

re: the saccade feature - will the data also tell the direction of the saccades in addition to the length?

papr 28 September, 2017, 18:41:46

Gaze positions that belong to a saccade are marked as such. You have all information available

user-893cb5 28 September, 2017, 18:48:02

Hey! Thinking about picking up one of the HMD add-ons to try out. How reliable is the calibration? Does it typically work every time? Having some issues with tracking calibration with FOVE HMD I've been using

papr 28 September, 2017, 19:27:11

Calibration is stable if Pupil detection is good and the subject is good at focusing markers :)

mpk 28 September, 2017, 19:30:03

@user-893cb5 to elaborate on @papr I have not tried fove. With pupil you have full access to the pipeline. If calibration should fail you can check and find out why. Generally all that's needed in good pupil detection and you can see that in the eye video steam debug monitor.

mpk 28 September, 2017, 19:31:38

We are actively developing methods and tools. The unity integration is getting a refactor right now and we will add a new eye tracking algorithm that extends our current 3d model.

user-6419ec 29 September, 2017, 14:01:45

hi there, is there any possibility to get something like fixation_on_srf like gaze_on_srf in realtime via pupil remote (zmq)? I thought maybe considering the position but i think that the position out of subscribing "fixations" (pupil remote) is not relative to the surface.

user-29e10a 29 September, 2017, 14:27:58

Short question: Is there a time frame when we will be able to record audio in pupil capture in windows? πŸ˜ƒ

user-fe23df 30 September, 2017, 16:30:37

@papr hi, why does the 3d fixation detection plugin only give fixations for individual eyes and nothing on the cyclopean eye? Is there a way to get cyclopean fixations? Thanks a lot! πŸ˜ƒ

papr 30 September, 2017, 16:36:21

In the upcoming version it will use cyclopion fixations if no 3d data is available. It then uses the camera model to distort the 2d norm gaze positions. This assumes a valid calibration

user-fe23df 30 September, 2017, 16:38:04

yeah my problem right now is I need cyclopean fixations and the fixations on left and right eye are not synchronized, so I haven't figured out how to use this data

papr 30 September, 2017, 16:39:49

You could adapt the code such that it calculates fixations for both eyes separately instead of the one with more data.

user-fe23df 30 September, 2017, 16:40:06

oh interesting

user-fe23df 30 September, 2017, 16:40:48

do you mind showing me how/where I can do this? I haven't played around with the python code yet

papr 30 September, 2017, 16:41:42

In the fixation detector file in the shared modules folder.

papr 30 September, 2017, 16:42:06

The bottom class should be the online detector

user-fe23df 30 September, 2017, 16:42:29

oh it's interesting that I can't find the shared module folder in our lab's computer

user-fe23df 30 September, 2017, 16:42:53

we just downloaded the three packages (service, capture and player)

papr 30 September, 2017, 16:43:32

Yeah, the bundles are a different story...

papr 30 September, 2017, 16:43:44

On which platform do you work on?

user-fe23df 30 September, 2017, 16:43:53

I work on windows 10

papr 30 September, 2017, 16:45:30

Oh mmh. Modifying code on windows is a bit more difficult. If it is not too urgent, I will implement this before the next release.

user-fe23df 30 September, 2017, 16:45:55

oh, okay, thanks

user-fe23df 30 September, 2017, 16:46:08

when will the next release be?

papr 30 September, 2017, 16:46:10

Else you will have to go through the instructions on how to run the application from source on windows

user-fe23df 30 September, 2017, 16:46:27

I'm doing data analysis on our pilot testing batch

papr 30 September, 2017, 16:46:28

My guess is 2-3 weeks.

user-fe23df 30 September, 2017, 16:46:57

and the data is somewhat useless right now for me because we can't get cyclopian fixations

papr 30 September, 2017, 16:47:26

So you need it for interaction?

user-fe23df 30 September, 2017, 16:48:08

yes, the original plan is to get the hit data from Unity, but that seems not too reliable

user-fe23df 30 September, 2017, 16:48:18

so now we switch to fixations and gazes

user-fe23df 30 September, 2017, 16:48:37

to get the fixations and then see the 3d gaze points of gazes that lie within that fixation

user-fe23df 30 September, 2017, 16:49:42

and will it be possible to get the positions of the cyclopean fixation's gaze points in the next release?

papr 30 September, 2017, 16:51:55

Would it be enough for you to know the gaze positions that belong to an fixation?

papr 30 September, 2017, 16:52:38

Their norm positions are relative to the world camera. That would be your cyclopian eye

user-fe23df 30 September, 2017, 16:53:41

the norm positions of the fixations?

user-fe23df 30 September, 2017, 16:53:50

or of the gazes?

user-fe23df 30 September, 2017, 16:54:11

for both, there's only norm x and norm y though, so we're missing the z dimension

papr 30 September, 2017, 16:54:26

Both. The fixation norm pos is just the mean of all its base gaze positions

user-fe23df 30 September, 2017, 16:55:09

I thought I could use "gaze_point_3d_x" (and y, z) as the position of where the person looked at in world camera, no?

user-fe23df 30 September, 2017, 16:57:38

we don't have a recording of world camera (because the frame rate is way too low for VR rendering), that's why there will be no world camera picture for reference (and therefore having only norm_pos_x and norm_pos_y is not sufficient)

papr 30 September, 2017, 17:00:40

I understand. I will think about how to implement this best.

user-fe23df 30 September, 2017, 17:00:49

thank you very much!

user-fe23df 30 September, 2017, 17:02:18

can you please show me the instructions on running the application from source?

user-fe23df 30 September, 2017, 17:02:24

I would like to try it out if possible

papr 30 September, 2017, 17:14:32

https://docs.pupil-labs.com/#windows-dependencies

papr 30 September, 2017, 17:14:38

Good luck :)

user-fe23df 30 September, 2017, 17:14:41

thanks! πŸ˜ƒ

papr 30 September, 2017, 17:15:23

I will be afk for now but will occasionally check in case that you have questions :)

user-fe23df 30 September, 2017, 17:16:21

thank you so much πŸ˜„

user-fe23df 30 September, 2017, 17:16:56

I should go now and have dinner too so you won't need to check for my questions over the next few hours :))

End of September archive