core


user-f1eba3 01 May, 2018, 00:03:21

More or less. It is still in the works. Basically data can be received from Pupil Service and a very random Raycast can be done. Now I am struggling to make a game similar to hmd eyes in order to calibrate the device

user-3d9cdd 01 May, 2018, 00:06:42

Would you mind sending me what you have?

user-4fbb59 01 May, 2018, 05:40:58

Hi everyone, I just received my pupil in the mail and I'm trying to run Pupil Capture but nothing seems to work. My device is correctly plugged in, but it tells me that "eye0 init failed. Capture is started in Ghost Mode." and all I see is a grey screen. Has anyone seen anything similar to this happen?

Also, I'm wondering for the surface tracking- how big do the surface markers have to be so the pupil can detect it? Can I go down so far as 1inch by 1 inch for example- and what kind of coordinates does the surface tracking functionality return? Is it pixel offset from the top left of the surface? Or does it know absolute distance?

Thanks for the help! This device is so cool and I hope to get started with it successfully soon ๐Ÿ˜ƒ

user-4fbb59 01 May, 2018, 05:41:31

Here's the image of what I see when I open pupil capture

Chat image

wrp 01 May, 2018, 05:42:31

@user-4fbb59 what operating system are you using?

user-4fbb59 01 May, 2018, 05:42:51

I'm using a macOS High Sierra

wrp 01 May, 2018, 05:43:21

Pupil Capture version is v1.6.x?

user-4fbb59 01 May, 2018, 05:43:40

Yup, 1.6.14 to be exact

wrp 01 May, 2018, 05:43:50

Please try restarting with default settings General > Restart with default settings

user-4fbb59 01 May, 2018, 05:44:02

Yeah, I gave that a try but it still does the same thing

wrp 01 May, 2018, 05:44:14

Are there any other Pupil applications open like Pupil Service?

user-4fbb59 01 May, 2018, 05:44:32

No

user-4fbb59 01 May, 2018, 05:44:38

though I could restart my computer to see if that helps?

wrp 01 May, 2018, 05:45:05

@user-4fbb59 do you see any world video?

user-4fbb59 01 May, 2018, 05:45:19

if by that you mean the pupil's video feed, then no

wrp 01 May, 2018, 05:45:22

Are the cables connected fully?

user-4fbb59 01 May, 2018, 05:45:22

I just see a grey screen

user-4fbb59 01 May, 2018, 05:45:26

I believe so

wrp 01 May, 2018, 05:45:41

You may need to push the USBC connector in more to the cable clip on Pupil Capture

wrp 01 May, 2018, 05:45:50

it may require a firm push

user-4fbb59 01 May, 2018, 05:46:06

ahh! yes that did it.

user-4fbb59 01 May, 2018, 05:46:09

I was afraid to break the device haha

wrp 01 May, 2018, 05:46:35

The USBC connector is a bit stiffer/tighter than other USB connectors

user-4fbb59 01 May, 2018, 05:46:51

Great, works amazing now. Thank you so much for the fast response and help!

user-4fbb59 01 May, 2018, 05:47:03

Do you have any insight on my second question regarding the surface tracking?

wrp 01 May, 2018, 05:47:39

ok regarding your other questions

Also, I'm wondering for the surface tracking- how big do the surface markers have to be so the pupil can detect it? 

A: This depends on how far you are away from the marker. If you are very close, you can use a very small marker. You can adjust the min marker perimeter setting in Pupil Capture/Player to reduce the search area.

wrp 01 May, 2018, 05:49:28

Surface tracking will provide you with a transformation matrix which will allow you to map coordinates into the space of the surface

user-4fbb59 01 May, 2018, 05:50:33

Hmmm, I see.

user-4fbb59 01 May, 2018, 05:50:57

So the Pupil will give me a transformation matrix and coordinates in terms of what it sees

user-4fbb59 01 May, 2018, 05:51:07

and then I can use those two to map to coordinates within the paper?

user-4fbb59 01 May, 2018, 05:51:09

does that sound right?

wrp 01 May, 2018, 05:53:03

@user-4fbb59 Pupil software will take care of the mapping of gaze coordinates --> surface coordinates - you don't have to do anything further. For example: If you defined a surface as the cover of a magazine, then Pupil will calculate the position of the gaze relative to this surface. Gaze positions within a surface are normalized coordinates (0,0 for bottom left of the surface and 1,1 for top right of the surface).

wrp 01 May, 2018, 05:54:22

here is an example of surface tracking used within a driving study - https://vimeo.com/266868006

wrp 01 May, 2018, 05:55:01

in the beginning of the video they show heat map for the surface, afterwards they disable the heat map visualization and just show the surface bouding box. You can see how markers are used to define a surface in this example.

user-4fbb59 01 May, 2018, 05:55:11

I see, that makes a lot of sense

user-4fbb59 01 May, 2018, 05:55:15

let me make sure I understand this correctly

user-4fbb59 01 May, 2018, 05:55:29

the pupil will return coordinates between (0,0) and (1,1) depending on where you're looking on the surface

user-4fbb59 01 May, 2018, 05:55:33

where (0,0) is bottom left?

user-4fbb59 01 May, 2018, 05:55:39

as defined by the markers

user-8e7659 01 May, 2018, 06:01:06

Hi all, I am looking to extract from pupil capture via UDP the x-axis angle of the eye gaze, with calibration center at 0deg and +/- from this point depending on gaze direction. Any advice would be appreciated, thanks.

wrp 01 May, 2018, 06:41:09

@user-4fbb59 gaze positions are normalized betweeen 0 and 1 relative to the world frame. When you're using surface tracking plugins, you can also get gaze positions relative to surfaces in normalized coordinates. If the gaze position is outside the surface it will be outside of the range (0,0 - 1,1) (e.g. you can have a point that is (-0.2, -0.2) which would be outside the surface but with coordinates still reported relative to the surface

user-4fbb59 01 May, 2018, 06:49:39

okay excellent. That definitely helps, thanks so much! Last thing, are you aware about the type of screws that sit on the Pupil? Little buggers are so tiny, I already lost one while moving the eye camera.

user-dc2842 01 May, 2018, 10:18:05

@papr @user-b116a6 - I've found new proposition for saccade detection algorithm, what do you think about it? From http://dx.doi.org/10.1016/j.visres.2017.03.001 Bargary et al, Individual differences in human eye movements: An oculomotor signature? (2017) "All saccades in each of the tasks were detected with the same purpose-built saccade algorithm. This algorithm used both eye acceleration and eye velocity criteria to detect and profile a saccade. The presence of a saccade was detected if the eye acceleration exceeded a relative threshold value (six times the median value of the standard deviation of the acceleration signal during the first 80 ms of all trials for a particular person), or if the eye velocity exceeded an absolute threshold of 50/s (the latter criterion was used very rarely). After detection, the saccade was profiled using the eye velocity record: borders of the saccade were defined as the regions where the eye velocity dropped below three times the median value of the standard deviation of the eye velocity record during the first 80 ms of all trials for a particular person."

user-723401 01 May, 2018, 17:08:54

Hello, I need assistance in finishing the plugin PupilLabs to UnReal game engine that Tudor started ASAP! Sorry to sound demanding but we have a May 11 deadline presentation to NIST!

user-4fbb59 01 May, 2018, 17:38:16

Hi! Another quick question regarding Pupil- how does it get Z positions? Is the world camera able to detect depths? If not, how does it get gaze_point_3d_z?

user-c16fb4 01 May, 2018, 20:03:13

Anyone know if this is being used for medical testing? I've got a use case that could really help patients who might be having subtle, hard to detect strokes.

wrp 02 May, 2018, 04:05:57

@user-4fbb59 the z positions are estimated via binocular vergence using gaze angles from our 3d model.

user-6e1816 02 May, 2018, 07:46:22

I want to use libuvc to get the video streams from world camera. ๏ผˆUbuntu16๏ผ‰ The ID 05a3:9231 is Cam1 ID1, the ID 05a3:9230 is Cam1 ID0, but the Cam1 ID2's ID is the same as the ID0: 05a3:9230, if I set the ID param "05a3:9230", the video is from right eye cam, so how to get the Cam1 ID2? And please tell me if the Cam1 ID2 is the world camera.

papr 02 May, 2018, 07:47:36

Correct, Pupil Cam1 ID2 refers to the world camera.

papr 02 May, 2018, 07:49:08

You should use the cameras' names to select the correct camera to initialize instead of the IDs

user-6e1816 02 May, 2018, 08:00:19

@papr Thanks, and I also find when using IDs I can set the index value, the default 0 refers to the eye cam, 1 refer to the world cam.

papr 02 May, 2018, 08:02:15

You mean the index value within the device list? I would advise against using the device list index as a way to select the correct camera.

user-6e1816 02 May, 2018, 08:07:40

The code is as follows: <!-- Parameters used to find the camera --> <param name="vendor" value="0x05a3"/> <param name="product" value="0x9230"/> <param name="serial" value=""/> <!-- If the above parameters aren't unique, choose the first match: --> <param name="index" value="0"/> I think the index is under the same IDs.

user-d1b281 02 May, 2018, 08:08:41

Hi guys, I have a question regarding the DIY pupil headset, because I could not find exposed (black) film negative, what kind of other material do you recommend me to use as a filter for eye camera?

wrp 02 May, 2018, 08:10:13

@user-d1b281 you could also source a bandpass filter, but exposed film is the best choice for DIY setup.

user-d1b281 02 May, 2018, 08:14:40

@wrp Thank you very much

user-d1b281 02 May, 2018, 08:18:44

another question is, can I use an undeveloped film?

wrp 02 May, 2018, 08:23:06

@user-d1b281 no, this will not work.

user-d1b281 02 May, 2018, 08:27:10

I'm sorry for asking so many questions. but can I use a floppy disk?

wrp 02 May, 2018, 08:27:56

@user-d1b281 questions are welcome. This will also not work (someone asked/and tried this earlier - search back and you might be able to find the discussion ๐Ÿ˜„ )

wrp 02 May, 2018, 08:36:01
user-d1b281 02 May, 2018, 09:52:41

@wrp So, can I use this one?

Chat image

user-f2aed7 02 May, 2018, 10:33:52

Hi all, I just got Pupil, and Im trying to use it on my computer now (running Windows 7). I installed Pupil Capture, but there seems to be a problem with the drivers: 2 of the cams are installed in the "Imaging devices" instead of "libusbK Usb Devices". I uninstalled, restarted, and additionally installed the drivers automatically. Still the same issue. I tried on a different pc and there was absolutely no problem. Any ideas on how to solve this issue?

user-f2aed7 02 May, 2018, 10:34:18

*manually I meant ๐Ÿ˜ƒ

user-f2aed7 02 May, 2018, 10:40:44

As an (expected) result, I can only access one of the cameras with Pupil Capture

papr 02 May, 2018, 10:51:12

We only support Windows 10. Please upgrade your system to use Pupil Capture.

user-f2aed7 02 May, 2018, 11:02:50

I am aware of that, but for our experimental setups we were using Windows 7 - and I was trying this out before changing everything. Thank you anyway

user-4fbb59 02 May, 2018, 15:16:58

great, thanks!

user-4fbb59 02 May, 2018, 15:50:04

Can someone help me understand the exported data for surface tracker? What's the difference between world_timestamp and gaze_timestamp? And what's the difference between x_norm and x_scaled?

papr 02 May, 2018, 16:03:33

@user-4fbb59 Hey there. *_norm means that the values are normalised to the surface, meaning that (0,0) refers to the bottom left corner, and (1,1) to the top right corner. *_scaled values are the *_norm values multipy the surface size (which you have to set manually in Capture/Player)

papr 02 May, 2018, 16:05:05

world_timestamp refers to the scene frame's timestamps that the datum is correlated to. You can use it to identify the correct scene frame. The gaze_timestamp is the gaze datum's timestamp that is mapped to the surface.

user-4fbb59 02 May, 2018, 16:05:56

@papr so each frame has multiple gaze recordings on it?

papr 02 May, 2018, 16:08:00

correct, since the gaze timestamps are based on the eye camera frame timestamps and the eye cameras record with an higher frame rate than the world camera

user-4fbb59 02 May, 2018, 16:10:07

excellent. Thank you so much!

user-8944cb 02 May, 2018, 16:58:50

Hello, I have a question about the capture rate. When I record with a 200 hertz capture rate for the eye cameras (binocular) and 120 Hertz for the world camera I am getting ~400 timestamps for each second (not sure it is seconds but it aligns with the number of timestamps with respect to the length of the recorded video in seconds), and about 100 different (a little less) 'index numbers' in the 'Gaze positions' excel spreadsheet. When I record with 120Hertz for both eye cameras and world camera, I am getting ~240 timestamps. The data in all variables (as far as I can see) is different under each timestamp. The number of timestamps seems to be double the capture rate of the eye cameras - does this mean that the eye cameras are not synced? Is there something I need to change in the settings in this case? Thanks for the help!

user-02ae76 02 May, 2018, 17:10:31

Hi all, I have been working on a project using pupil groups and wanted to see if anyone here has had experience operating the groups feature on a standalone laptop. I have consistently run into issues getting my separate instances of Capture to recognize my two separate headsets (I have two 300hz trackers, connected to a '16 MacBook Pro via USB 3.1 cables). I don't have any issues getting the first camera to recognize, however I have found myself unable to reliably connect each of the sets. Sometimes I will succeed at getting both cameras recognized, however when I begin recording one or both cameras freeze up fairly fast.

user-02ae76 02 May, 2018, 17:11:50

Just wondering if anyone has experienced these quirks; my current theory is that my laptop CPU simply cannot handle to load of four (considering each eye cam) video streams. If anyone has had success in a similar setup I'd be excited to hear how!

user-e12082 02 May, 2018, 17:22:26

I have found a publication from Tobii explaining about the sampling frequency. Maybe it could be useful (the references are also so useful for deeply understanding the rates in eye-tracking. This is the link: https://www.tobiipro.com/learn-and-support/learn/eye-tracking-essentials/eye-tracker-sampling-frequency/

user-88ecdc 02 May, 2018, 19:24:16

Hello Folks. If i would like to know whether im doing the right thing, before taking on a major analysis chor :). I want to compare the fixations on to a surface vs the total amount of fixations. Now what i have done is to enable both offline fixation detector and offline surface tracker. Once having exported this file, I can see the values on the exported document named "Surface_gaze_distribution" in Excel. This give me the following info: http://piclair.com/pksnf Note that the name of my surface is "unnamed". So based on this i would assume that i am able to compare the fixations on the surface against the total fixations right? Lastly what are these numbers a sign of? fixations measured in time? Sorry for the long message and thanks in advance :).

user-8b414f 03 May, 2018, 07:22:36

Hi all! Do you know if with the calibration files (eye 0, eye 1 and camera world) its possible to transform the position data (in normalised units) to centimetres? Or at least to know the position in normalised data at which the calibration points appear. (we controlled distance from screen, head movements, and we know at what distance from the centre of the screen the different stimuli appear). Thank you very much! ๐Ÿ˜ƒ

user-6e1816 03 May, 2018, 07:47:55

@papr I had calibrated the world camera, and get the cam.yaml, but cannot find where to use it, do you know how to change the world camera params on uvc?

mpk 03 May, 2018, 12:23:33

@here please check out our latest release. Improved pupil detection included! https://github.com/pupil-labs/pupil/releases/tag/v1.7

user-1bcd3e 03 May, 2018, 18:05:45

@mpk I tried to use the new pupil for mac 1.7 but after download if I start the program says "file damaged"

mpk 03 May, 2018, 18:06:50

@user-1bcd3e I just saw that too. I m uploading a new bundle. Can you post the full error message? We have not seen this before and it does not happen deterministically.

mpk 03 May, 2018, 18:24:18

ok. I m codesigning the mac bundles again now and will upload. It should work then!

user-1bcd3e 03 May, 2018, 18:25:43

@mpk thank you ... I'm excited to see the new release!! sure will be a great job ๐Ÿ˜‰

mpk 03 May, 2018, 18:38:58

@user-1bcd3e alright I uploaded the bundle please try the new one.

user-bcf3aa 03 May, 2018, 19:11:34

Hey folks, just got the Pupil eye tracker, and wondering what's the easiest way to integrate with Python. I'd like to be able to stream estimated eye position in screen coordinates and draw that estimated eye position via, e.g. PsychoPy, PyGame, PyGaze, etc. Is there a higher level API than the bare metal ZeroMQ API? Thanks!

user-41f1bf 03 May, 2018, 19:15:12

For streamming of screen coordinates you will need to use Pupil Surfaces. The easiest way is to write a plugin that will talk to the online surface tracker.

user-41f1bf 03 May, 2018, 19:17:08

@user-bcf3aa I am guessing you want a real time stream.

user-41f1bf 03 May, 2018, 19:18:31

You also have an alternative option for offline screen coordinates.

user-1bcd3e 03 May, 2018, 19:54:44

@mpk it doesn't seem to be solved ๐Ÿ˜ฆ same file demaged!

user-1bcd3e 03 May, 2018, 19:58:15

@mpk I downloaded by Chrome

user-1bcd3e 03 May, 2018, 19:59:52

Chat image

user-bcf3aa 03 May, 2018, 20:36:00

@user-41f1bf yes a realtime stream would be ideal

user-728529 03 May, 2018, 21:15:39

Hello, my pupil headset have a very good accuracywith a fixed head and distance (the distance between subject's head and screen/working area). However, if the subject's head moves, It will come out with a bad eye tracking accuracy. Have anyone encountered this? Thanks.๐Ÿ˜‘

user-c87ed5 03 May, 2018, 23:40:00

Hi ! I am trying to send a custom message from a raspberry to a pupillab PC and then to record this message in a text file . I am able to send the standard message by zmq for instance -- socket.send("R") to start the recording-- but I need help with the sending and recording my custom message that allow us to synchronize different machines. Best, Pablo

wrp 04 May, 2018, 02:01:47

@user-1bcd3e I confirm that the most recent macOS bundle is not currently able to install/run. We will work on this today to resolve any signing issues in the deployment process.

mpk 04 May, 2018, 06:37:12

@wrp @mpuccioni#0374 sorry about that. Somehow Pupil Capture was still not signed. I just uploaded again. Please try it!

user-b571eb 04 May, 2018, 07:12:45

Chat image

user-b571eb 04 May, 2018, 07:13:25

I have tried few times. It remains the same. At the moment, I canโ€™t use it. Does it work on your side?

user-b571eb 04 May, 2018, 07:18:11

Ah. Just saw the previous message. Hope it will be solved soon.

user-29e10a 04 May, 2018, 07:32:28

@user-1bcd3e As a quick and dirty workaround you can disable the mac OS signing check by running "sudo spctl --master-disable" in the terminal. it is sufficient to run the app once and enable the check back with "sudo spctl --master-enable" and it will continue to work.

user-29e10a 04 May, 2018, 07:33:16

we all trust in pupil labs that they're not a Trojan horse ๐Ÿ˜‰

user-29e10a 04 May, 2018, 07:33:40

@user-b571eb see my answer for a quick solution

mpk 04 May, 2018, 07:35:25

@user-29e10a yes, that works, but I hope that the version from this morning is properly signed ๐Ÿ˜ƒ

user-29e10a 04 May, 2018, 07:36:26

@mpk I just downloaded this: "https://github.com/pupil-labs/pupil/releases/download/v1.7/pupil_v1.7-42-g7ce62c8_macos_x64_signed_confirmed.zip" and the error persists, I'm sorry ...

mpk 04 May, 2018, 07:36:35

...

mpk 04 May, 2018, 07:36:38

I have no words.

mpk 04 May, 2018, 07:36:54

whats the output of codesign -dv --verbose=1 /Applications/Pupil\ Capture.app

mpk 04 May, 2018, 07:37:00

and for the others as well..

user-29e10a 04 May, 2018, 07:37:42

*processing

user-29e10a 04 May, 2018, 07:39:42

Executable=/Applications/Pupil Capture.app/Contents/MacOS/pupil_capture Identifier=Pupil Capture Format=app bundle with Mach-O thin (x86_64) CodeDirectory v=20200 size=27237 flags=0x0(none) hashes=1355+3 location=embedded Signature size=8568 Timestamp=04.05.2018, 08:21:33 Info.plist entries=10 TeamIdentifier=R55K9ESN6B Sealed Resources version=2 rules=12 files=292 Internal requirements count=1 size=176 for capture

user-29e10a 04 May, 2018, 07:39:50

Executable=/Applications/Pupil Player.app/Contents/MacOS/pupil_player Identifier=Pupil Player Format=app bundle with Mach-O thin (x86_64) CodeDirectory v=20200 size=26716 flags=0x0(none) hashes=1329+3 location=embedded Signature size=8567 Timestamp=04.05.2018, 08:25:34 Info.plist entries=11 TeamIdentifier=R55K9ESN6B Sealed Resources version=2 rules=12 files=286 Internal requirements count=1 size=172 for player

user-29e10a 04 May, 2018, 07:39:56

i didn't install the service

mpk 04 May, 2018, 07:41:32

ok. thanks. I ll investige some more.

user-29e10a 04 May, 2018, 07:42:10

as we're speaking, I didn't notice this before, but is in pupil player the coarse pupil detection disabled, or is it since 1.7?

papr 04 May, 2018, 07:45:00

@user-29e10a coarse Pupil detection is disabled for low resolutions since at least v1. 6

user-1bcd3e 04 May, 2018, 07:59:11

@mpk @user-29e10a thank you I'll wait for updates.... me too I didn't install pupil service, as I think its not the problem

user-29e10a 04 May, 2018, 10:52:35

@papr why is that? it makes it easier to tune the intensity range imho

mpk 04 May, 2018, 11:10:53

@user-1bcd3e @user-29e10a @Eli#9432 mac os release if finally fixed.

papr 04 May, 2018, 11:10:59

@@user-29e10a It was mainly used for speed increase. But the speed gain is not high enough for lower resolutions to be worth it.

papr 04 May, 2018, 11:12:17

Your argument is something to consider though

user-29e10a 04 May, 2018, 11:14:48

@papr understandable โ€“ I'm working on a project which includes the pupil detection completely inside a c++ and c# environment ... works fine so far, but I had to implement the coarse pupil detection in c++; if this is interesting for you, I'm ready to share

user-29e10a 04 May, 2018, 11:15:31

I wrote a wrapper to get the results of the pupil detection from unmanaged c++ in managed c# code โ€“ all without python

papr 04 May, 2018, 11:23:25

This is definitely something that would be beneficial for the Pupil project

user-29e10a 04 May, 2018, 11:31:37

ok, i can prepare a pull request, or I'll send you the relevant code snippet (just a method)

user-9d7bc8 06 May, 2018, 22:02:09

Is there any kind of minimum hardware recommendation? Would a Raspberry Pi Zero be sufficiently powerful for capturing?

user-c1f766 07 May, 2018, 11:37:02

any sugestion for settings of the eye camera first time?

Chat image

user-5543ca 07 May, 2018, 12:43:05

Hello @papr, I'm trying to analyze how the gaze positions shift in between two surfaces (left and right in this case). I couldn't find any documentation about how to understand this exported files. Before making my own assumptions, I would like to ask for your help. Thanks in advance!

Chat image

user-e38712 07 May, 2018, 12:56:39

Hello guys, is somewhere info about how calibration is done? What algorithms are used to detect eye ball, pupil, pupil movement etc. for both 2d and 3d detection

papr 07 May, 2018, 12:58:43

@user-e38712 This paper describes the pupil detection algorithm: https://dl.acm.org/citation.cfm?id=2641695

user-e38712 07 May, 2018, 13:02:36

@papr thanks! When I'll reach my PC I'll check if I have access to this online library as a student

papr 07 May, 2018, 13:03:16

pupil-labs-paper.pdf

papr 07 May, 2018, 13:03:58

This is the pdf for the paper. The paper is publicly available as well but I do not have the link at hand.

user-e38712 07 May, 2018, 13:05:40

Ok thanks! I'll look at this

user-ba98c6 07 May, 2018, 14:42:35

Hello I need help about using Pupil Mobile and capture in my PC , I have Nokia 6 and Samsung Galaxy 8 and Windows 10 ,I have installed Pupil Mobile on my phones and start Pupil capture on my pc , But nothing happens.Windows and cellphones are connected to my ADSL modem. I have not connected Pupil headset yet, I want to test the platform then connect headset

user-9d7bc8 07 May, 2018, 15:19:38

Is there any kind of minimum hardware recommendation? Would a Raspberry Pi Zero be sufficiently powerful for capturing?

wrp 07 May, 2018, 15:20:18

@user-9d7bc8 I think RPI might not be powerful enough

wrp 07 May, 2018, 15:20:56

@user-9d7bc8 you should get in touch with @user-c494ef to discuss - as they have Pupil running on NVIDIA Jetson TX2

wrp 07 May, 2018, 15:21:41

@user-ba98c6 please post your questions - I am about to go AFK, but will respond when back

user-3d9cdd 07 May, 2018, 15:30:56

Hello, I'm trying to setup eye tracking for vive and whenever I launch the 3d calibration unity project, my computer blue screens with the stop code "MULTIPLE_IRP_COMPLETE_REQUESTS"

user-6930d5 08 May, 2018, 19:29:26

@papr Hello, Iโ€™m struggling to figure out the basics of Pupil Remote, forgive me as I donโ€™t have much networking experience :p Iโ€™m just testing the Remote feature -- I have Pupil Capture running and Iโ€™ve enabled the Remote plugin, then Iโ€™m running python3 pupil-helpers-master/python/pupil_remote_control.py but I get this error. What else do I need to do?

Also, a previous link you made to https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/remote_annotations.py no longer exists -- is there somewhere I could still see the code? Thank you!

Chat image

papr 09 May, 2018, 08:24:25

Hey @user-6930d5 The traceback indicates an interuption by the user. This happens if you hit ctrl-c within the terminal.

papr 09 May, 2018, 08:25:52

The ouput should look like this:

4452.78368767
Round trip command delay: 0.00037980079650878906
Timesync successful.
OK
OK
papr 09 May, 2018, 08:28:52

The linked script moved to https://github.com/pupil-labs/pupil-helpers/blob/master/python/remote_annotations.py

user-88ecdc 09 May, 2018, 17:56:58

Hello Folks, 1. Is there any way to get a clean screenshot from a scanpath through PupilPlayer? 2. is there any way to make small clips for presentations of findings through the software? Thanks in advance

user-fa3706 09 May, 2018, 19:01:42

Hi everyone, I am trying to get surface tracking working for multiple monitors and I'm not quite sure how the markers are supposed to work. If I attach the markers to each monitor, should the eye tracker just be able to detect them or will I need to specify each surface through the pupil capture application?

papr 09 May, 2018, 19:02:43

@user-88ecdc The recommended way is to set the trim marks such that they encapsulate the section that you want to present and export the recording afterwards. The exproted video includes all visualizations

papr 09 May, 2018, 19:04:06

@user-fa3706 The markers are detected automatically. But the markers alone are not a surface yet. You will need to specify which markers belong to which surface. You do so by adding a surface and showing the markers that belong to the surface into the camera

user-fa3706 09 May, 2018, 19:05:15

@papr so I will need to use the capture application to present the marked surface to the world camera and then I can specify the surface while it is viewing the marked surface?

papr 09 May, 2018, 19:06:39

correct. To summarize: 1. enable the surface tracker. 2. point the world camera to the monitor with the markers 3. hit A to add a surface. The surface will automatically use the recognized markers as a surface definition. 4. edit the surface as required

user-fa3706 09 May, 2018, 19:13:43

Perfect. Thank you!

user-fa3706 09 May, 2018, 19:59:05

@papr I have been testing what you said and the pupil player doesn't seem to be recognizing the surface in the recording even though the pupil capture application seems to recognizing the surface.

user-8889de 09 May, 2018, 20:36:48

Hello!

user-8889de 09 May, 2018, 20:37:43

I am using Motion on a raspberry pi to create a stream at its ip address at two ports. Is there a way I can capture those feeds using pupil on another machine?

user-8889de 09 May, 2018, 20:44:48

@user-c494ef @user-9d7bc8 @wrp I am using a rpi 3B and a DIY headset with the two webcams recommended by the DIY documentation.

user-88ecdc 10 May, 2018, 07:36:18

Thank you for your Brilliant answer yesterday @papr. On another note, does anyone of you get this error: "Player_methods: No Valid dir Supplied" When opening Pupil player? and do any of you know a fix? Ty in advance ๐Ÿ˜ƒ

wrp 10 May, 2018, 07:36:57

@user-88ecdc do you happen to have any non-ascii chars in your info.csv file within the dataset?

wrp 10 May, 2018, 07:39:06

@papr @user-8889de question seems relevant for pyndsi

papr 10 May, 2018, 07:40:58

@user-88ecdc Are you using Windows?

user-88ecdc 10 May, 2018, 07:41:34

Yes i am, What is non-ascii chars? ๐Ÿ˜ƒ

papr 10 May, 2018, 07:42:32

@user-88ecdc Does the error appear before you drop the recording onto the window or afterwards?

wrp 10 May, 2018, 07:43:14

@user-88ecdc please share the info.csv file - I have a feeling that this might be related to the username of the machine listed in the info.csv file

user-88ecdc 10 May, 2018, 07:43:50

MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version. player - [ERROR] player_methods: No valid dir supplied. This is what appears when i open Pupil player. The window where i am supposed to be able to put in clips does not appear.

user-88ecdc 10 May, 2018, 07:44:39

Ok, is info.csv found within the Pupil Player folder?

user-88ecdc 10 May, 2018, 07:46:40

Chat image

papr 10 May, 2018, 08:18:35

@user-88ecdc could you share your recording with data@pupil-labs.com ? The easist way to do so is to upload it to Google Drive and to share the folder with the email address above.

user-6930d5 10 May, 2018, 08:19:09

Thanks @papr , I think I had an issue with a module but Remote works now!

I want to use Pupil Labs with jsPsych, a javascript framework for running behavioral experiments in the browser. Iโ€™m trying to find a way to sync up the timestamps of the experiments and Pupil Labs recordings. Any suggestions for a good/easy approach? Some ideas: using Pupil Remote to somehow reset timestamp to 0 when the experiment starts/make an annotation, or have Pupil Capture/Player recognize a special marker in the experiment and make the timestamp 0 from that point on.

Iโ€™ve tried setting timestamps to 0 using Pupil Remote during a recording and it messed up the videos, probably because of anachronistic timestamps. How would you recommend handling this? Should Timesync always be done before a recording starts?

papr 10 May, 2018, 08:21:20

You are exactly right! Time sync needs to be done before starting a recording else the assumtion of time being monotonic will break.

papr 10 May, 2018, 08:21:45

I suggest to simply set the time to what ever clock your experiment uses.

user-88ecdc 10 May, 2018, 08:22:27

@papr Roger will do, i have 24 recordings which are each about 6gb each, but i assume it is ok if i just upload one?

papr 10 May, 2018, 08:22:43

Do all of them trigger the issue?

papr 10 May, 2018, 08:24:08

Never mind, you said that issue is that the drop-recording window does not appear, correct?

user-88ecdc 10 May, 2018, 08:24:16

correct

papr 10 May, 2018, 08:24:42

This issue is independent of the recording then. Which version of Pupil Player do you use?

user-88ecdc 10 May, 2018, 08:26:44

Chat image

user-88ecdc 10 May, 2018, 08:27:02

System Info,"User: HumLab, Platform: Windows, Machine: AAU109897, Release: 10, Version: 10.0.10586"

user-88ecdc 10 May, 2018, 08:28:04

Capture Software Version,1.4.1 Data Format Version,1.4.1

papr 10 May, 2018, 08:31:47

Could you please update your software to the newest version https://github.com/pupil-labs/pupil/releases/tag/v1.7 and try again?

user-88ecdc 10 May, 2018, 09:08:57

It worked, thanks :). When exporting a trimmed sequence of my clip, is there any way to make the x50 speed follow, or can an exported trimmed section only be visualized in realtime when being exported?

papr 10 May, 2018, 09:11:19

Are you saying that playback in Pupil Player is slower while video exporting a section?

user-88ecdc 10 May, 2018, 09:12:55

I want to show a sequence in half speed, but when i set it to half speed and export, the exported version is displayed in real time. And is there a way to improve the video quality when exporting? Thanks for such quick replies!

papr 10 May, 2018, 09:14:35

Ah, now I understand. Setting playback speed in Pupil Player only affects playback in Pupil Player. You can use a media player like VLC to playback the exported video more slowly. Alternatively you can slowdown the video permanently using a tool like ffmpeg.

papr 10 May, 2018, 09:15:32

And there are no export quality settings.

user-88ecdc 10 May, 2018, 09:26:00

it is 1280 x 720 as a source file, 30 FPS when looking at File source, i assume that is just how it is. Thanks for the VLC Advice

user-88ecdc 10 May, 2018, 10:02:14

Wow the new version of pupil labs exportations gets a different result when exporting than the previous, with all settings still on default. Can anyone tell me, what Total Gaze Points can be interpreted as a sign of? TY for your help

papr 10 May, 2018, 10:03:07

What differences do you see? And what do you mean by Total Gaze Points can be interpreted as a sign of?

user-88ecdc 10 May, 2018, 10:08:47

Total_gaze_point_count is a result i get from looking in to "Gaze Distributions", after exporting offline surface tracking + offline fixation detecter. What i mean is that i assume that Gaze points are related to how much time they look at something, but also that there is not necessarily a causal relationship between that.

papr 10 May, 2018, 10:18:09

This statistic is definitively related to time. total_gaze_point_count is the total number of gaze points in the exported section. Below follow all registered surfaces and how many of the gaze samples were detected on the corresponding surface. Be aware that summing all surface gaze counts does not necessarly sum up to the total count. It can be less (subject does not look at any surface) or more (subject looks at overlapping surfaces) than the total count.

user-88ecdc 10 May, 2018, 10:24:24

I see, thank you for that explanation! for our study, we only had 1 surface and calculated the number of gaze point counts that this one surface had against the total amount of gaze points recorded. But what exactly is the definition a single gaze point?

papr 10 May, 2018, 10:46:07

A single gaze point is a mapping of a single or a pair of pupil datum. Each eye video frame generates exactly one pupil datum. A meaningful metric would be to calculate the average amount of gaze points per second over the whole recording.

user-42b39f 10 May, 2018, 10:52:18

Hi all, One of our research topic is to monitor the pupil diameter with regards to an audio input. I would be glad to have any comments from the experts to the following points: 1- we have the feeling that the pupil diameter measurement is more accurate with the 2d model rather than with the 3 D one. Is it correct ? 2- could you confirm that in a csv export the timestamp unit is in second and that the rate of capture is given by the cameras so the number of diameters recording might depend on cpu load or other external issue ? 3- the blink process is inducing a physiological variation in the eye diameter ; we would like to delete these non valuable data. Does it make sense to set the confidence value to 0.9 or would you advice another value (1 ?) ? 4- is it possible to add some triggers or marks into the time line in pupil player that could be retrieved in an csv export ? I did not test yet version 1.7 with the new audio wave visualization. We could perhaps with this new feature use the trimmers in pupil player and export part by part but for a long file, it is a bit boring 5- is it possible to add audio export to the video export plugging in pupil player ? Thanks (sorry for the long list !)

papr 10 May, 2018, 10:55:00

5 - this is already implemented since various versions. Audio playback might not work with your default medai player though. We recommend VLC.

papr 10 May, 2018, 10:58:10

1 - the 2d pupil diameter is a direct result from the pupil detection while the 3d diameter is based on the 3d model. If the 3dmodel is not well fit or unstable the 3d diameter will vary. If you are looking for relative pupil size changes it might be better to use the 2d diameter for the moement. We are working actively on improving the 3d model and its stability.

papr 10 May, 2018, 10:58:56

2 - yes, timestamp units are always in seconds. And yes, the frame rate can drop on high load and result in fewer pupil diameter samples

papr 10 May, 2018, 10:59:34

3 - I would suggest to use the blink detection plugin to find blinks and to use them to filter your data acordingly

papr 10 May, 2018, 11:00:41

4 - You can use the annotation plugin to add such triggers either manually during the recording, automatically during the recording via notifications, or manually in Player

user-42b39f 10 May, 2018, 11:04:58

@papr thanks for the answers;

user-42b39f 10 May, 2018, 11:06:15

5- yes the export with vlc has audio (I used the quicktime players unsuccessfully)

user-88ecdc 10 May, 2018, 11:06:33

Thank you so much for the answers @papr Mucho appreciato ๐Ÿ˜ƒ have a great day.

papr 10 May, 2018, 11:06:50

@user-88ecdc you too!

user-42b39f 10 May, 2018, 11:08:29

1- we noticed that stability of the 3D model was the major issue especially after a blink. Did version 1.7 adress this point or should we try again with the next one?

papr 10 May, 2018, 11:09:08

The stability improvements are still work-in-progress.

user-42b39f 10 May, 2018, 11:14:52

ok thanks papr; last question (for today!), are you aware of any visualization plugin from the community to display a parameter (ie diameter...) over time ?

papr 10 May, 2018, 11:15:46

Pupil Player shows the 2d pupil diameter in a timeline by default

user-42b39f 10 May, 2018, 11:17:22

yes but it is not scalable and it is difficult to assess any minor changes (absolute or relative) to compare between recordings

papr 10 May, 2018, 11:19:16

I understand. This will be fixed as soon as we introduce timeline zooming. Until then I would recommend to export the pupil data via the Raw Data Exporter and to use something like Google Spreadsheets to visualize the diameter column of the csv file.

user-42b39f 10 May, 2018, 11:21:28

ok it is what I am doing for the moment, waiting for the introduction of this new feature!

user-88ecdc 10 May, 2018, 12:32:45

Ohh. Now it happend again (even on the new version) i cannot open pupil player now: it says player - [ERROR] player_methods: No valid dir supplied (C:\Users\HumLab\Desktop\pupil_v1.7-42-7ce62c8_windows_x64\pupil_player_windows_x64_v1.7-42-7ce62c8\pupil_player.exe)

papr 10 May, 2018, 12:34:50

Try the following: 1. Make sure Player is not running 2. look for the pupil_player_settings folder in your user/home directory 3. delete the player_user_settings file 4. start Player

user-88ecdc 10 May, 2018, 12:45:36

Worked perfectly, thanks once again ๐Ÿ˜„

papr 10 May, 2018, 12:46:27

Ok, thank you for the report. I will investigate the cause of this issue.

papr 10 May, 2018, 12:46:46

Deleting the user settings is just a bad work-around for now.

user-4fbb59 10 May, 2018, 16:00:06

Hey all, so when I use the pupil to do some simple gaze tracking it seems like the predicted gaze is jumping around a lot, very inaccurate and noisy. I'm trying to keep my head relatively still and the accuracy seems to work fine when I'm calibrating. Any reason why this could be? Thanks!

user-4fbb59 10 May, 2018, 16:00:40

Also- when the gaze jumps around, is that a consequence of not being able to see my pupil correctly, or is it misidentifying the pupil?

papr 10 May, 2018, 16:08:12

@user-4fbb59 This can have multiple reasons. Could you share an example recording with data@pupil-labs.com such that we can have a look?

user-fa3706 10 May, 2018, 16:59:01

@papr Hello, I have attempted the surface tracking setup that you explained to me yesterday. It seems that Pupil Capture is registering the surface, but the Pupil Player application doesn't seem to be recognizing the surface when I open the test recording. Is there a different setting that needs to be used? I can send the recordings if that helps.

papr 10 May, 2018, 17:17:08

Did you open the offline surface tracker?

user-b27aef 10 May, 2018, 17:20:06

Hi guys can you please tell me what the resolution is for your 200Hz cameras and what is the latency from camera to PC and PC to dumped values?

user-b27aef 10 May, 2018, 17:20:34

also the angular resolution/precision

user-fa3706 10 May, 2018, 18:12:47

@papr It is enabled in the plug-in settings on both Capture and Player.

papr 10 May, 2018, 18:14:00

Mmh. Normally Capture should save the surface definition. You can simply add a new surface player though

user-fa3706 10 May, 2018, 18:14:28

Ok. Is that done in the same process? Just tell Player when the world camera is viewing a marked surface?

papr 10 May, 2018, 18:15:09

Yeah, you seek to a frame that shows your surface, pause playback, add the surface, edit if necessary

user-fa3706 10 May, 2018, 18:15:30

Ok. I just tried that and it seems to be working now. Thank you.

user-4fbb59 10 May, 2018, 18:21:55

@papr semiresolved, seems like just has to do with accuracy of pupil capture, recalibrating and being more deliberate about eye movements seems to help

user-4fbb59 10 May, 2018, 18:23:04

I do have another issue though- i'm trying to use pupil to find out where exactly I'm looking in this maze (attached below) but the surface tracker seems to pick up the maze as skewed- i'm afraid this will affect my accuracy, because I need to match these gaze tracking points up with pen data. Any thoughts on how to make the surface tracking more accurate?

user-4fbb59 10 May, 2018, 18:23:06

Chat image

wrp 11 May, 2018, 02:55:02

@user-4fbb59 the image you attached is a screenshot from pupil capture? Can you turn on surface tracking so that we can see the quality of the surface or send a sample/example dataset to data@pupil-labs.com so that we can provide you with some concrete feedback?

user-4fbb59 11 May, 2018, 02:55:33

@wrp the image is a screenshot of the surface debug tool

wrp 11 May, 2018, 02:55:51

ok, thanks for the clarification

user-4fbb59 11 May, 2018, 02:56:38

I essentially need to get the location of my gaze on the page in (x,y), but since the image is skewed I'm not sure how to translate the gazepoint into actual gaze coordinates

user-4fbb59 11 May, 2018, 02:56:45

(as in, where I'm looking on the page)

wrp 11 May, 2018, 02:57:13

by skewed, you mean distorted due to the lens distortion from the world camera, correct?

user-4fbb59 11 May, 2018, 02:57:20

yup

user-4fbb59 11 May, 2018, 02:57:30

(sorry for the imprecise wording)

wrp 11 May, 2018, 02:57:32

Have you tried re-calibrating your world camera?

wrp 11 May, 2018, 02:58:57

You can do so by using the camera intrinsics estimation plugin

user-4fbb59 11 May, 2018, 02:59:11

Hm, okay I'll look into that

user-4fbb59 11 May, 2018, 02:59:21

thanks so much!

wrp 11 May, 2018, 03:00:03

@user-4fbb59 final question - where are the corners of the surface that you defined?

user-4fbb59 11 May, 2018, 03:00:28

Well, I have 4 tracker images on the paper

user-4fbb59 11 May, 2018, 03:00:31

they should be in the corners

wrp 11 May, 2018, 03:00:32

Did you edit the surface at all or no?

user-4fbb59 11 May, 2018, 03:00:53

I haven't tried editing

user-4fbb59 11 May, 2018, 03:01:02

would that provide consistent results even when I move my head around?

wrp 11 May, 2018, 03:02:19

You might want to try editing the surface so that the corners of the surface correspond more closely to the content

user-4fbb59 11 May, 2018, 03:02:29

Okay. I'll try that as well

user-4fbb59 11 May, 2018, 03:02:35

Thanks for the pointers!

wrp 11 May, 2018, 03:03:25

You're welcome. Looking forward to your feedback.

user-3742dc 11 May, 2018, 10:22:52

Hello, I want to use world camera with python to record video, how can I do it and thank you

papr 11 May, 2018, 10:24:22

@user-3742dc https://github.com/pupil-labs/pyuvc this is what you need

user-d9bb5a 11 May, 2018, 11:29:52

Good morning. I would like to know about your application on android. What technical characteristics should be, what functions will be supported there, etc. Thank you in advance.

user-48da9b 11 May, 2018, 11:37:03

Good morning everyone. Does anyone knows if it is possible to access the raw image data from a pupil headset outside of Python?

user-48da9b 11 May, 2018, 11:37:44

We could do it before using the default DirectShow drivers, but it seems that in the latest version of pupil capture these drivers are overwritten with some custom ones and we can't find the cameras anymore

user-48da9b 11 May, 2018, 11:38:04

btw, this is in Windows 10

user-48da9b 11 May, 2018, 11:38:25

and using the latest headset (i.e. 200Hz cameras)

user-48da9b 11 May, 2018, 11:39:05

we've tried it on a number of different computers. as long as we don't run pupil capture, everything runs fine

user-48da9b 11 May, 2018, 11:39:27

but as soon as we run it the first time there is some funky powershell scripts that tries to overwrite driver certificates, etc

user-48da9b 11 May, 2018, 11:39:37

and then the headset becomes "bricked" to using pupil capture only

mpk 11 May, 2018, 12:29:44

@user-48da9b to revert back to normal drivers just demove the libusbK drivers from the cameras in the device manager and allow the system to install default uvc drivers. We use these custom drivers to run 3 cameras on one USB bus.

user-48da9b 11 May, 2018, 12:40:34

@mpk ok, thanks for the heads-up

user-48da9b 11 May, 2018, 12:40:46

is there a way to access the custom drivers via some C/C++ API?

mpk 11 May, 2018, 12:55:03

@user-48da9b you should be able to do it with this: https://github.com/pupil-labs/libuvc/

mpk 11 May, 2018, 12:55:22

our fork has changes that are required.

user-48da9b 11 May, 2018, 12:57:30

@mpk thanks, this sounds perfect

user-48da9b 11 May, 2018, 13:03:58

which version of libusb are you targeting?

mpk 11 May, 2018, 13:42:38

@user-48da9b libusb1.0

user-42b39f 11 May, 2018, 14:57:56

Hi All, Can anyone explain me how audio vizualization can be turned on with new version ?

user-42b39f 11 May, 2018, 14:58:22

and how can I retrieve an annotation in a csv file ? TIA

mpk 11 May, 2018, 15:28:36

@user-42b39f this feature is enabled by default and will show in the timeline when audio was recorded.

papr 11 May, 2018, 15:30:07

To get annotations in a csv file you need to 1) Start Player 2) Load the recording 3) Enable the Annotation plugin and 4) hit export e

user-42b39f 11 May, 2018, 15:32:04

@mpk it is not showing up, I should try perhaps to delete player_settings ?

papr 11 May, 2018, 15:32:51

You might need to scroll or resize the timeline view

user-42b39f 11 May, 2018, 15:33:12

@papr it is what I have done but I can't find it in pupil_positions.csv?

papr 11 May, 2018, 15:33:13

Is there a speaker icon visible in the right icon bar?

papr 11 May, 2018, 15:33:47

@user-42b39f Annotations are not listed in pupil_positions.csv but in annotations.csv if I rememeber correcty

user-42b39f 11 May, 2018, 15:39:01

Oups, obvious but I didn't see the scroll control of the timeline view,sorry! However, it is a big timeline view now. Is there any possibility to select what I would like to be present in the timeline view and is there a control to zoom in the envelope because if unfortunately you get a click in the recording, the automatic scaling is not really appropriate

papr 11 May, 2018, 15:42:33

There timeline selection and zooming are not available yet.

user-42b39f 11 May, 2018, 15:43:29

ok perhaps in a future version ? It would be useful

user-42b39f 11 May, 2018, 15:43:35

have a good day

papr 11 May, 2018, 15:43:52

Yes, it is on our (long) todo list ๐Ÿ˜ƒ

user-42b39f 11 May, 2018, 15:45:16

yes but the new audio features are really of great value ๐Ÿ‘Œ

papr 11 May, 2018, 15:45:40

Great to hear ๐Ÿ˜ƒ

user-0f7b55 11 May, 2018, 16:24:04

Indeed

user-90270c 11 May, 2018, 19:42:47

@papr We would love to use the Annotation plugin to mark audio events, but for some reason we cannot hear the audio in Pupil Player (v1.5-12). We can see gaze on world with audio in VLC, but again, we would like to annotate with respect to audio as well as visual events. Any suggestions? Thanks in advance!

wrp 11 May, 2018, 23:38:44

@user-90270c audio playback is only available in v1.6 and up. Please upgrade to the latest release (v1.7)

user-90270c 13 May, 2018, 04:38:51

@wrp Many thanks!

wrp 13 May, 2018, 06:42:08

@user-90270c you're welcome ๐Ÿ˜„

user-3742dc 14 May, 2018, 06:51:42

hello, My world camera is not working, I just get a grey interface but the two eye camera work correctly, how to fix it and thanks

user-29e10a 14 May, 2018, 06:52:25

@user-3742dc try to pull out the usb cable and plug it in again... does it help?

user-3742dc 14 May, 2018, 07:08:39

It doesn't work i try it many times

papr 14 May, 2018, 07:17:14

@user-3742dc did you try to restart with defaults? This can be done by clicking the corresponding button in the general settings.

user-3742dc 14 May, 2018, 07:31:31

I try it also to restart with default settings , I get this in cmd line of pupil ... world - [WARNING] launchables.world: Resetting all settings and restarting Capture. world - [INFO] launchables.world: Process shutting down. eye0 - [INFO] launchables.eye: Process shutting down. eye1 - [INFO] launchables.eye: Process shutting down. Clearing C:\Users\incia\pupil_capture_settings\user_settings_eye0... Clearing C:\Users\incia\pupil_capture_settings\user_settings_eye1... Clearing C:\Users\incia\pupil_capture_settings\user_settings_world... world - [INFO] launchables.world: Application Version: 1.7.42 world - [INFO] launchables.world: System Info: User: incia, Platform: Windows, Machine: phyb1, Release: 10, Version: 10.0.16299 world - [INFO] launchables.world: Session setting are from a different version of this app. I will not use those. ...... world - [ERROR] video_capture.uvc_backend: Init failed. Capture is started in ghost mode. No images will be supplied. world - [INFO] camera_models: No user calibration found for camera Ghost capture at resolution (1280, 720) world - [INFO] camera_models: No pre-recorded calibration available world - [WARNING] camera_models: Loading dummy calibration world - [WARNING] launchables.world: Process started.

user-3742dc 14 May, 2018, 08:30:32

I have another question about synchronization, can i synchronize the world camera with my web cam or other camera? thanks for your help

user-d9bb5a 14 May, 2018, 11:18:00

Hello everyone, how to calculate the saccades protocols?

user-d9bb5a 14 May, 2018, 13:22:01

Tell me, please, who can explain how to work with the data of fixations and saccades.

user-673c72 14 May, 2018, 19:55:29

I am using Pupil to track singers' gaze while watching a video conductor stimulus. I have found that calibration is rarely consistent throughout the 4 minute duration. What is the expectation for how long a single calibration should last? Should I plan to recalibrate every minute or so?

user-ecbbea 14 May, 2018, 22:19:59

hey, just curious if when using the surface tracker -- is it possible to bound your gaze tracking to be only within the tracked surface? I have an experiment where the participant is looking at a monitor and I want to be able to determine only when they are looking at the screen. I have created the surface using the QR codes, but it doesn't look like it is bounding their gaze to just the surface

user-d9bb5a 15 May, 2018, 07:29:55

Tell me, please, who can explain how to work with the data of fixations and saccades.

user-e38712 15 May, 2018, 07:55:31

Hi guys, is there paper or blogpost describing 3D pupil detection? I wonder what algorithms are used there

user-f1eba3 15 May, 2018, 08:19:00

Hi Guys

user-f1eba3 15 May, 2018, 08:21:26

Can someone teamview with me to help me adjust the eyes settings in Pupil Capture for optimal pupil detection ?

user-ef1c12 15 May, 2018, 08:44:35

Hi! I would also need some advice on how best to set up Pupil Capture for pupil detection as Pupil detection has been quite unstable under current settings. Is it possible to speak with someone using screenshare so that settings can be adjusted real time?

user-e38712 15 May, 2018, 08:46:01

I've never had problem with that guys. Did you adjust camera focus?

user-e38712 15 May, 2018, 08:46:37

there is step by step guide on how to adjust that in documentation

user-ef1c12 15 May, 2018, 08:52:08

Thanks @user-e38712 ! Yes, I did adjust the camera focus and made sure the video capture is as per the documentation (i.e., camera close enough to pupil, etc)

user-e38712 15 May, 2018, 08:55:13

detection & mapping mode are set to 3d?

user-ef1c12 15 May, 2018, 08:55:30

Yes

user-e38712 15 May, 2018, 08:56:06

hmmm.. your pupil is detected or there is a problem just with mapping?

wrp 15 May, 2018, 08:56:27

Hi @user-ef1c12 perhaps you could share a demo/sample dataset (recording) with data@pupil-labs.com and we can provide you with some concrete feedback. Please include the calibration sequence in the recording.

user-e38712 15 May, 2018, 08:59:01

@wrp hello, good to see you here. Are you able to provide me with information what algorithms are used to 3D pupil detection in Pupil Labs?

user-ef1c12 15 May, 2018, 08:59:36

@wrp ok I will do this now. Thank you!

user-d9bb5a 15 May, 2018, 09:00:16

Good afternoon, please tell me, does anyone work with the data obtained by using the aytracker? Can anybody help with their transcript? (fixations, etc.). Help me please...

user-81072d 15 May, 2018, 18:02:47

Hi, all! I received my Pupil Labs eye tracker yesterday (binocular + 3d), and I'm having a hard time with drivers on Windows 10 ๐Ÿ˜ฆ

user-81072d 15 May, 2018, 18:03:28

Both pupil cams seem to work, but none of the other cameras even seem to be detected by the PC

user-81072d 15 May, 2018, 18:04:25

Does anybody have experience troubleshooting this? I've uninstalled, reinstalled, and rebooted more times than I can count on 3 different Windows 10 PC's

user-81072d 15 May, 2018, 19:14:00

Same thing on Debian (buster). At least in linux I can see the RealSense hardware in lsusb, although it doesn't appear in the pupil_capture UI. I installed pyrealsense (via git), but still see this in the pupil_capture output: "world - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend"

user-81072d 15 May, 2018, 21:33:19

Is it necessary to calibrate the RealSense world camera using Intel's Camera Calibrator software? I can't move past the DS_INVALID_ARG error.

user-81072d 15 May, 2018, 21:33:22

Chat image

user-e7102b 16 May, 2018, 02:37:36

Hi. I'm having some trouble getting audio recording working. I'm using Pupil Capture 1.7.42 on a Macbook Pro (Sierra). My internal microphone is definitely switched on and functioning. I've activated the audio plugin and set the source to "Built in Microphone" (and I don't get any errors when I do this). When I record, an audio.mp4 file gets produced, and an audio_timestamps.npy file. However, if I try and open the audio.,mp4 in quicktime the file plays but I don't hear any sound. When I play the recording in Pupil Capture (also latest version), I also don't hear any sound. Do you have any suggestions how to get this working?

user-e7102b 16 May, 2018, 02:37:40

Thanks

mpk 16 May, 2018, 06:31:17

@user-81072d we pre-calibrate the devices during QC it should not be necessary to calibrate again.

mpk 16 May, 2018, 06:31:53

@user-e7102b thanks for the report. Would you mind making an issue on github?

papr 16 May, 2018, 06:32:52

@user-e7102b could you please also send the recording to [email removed]

user-20c486 16 May, 2018, 09:17:52

Hi I have a silly question about the pupil eye tracking in VR headstes. Are they documented to be safe for the eyes?

user-49149c 16 May, 2018, 12:41:54

Hello everyone I am looking for the source code for "pupil capture" application. someone please share the link if there is anything out there. thanks

papr 16 May, 2018, 12:42:18

https://github.com/pupil-labs/pupil

user-49149c 16 May, 2018, 12:45:31

Thanks Papr, is this GUI ?

papr 16 May, 2018, 12:47:08

Please follow these instructions if you want to run from source: https://docs.pupil-labs.com/#developer-setup

user-49149c 16 May, 2018, 12:51:24

Many thanks papr

user-8779ef 16 May, 2018, 13:00:54

@user-e7102b Have you tried VLC? I believe it does a better job.

user-29e10a 16 May, 2018, 13:21:26

Help. I'm trying to install pip3 install git+https://github.com/pupil-labs/PyAV on MacOS 10.13.4 and it fails with src/av/_core.c:637:10: fatal error: 'libavfilter/avfiltergraph.h' file not found #include "libavfilter/avfiltergraph.h"

I installed ffmpeg etc. like it is in the docs... the weird thing is, it did work a few weeks ago ...:( anybody?

mpk 16 May, 2018, 13:57:57

@user-20c486 Hi, the VR add-on gets warm and this is something some users find a bit uncomfortable. We have tested for IR safety and confirmed that it is save for use.

user-20c486 16 May, 2018, 13:58:29

@mpk Thanks a lot! ๐Ÿ˜ƒ

user-81072d 16 May, 2018, 16:04:27

@mpk - that's good to know, thank you. Unfortunately, however, we're still unable to use the device. When I pick the R200 as the backend manager, the screen appears gray. Standard output shows this:

world - [INFO] camera_models: No user calibration found for camera Intel RealSense R200 at resolution (640, 480)
world - [INFO] camera_models: No pre-recorded calibration available
world - [WARNING] camera_models: Loading dummy calibration
user-81072d 16 May, 2018, 16:04:43

This is what the UI looks like.

Chat image

papr 16 May, 2018, 16:05:15

@user-81072d Please select the Realsense Backend

papr 16 May, 2018, 16:05:40

And then the R200 from the menu

papr 16 May, 2018, 16:06:27

Sorry, if you have done this already

user-81072d 16 May, 2018, 16:06:40

I have, yes

user-81072d 16 May, 2018, 16:06:50

That's the screen which displays immediately after doing so

user-81072d 16 May, 2018, 16:07:50

Any other suggestions, @papr?

papr 16 May, 2018, 16:08:11

Yes, I would suggest to try the pyrealsense examples and see if they work

papr 16 May, 2018, 16:08:37

Any of these: https://github.com/pupil-labs/pyrealsense/tree/master/examples

user-81072d 16 May, 2018, 16:10:38

Ok, will give it a go. Thanks.

user-81072d 16 May, 2018, 16:28:58

@papr - does the latest release of the RealSense SDK 2.0 (Build 2.11.0) support the R200 that came with my Pupil headset? It's not listed on https://github.com/IntelRealSense/librealsense/releases

None of the RealSense software (Viewer and Depth Quality Tool) seem to be able to detect my hardware, despite it showing up fine in the device manager

user-04d904 16 May, 2018, 16:40:55

Hi all, i'm using pupil capture to track my gaze on a table. I can't seem to get the calibration right. what's the best way to calibrate looking down on the surface. I tried doing this with the manual calibration feature

papr 16 May, 2018, 16:59:35

@user-81072d no it does not!... Unfortunately

user-81072d 16 May, 2018, 17:23:09

So would we need an older version of the SDK then?

papr 16 May, 2018, 17:37:30

The Pupil repositories point to the old versions already

user-ecbbea 16 May, 2018, 17:53:12

hey, just curious if when using the surface tracker -- is it possible to bound your gaze tracking to be only within the tracked surface? I have an experiment where the participant is looking at a monitor and I want to be able to determine only when they are looking at the screen. I have created the surface using the QR codes, but it doesn't look like it is bounding their gaze to just the surface May 15, 2018

papr 16 May, 2018, 18:06:00

@user-ecbbea no

papr 16 May, 2018, 18:08:43

But the surface tracker will mark which gaze is on the surface and which one is not

user-81072d 16 May, 2018, 18:51:42

Are gaze/fixation coordinates normalized to total screen resolution? E.g., calibration occurs on just one monitor, but if I have a triple monitor configuration, the center of the center monitor will be [0.5, 0.5] - right?

papr 16 May, 2018, 18:54:19

@user-81072d gaze is normalized to the world camera. Not to a specific screen. If you want to do that you need to setup surfaces.

mpk 16 May, 2018, 19:38:45

@user-81072d the reason Realsense is not showing up is a Windows driver issue. Please remove the Realsense drivers. Install system update and install drivers again. No need to install and librealsense framework.

user-81072d 16 May, 2018, 19:52:13

Thanks @papr and @mpk. I'm up and running now on one machine. The RealSense drivers have been a real headache in Windows

user-4fbb59 16 May, 2018, 20:20:57

Hi everyone, how do we convert between the timestamps given in the data to UNIX timestamps?

papr 16 May, 2018, 20:44:30

@user-4fbb59 you will need to set Capture's clock to the Unix clock via Pupil Remote or the Time Sync protocol.

user-4fbb59 16 May, 2018, 20:45:57

I see- is there a detailed tutorial on how to do this?

user-4fbb59 16 May, 2018, 20:46:04

Also- are the units still in seconds?

papr 16 May, 2018, 20:46:32

All timestamps in Capture are floats in seconds.

user-4fbb59 16 May, 2018, 20:47:23

Okay, I'll take a look at that

papr 16 May, 2018, 20:47:40

See the time sync protocol definition in the Pupil repository (linked above) or the Pupil helper remote control file.

user-f1eba3 16 May, 2018, 20:57:39

When sending pupil service the current timestamp : Do I have to send it on the same request socket or it doesn't really matter ?

papr 16 May, 2018, 20:58:04

No it does not matter.

papr 16 May, 2018, 20:59:38

The Remote plugin does not differentiate incoming requests at all. Only important thing: Each request gets exactly one response. And the client needs to receive the response. Else the socket will block.

user-4fbb59 16 May, 2018, 20:59:58

i'm somewhat confused- do i have to write my own code to sync pupil's clock with UNIX?

papr 16 May, 2018, 21:00:59

Yes

papr 16 May, 2018, 21:01:24

Why do you want to use the Unix clock?

user-4fbb59 16 May, 2018, 21:05:23

i'm trying to sync it with another utility (a pen) that uses unix time stamps

papr 16 May, 2018, 21:08:36

The reason why Capture does not use Unix timestamps: time.time() is not monotonic and the Unix time is not precise because it uses a lot of the available digits for the time passed since 199X.

papr 16 May, 2018, 21:10:41

Therefore we do not want to encourage users to use the Unix timestamp. You need to know what you are doing if you want to sync Capture to a different data source.

user-4fbb59 16 May, 2018, 21:16:39

I see

user-4fbb59 16 May, 2018, 21:16:51

What would you recommend if I was in need of time aligning the pupil with another data source (such as the pen?)

papr 16 May, 2018, 21:20:01

Line 42. Pass the string representation of the current Unix timestamp instead 0.0

user-4fbb59 16 May, 2018, 21:20:16

okay- perfect! Thanks @papr I'll take a look into that

user-4fbb59 16 May, 2018, 21:20:22

and this is over zmq i assume?

papr 16 May, 2018, 21:20:23

This should be precise enough for now

papr 16 May, 2018, 21:21:15

Yes, that is correct. Alternatively you could use write an plugin that resets the clock directly in Capture. But I do not have example code for that

user-81072d 16 May, 2018, 21:45:20

Ok, one more challenge for @papr, @mpk, and any other experts.

On one of my machines, the 2 of the RealSense devices (RGB and DEPTH) show up as Cameras, and they appear to have a default windows driver installed. The third device (Left-Right) appears as an imaging device with the Intel driver.

I believe the drivers are part of the RealSense Depth Camera Manager software, which I've installed and uninstalled numerous times. I've also tried some driver cleanup tools, but no matter what I do, I can't seem to make the RGB and DEPTH devices use the actual driver from Intel.

user-81072d 16 May, 2018, 21:45:50

The devices show up and work fine on a different machine, so I'm really scratching my head

user-4fbb59 16 May, 2018, 23:21:13

okay, thank you!

user-bd2540 17 May, 2018, 06:45:46

Question about 3D eye position detection: 1. What the green circle mean?

Chat image

papr 17 May, 2018, 06:46:20

Hey @user-bd2540 The green circle indicates the outline of the fitted 3d eye model

papr 17 May, 2018, 06:46:34

This looks like a very good fit in this case

user-bd2540 17 May, 2018, 06:52:15

Question 2: Thank you very much. For 3D eye model, you have a sphere for the whole eye ball and one sphere for the corneal. Is this correct? How many infared lights are needed at least for 3D eye position detection?

papr 17 May, 2018, 06:53:42

The current model does not model the cornea. The second/red circle shows the pupil. The 3d model is fitted using a glint-free method that is based on a series of pupl ellipses

user-bd2540 17 May, 2018, 06:53:56

Question 3: What the blue wireframe mean and what the red wireframe out sice mean?

Chat image

papr 17 May, 2018, 06:54:46

We fit multiple models in parallel if new observations do not fit the old model anymore.

papr 17 May, 2018, 06:55:08

Blue is the current model, red are potential new candidates

user-bd2540 17 May, 2018, 07:03:30

Question 5: How do you measure the maturity of the model as shown in the following picture? What the purple points in the image mean?

Chat image

user-bd2540 17 May, 2018, 07:25:30

Question 6: Why don't you use the glints 3D eye position detection method?

user-bd2540 17 May, 2018, 08:15:24

May I ask that when the glints based 3D eye detection method is replaced by the glint-free method? (https://pupil-labs.com/blog/2016-10/open-source-3d-pupil-detection-and-mapping-for-windows/) On this webpage, you publish the 3D eye position detection based on glints. So I thought that current 3D eye position is detected using glints

papr 17 May, 2018, 08:15:59

To Q6: Our pipeline nees to work as well as possible outside. Glints are not very realiable outdoors. Therefore we choose the pupil-ellipse-based method.

papr 17 May, 2018, 08:17:05

This is a third-party implementation and is not part of our software

papr 17 May, 2018, 08:18:28

To Q5: This is kind of difficult to say. We have a model confidence based on how well new data fits the old model.

papr 17 May, 2018, 08:18:57

The purple points are older pupil positions

user-bd2540 17 May, 2018, 08:26:55

Thank you very much for your patient answer. Thanks

user-bd2540 17 May, 2018, 08:27:05

@papr

papr 17 May, 2018, 08:27:32

You are welcome. Let me know if you have any other questions. ๐Ÿ‘

user-bd2540 17 May, 2018, 08:28:02

For the user wearing glass, how it performs?

papr 17 May, 2018, 08:29:20

It works quite well as long as the glasses' frame does not occlude the eye cameras.

papr 17 May, 2018, 08:30:34

Although glasses often introduce reflections that effect the pupil detection. We recommend to use contact lenses.

user-bd2540 17 May, 2018, 08:31:01

Thanks a lot.

user-29e10a 17 May, 2018, 09:15:55

ok got it โ€“ my issue about the PyAV above is due to the ffmpeg 4 version. they deleted some things on which the PyAV 0.4 relies on... (PyAV 0.4.1 will have resolved this issue, but isn't released yet) .... if you just brew install ffmpeg, the new version will be installed.

papr 17 May, 2018, 09:16:44

Thank you for the update

user-49ac90 17 May, 2018, 14:23:30

Greetings room. Looking to buy a Pupil headset. On the checkout page, I can't get the "country" drop down to appear in Firefox. Switching over to Chrome, that works, but when I submit the form (I intend to use a PO), the form just hangs and says it is "submitting" ad infinitum

mpk 17 May, 2018, 14:25:57

@user-49ac90 sorry to hear that and thanks for your feedback! We will debug and fix this ASAP. In the meantime, please feel free to write your quote request for PO via email to sales[at]pupil-labs.com we will get back to you with a Quote for PO processing.

user-49ac90 17 May, 2018, 14:27:45

@mpk Thanks, I'll send that email

user-06a050 17 May, 2018, 14:52:25

Hey! I'm trying to repair my Pupil installation for a few days now. The problem started when I updated OpenCV and/or ffmpeg via brew (I'm on macOS). But even after going back to the (assumed) previous versions of both, starting main.py won't work. Now, with the most recent versions (OpenCV 3.4.1 and ffmpeg 4.0), I get the following error: ld: library not found for -lboost_python3. When downgrading ffmpeg to 3.4.2, I get the this error: ImportError: dlopen(/Users/patrick/Code/offis/pupil_labs_leap_plugin/leap_plugin_env/lib/python3.6/site-packages/cv2.cpython-36m-darwin.so, 2): Library not loaded: /usr/local/opt/ffmpeg/lib/libavcodec.58.dylib, downgrading only OpenCV results in the following error: ImportError: dlopen(/Users/patrick/Code/offis/pupil_labs_leap_plugin/leap_plugin_env/lib/python3.6/site-packages/cv2.cpython-36m-darwin.so, 2): Library not loaded: /usr/local/opt/ilmbase/lib/libImath-2_2.12.dylib. Downgrading both results in the same error. Compiling OpenCV 3.2 (the version that you're supposed to install for Windows) didn't work either. I also tried simply symlinking some libs from the second error (libavcodec.58.dylib to libavcodec.57.dylib, for example), but this resulted in only more errors, and a seemingly endless chain of not found libs.

user-06a050 17 May, 2018, 14:59:48

Wait, something is wrong. Now that I upgraded both to the most current version I get the last error instead of the first... I"m sorry, but all this doesn't make any sense anymore. Fact is, no matter how often I reinstall everything (including the other things from the installation docs), it doesn"t work. Does anybody have a clue what I could be doing wrong?

papr 17 May, 2018, 15:08:34
  1. error: ld: library not found for -lboost_python3 indicates that you need to run brew install boost-python3
  2. Library not loaded: /usr/local/opt/ffmpeg/lib/libavcodec.58.dylib indicates that you need to install an older version of ffmpeg. I use version 3.4.2
  3. Library not loaded: /usr/local/opt/ilmbase/lib/libImath-2_2.12.dylib. I have not seen this error before.
user-06a050 17 May, 2018, 15:10:26
  1. Did that.
  2. Exactly the version I downgraded to, didn't work.
  3. Then this could a real problem?
user-06a050 17 May, 2018, 15:10:50

Which version of OpenCV have you installed?

papr 17 May, 2018, 15:11:29
  1. please reinstall pyav after downgrading
papr 17 May, 2018, 15:13:32

Interestingly my cv2.cpython-36m-x86_64-linux-gnu.so does not point to a libImath-2_2.12.dylib or anything similar

papr 17 May, 2018, 15:15:02

This issue seems to be related to opencv 2.4.x. Can you use brew to look up which version of opencv was installed?

papr 17 May, 2018, 15:15:31

I have 3.3.1 installed

user-06a050 17 May, 2018, 15:15:46

I have 3.4.2 installed

user-06a050 17 May, 2018, 15:17:15

Reinstalling pyav doesn't seem to work

papr 17 May, 2018, 15:17:35

python3 -c "import cv2; print(cv2.__version__)"

papr 17 May, 2018, 15:17:41

Could you run this for me please?

papr 17 May, 2018, 15:18:12

I want to make sure that the python version points to the same version as you expect

user-06a050 17 May, 2018, 15:19:16

3.4.1 is installed. The code doesn't work, I get this error: Traceback (most recent call last): File "<string>", line 1, in <module> ImportError: dlopen(/Users/patrick/Code/offis/pupil_labs_leap_plugin/leap_plugin_env/lib/python3.6/site-packages/cv2.cpython-36m-darwin.so, 2): Library not loaded: /usr/local/opt/ffmpeg/lib/libavcodec.58.dylib Referenced from: /usr/local/Cellar/opencv/3.4.1_5/lib/libopencv_videoio.3.4.dylib Reason: image not found

papr 17 May, 2018, 15:20:07

Ah, interesting. So it is not necessarily pyav's fault

user-e0d3fa 17 May, 2018, 15:26:59

Hello, we have been practising our calibrations and are seeing some variability in the accuracy of the eye tracking after the calibration. We are getting a good calibration window (in the corners of the frame) and pupil seems to be detected fine in the debug window when the participant looks up, down, left and right (and center). So we aren't sure why the eye tracking sometimes isn't accurate. One thing we want to know is: Is there a particular distance of the eye cam from the pupil that is optimal? Another question is: is there a particular angle of the camera to the eye that is optimal? And finally, can and should we be manually adjusting the green circle that encircles the eye? Thanks! Sara

papr 17 May, 2018, 15:27:53

can and should we be manually adjusting the green circle that encircles the eye?

This is not possible

papr 17 May, 2018, 15:30:29

Is there a particular distance of the eye cam from the pupil that is optimal?

Not that I know of.

Another question is: is there a particular angle of the camera to the eye that is optimal?

Position the eye cameras such that the 2d pupil detection is best. Did you try 2d calibration?

user-29e10a 17 May, 2018, 15:37:54

@user-06a050 the exact problem i had the last few days. You need to install the expected version of ilmbase (and you will see you need another version of openexr too). And remember to use brew switch

user-29e10a 17 May, 2018, 15:38:29

Or you wait until pyav 0.4.1 is available as a wheel

user-06a050 17 May, 2018, 15:39:03

Oh nice! Which version of ilmbase would that be? Or how do I find that out?

user-29e10a 17 May, 2018, 15:39:45

I think 2_2.12 as it is in the error

user-06a050 17 May, 2018, 15:40:36

Oh okay, so simple ๐Ÿ˜„ Thanks, I will try that out!

user-29e10a 17 May, 2018, 15:41:18

Took me hours ๐Ÿ™‚

user-29e10a 17 May, 2018, 15:47:12

@papr @mpk maybe it would be a good idea to open a source related channel โ€“ the mixture of threads in this channel is overwhelming

papr 17 May, 2018, 15:48:10

You are right. Which categories would you suggest?

user-29e10a 17 May, 2018, 15:50:14

Pupil Source // Installation // Hardware // Usage

papr 17 May, 2018, 15:50:56

Source in the sense of software functionality/how the software works under the hood?

user-29e10a 17 May, 2018, 15:51:52

For everyone working with the source code.

user-29e10a 17 May, 2018, 15:53:04

How it works under the hood is of main interest for all and fits in the main channel

papr 17 May, 2018, 15:53:15

Mmh, I will talk to @wrp about this. Thank you for your input on this. I think that this will be useful to everyone. We just need to figure out details on how to split the channels.

user-29e10a 17 May, 2018, 15:53:44

Of course ๐Ÿ‘๐Ÿผ

user-380f66 17 May, 2018, 16:08:16

We are using the 3D calibration. Should we be using 2D calibration?

wrp 17 May, 2018, 16:10:48

We can restructure/add channels. I will do this tomorrow.

user-bd2540 17 May, 2018, 16:46:03

What using binocular vergence mean?

Chat image

user-29e10a 17 May, 2018, 17:17:09

@user-bd2540 You are able (in principle) to tell if a subject focusses on something in 3D-space if you look at the crossing gaze vectors of the both eyes. It is different than pure eye accomodation because Pupil cannot track the cornea. It is also a litte different from a fixation which means how the conciousness of a subject is directed. Hope this helps

user-29e10a 17 May, 2018, 17:18:09

If something very near to you appears in your line of sight, your eyes have to accomodate to focus it

user-29e10a 17 May, 2018, 17:19:03

Two things are necessary: change the refraction of your eye lens (cornea) and moving the eyes to adjust the gaze

user-bd2540 17 May, 2018, 17:32:27

@user-29e10a Thanks. The thing that I don't understand is why the eye accomodation is menthioned? Why should I take this into consideration? It seems that it doesn't influence the accuracy.

user-29e10a 17 May, 2018, 17:51:35

Depends on your kind of research. And is only valid for binocular data. It is a โ€žhigh levelโ€œ eye movement and does not affect accuracy (only if youโ€˜re misinterpret your data)

user-bbf709 17 May, 2018, 19:40:46

Hi

user-bbf709 17 May, 2018, 19:41:03

im having problems trying to run pupil from source

user-bbf709 17 May, 2018, 19:42:07

when i run python main.py i get /usr/bin/ld: can't find -lboost_python-py36

user-bbf709 17 May, 2018, 19:42:50

its the same issue co https://github.com/pupil-labs/pupil/issues/874 its closed, but there's no solution

user-bbf709 17 May, 2018, 19:43:20

does anyone know haw does it fixes

user-bbf709 17 May, 2018, 19:43:28

how*

user-bbf709 17 May, 2018, 19:44:51

i've already reported it with a note on the same issue with the whole output

user-8944cb 17 May, 2018, 23:29:31

Hello, Will be happy for some help with sampling rate. When I record with the eye tracker connected to a laptop with 120 hertz for world camera, and 200 Hertz for eye cameras the frame rate I get in the combined video after exporting the recording from Pupil Player is about 70 (it is different every recording) - if I understand correctly this corresponds to the "Index" column in the "Gaze_Positions" excel spreadsheet. However when I record with the same values with Pupil Mobile the frame rate I get is 20-30. The question I have are: 1. Why does the overall frame rate is lower than specified in general? 2. Why does the frame rate is een lower when recorded with the Pupil Mobile with the same setting? 3. If I want to filter the gaze position data to find angular eye movement, should I address to the capture frequency as the 'eye camera sampling rate X 2' - Id I understood correctly this is the total number of gaze points I receive? Thanks in advance for your help!

papr 18 May, 2018, 07:18:36

@user-8944cb recorded frame rate can be lower than expected if the system was experiencing high load during the recording. The index column just indicates which gaze datum belongs to which world frame. To the third question: I would recommend to use the recorded timestamps to calculate time instead of assuming a fixed frame rate.

user-b91aa6 18 May, 2018, 08:32:15

Hi, What's the reason for this error?

user-b91aa6 18 May, 2018, 08:33:06

File "C:\work\PupilLab\pupil\pupil_src\launchables\world.py", line 96, in world import glfw File "C:\work\PupilLab\pupil\pupil_src\shared_modules\glfw.py", line 78, in <module> raise RuntimeError('GLFW library not found') RuntimeError: GLFW library not found

user-b91aa6 18 May, 2018, 08:35:23

In the tutorial, we are only told to Copy glew32.dll to pupil_external

user-b91aa6 18 May, 2018, 08:37:55

Question 2: How to I run pupil lab from source? Just Run main.py ?

papr 18 May, 2018, 08:39:46

Be aware that there two libs: GLEW and GLFW. These are two separate steps!

papr 18 May, 2018, 08:41:20

You run python3 main.py from within the pupil_src folder to start Capture, or python3 main.py player to start Player

user-b91aa6 18 May, 2018, 08:44:01

Yes, I have both libs

Chat image

user-b91aa6 18 May, 2018, 08:45:16

Still doesn;t work

Chat image

papr 18 May, 2018, 08:55:34

Sorry, I do not have any experience with the dev setup on Windows. The only thing I know is that we highly recommend to use macOS/Ubuntu for your dev setup.

user-b91aa6 18 May, 2018, 09:10:09

In the tutorial on website, we are only told to copy file about GLFW. I think something is missing . Of course the python can't import GLFW. Can you update the information on the wensite?

papr 18 May, 2018, 09:14:06

Dynamic libs are required to be stored in pupil\pupil_external so that you do not have to add further modifications to your system PATH.

papr 18 May, 2018, 09:16:12

Nonetheless glfw.py calls find_library('glfw3'). Not sure if this is intended. Maybe the pupil_external path needs to be added to the system PATH?

papr 18 May, 2018, 09:16:27

But as I said, I have no idea about the windows dev setup.

user-b91aa6 18 May, 2018, 09:35:49

Question1: While running main.py get the following error. Can't import pupil detectors

Chat image

papr 18 May, 2018, 09:37:48

Aaaah, my bad. There is a hint that you need to execute run_capture.bat instead of python main.py

papr 18 May, 2018, 09:37:59

This will probably fix the other issues as well

user-b91aa6 18 May, 2018, 09:43:42

The same error happens

Chat image

papr 18 May, 2018, 09:45:03

See the box with the question mark and build the pupil detectors explicitly https://docs.pupil-labs.com/#modify-pupil-detectors-setup-py

user-b91aa6 18 May, 2018, 09:48:20

I think that's the problem. It seems that the pupil detector is not built correctly. But what's the reason for this?

Chat image

papr 18 May, 2018, 09:51:50

I get output from the compiler when I run this command. Not sure why you do not gey any

user-b91aa6 18 May, 2018, 09:54:01

This is the content in build, is this correct?It seems that build doesn't run setup.py

Chat image

papr 18 May, 2018, 09:55:31

this is build.py this is different from what would run by calling python setup.py build

papr 18 May, 2018, 09:57:42

Maybe switch https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/pupil_detectors/setup.py#L120 from quiet=True to quiet=False

user-b91aa6 18 May, 2018, 09:59:22

It doesn't work. This is really weird.

Chat image

user-b91aa6 18 May, 2018, 09:59:58

When it says, running build, what the codes are doing?

Chat image

papr 18 May, 2018, 10:01:54

This is cython/dist tools territory and not related to our software project

user-b91aa6 18 May, 2018, 10:02:40

When we run python setup.py build, what the codes are doing

user-b91aa6 18 May, 2018, 10:02:44

This is the content of setup.py

Chat image

user-29e10a 18 May, 2018, 10:08:40

@papr I just added a pull request with the coarse pupil detection in C++ ... hope it helps

user-b91aa6 18 May, 2018, 10:09:09

Tahnks. What the pull request with the coarse pupil detection means?

user-29e10a 18 May, 2018, 10:10:04

@user-b91aa6 never mind, has nothing to do with your issue ๐Ÿ˜ƒ

wrp 18 May, 2018, 10:10:32

@user-29e10a software-dev channel has been created for software development related discussions

user-b91aa6 18 May, 2018, 10:11:48

What's the possible reason that I can't do python setup build in pupuil detector?

user-b91aa6 18 May, 2018, 10:12:34

When we run python setup.py build, what the codes are doing This is the content of setup.py

Chat image

user-b91aa6 18 May, 2018, 10:16:55

I see one image from other people's messages. It seems that the extention of mine is not built. Why this may happen?

Chat image

wrp 18 May, 2018, 10:37:26

@user-b91aa6 let's migrate this to software-dev

user-d9bb5a 18 May, 2018, 13:38:39

Good afternoon. Tell me, please, how to calculate saccades in pupil?

user-b571eb 18 May, 2018, 13:49:15

(Heatmap) how to define the surface? I see button A on the left and add surface (No.) in the plus in Tracker. Did I miss anything? How to define the surface? Or I need to do it in the pupil capture already? Thanks

wrp 18 May, 2018, 14:01:18

@user-d9bb5a currently Pupil software does not have a saccade classifier built in. @papr can you provide @user-d9bb5a with further information on this topic.

wrp 18 May, 2018, 14:05:30

@user-b571eb You can define surfaces in Pupil Capture or Pupil Player. First turn on the Surface Tracker plugin in Pupil Capture; or Offline Surface Tracker in Pupil Player. Next, ensure that the markers that you want to use to define your surface are visible and detected - you will see detected markers highlighted in the World window in Pupil Capture or the main Pupil Player window. Press a on your keyboard or the A button in the GUI to add a surface. Once you have added a surface. You can then name that surface in the Surface Tracker plugin GUI and set it's x, and y dimensions.

wrp 18 May, 2018, 14:06:48

If markers are not detected, please ensure the following: 1. That markers have enough of a white border around them 2. min marker perimeter in the Surface Tracker plugin is set such that your marker is detected in the scene. If you have small markers you will need to reduce the default min marker perimeter setting

wrp 18 May, 2018, 14:07:04

Again, you can define surfaces in either Pupil Capture or Pupil Player.

wrp 18 May, 2018, 14:10:14

Now on to heatmaps in Pupil Player. Once you have defined a surface in Pupil Player (or using surfaces defined already in Pupil Capture) ensure that your Surface has a X size and Y size defined. Size can be real world size - the sizes will determine the size of the png image output when you export the in Pupil Player and the number of bins for the histogram.

wrp 18 May, 2018, 14:10:20

I hope these notes are helpful

wrp 18 May, 2018, 14:10:33

Please refer to https://docs.pupil-labs.com for more information.

user-959f29 18 May, 2018, 14:26:55

2018-05-18 13:27:05,857 - player - [ERROR] launchables.player: Process Player crashed with trace: Traceback (most recent call last): File "launchables\player.py", line 478, in player File "site-packages\OpenGL\error.py", line 232, in glCheckError OpenGL.error.GLError: GLError( err = 1281, description = b'ung\xfcltiger Wert', baseOperation = glViewport, cArguments = (0, 0, 1280, 720) )

2018-05-18 13:27:05,857 - player - [INFO] launchables.player: Process shutting down.

user-959f29 18 May, 2018, 14:27:28

what does this mean? WIN10 Issue?

user-8944cb 18 May, 2018, 18:21:57

@papr Thanks for clarifying! One last thing about the sampling frequency, and differences between the exported "Pupil Positions" and "Gaze Positions" excel spreadsheet. I noticed that the number of gaze positions differs from the number of pupil positions for a given second. Additionally, the timestamps within the same second are also different. My question is what I need to do, or how I can match the timestamps to calculate a certain measure from the "Pupil Position" data (for example pupil diameter) and a time synched measure from the "Gaze positions" information? Thank you!

user-959f29 20 May, 2018, 12:06:23

Howdy,

I found this closed thread: #1097

I tried minimizing the pupil player window, upon restore the error appears. Might not be as unsystematic as I first thougt. Okay. So, a potential workaround is not to minimize the window. But it doesn't seem to be fixed - at least not in the latest release 1.7.42.

user-959f29 20 May, 2018, 12:21:42

is it fixed in the source code?

user-f1eba3 21 May, 2018, 10:42:06

Regarding the notify procedure to send pupil notify.eye_process.should_start.0

user-f1eba3 21 May, 2018, 10:43:14

m.Append ("notify." + data ["subject"]); m.Append (MessagePackSerializer.Serialize<Dictionary<string,object>> (data));

user-f1eba3 21 May, 2018, 10:43:27

is the first part of the message always not serialized ?

user-81072d 21 May, 2018, 16:40:54

I'm looking at buying another headset. Can anyone tell me which sensor is being used for the high-speed world cam?

user-f1eba3 22 May, 2018, 00:30:47

Does it mather on which subport I send the calibration handshking data ? If I send all the callibration data one subport and then read the data on another would that influence anything ?

mpk 22 May, 2018, 07:26:31

@user-f1eba3 yes. The first part is just the topic, its raw, the second part is the payload, msgpack serialised.

mpk 22 May, 2018, 07:26:58

https://docs.pupil-labs.com/#message-format

user-f1eba3 22 May, 2018, 09:41:46

Does it mather on which subport I send the calibration handshking data ?

papr 22 May, 2018, 11:27:58

@user-8944cb sometimes two pupil positions are used to generate a binocular gaze point. Use the timestamps in the base data column to identify which pupil positions were used for eaxh gaze point

user-8779ef 22 May, 2018, 15:40:54

Hey guys, has anyone spend the time to open up the vive and put a camera behind the optics? My lab has been considering giving this a go this summer and it would be a shame not to learn from previous experiences

user-8779ef 22 May, 2018, 15:44:07

@papr Was your decision not to put the cameras behind the optics due to manpower? Or a desire to keep it as a simple snap-in, Non-invasive installation? Or is there some other reason that I havenโ€™t considered yet and should be aware of?

mpk 22 May, 2018, 16:28:15

@user-8779ef we tried putting it inside. It's not easy to film through the fresnel lens and the modifications of the hmd are non reversible.

user-fa3706 22 May, 2018, 16:55:53

Hello, I am trying to implement surface tracking onto a driving simulator and after trying a few times resizing the markers used to mark surfaces, I haven't been able to get it to register the markers. I've been doing it in OS X Preview since I do not have much experience in Python coding. Our set-up is a vehicle a cab with monitors surrounding the cab to provide 360 degree field of view. The monitor directly in front of the cab is about 4.5-5 feet from where the drivers will be sitting depending on height. I was hoping you guys might have some recommendations.

user-8944cb 22 May, 2018, 18:27:41

@papr Thanks for answering. I have additional quick question about the reduced frame rate I am getting compared with the selected capture rare - What factors might contribute for the system to experience high load during the recording resulting in the reduced frame rate, and is there something I can do to improve this?

papr 22 May, 2018, 20:24:16

@user-fa3706 Try to reduce the min marker perimeter value in the settings to recognize smaller markers.

papr 22 May, 2018, 20:25:06

@user-8944cb other programs running, your computer being simply to slow, etc... What are your hardware specs?

user-8779ef 23 May, 2018, 02:01:16

@mpk Ok, thanks. Well then, full steam ahead, it seems. Our hope is to design a modified housing for the lenses, cameras, and hotplate that can be put in the helmet. Yes, this will be irreversible

user-e9f4b4 23 May, 2018, 05:14:22

Hi, does anyone know if it's possible to add a hardware trigger to synchronize frame acquisition between cameras? The docs suggest it's not possible, but... maybe it can be done?

user-6ef9ea 12 February, 2021, 20:11:41

Any updates on this? We are trying to assess whether we can use pupil core to measure spike-triggered pupil dynamics and would need the ability to either hardware trigger the cameras, or alternatively get a shutter exposure strobe signal output to send into our acquisition board

user-b116a6 23 May, 2018, 11:00:44

Hello guys, I seem to have a problem, when I try to edit a surface the window freezes and the point does not drag to the position I want it to due to this freeze. I tried with different versions of the Pupil Capture application (1.6 and 1.7) but still no luck. I tried to open the application through the source code as well (1.6 and 1.7) but again the window freezes. My current setup is the Macbook Pro Retina running MacOS High Sierra 10.13.4 with 16GB Ram and 2.2GHz i7 processor. I tried running the applications on other PCs as well (Macbook Pro Retina 10.13.2 and a HP Windows All-in-one PC) with the same versions of applications and the surface editing works perfectly which makes me think that it's my Macbook that has some problem. Can it be a storage problem or something else? Thanks in advance for any help.

user-78dc8f 23 May, 2018, 14:44:55

@papr @wrp Hi All. We are furiously processing data for a conference next week. We've successfullly processed multiple participants, but the last 3 are causing problems. We first load up multiple plugins. Everything processes fine. Then we try to set the minimum fixation duration from 300 to 100. Pupil player starts to reprocess fixations, but half way through, it halts and then the software crashes. Can you help?

user-f1eba3 23 May, 2018, 15:31:41

Hey

user-f1eba3 23 May, 2018, 15:32:10

is the position.z = PupilTools.CalibrationType.vectorDepthRadius[0].; in the 2d Calibration modus just for specifing something on the imediat plane ?

mpk 23 May, 2018, 15:54:18

@user-78dc8f can you share the logfule from player after the crash?

user-9b14a1 23 May, 2018, 16:18:35

@mpk Hi , I have a codec problem with the pupil player on a 10 years intel CORE 2 DUO windows 10 64bit. Like the following it is looking if I run it from CMD. Is it may different with different OS. I was recording the folder on my android mobile, that sometimes is running out of storage. Could that end up in buggy mpeg files that are read by ffmpeg on OSX but not on windows 10?

player - [INFO] launchables.player: Process shutting down.

C:\Program Files\pupil_player_windows_x64_v1.7-42-7ce62c8>pupil_player.exe . . . player - [ERROR] player_methods: No valid dir supplied (pupil_player.exe) player - [INFO] launchables.player: Session setting are from a different version of this app. I will not use those. player - [INFO] launchables.player: Starting new session with 'C:\Users\Admin\Desktop\20180408143738803' . . . player - [INFO] launchables.player: Session setting are a different version of this app. I will not use those. player - [ERROR] libav.mov,mp4,m4a,3gp,3g2,mj2: moov atom not found player - [ERROR] launchables.player: Process Player crashed with trace: Traceback (most recent call last): File "launchables\player.py", line 393, in player . . . File "shared_modules\audio_playback.py", line 70, in init . . . File "av\container\core.pyx", line 182, in av.container.core.ContainerProxy.err_check File "av\utils.pyx", line 103, in av.utils.err_check av.AVError: [Errno 1094995529] Invalid data found when processing input: 'C:\Users\Admin\Desktop\20180408143738803\audio.mp4' (libav.mov,mp4,m4a,3gp,3g2,mj2: moov atom not found)

user-c1458d 23 May, 2018, 17:09:07

@papr Looking to buy a PupilLabs headset. I came here probably a week (?) ago to report I was having difficulties with the web checkout form. It was suggested I email [email removed] I did that. I have not received a response. It shouldn't be this hard to give you my money ๐Ÿ˜ƒ

user-78dc8f 23 May, 2018, 17:09:34

@mpk how do I do that?

mpk 23 May, 2018, 17:18:10

In pupil player settings dir These is the logfile. Right after the crash it will contain the crash report

mpk 23 May, 2018, 17:18:21

there*

user-78dc8f 23 May, 2018, 17:19:35

@mpk player.log?

mpk 23 May, 2018, 17:19:51

I think so

user-78dc8f 23 May, 2018, 17:20:11

@mpk can I attach that here? or send by email?

user-78dc8f 23 May, 2018, 17:20:58

player.log

user-78dc8f 23 May, 2018, 17:21:07

@mpk there you go...

mpk 23 May, 2018, 17:21:37

I'll look at this tomorrow. I'm not in the office now.

user-78dc8f 23 May, 2018, 17:22:05

@mpk ok. thx. Zip me an email with any observations: [email removed]

user-bfecc7 23 May, 2018, 21:17:10

Timestamp question here. I am looking to see the computer clock timestamp of the first frame of my videos to as many decimal points of a second as possible. Looking in the info.csv file there are 3 different start times: "Start Time", "Start Time (System)" and "Start Time (Synced)". The what are the units of System and Synced, and is there any way to calculate/see when the first frame was recorded using this information? Thanks.

mpk 24 May, 2018, 06:21:39

@user-bfecc7 units are always second. Start time system is Unix EPOCH. Start Time (Synced) is the Pupil internal clock and then one that we use for timestamping.

mpk 24 May, 2018, 06:22:37

if you take the diff of both and add this to the timestamps you have them in UNIX Epoch. (take note that you may lose decimals when doing this without changing type.)

user-603c57 24 May, 2018, 09:08:33

Hi, does it make sense to do an offline pupil detection but use gaze from recording? My guess would be that thae gaze points in the video output won't be affected by the offline pupil detection, correct? Vice versa: If I use puil from recording and start an offline gaze calibration, gaze points will be computed based on the recorded pupil data. So that workflow would make sense, correct?

user-bfecc7 24 May, 2018, 13:49:33

@mpk Thank you very much. Follow up question: I use Pupil Mobile and stream it over wifi to my computer where I start the recording and save them directly to my computer. Would the Start Time(Synced) be the Computer's internal clock at time of recording or the Phone's internal clock.

mpk 24 May, 2018, 13:59:06

@user-bfecc7 if you turn on pupil time sync (just start the plugin) Pupil mobile will sync with Pupil Capture and you can consider Pupil Capture clock to be used everywhere.

user-9b14a1 24 May, 2018, 14:04:28

SOLVED my crashing pupil player : . File "av\container\core.pyx", line 182, in av.container.core.ContainerProxy.err_check File "av\utils.pyx", line 103, in av.utils.err_check av.AVError: [Errno 1094995529] Invalid data found when processing input: 'C:\Users\Admin\Desktop\20180408143738803\audio.mp4' (libav.mov,mp4,m4a,3gp,3g2,mj2: moov atom not found)

SOLVED it by - running it from CMD as Admin - gave write permissions to the pupil_player application folder - granting write permissions to project-folder

After that, I got another ERROR Traceback (most recent call last): File "launchables\player.py", line 393, in player File "shared_modules\plugin.py", line 294, in init File "shared_modules\plugin.py", line 321, in add File "shared_modules\audio_playback.py", line 131, in init File "site-packages\pyaudio.py", line 750, in open File "site-packages\pyaudio.py", line 441, in init OSError: [Errno -9996] Invalid output device (no default output device) player - [INFO] launchables.player: Process shutting down.

SOLVED by - Connecting HEADPHONE to the DELL windows 10 CORE 2 D

user-bfecc7 24 May, 2018, 14:06:10

@mpk And the Pupil Capture gets its clock from the computer the application is running on. Great, thank you very much.

user-c1458d 24 May, 2018, 14:31:10

@here As per a couple postings here, can anyone official with PupilLabs help me purchase a head set??? Web form is broken and the sales team hasn't responded to my email!

wrp 24 May, 2018, 14:34:05

Hi @user-c1458d apologies for the delay in response. Out of curiosity what browser were you using to submit the web form?

wrp 24 May, 2018, 14:35:29

We try to be quick with responses - please send us another email or let me know your email address via DM so I can sort this out for you

user-c1458d 24 May, 2018, 14:40:09

I can't see the "Country" drop down box in Firefox

user-c1458d 24 May, 2018, 14:40:22

I can complete that field using Chrome, but it hangs on form submission

wrp 24 May, 2018, 14:46:13

Ok, thanks for the feedback. We will run some more tests and try to recreate this behavior.

user-8944cb 24 May, 2018, 19:41:46

Hello, I am looking for a way to synchronize Pupil capture and a Motion capture system (Qualisys Miqus), and to get information where people are looking. If I understand correctly, there is a difference between triggering them to start recording at the same time, and between synchronizing them . I found the middleman script by @user-e7102b and @user-dfeeb9, and was wondering if this would be a good solution to trigger the recording of pupil capture? Would triggering them at the same time enable us to get information where people are looking the motion capture coordinate system in some way? Will greatly appreciate any tips on how I should approach this...Thank you!

user-dfeeb9 24 May, 2018, 19:44:18

hi @user-8944cb. I would like to ask about the temporal sensitivity of your motion capture system and data

user-dfeeb9 24 May, 2018, 19:44:40

using the middleman you will likely end up with some small milliseconds of delay

user-dfeeb9 24 May, 2018, 19:46:00

the best solution would be to use some kind of resynchronisation or time correction like Pupil has natively. we haven't implemented it in our middleman though.

user-8944cb 24 May, 2018, 19:48:06

@user-dfeeb9 thanks for the fast reply! The motion capture can record in a capture rate of up to 640 hertz. Is there any way to account for the delay, meaning will it be constant, or will it increase over time? Where can find information regarding the time correction? Thank you!

user-dfeeb9 24 May, 2018, 19:51:19

It's been a while since I looked, there are docs in the pupil github. I'll try to find them in a minute. 640 hz rate is certainly not something you want to be using our raw middleman for. you may end up needing to tool fit for purpose. The delays are not consistent and vary on context, but they don't get worse over time too much. I haven't tested the middleman as extensively as i would have liked to but in 20 minute experiments they consistently range from 0-40ms on average.

user-dfeeb9 24 May, 2018, 19:55:20

@user-8944cb please try taking a look at these, https://github.com/pupil-labs/pupil-helpers/tree/master/network_time_sync

user-8944cb 24 May, 2018, 19:57:31

@user-dfeeb9 Thank you! I will read it.

user-9d7bc8 24 May, 2018, 22:56:57

Hey all. I'm trying to get pupil to run on a Raspberry Pi 3 B+, but I don't have much experience at all with c++. How would I go about building the project from the git repo for the Pi?

wrp 25 May, 2018, 02:21:10

Hi @user-9d7bc8 you will need to look through instructions on building from source here https://docs.pupil-labs.com/#linux-dependencies - You do not need to have experience with C++, but do need to have experience in setting up dependencies and being familiar with developing in linux. Unfortunately I will not be able to give you feedback/tips for RPI - but hopefully the notes in the linux dependencies dev notes will help. Additionally I think this conversation might be best migrated to software-dev ๐Ÿ˜„

user-9d7bc8 25 May, 2018, 02:23:22

Awesome, thanks ๐Ÿ˜€ If I have any more questions, I'll ask there

wrp 25 May, 2018, 02:23:49

Thanks @user-9d7bc8

user-29e10a 25 May, 2018, 11:20:32

Hi, if I use the h264 encoder to record eye videos, which quality preset for ffmpeg is used for this? I'm asking how I can convert the non-h264 to the same size and quality as if they were recorded in h264 in the first place.

user-8779ef 25 May, 2018, 13:45:52

@mpk Any reason there has been no attempt to integrate the 200 Hz cameras into the Vive?

mpk 25 May, 2018, 13:46:15

We are working on it :-)

mpk 25 May, 2018, 13:46:28

Will be an addon too.

mpk 25 May, 2018, 13:46:47

It is a bit tricky with the lens

user-8779ef 25 May, 2018, 13:47:07

Ah, oK. Great. Is there any interest / mutual benefit to working with Jeff Pelz and I on our summer project to move the camera behind the lens?

mpk 25 May, 2018, 13:47:37

Let's me discuss this here. I'm not in the office right now.

user-8779ef 25 May, 2018, 13:48:31

Sounds good! ...is the Unity integration totally ready for the 200 Hz swapout, or will additional work needed to be done to the software?

mpk 25 May, 2018, 13:51:03

No software changes nessesary

user-8779ef 25 May, 2018, 13:51:08

Beautiful.

user-603c57 25 May, 2018, 17:19:18

experienced this now multiple times: what to do if one eye has a "constant shadow" (either everything is very light or everything is very dark)? recorded with pupil mobile. changing camera settings in the preview didn't help as well as resetting the app or changing USB cable. It doesn't seem to be lasting, but occuring from time to time. opened an issue on github: https://github.com/pupil-labs/pupil/issues/1200

Neue_Bitmap.bmp

user-c7a20e 27 May, 2018, 08:48:02

Hi guys, so I'm trying to get Pupil Capture to recognize camera and it doesnt work with any, not my integrated webcam, not a webcam by Logitech, not 3 different UVC OEM cameras (Raspberry, etc) ranging from 30 fps to 120 fps ones, while all of these cameras work fine. I used Zadig on all of them to change their drivers to "libusbK (v3.0.7.0)" as the docs mention but nothing. What might be the issue?

mpk 27 May, 2018, 13:29:21

@user-c7a20e this is a driver setup issue. can you use linux or mac. Less issues on those plattforms...

user-c7a20e 27 May, 2018, 14:20:16

Afraid not...

user-c7a20e 27 May, 2018, 14:20:28

What do you mean though? If you mean use Linux only for updating camera drivers then using the cameras on a Windows environment then thats possible.

papr 27 May, 2018, 18:46:55

No, he meant if you can use Linux/Mac to run the Pupil applications. You cannot use Linux to install windows drivers.

user-c7a20e 27 May, 2018, 19:40:55

Well, I can't. Does this also apply to the python SDK?

papr 27 May, 2018, 19:48:17

You mean running from source if you say SDK? The bundled application uses the same drivers as the software that runs from source. As mpk said, this is a driver setup issue and not related to our application. Did you verify in the Device Manager that the cameras are listed as expected?

user-c7a20e 27 May, 2018, 19:51:11

how should they be listed? Its listed as "usb camera"

papr 27 May, 2018, 19:51:42

They should be listed as libusbK devices

user-c7a20e 27 May, 2018, 19:52:13

yup, they are, named "Usb Camera"s

papr 27 May, 2018, 19:52:21

This means that the manual driver install procedure was not successful yet

papr 27 May, 2018, 19:53:28

Beware that you will not be able to use your camera in a different program, e.g. Skype, as long as you do not roll back the drivers to default

user-c7a20e 27 May, 2018, 19:53:46

Sure. This is the error: "EYE0: The selected camera is already in use or blocked"

papr 27 May, 2018, 19:54:21

Yes, as long as the cameras are not listed in the libusbK category, Capture will not recognize them

user-c7a20e 27 May, 2018, 19:54:36

they are

papr 27 May, 2018, 19:54:57

Ah my bad, I misunderstood.

papr 27 May, 2018, 19:56:05

Then please open Capture, and select the UVC Manager icon on the right of the eye process window and select the cameras. There should be cameras listed, correct?

user-c7a20e 27 May, 2018, 19:57:19

yes, thats the error i get when i try to select my camera from there, in "Activate Source"

papr 27 May, 2018, 20:00:52

Mmh, weird. This is unexpected. I will ask my colleague about it tomorrow. Maybe he has an idea what the issue might be.

papr 27 May, 2018, 20:02:26

But generally we recommend not to use custom cameras on Windows. The whole driver situation is very unstable and we only support automated driver installation for our supported cameras.

papr 27 May, 2018, 20:03:02

Mac and Linux do not have such problems.

user-c7a20e 27 May, 2018, 20:03:23

okay, I'll wait

user-af87c8 28 May, 2018, 16:03:40

Hey! Due to other reasons, we use an old pupillabs version (0.9) (not possible to upgrade, sorry) on a dell latitude laptop. Unfortunately, during recording every ~1.5minutes ( +- a minute) we have a world-cam only framedrop of ~500ms - 1000ms (eye cameras seem to work fine). Any ideas?

papr 28 May, 2018, 16:04:58

@user-af87c8 Hey there! ๐Ÿ™‚ What plugins do you use? I guess that you are using a custom plugin as well?

user-af87c8 28 May, 2018, 16:05:55

yes, we send parallel port trigger every 2s

user-af87c8 28 May, 2018, 16:06:32

@papr also hey ๐Ÿ˜ƒ

papr 28 May, 2018, 16:09:20

A world-cam only frame-drop indicates a blocking call in the world event loop. This is very likely related to your plugin then ๐Ÿ˜›

papr 28 May, 2018, 16:09:31

Ah wait, minutes, not seconds

user-af87c8 28 May, 2018, 16:10:11

and sometimes for 500 sometimes for 1000. Will check the plugin. Do I get it right: Plugins are not started in independent processes?

papr 28 May, 2018, 16:11:48

No, plugins are run within the world loop. This means that if your plugin blocks the UI and everything within the world window will freeze as well.

papr 28 May, 2018, 16:12:20

If you have any blocking calls you will have to move your code to a background thread

papr 28 May, 2018, 16:14:56

Possible other reasons are system-depend, e.g. if you OS pages memory

user-af87c8 28 May, 2018, 16:17:54

ok. but I would need to mange the background thread?

user-af87c8 28 May, 2018, 16:18:11

I have an idea of what could block the loop, and also block it in 500ms chunks

papr 28 May, 2018, 16:20:41

Manage the thread in which sense?

user-af87c8 28 May, 2018, 16:20:53

start it, stop it, call it from time to time

user-af87c8 28 May, 2018, 16:22:37

I'm more wondering if I have to do it for myself, or if there is an ecosphere already there

papr 28 May, 2018, 16:23:26

I would recommend to do the following: Have a thread started with a function that has its own loops and runs your serial communication. Then add a mechanism (e.g. a list for messages) that can be used by recent_events to read/write from/to the thread

user-af87c8 28 May, 2018, 16:24:27

Can the thread asynchronously read data from pupil? Or does it need to wait until world is in the right state?

user-af87c8 28 May, 2018, 16:25:16

I did find one thing that could give raise to the issue in the trigger plugin somebody wrote in our group - so I will try fixing that first. The external thread makes a lot of sense as well

papr 28 May, 2018, 16:31:42

If I remember correctly, you were using a thread based approach before. I recommended to switch to the recent_events version assuming that you would not have any blocking calls. If you can remove the blocking call, you are fine with the current implementation. ok. Else you will have to go for the thread approach. In this case, I would recommend a message based approach. Have two queues (simple list objects as plugin attributes) one for reading and one for writing from/to the background thread. Put any message objects into it that you need to communicate to the background thread.

papr 28 May, 2018, 16:33:17

You can access your plugins attributes (incl. g_pool!) from the background thread. Threads are not really concurrent in Python. Therefore it should be rather safe to access these attributes directly.

user-af87c8 28 May, 2018, 16:33:40

ah cool, so gett self.g_pool.capture._recent_frame.timestamp (if it exists) should not be a problem

papr 28 May, 2018, 16:33:45

If you run in any concurrency problems I recommend to use https://docs.python.org/3/library/threading.html

papr 28 May, 2018, 16:33:57

Correct

user-af87c8 28 May, 2018, 16:34:21

mh. Ok Not sure I want to rewrite (or have the time ;-)) but we'll see if it helps what I did already

user-af87c8 28 May, 2018, 16:40:36

so I get it right. If in a plugin, I run time.sleep(0.01) I will have no data for 10ms. Or is there a buffer for the worldcam?

papr 28 May, 2018, 16:42:50

@user-af87c8 you are correct. There is no such buffer since the world loop does not move but is being blocked

papr 28 May, 2018, 16:43:27

@user-af87c8 if I remember correctly, you had to add sleeps in order for the serial message to go through

user-af87c8 28 May, 2018, 16:45:49

yes exactly. Ok I will need to check this. With 60Hz, this would mean I would drop 1 or 2 frames - not perfekt but also not terrible. But I fear the trigger plugin might not be the problem. 80% througout the function we happen to write out how long the function took, and its <1ms for all cases (and never 500/1000ms)

user-af87c8 28 May, 2018, 16:48:34

after sending a notify_all, there is no need to sleep - correct?

papr 28 May, 2018, 16:48:49

@user-af87c8 correct

user-af87c8 28 May, 2018, 17:24:57

ok. Looking at the world-timestamps, I'm finding my trigger plugin to introduce a constant 40ms (which it should be) delay (looking at np.diff(timestamps)). But the plugin is clearly not the cause of the delays

user-af87c8 28 May, 2018, 17:36:42

Ok I checked another recording (new pupillabs 1.6.X version binary, USB-C clip, different Pupillabs eyetracker). Same phenomenon. We employ a completly different plugin there (zmq) and the commonalities (besides being plugins) are a call to self.g_pool.capture._recent_frame.timestamp and notify_all

user-8fe915 28 May, 2018, 17:38:06

Many eye-frames on the same world-frame

Chat image

user-af87c8 28 May, 2018, 17:39:57

besides, plugin is uncorrelated to delay onset

user-af87c8 28 May, 2018, 17:46:28

and confirmed in another experiment (but same setup as the pupillabs 1.6 binary USB-C one). We recording with 120Hz binocular and 60Hz worldcam

user-af87c8 28 May, 2018, 17:46:57

We use USB3.0 throughout up until the clip

papr 28 May, 2018, 17:54:19

@user-8fe915 it is expected to get multiple Pupil positions for a single world frame if you had frame drops

user-af87c8 28 May, 2018, 17:59:09

exactly, just wanted to add a visual ๐Ÿ˜‰

user-bd0840 29 May, 2018, 09:40:09

hello everyone! i have just updated the pupil software to the latest version (1.7-42), but now my hmd calibration routine (based on the one in hmd-eyes) is broken ๐Ÿ˜ฆ pupil capture is bailing out with

world - [ERROR] launchables.world: Process Capture crashed with trace:
Traceback (most recent call last):
  File "launchables\world.py", line 445, in world
  File "shared_modules\accuracy_visualizer.py", line 133, in on_notify
  File "shared_modules\accuracy_visualizer.py", line 147, in recalculate
  File "shared_modules\accuracy_visualizer.py", line 187, in calc_acc_prec_errlines
IndexError: boolean index did not match indexed array along dimension 0; dimension is 1340 but corresponding boolean dimension is 1675

i checked recent update to hmd-eyes, but couldn't find any relevant changes. any ideas what could be the problem?

user-bd0840 29 May, 2018, 09:40:55

here's the full log from pupil capture: https://gist.github.com/skalarproduktraum/441382e90c26ba80874564901e79c2dd

user-58d5ae 29 May, 2018, 14:15:56

Hello, I am trying to implement eye-tracking in my Hololens app, but when I do seemingly the same operations than on the example given, it does the eye-tracking, but does not compensate for any head movement, any idea of what I could have forgot ?

user-8944cb 29 May, 2018, 16:37:18

Hello, I have a question about accuracy in 2D mode - with respect to what point the accuracy is calculated? In other words, how is the error in degrees of visual field is calculated if there are only X,Y coordinates information? Thank you!

user-f68ceb 30 May, 2018, 10:39:36

Dump question: For usability testing/research of website on desktop computers โ€“ย is one camera sufficient? Like the 2d approach? Or do I need two cameras for pupil tracking? Thanks for help.

user-58d5ae 30 May, 2018, 10:44:25

I didn't try with only one camera, but the double one is really very precise and it might be overkill for that kind of ergonomy research

papr 30 May, 2018, 12:47:10

@user-8944cb we use the camera intrinsics to deproject the 2d gaze and reference positions into 3d vectors within the world camera space. Afterwards we can calculate the pair-wise cosine distance resulting in the angualr error. See https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/accuracy_visualizer.py#L157-L211 for details

papr 30 May, 2018, 12:59:37

@user-f68ceb The problem with the monocular headeset is that it does not cover all eye positions, i.e. if the eye camera is mounted on the right side and the subject looks to the left it will be difficult to detect the pupil from this extreme angle. Looking straight forward should be covered well enough though. 2d is more precise than 3d but is more prone to slippage. You should be fine with 2d detection/mapping if your subject only looks at a screen.

user-c7a20e 30 May, 2018, 13:02:10

any updates on my libusbK issue? (""EYE0: The selected camera is already in use or blocked"")

papr 30 May, 2018, 13:09:05

@user-c7a20e You run the bundle and not from source, correct?

user-c7a20e 30 May, 2018, 13:09:17

yeah

user-c7a20e 30 May, 2018, 13:09:46

by source you mean git repo or running a python script in the folder?

papr 30 May, 2018, 13:10:33

Correct

papr 30 May, 2018, 13:10:44

Under which names are the cameras listed in Capture?

user-c7a20e 30 May, 2018, 13:12:31

"USB Camera"

papr 30 May, 2018, 13:13:05

And there is only one? Which camera is active in the world window?

user-c7a20e 30 May, 2018, 13:13:33

theres also the notebooks webcam labelled as "unknown"

user-c7a20e 30 May, 2018, 13:14:03

there is no camera in the world window. That said Ive also tried using that camera for the world camera but got the same error

papr 30 May, 2018, 13:23:55

I consulted my collegue. His guess is an implicit incompatiblity with drivers/pyuvc/uvc. Please understand that the driver situation on Windows is very tricky and that we cannot support hardward that we do not have access to. I am confident that you would not have this issue on mac/linux. Therefore my recommendation is to try these operating systems. You can easily install Ubuntu on an USB stick, boot from it and test the Linux Capture bundle on the live-usb-stick.

user-c7a20e 30 May, 2018, 13:25:40

Thats not a practical solution. I can ask the manufacturer to provide a different firmware/driver with the UVC camera. Will that work and what driver should I ask him to use? libusbK?

user-c7a20e 30 May, 2018, 13:26:24

By the way this has been the case with every single camera Ive tried. Unless nobody got this working on Windows I think the issue may be pinned down

mpk 30 May, 2018, 13:27:47

@user-c7a20e our video uvc backend is pretty finetuned and will not work with all uvc compatible hardware. Its all open source, feel free to build it all from source and mod to make it work with your HW. Please understand that its not possible for us to debug and support without having access to this HW.

user-c7a20e 30 May, 2018, 13:28:52

I can get them to send you one as well. The main thing got me interested with thos OEM camera is it can do 330 fps at QVGA. I wanted to study how much of improvement that could provide to saccade tracking

user-c7a20e 30 May, 2018, 13:29:36

(but again I cannot get any camera to work at all)

mpk 30 May, 2018, 13:29:59

@user-c7a20e as a hint, your camera needs to support mjpeg compression or you need to modify the uvc backend to support other video modes.

user-c7a20e 30 May, 2018, 13:30:18

(yeah its mjpeg and usb2)

mpk 30 May, 2018, 13:30:41

then it should work and you should try linux.

user-c7a20e 30 May, 2018, 13:32:14

Dunno what to say. The ones they sent me are configured to run at 120 fps VGA, mjpeg stream... (but can also run at 330 fps QVGA)

papr 30 May, 2018, 14:40:13

For everyone that followed @user-af87c8 's discussion in regard to frame drops during recordings: https://github.com/pupil-labs/pupil/issues/1204

user-1fb6d9 31 May, 2018, 06:22:16

Hi, this is Lakshman. I'm trying to build a gaze contingent experiment with the trial proceeding to the next one when fixated for some duration on a target among distractors. I want to work with psychopy for this but couldn't find right sources. Can anyone help with this? I used pygaze which has an AOI function to achieve my abojective but pupil labs tracker isn't compatible with pygaze. I'll be grateful for any help.

user-16d429 20 May, 2021, 23:17:18

Did it work? I just join the group and did a search for "gaze-contingent" and your message from 2018 popped up

user-bfecc7 31 May, 2018, 14:01:27

@wrp @user-006924 I saw that both of you are mentioning the Pixel 2 and it's capability to run Pupil Mobile. All of my work with Pupil Labs over the past several months has used a Pixel 2 and Pupil mobile and I've had no issue. If its not running on the phone my guess is bad cables. Hope this helps.

papr 31 May, 2018, 14:02:11

@user-bfecc7 Thank you very much! This is indeed helpful.

user-006924 31 May, 2018, 14:19:47

@user-bfecc7 thank you for your comment. I'm going to get the cables @wrp suggested and hopefully the problem will go away.

user-f68ceb 31 May, 2018, 14:38:06

Hi @user-006924 The cable will fix the problem. I have a Nexus 5X with Android 7.0.

user-9b14a1 31 May, 2018, 14:56:29

Hi, my pupil player 1.7.42 (on lubuntu 18.4) is exiting with the known error message: Session setting are a different version of this app. my android info.csv tells me data format v 1.4 ... But what can i do to load my recording?

I started the processing of the same recording on my mac with the pupil player 1.7.42. Here I didnยดt got that error. It seems to me that mobile recordings are bound to the mac and windows plattform. Is that right?

user-8944cb 31 May, 2018, 15:51:30

Hi @papr , Regarding your response from last week, I am sorry that I missed it. I was asking about the reduced frame rates after importing a recording to pupil player compared to the frame rate that was selected (120 hertz for world camera). The raw data videos before importing to pupil player are with the right frequency. Trying this on different computers - I receive a final frame rate of between 70-90 on laptops/stationary computers, and pupil mobile (running on one note 8 phone) - frame rate between 20-30fps. The best frame rate I got (90 when specified 120, the frame rate is inconsistent throughout the recording) was when recording on a computer with the following hardware specs: Processor: Intel(R) Core(TM) i7-6700HQ [email removed] 2.59 GHz Installed memory (RAM) 16.0GB. System type: 64-bit Operating system, x64-based processor I tried running it when no other programs are working. 1. Do you have suggestions to why it might be happening?
2. Is there any preferable computer for recording? 3. How is the inconsistent frame rate affect the exported gaze and pupil positions? Thank you so much for your help!

user-f1eba3 31 May, 2018, 18:40:22

Chat image

user-f1eba3 31 May, 2018, 18:40:37

where is the calibration_routines folder ?

user-f1cf33 31 May, 2018, 19:50:13

Hi, we recently exported a video where throughout the recording there are two fixations - one appears accurate with where the individual would be looking, and the other is definitely not correct. Everything else about the recording appears relatively normal with the eyes being tracked, etc. Is there any reason why this would happen and any way to prevent it in the future? Thank you!

user-f1eba3 31 May, 2018, 21:15:42

uvc: Turbojpeg jpeg2yuv: b'Invalid JPEG file structure: two SOI markers'

user-f1eba3 31 May, 2018, 21:15:55

Does anyone know what that means ?

End of May archive