core


user-0a2ebc 01 March, 2019, 02:58:44

Hi Friends

user-0a2ebc 01 March, 2019, 02:58:55

I am newbie

user-0a2ebc 01 March, 2019, 02:59:38

Could anyone share some dummy data so that i can use it to play around with pupil player

wrp 01 March, 2019, 04:11:54

@user-08f2c2 In Pupil Capture the most recent calibration will be used to estimate gaze positions. If you want to use multiple calibrations, I suggest that you start recording first, calibrate, continue recording and then do other calibrations as desired. You can then chose which calibration you want to use for sections of the recording post-hoc with Pupil Player.

wrp 01 March, 2019, 04:12:51

@user-08f2c2 please see offline calibration topic within this section of the docs: https://docs.pupil-labs.com/#data-source-plugins

user-f27d88 01 March, 2019, 04:17:54

Hello, @user-0a2ebc, you may have a look at https://drive.google.com/file/d/0Byap58sXjMVfQnAteVlQUlBJRjA/view?usp=sharing is an old demo for marker tracking. Here is also an old recording (from v0.7) https://drive.google.com/file/d/0Byap58sXjMVfRzhzZFdZTkZuSDA/view?usp=sharing

user-0a2ebc 01 March, 2019, 06:03:35

@user-f27d88 thanks a lot with this data. I have some issues with pupil player when I loaded the data the window dissapeared :)

user-0a2ebc 01 March, 2019, 06:04:15

I will try with your data @user-f27d88

user-f27d88 01 March, 2019, 06:04:50

You're welcome.

user-c1923d 01 March, 2019, 14:36:52

Quick question, are the gaze vectors filtered?

papr 01 March, 2019, 14:37:24

Hi @user-c1923d In Player, yes, by confidence.

user-c1923d 01 March, 2019, 14:37:58

Ah sorry, I meant if it's raw data or there is any type of filtering going on

user-c1923d 01 March, 2019, 14:38:17

Like a median filter, kalman filter, etc

papr 01 March, 2019, 14:39:24

Oh, I understand. Not explicitly.

papr 01 March, 2019, 14:42:31

The 3d gaze vectors are implicitly filtered by the 3d model. 2d pupil observations are per-frame observations (and therefore deterministic) while 3d pupil positions are based on 2d observations being applied to the current 3d model. Different 3d models therefore generate different pupil data. This sounds obvious but plays an important role during offline pupil detection.

user-bd800a 01 March, 2019, 14:44:01

Hello, I've been using the script for batch exporting (works great, I just added the annotations too thank you) however I sometimes have an error with msgpack

File "C:\Python34\lib\site-packages\msgpack\fallback.py", line 459, in _read_header
    raise ValueError("%s exceeds max_bin_len(%s)" % (n, self._max_bin_len))
ValueError: 1201675 exceeds max_bin_len(1048576)
user-bd800a 01 March, 2019, 14:44:30

If I open the file in pupil player then the folder becomes readable

papr 01 March, 2019, 14:44:43

@user-bd800a I have not encountered this error before and also do not know how to interpret it.

user-c1923d 01 March, 2019, 14:45:59

@papr Makes sense. Is it just the same effect as from the original Swirski model (ie, eye center through pupil center constraints) or you also have anything additional in there that would affect gaze precision?

papr 01 March, 2019, 14:48:55

@user-c1923d as far as I know, the current implementation is equivalent to the original swirski model. We are working on a successor version that will be able to correct corneal refraction, improving gaze depth and pupil diameter estimations.

user-bd800a 01 March, 2019, 14:51:32

@papr I just checked it is due to the latest version of msgpack, downgrading to 0.5.6 works fine

user-c1923d 01 March, 2019, 14:54:08

@papr ok, thanks!

papr 01 March, 2019, 14:56:45

@user-bd800a that is good to know! Thank you!

user-c1923d 01 March, 2019, 14:56:58

Any idea when the successor is coming out? (I assume that's the one from Kai, really cool stuff)

papr 01 March, 2019, 14:59:26

You are right. Next week we will publish v1. 11. An initial version of Kai's implementation is planned for v1.12. It will not include the ability to update the model position on the fly though.

user-c22e3a 01 March, 2019, 18:21:47

@papr thanks to you I can retrieve the gaze position in MATLAB when I'm using monocular calibration but what I can retrieve is either normalized position with respect to the scene or a 3d position with respect to scene camera is there a way to have the position of the gaze with respect to a tracked surface online, as we need it for on-screen experiment.

papr 01 March, 2019, 18:30:52

@user-c22e3a Mmh, yes, technically this is possible, but you would have to modify the code of the receiving Matlab code. Subscribe to surface.

user-a50aee 02 March, 2019, 14:11:00

Is pupil labs work with webcam??

papr 02 March, 2019, 15:37:40

@user-a50aee Assuming that you want to know if the Pupil Software supports remote eye tracking: No, it does not.

user-2be752 02 March, 2019, 19:27:47

hi! what does the yellow color mean in the visualization of the eye in pupil player/algorithm?

user-f27d88 03 March, 2019, 00:30:16

@user-2be752 Would you mind add a screenshot for the yellow color you are talking about?

user-21d960 03 March, 2019, 20:21:03

this extender doesnt even work

user-21d960 03 March, 2019, 20:22:22

no matter how I adjust it its always too low

user-21d960 03 March, 2019, 20:22:37

Chat image

user-2be752 03 March, 2019, 23:31:43

@user-f27d88

Chat image

user-0a2ebc 04 March, 2019, 02:14:07

Hi friends,

Does anyone have a sample of eye tracker report for client please ?

user-0a2ebc 04 March, 2019, 02:15:18

We would like to have an idea of how presenting and creating eye tracker report

user-0a2ebc 04 March, 2019, 02:15:27

Thanks in advanced

user-6997ad 04 March, 2019, 02:36:19

@papr sorry for the late reply. continuing convo from 2/22. Capture version 1.8.26. I tried auto mode (with and without auto exposure priority), and it seems to not affect the exposure differential between the eyes. How are the eyes being illuminated? Could there be an issue with one of the lights?

wrp 04 March, 2019, 04:45:39

Hi @user-21d960 could you please explain what you mean by "too low"? You can angle the eye camera up by rotating it within the ball joint to look "up" towards the eye when using this extender.

user-21d960 04 March, 2019, 05:06:35

It is at max angle currently

wrp 04 March, 2019, 05:06:55

can you send a screenshot of the eye image

wrp 04 March, 2019, 05:07:00

as reference

user-96755f 04 March, 2019, 10:55:31

Hello. There is no AOI tool in Pupil at the moment, is it? I'm currently working with surface markers but what I need is to count fixations from a smaller area. How can I do bypassing manual methods?

papr 04 March, 2019, 10:59:31

@user-96755f Surface Tracking is the only AOI method at the moment. What kind of AOIs are you looking for?

user-96755f 04 March, 2019, 11:02:44

We are using surface tracking to boxing off a flyer that we'll show on a PC monitor. We have a target in that flyer and we want to know how many times people are going to watch and re-watch it.

user-96755f 04 March, 2019, 11:07:11

So the heatmap is pretty much a great help to show us how much people watch our target, but I don't know if we are going to have the number of how many fixations are present in that specific area. I don't know if there are any tool to set bundaries and extract this kind of data from there

user-f81efb 04 March, 2019, 11:45:26

Hello, I am facing a problem with decreased frame rate of the video. It is only 20 Hz in world_viz.mp4 (instead of 60 Hz that was selected with 480x640 resolution of the world camera real sense R200), and accordingly the original world.mp4 video is fast-forward and only a third of the length. Thus we still have 1 video frame per 12 gaze data points. I am not sure why this is happening? Could you please help me

user-f81efb 04 March, 2019, 11:45:55

@papr

user-f81efb 04 March, 2019, 11:48:45

@wrp

wrp 04 March, 2019, 11:58:54

@papr can you respond to @user-f81efb when you are available?

papr 04 March, 2019, 12:14:48

@user-96755f If the offline fixation detector is running, there should be a fixations_on_surface*.csv file for each surface after export. Also, you can edit surfaces. This way you can setup surface markers around the screen but have the actual surface be around the target.

papr 04 March, 2019, 12:15:30

Hello @user-f81efb Could you please share the recording with [email removed]

wrp 04 March, 2019, 12:23:56

@user-96755f you might also find this tutorial useful: https://github.com/pupil-labs/pupil-tutorials/blob/master/03_visualize_scan_path_on_surface.ipynb

user-f27d88 04 March, 2019, 13:27:36

@user-2be752 "The yellow color are reflections. I think the paper mentions them. In this case, the eye image seems to be overexposed"- Thanks to @papr

papr 04 March, 2019, 13:28:21

This is the paper that @user-f27d88 is referring to: https://pupil-labs.com/blog/2014-05/pupil-technical-report-on-arxiv-org/

user-96755f 04 March, 2019, 13:44:49

@wrp @papr thank you so much! I'll test it soon

user-f81efb 04 March, 2019, 13:46:31

@papr yes I will share it now

user-f81efb 04 March, 2019, 13:46:44

Thanks @wrp

wrp 04 March, 2019, 13:46:54

Welcome!

user-f81efb 04 March, 2019, 14:02:37

I have sent the data

user-f81efb 04 March, 2019, 14:02:39

😃

user-21d960 04 March, 2019, 17:21:06

Hey i want to buy a dedicated computer to run the capture software

user-21d960 04 March, 2019, 17:21:10

is this good enough?

user-21d960 04 March, 2019, 17:21:11

https://ca.pcpartpicker.com/list/Jzp7sZ

user-21d960 04 March, 2019, 17:21:25

is there any specific things i should get or monitors?

papr 04 March, 2019, 17:22:22

This looks very good!

user-21d960 04 March, 2019, 17:31:00

are there certain monitors people use for testing or does a standard gaming 144hz one work

papr 04 March, 2019, 17:31:49

That should work. I personally use a DELL U2515H Display at work.

user-21d960 04 March, 2019, 17:52:44

What CPU and GPU do you use?

papr 04 March, 2019, 17:54:51

Chat image

user-21d960 04 March, 2019, 17:56:04

wait what, I use a intel 7700Q laptop with 1050ti and it only gets 30fps on eye camera

papr 04 March, 2019, 17:56:17

This machine is an above average amount of memory since I need to run virtual machines to create the bundled applications.

papr 04 March, 2019, 17:56:50

The graphics card does not play much of a role for pupil detection

papr 04 March, 2019, 17:57:05

I don't know this cpu model

user-21d960 04 March, 2019, 17:57:38

Chat image

papr 04 March, 2019, 17:58:05

Looks fine to me indeed

user-21d960 04 March, 2019, 17:58:13

it only gets 30 fps :/

user-21d960 04 March, 2019, 17:58:21

on 192x192

user-21d960 04 March, 2019, 17:58:29

which should be 200

user-21d960 04 March, 2019, 17:58:44

i thiink thats why i am getting bad detection

papr 04 March, 2019, 17:59:05

Less than 60fps is indeed bad for the 3d model.

papr 04 March, 2019, 17:59:50

And yes, you should be able to see 200Hz. Especially, since you only use a monocular headset, if I remember correctly?

user-21d960 04 March, 2019, 18:00:00

Yes thats correct

papr 04 March, 2019, 18:00:26

Which OS did you use? Did you check your cpu monitor? Is the cpu on maximum?

user-21d960 04 March, 2019, 18:00:43

Windows 10

user-21d960 04 March, 2019, 18:00:50

CPU is around 80%

papr 04 March, 2019, 18:02:14

From where do you get the fps number?

user-21d960 04 March, 2019, 18:04:05

Top left of the screen

user-21d960 04 March, 2019, 18:04:17

Chat image

papr 04 March, 2019, 18:06:55

That is in the world or the eye window?

papr 04 March, 2019, 18:07:21

Both windows have separate fps graphs.

user-21d960 04 March, 2019, 18:08:27

Eye

papr 04 March, 2019, 18:10:08

Can you connect your headset, and disable the detection and mapping mode in the world window's general settings? By how much does the cpu usage drop in the cpu monitor? Do the eye fps increase?

user-21d960 04 March, 2019, 18:15:03

Chat image

user-21d960 04 March, 2019, 18:15:08

yep

user-21d960 04 March, 2019, 18:15:13

but there is lag spikes

user-21d960 04 March, 2019, 18:15:24

dips to 2fps for a few milliseconds

user-21d960 04 March, 2019, 18:15:28

then back to 200

papr 04 March, 2019, 18:15:53

How long are these lags? These lags are a separate issue though.

user-21d960 04 March, 2019, 18:16:07

only for a split second

user-2be752 04 March, 2019, 18:17:24

@user-f27d88 is this because i'm testing outdoors, and the sunlight is hitting the eye?

papr 04 March, 2019, 18:18:16

@user-2be752 Yes, this is very likely. Did you try to enable the auto exposure mode in the eye window?

papr 04 March, 2019, 18:18:35

@user-21d960 mmh ok. This is not too concerning for now.

papr 04 March, 2019, 18:19:13

@user-21d960 So now we know that the pupil detection is the bottle neck on you machine and not the access to the eye images

user-21d960 04 March, 2019, 18:20:08

the tracking is decent now but since I am calibrating on a screen it is only a small view of my total vision, and it only works for that distance, how can I calibrate so I can walk around and it is still accurate

papr 04 March, 2019, 18:22:20

@user-21d960 Checkout the single marker calibration. It displays a single marker which the subject needs to fixate while moving his/her head. This allows to calibrate a flexible area.

papr 04 March, 2019, 18:22:58

If you do not need realtime data, you can leave the pupil detection disabled in Capture and use the Offline Pupil Detection in Player to do the analysi posthoc.

user-21d960 04 March, 2019, 18:25:04

so for example

user-21d960 04 March, 2019, 18:25:06

Chat image

user-21d960 04 March, 2019, 18:25:11

i did the single marker calibration

user-21d960 04 March, 2019, 18:25:17

i am looking at the E on DELL

user-21d960 04 March, 2019, 18:25:20

but its abit off

user-21d960 04 March, 2019, 18:25:25

how ccan I make this more accurate

papr 04 March, 2019, 18:27:57

This is already very accurate. What does the accuracy in the Accuracy Visualizer plugin say?

papr 04 March, 2019, 18:28:45

There is always an expected estimation error.

papr 04 March, 2019, 18:30:05

The orange lines shows that this error is not fixed either and depends on the region.

user-21d960 04 March, 2019, 18:33:50

hm thats about 2degrees accuracy

user-21d960 04 March, 2019, 18:34:00

the maximum accuracy is 0.6 degrees?

user-21d960 04 March, 2019, 18:35:07

also what does this CPU number refer to

user-21d960 04 March, 2019, 18:35:09

Chat image

papr 04 March, 2019, 18:42:00

0.6 is very very low. Less than 1 degree is already considered as very accurate. Please be aware that accuracy is not the only metric. Checkout this paper for a thorough evaluation: https://www.biorxiv.org/content/10.1101/536243v1

user-21d960 04 March, 2019, 20:10:46

ok

user-21d960 04 March, 2019, 20:11:25

@papr Thank you for your help, its much better and with a chinrest it will be under a degree as I have around 2 degrees currently

user-21d960 04 March, 2019, 20:11:44

do you use a chinrest for screen calibrating and testing?

papr 04 March, 2019, 22:27:14

@user-21d960 No, I do not

user-21d960 04 March, 2019, 22:36:28

interesting

user-21d960 04 March, 2019, 22:36:39

it seems for screen testing if I move my head it becomes very inaccurate

user-2ee432 05 March, 2019, 08:13:18

Hi guys. Can you please let me know if I can safely flash the firmware of the Pupil cam to use it as a USB webcam with other program I have and then flash back when I need to use it with Pupil Capture?

papr 05 March, 2019, 08:16:24

@user-2ee432 Can you clarify if you mean firmware or the Windows drivers?

user-2ee432 05 March, 2019, 08:17:55

I think driver. The thing you set in Zadig.

papr 05 March, 2019, 08:18:17

Yes, these are drivers. Yes, you can change the drivers.

user-2ee432 05 March, 2019, 08:19:32

Cool. What if I want to change resolution to 400x400 at 120Hz for my other program. is it a firmware thing or can be set in the same firmware?

user-2ee432 05 March, 2019, 08:20:38

(as opposed to 200x200 at 200Hz)

papr 05 March, 2019, 08:20:40

Higher frame rates (>30Hz) are likely to not be available with other drivers

user-2ee432 05 March, 2019, 08:21:32

I can get another usb camera to work at 120 fps but its not global shutter so I want to use the Pupil labs cam

user-2ee432 05 March, 2019, 08:30:48

do you maybe mean worse latency instead of fps?

papr 05 March, 2019, 08:39:33

No, I mean fps. But I have not a lot of experience with other drivers or drivers in general. You can try your luck. Just do not be surprised if things do not work as expected 🙂

user-2ee432 05 March, 2019, 08:40:03

As long as I can flash back its okay for me to try

user-f81efb 05 March, 2019, 11:11:27

Hello @papr did you find some time to look at the data that I had sent yesterday?

papr 05 March, 2019, 11:13:57

@user-f81efb not yet

user-f81efb 05 March, 2019, 12:05:56

ok

user-86c436 05 March, 2019, 12:38:10

Hey everyone, I need to detach my HTC Vive Eye Tracking Add On from one HMD to be able to attach it on another. Are there any tips on how to detach the rings from the lenses rather than "brute force"? Any help is very appreciated, thanks in advance 😃

papr 05 March, 2019, 12:40:05

@user-755e9e do you have any tips for @user-86c436 ?

user-c22e3a 05 March, 2019, 12:58:11

Hi, so we're thinking of buying the pupil lab add on for VR headset and I wanted to acquire about some points before that if possible. the first is it possible to have subject wear glasses while using the VR headset that is equipped with the addon (I fear that the glasses frame would be contact with eye cameras)? also I want to know if you have some SDK in order to retrieve information in Unity3D (C#) in order to use it within the scene? Thanks in advance.

papr 05 March, 2019, 13:46:03

Hi @user-c22e3a Even if the glasses do not touch the cameras, they might obstruct the eye camera's view. Please check out the vr-ar channel and project. It contains a Unity plugin to access the data remotely.

user-755e9e 05 March, 2019, 13:59:29

Hi @user-86c436 , to remove the Add-on you have to pull it from the bottom of the camera (you will still have to apply some force) being careful of not breaking the LED ring.

user-86c436 05 March, 2019, 14:02:08

@user-755e9e thanks a lot, that worked fine 😃 I first didn't dare to do that because everything looks so fragile. Yeah, about breaking the LED ring... I actually did break one of the thin plastic rings on my first try 😔 .

user-755e9e 05 March, 2019, 14:08:39

@user-86c436 is the LED ring still working?

user-86c436 05 March, 2019, 14:33:28

@user-755e9e well, in the Service app I can see pictures of both eyes - so I think yes?

user-755e9e 05 March, 2019, 14:42:21

@user-86c436 yes, good to hear!

user-86c436 05 March, 2019, 14:47:25

@user-755e9e definitely 😃 Thousand thanks for your support!

user-755e9e 05 March, 2019, 15:29:23

you're welcome @user-86c436 🙂

user-888056 05 March, 2019, 15:29:41

hello, are there any tips to increase accuracy? I have tried multiple calibration techniques but the detection is really offset

user-888056 05 March, 2019, 15:31:00

Chat image

user-888056 05 March, 2019, 15:31:14

i am looking at the red dot....

papr 05 March, 2019, 15:33:15

@user-888056 Please make a recording during the calibration and send it to [email removed] We might be able to give tips. Steps: 1. Start Capture 2. Start recording 3. Calibrate 4. Look at some example targets, e.g. your red dot 5. Stop recording 6. Upload the whole recording folder, to e.g. Google Drive 7. Share with data@pupil-labs.com

user-888056 05 March, 2019, 15:33:35

ok!

user-888056 05 March, 2019, 15:40:11

if i want to use the print out markers

user-888056 05 March, 2019, 15:40:21

should i print them at an a4 as they are?

papr 05 March, 2019, 15:42:08

That depends on the distance from which you want to show them to the subject.

user-888056 05 March, 2019, 15:42:30

So for recordings "in the wild"

user-888056 05 March, 2019, 15:42:41

how should we use them?

user-888056 05 March, 2019, 15:43:46

is it too open? the question?

papr 05 March, 2019, 15:43:55

Are you asking bout the procedure or how big they are supposed to be?

user-888056 05 March, 2019, 15:44:34

both? 😄

papr 05 March, 2019, 15:45:22

Checkout this example: https://drive.google.com/open?id=1qf17stN-ssDVLGA-KgAvD7nNVpcb4Gov This could give you an idea.

user-888056 05 March, 2019, 15:46:17

thank you. yes I have watched this video and i noticed that the markers were smaller

user-888056 05 March, 2019, 15:46:30

so I need to adjust the size of the marker

user-888056 05 March, 2019, 15:46:48

based on some expectation of the distance ? of the field of view?

user-888056 05 March, 2019, 15:47:33

since this is going to be in a building, could I use the surface markers on walls ?

papr 05 March, 2019, 15:47:54

The markers can be bigger. Bigger is better for detection in this case.

papr 05 March, 2019, 15:51:16

Yes, you can place them whereever you want. Make sure that there is good contrast between the white and black areas.

user-2be752 05 March, 2019, 19:03:59

@user-f27d88 can I set this autoexposure mode offline? or it has to be during the recording?

papr 05 March, 2019, 20:01:33

@user-2be752 this needs to be set before the recording

user-2be752 05 March, 2019, 20:21:08

oh... so I guess too late

papr 05 March, 2019, 20:22:36

@user-2be752 is the pupils detection that bad? What confidence values do you get?

user-2be752 05 March, 2019, 20:49:58

@papr they drop below .5 at some point, and the validation (I told them to follow my finger on 9 points, to see how good the calibration is ) shows that it's pretty bad

user-286246 05 March, 2019, 23:32:42

I'm new here. Has anyone been able to use this library to track pupil responses (movement, size) with a standard web camera with an IR light source? Sorry if this has been asked before.

user-f27d88 06 March, 2019, 05:10:41

@user-286246 "You can not use the webcam really unless your eye was right up next to it with IR light and IR bandpass filter a standard webcam (unmodified) would not work - needs to be IR", you may have a look at the DIY section in https://docs.pupil-labs.com/#diy

user-41c874 06 March, 2019, 09:20:21

@papr I will also follow the same suggestion you gave to zou_al of sending calibration recording to the email/drive. I currently performing analyses for accuracy and precision with different methods to decide which one to use. I am not convinced by that the accuracy( I visually see) is going to be good enough for our purposes. I should be able to share this with you soon. 😃 I do manual marker calibration and manual marker calibration with chin rest and then the single marker calibration and then validate it with dots shown on a surface. I extract the surface coordinates and then utilise that to perform my analyses. I will provide details of each file and how I calculate accuracy and precision and ask on input on that as well.

papr 06 March, 2019, 09:21:48

@user-41c874 Please be aware that calibrations do not stack but overwrite each other. During the validation, you will only see the estimation based on the single marker calibration.

user-41c874 06 March, 2019, 09:23:08

Yes ! I perform new validations after every calibration protocol I follow. Thanks! 😃

user-41c874 06 March, 2019, 09:25:24

I will also share my analyses plots studying different calibration methods once I am completely done collecting data and analysing. This might help other users also.

papr 06 March, 2019, 09:26:08

That would be great!

papr 06 March, 2019, 09:27:08

@user-41c874 Have you seen https://www.biorxiv.org/content/10.1101/536243v1 ? It might be related to your research

user-41c874 06 March, 2019, 09:50:45

Ah! That looks very useful for my work ! I hadn't seen this before. Thanks a lot !! 😃

user-2ee432 06 March, 2019, 09:59:43

Hi guys. I'm considering also purchasing the VR addon to put in my HTC Vive but I have a question regarding eye tracking vs gaze tracking. I believe I will need to calibrate the eye position/rotation relative to the screen by having the user view points popping up on each edge of the screen in my VR program and then know the rotation angle of the eyeball to know at what pixel(s) the eye is focus at each frame, correct? I believe the Pupil sourcecode does "2d eye tracking" too but I don't know what those 2d coordinates represent and could be used for and how it differs from "gaze tracking". For my use case I need to know where on the screen or field of view the eye is focused at.

papr 06 March, 2019, 10:33:04

@user-2ee432 Please reach out to the vr-ar channel for this question.

user-2ee432 06 March, 2019, 10:33:17

ok

papr 06 March, 2019, 13:30:33

@user-f81efb Hey, I had a look at your recording. This is a graph of the recorded FPS over time (open the image in your browser is the legend does not show up on discord). It shows that the recorded fps vary a lot. This is most likely due to high cpu usage/high system load during recordings. In these cases, Capture starts dropping frames to keep up.

Do you have this issue with all of your recordings? Can you run a recording in Capture and check if the cpu and fps graphs (top left in the world window) confirm this?

Chat image

user-888056 06 March, 2019, 15:50:07

hello, when using the mobile, how do i do the calibration?

papr 06 March, 2019, 16:11:30

@user-888056 You record the procedure and run the calibration post-hoc using the Offline Calibration plugin in Player.

user-e0772f 06 March, 2019, 16:42:45

Hello! I'm subscribe to the gaze position zmq traces and I'm receiving three different topics : 'gaze.3d.01', 'gaze.3d.0' and 'gaze.3d.1'. What is the different between this topics? I'm asking this question because I thought the information of this topics is the same but the timestamps of the traces are not in order between different topics.

papr 06 March, 2019, 16:47:50

@user-e0772f .01. indicates a binocular gaze point (merged from two pupil positions), while .0. and .1. are monocular gaze points.

papr 06 March, 2019, 16:49:07

The data fields also have subtle differences.

user-888056 06 March, 2019, 17:15:34

papr although both my capture and my mobile are in the same wifi, the capture does not seem to connect with the mobile. Should it be automatic? Is there something I need to set up? (thank you for all your answers so far! )

user-2c338d 06 March, 2019, 17:40:40

howdy! new user here. Having problems with the Player (only recently)... black box opens with no code, grey box for data folder does not fully open - just the outline. Re-installed software - no change. Running windows 10 (PC less than 1 yr old). Any ideas on a fix are greatly appreciated!!

user-06c1e0 06 March, 2019, 23:03:18

@papr could you please share some of the tips you had for @user-888056 on this thread, regarding accuracy of pupil data?

user-06c1e0 06 March, 2019, 23:03:54

I’ve also been having a lot of trouble with doing calibrations out in the wild (in a car)

user-f27d88 07 March, 2019, 04:24:20

@user-2c338d Which Player version you are using? Would you mind add a screenshot to describe your question?

user-54376c 07 March, 2019, 08:31:55

Hey Guys. I'm trying to use the fingertip calibration. I'm running macOS with NVIDIA GPU. The world camera frame rate drops to < 5fps, which indicates that HW acceleration is not working, right?

papr 07 March, 2019, 08:36:54

@user-54376c The bundled application does not support HW support. If you want to run with HW you need to install the dependencies (including pytorch with CUDA) and run the application from source.

papr 07 March, 2019, 08:37:30

@user-2c338d Please try deleting the user_settings_* files in the pupil_player_settings folder

user-54376c 07 March, 2019, 08:52:55

Thanks. Unfortunately there won't be HW acceleration for me, as I'm already on macOS Mojave (10.14) and NVIDIA doesn't get their CUDA drivers released. (https://devtalk.nvidia.com/default/topic/1042520/driver/-when-will-the-nvidia-web-drivers-be-released-for-macos-mojave-10-14-/71)

user-a69044 07 March, 2019, 10:54:41

Hi @papr I am having trouble calibrating my Intel Realsense D415 with the pupil tracker. I have tried multiple different methods and constantly get an increasing offset as I move my gaze away from the center towards the left/right. The system works perfectly however when using the pupil-labs world cam. Does this have anything to do with some hard coded values for this stock world cam as I did notice the stock world cam is heavily distorted and there may be some hardcoded intrinsic values that I am not aware of. Is there anything you are aware of that I should do to fix this ?

papr 07 March, 2019, 10:57:19

@user-a69044 Yes, this might be due to the camera intrinsics used. Please run the Camera Intrinsics Estimation plugin/procedure with the D415 selected as world camera.

user-a69044 07 March, 2019, 11:17:27

awesome, thanks @papr will give that a go

user-b2766f 07 March, 2019, 14:37:10

I am going to be recording some data this weekend using Psychtoolbox in Matlab. I am going to use our Pupil device as well, but the problem is the Pupil device is brand new to us. I can capture the Matlab current System Time (likely using the now() command) when events occur. Before I even bother, is there a way to be able to somehow sync the timestamps after the fact?
I would only need to sync a single timestamp - from there I can find the difference in time between events from the Matlab output, and then find the corresponding difference in the Pupil timestamps. At least, I hope that I can...

Edit: Perhaps I could watch the Player output video until the video frame that the first event happens. I get the Matlab system time of that event. Then I could find the Pupil timestamp associated with that frame of the video. If I assign 0 to that time in both outputs, then the timestamp outputs start at the same baseline, at least. I should be able to work from there...

user-1603a2 07 March, 2019, 14:43:20

hello, my left eye tracker camera has a problem of focus, it is always like a bit blurred and darker also seems not to recognize the pupil/recognizes it but it lags and it jumps out of the pupil and goes back in, so it creates artifacts on my basis line... when changing the camera settings there is no difference or it becomes worst than it used to. Could anyone help me to fix this?

user-87fec3 08 March, 2019, 00:01:49

If I have recorded videos using Pupil Capture, is there a way for me to go back and get the camera intrinsic parameters for that file? I am recording videos and then going back to analyze the files separately. Additionally how much distortion do the lenses of the camera cause? If I remember correctly, Pupil Capture does not use this for 2D position but does for 3D position. Why is this the case?

user-f27d88 08 March, 2019, 02:15:10

@user-1603a2 Did you follow the guide at https://docs.pupil-labs.com/#pupil-headset-adjustments and https://docs.pupil-labs.com/#4-calibrate for adjustments and calibrate?

wrp 08 March, 2019, 03:15:12

@user-1603a2 200hz eye cameras can not be focused. Please send an image of left and right eye camera image so that we can give you concrete feedback. You can either send them here or to info@pupil-labs.com so we can further diagnose

user-42b39f 08 March, 2019, 08:11:56

Has anybody solved a ghost issue with the cameras (install pyrealsense 2 for use of Intel Real sense D400 series cameras) and an audio issue with audiocapture from built in microphone (no libav.avfoundat) ? I am using mac os 10.14.3 with Capture 1.11.4, 200Hz cameras with world camera. Tried to move back to previous capture versions but no change, it looks like it is my os version the issue... Some months ago, everything was ok with my hardware.

user-42b39f 08 March, 2019, 12:25:42

update: having no image was just a trivial cable issue. The connection looked good but it was not in fact. I still have the warning with pyrealsense2. However, the audio input issue is still there.

user-42b39f 08 March, 2019, 12:31:13

I don't know how to make my FFMPEG recognize the device: [AVFoundation input device @ 0x7fcd43504d20] AVFoundation video devices: [AVFoundation input device @ 0x7fcd43504d20] [0] Pupil Cam2 ID1 [AVFoundation input device @ 0x7fcd43504d20] [1] FaceTime HD Camera [AVFoundation input device @ 0x7fcd43504d20] [2] Pupil Cam2 ID0 [AVFoundation input device @ 0x7fcd43504d20] [3] Pupil Cam1 ID2 [AVFoundation input device @ 0x7fcd43504d20] [4] Capture screen 0 [AVFoundation input device @ 0x7fcd43504d20] AVFoundation audio devices: [AVFoundation input device @ 0x7fcd43504d20] [0] Built-in Microphone : Input/output error

user-2968b9 08 March, 2019, 13:26:51

Hi, we're currently having an issue with the new version of Pupil Capture, where the application closes after performing a manual marker calibration, this happens as soon as the calibration finishes. Has this been reported? And is there a way to stop it from happening?

user-f27d88 08 March, 2019, 14:18:29

@user-2968b9 Would you like to tell me how to reproduce it?

user-2968b9 08 March, 2019, 15:58:23

@user-f27d88 Sure, so we have 2 instances of pupil open, streaming from the mobile application to the desktop application. We are only using the eye camera on one of these instances. We select the calibration method for manual marker; then run the calibration using the sheet, usually getting between 9-12 points, we then stop it via the stop marker, as soon as it finishes, the application closes. We get the error "Pupil capture has closed unexpectedly", we are using a MacBook pro when this happens. If you need any further info, please let me know.

user-2be752 08 March, 2019, 16:37:19

@user-f27d88 @papr I have now tried to set the autoexposure mode on the pupil mobile to see if I can decrease the impact of light on the cameras (i'm recording outdoors), but it says that my camera does not support such mode.

user-2be752 08 March, 2019, 21:36:43

also another question, so far I have been analyzing my eyetracking data using the pupil player since i'm doing offline calibration. is there any way I can automatize this process? I have large amounts of data collected and doing it manually is very time consuming

user-e45dc9 08 March, 2019, 22:18:09

Hi everyone, we are piloting our new Pupil Labs system, and I'm wondering if anyone has used Pupil Labs to track ocular torsion responses?

user-85ba9b 09 March, 2019, 09:38:42

Hi @user-e45dc9 I haven’t used it , am a newbie but ocular torsion responses would be amazingly interesting to study. Do the glasses work in the dark?

wrp 09 March, 2019, 09:44:39

@user-85ba9b yes the eye cameras do work in the dark as they capture in IR and eye region is illuminated with IR

user-85ba9b 09 March, 2019, 09:45:59

Excellent... from a brief look on the website the only option for simulating someone to be in the dark would be the adaption kit for the htc Vive

user-85ba9b 09 March, 2019, 09:46:03

Is that correct?

user-85ba9b 09 March, 2019, 09:46:27

Ie all the other options are glasses which wouldn’t give someone a dark environment

user-85ba9b 09 March, 2019, 09:47:04

I would like to be able to record eye movements in a dark environment without turning off lights

user-85ba9b 09 March, 2019, 09:47:40

I’d be interested in an open source version of goggles if it exists or is available

user-85ba9b 09 March, 2019, 09:47:55

If anyone knows of any or can recommend pls do!

user-85ba9b 09 March, 2019, 09:48:01

😀

wrp 09 March, 2019, 09:49:31

@user-85ba9b the Vive add-on could work for you. Alternatively we (Pupil Labs) could supply you with cable harness and eye cameras so you can prototype a setup that works for your research requirements

user-85ba9b 09 March, 2019, 09:52:01

Awesome thanks @wrp I need to look into options... ideally I would be able to open and close the front - so I could simulate dark environment and light environment easily

user-85ba9b 09 March, 2019, 09:52:15

Thanks for the feedback 👍

wrp 09 March, 2019, 09:53:08

Welcome. If you have any further questions on hardware feel free to get in touch via email

user-85ba9b 09 March, 2019, 10:02:58

ok to which email pls @wrp ?

wrp 09 March, 2019, 10:04:39

info@pupil-labs.com

user-85ba9b 09 March, 2019, 10:04:52

thanks

user-0eef61 10 March, 2019, 21:02:31

Hi I have the HTC Vive Pro add on and I have created a VR application in Unity . If anyone has an experience with pupil labs HTC vive add on, would be very helpful if you could tell me the steps to call any functions of the eye tracker in Unity C# scripts? Or are any projects already created in github with Unity and pupil labs?

user-f09f8c 11 March, 2019, 09:44:33

Hi devs, the sidebar buttons in the Player as well as Capture Mode become invisible on a frequent basis.

In the Player it happens when I press Player from Recording or Gaze From recording. In Capture it is when hitting UVC Manager.

I can fix it by making the screen smaller/bigger, but I have to do that every time after I hit any of those buttons. Is this a known bug?

wrp 11 March, 2019, 10:55:43

@user-f09f8c please could you make an issue for this on https://github.com/pupil-labs/pupil - please include also your OS/OS version

user-ca2a2b 11 March, 2019, 14:15:16

Hi guys, looking for some help for golf putting calibration. What if I want to calibrate for both the areas of the hole and the ball. I'm having difficulties to get both calibrated. Any ideas?

user-e7102b 11 March, 2019, 21:00:09

Hi @papr , I've been working with a set of Batch Export scripts that user @user-bd800a kindly shared with me last week (https://github.com/PierreYvesH/batchExport). These scripts work for data recorded using more recent versions of pupil recorder, but not for my data which were predominantly recorded using an earlier version (v1.5.12). It seems I have two options to adapt the script for my needs. 1) Add the "Incremental_Legacy_Pupil_Data_Loader" into the script to load the old-style data, or 2) Add the "update_methods.py" to convert the old-style data. Option1 seems like it should be straightforward, but as a relatively inexperienced Python user I have no idea how to do this. Option 2 would also be straightforward if I could get pyglui to install properly...however, I've tried this on mac and linux machines and ran into issues with building and installing libuvc. I'm feel like I've hit a wall. Do you have any suggestions for how best to proceed? Thanks!

user-2be752 11 March, 2019, 23:19:47

Hi, is there anyways to detect head turns? for example, if you wanted to detect when subject is moving their hold head to the left and to the right. Thanks!!

user-4ef865 11 March, 2019, 23:35:47

Hi - we are about to purchase a mobile device for our eyetracking glasses, but don't want to spend a great deal of money. Has anyone used theirs successfully with the Motorola Moto G6? Or alternatively, can anyone recommend a phone that isn't expensive and works with the glasses?

user-2be752 11 March, 2019, 23:36:18

@user-4ef865 I use the Motorola and works fairly well

user-4ef865 11 March, 2019, 23:37:19

Thanks @user-2be752 !

user-5c2b34 11 March, 2019, 23:42:53

Hi everyone! We are trying to use PupilLabs eye tracking device for our school project. For that we need to be able to do some processing on the live video feed we are getting from the cameras. Does anyone know how to do it? I've looked everywhere in the app and so far it seems like I can only record and save the video, but idk how we can feed it into our python script.

user-5c2b34 12 March, 2019, 05:27:53

would it be possible with OpenCV? Does PupilLabs have some kind of integration with it?

user-2be752 12 March, 2019, 06:38:47

Hi, is there any way to detect head turns?

user-f09f8c 12 March, 2019, 11:00:27

@user-4ef865 I have just managed to get it working with the Samsung S7 edge so that might be an option for you too.

user-0c583a 12 March, 2019, 15:20:24

Hello, everyone. I'm at a research university in the states (Chicago, to be exact) and we're looking at pupil-labs for a one-shot experiment with a non-standard configuration. Hoping someone will be able to answer a few questions for us. All of this is in context of a pupil dilation experiment only, with no gaze-tracking at all.

user-0c583a 12 March, 2019, 15:22:57

Question 1: Is it possible to use the software and some pupil-facing cameras with a mount of our own, rather than the 3d-printed frames? We're being asked to keep monetary expenditures as far down as possible, substituting our own labor and expertise. I'm concerned that the software might be taking advantage of the known geometry of the frames, though.

user-0c583a 12 March, 2019, 15:28:15

Question 2: We believe we understand the purpose of the IR diode substitution in the Do-It-Yourself kit, i.e., to improve contrast. But we are wondering about the possibility of using an external source of IR with unmodified web cams.

user-ac3779 12 March, 2019, 18:21:11

Hello guys, how can I convert from Pupil EPOCH timestamp to a readable time. I could have used the normal EPOCH converter but it was stated on pupil website that Pupil EPOCH is different from the normal EPOCH

user-b64d72 12 March, 2019, 18:33:31

Has anyone converted pupil_data to text file and worked their way to extract relevant parameters from it without extracting csv files from the player?

user-3b1e99 12 March, 2019, 23:31:32

Hello, we just bought the Pupil labs eye tracking but seem to be running into a minor problem. How do we know when manual calibration has successfully worked? Thanks!

wrp 13 March, 2019, 05:51:53

@user-3b1e99 you will get accuracy and precision estimate after calibration

user-42b39f 13 March, 2019, 09:05:03

Hello, has anyone encountered difficulties with MacOS Mojave and the audiocapture plugin with the last Capture software? I can't use this plugin anymore. I think that it is os dependent with the new tcc security system of Mojave but I can't find out a solution.

user-9bc047 13 March, 2019, 12:01:31

Hello Everyone! I am having an Issue with the Pupil Capture: basically whenever I turn both Eyes on (0,1) very often the video of one of the two eyes(0) suddenly stops for a few seconds or twitches and than restarts running. I checked the fps and CPU indices on the Capture and the TASK MANAGER > Details and whenever it stops it says: "Deffered Procedures calls and interrupt service routines". I TRIED: changing both the CPU Affinity (1 to 5 for one eye and all the rest for the other one) and set the Priority to High for both eyes. I didn't tried multithread. Can anyone help me? Thank you in advance

user-8779ef 13 March, 2019, 15:01:33

Hey folks, any way to simulate data capture using Pupil Capture? For example, can I load in eye images via movie file?

user-9d6943 13 March, 2019, 16:06:40

protected

wrp 14 March, 2019, 04:57:57

@user-8779ef You can load video files as source in Pupil Capture. Select the backend manager > manager > video file source and then navigate to the directory and file.

wrp 14 March, 2019, 05:11:31

@user-42b39f I will try to reproduce your observed audio behavior on macOS mojave today. Thansk for the report.

wrp 14 March, 2019, 05:26:11

@user-42b39f I can reproduce this behavior on Mojave both with built-in mic and external mic

wrp 14 March, 2019, 06:52:40

@user-42b39f can you try restarting Pupil Capture with default settings after accepting the permission request from Pupil Capture/macOS to allow microphone access? This enabled me to resolve this behavior/issue.

user-95608a 14 March, 2019, 07:23:28

Hi ! I recently bought a monocular pupil head set .I downloaded Pupil Capture .I was able to start pupil capture but I am not able to the see the pupil as a red dot as explained.I got a pop up screen with red and green circles only. I tried to calibrate and started recording but empty recording of eye camera was observed.Please help me

wrp 14 March, 2019, 07:29:44

@user-95608a Welcome to the chat 👋 - In the eye window you should be able to see your eye. Can you confirm this? You may need to adjust the headset to position the camera correctly. To give more concrete feedback you could send a screenshot of the eye window.

user-95608a 14 March, 2019, 07:32:25

no am not able to see the eye

wrp 14 March, 2019, 07:32:53

Hi @user-95608a can you take a screenshot of the eye window and share it here

wrp 14 March, 2019, 07:32:57

so we can provide you with feedback

user-95608a 14 March, 2019, 07:33:04

yes i will

user-95608a 14 March, 2019, 07:35:28

Chat image

wrp 14 March, 2019, 07:37:42

@user-95608a thank you for the screenshot. Based on what I can see, it looks like the eye camera needs to be manually adusted. It looks like it may be looking at your skin on your cheek and not your eye necessarily. Please see: https://docs.pupil-labs.com/#pupil-headset-adjustments

user-95608a 14 March, 2019, 07:39:12

okay will check the adjustments

user-95608a 14 March, 2019, 07:39:25

thank you

wrp 14 March, 2019, 08:12:41

@user-95608a has this been resolved?

user-95608a 14 March, 2019, 09:54:32

I could see my eye in the eye window now. I calibrated and then tried to do a recording.it recorded a data which i have to analyse further.I actually want to read a text and capture the eye movements.Once recording is started can i open the document to be read from the system ?Please let me know whether my understanding is right or not

wrp 14 March, 2019, 11:16:42

Hi @user-95608a I think I understand your question. You are trying to understand where a participant is looking while reading text in a book/magazine/on screen. Is this correct? If so, you might want to look at surface tracking: https://docs.pupil-labs.com/#surface-tracking

wrp 14 March, 2019, 11:17:30

Once you have made a recording, you can play back the recording using Pupil Player and can build up visualizations and perform "light weight" analysis: Please see: https://docs.pupil-labs.com/#pupil-player

wrp 14 March, 2019, 11:17:33

I hope this is helpful

user-434a86 14 March, 2019, 12:25:44

Hi, is there any possibility to hear the audio track when you click frame by frame through a video in pupil player? I need to identify a cue sound after which I analyze eye tracking data. However, I don't get an audio signal if I click frame by frame through the video.

wrp 14 March, 2019, 13:53:25

@user-434a86 audio is only during real-time playback not frame stepping or scrubbing

wrp 14 March, 2019, 13:53:55

Can the audio visualization waveform help you identify the onset of the cue?

user-f09f8c 14 March, 2019, 15:23:01

One of the eyes is upside down on the eye cam (both on mobile and Capture) , does that affect the recording?

wrp 14 March, 2019, 15:26:43

@user-f09f8c one eye camera sensor is physically upside-down. It doesn't affect the recording it gaze estimation. You can flip this in pupil capture software if you like.

user-f09f8c 14 March, 2019, 15:30:29

I see, I'm glad it is supposed to be that way and not something I messed up 😉

user-d9affa 14 March, 2019, 16:20:11

Hi everyone. I'm currently working on a research project with pupil mobile where I want to send some basic triggers and start/stop recording annotations via another android app. Sadly the provided docs focus on a python implementation of ndsi, which makes sense if streaming to pupil capture. I would require "offline" recording via the pupil mobile and hence I am looking for an android/java way to send those commands. Any help is appreciated.

user-2be752 14 March, 2019, 19:26:24

Hi guys, what is the difference between the 3d pupil detection and the 3d calibration on pupil player?

user-8779ef 14 March, 2019, 20:37:01

@wrp It looks like you can load scene videos, but not eye videos into capture. Am I wrong?

user-8779ef 15 March, 2019, 00:17:13

I would have thought i could load a recording folder (scene+eyes).

papr 15 March, 2019, 06:54:09

@user-8779ef you can run a video in the eye process in the same way as in world. Just change the backend to Video File and drop the video onto the window. There is no support for synchronized playback between all three processes though.

user-dd52c0 15 March, 2019, 08:20:01

Heyo! I have a question concerning a corrupted recorded session of the pupil capture. We had the notebook running on battery power and unfortunately weren't able to stop the recording properly before it went out of power. No data files have been written, just the world and the two eye videos are there - but corrupted. I already tried to repair them with several tools, obviously not successful. ffmpeg for example reports moov atom missing. So does anyone of you have experience on repairing the video files?

user-42b39f 15 March, 2019, 09:44:12

@wrp I tried but still have the same issue and I did not get the permission asked. I tried to reset the tcc database as well with tccutil. I got the permission and checked that the microphone access was enabled. Capture is enabled but I can't get access. I installed the oversight utlity to check the access of the microphone. As soon as I select built-in microphone from the plugin, it is active but becomes inactive quasi immediately. I tried as well with an usb wireless microphone (samson stage pxd1). I have no error but the recording is bad. Please note that both mikes (built in and external) work well with quicktime to record. Hope it helps...

papr 15 March, 2019, 09:46:31

@user-42b39f What do you mean by "the recording is bad"? Audio quality wise?

papr 15 March, 2019, 09:47:09

@user-dd52c0 I am sorry but I do not think that this recording can be recovered.

papr 15 March, 2019, 09:49:52

@user-2be752 3d pupil detection generates 3d pupil data from the eye videos. The 3d calibration uses 3d pupil data (eye video coordinate system) and reference locations (world video coordinate system) to learn a gaze mapping function that maps 3d pupil data (eye video coordinate system) to 3d gaze data (world video coordinate system). Detection and calibration/mapping are two steps of our gaze estimation pipeline.

user-42b39f 15 March, 2019, 10:15:49

@papr Yes audio quality with the external microphone. If needed I can send a short sample.

papr 15 March, 2019, 10:20:28

@user-42b39f No need, thanks. While Quicktime uses a native API to access microphones, we use pyaudio, a python wrapper for corss-plattform audio recording. My first guess is that there is something wrong with the mic config by pyaudio. Could you try a wired usb microphone?

user-42b39f 15 March, 2019, 10:34:22

@papr no I don't have any. I will try with pyaudio to record from both sources.

user-e7102b 16 March, 2019, 21:36:25

Hi @papr , I've been working with a set of Batch Export scripts that user @user-bd800a kindly shared with me last week (https://github.com/PierreYvesH/batchExport). These scripts work for data recorded using more recent versions of pupil recorder, but not for my data which were predominantly recorded using an earlier version (v1.5.12). It seems I have two options to adapt the script for legacy recording. 1) Add the "Incremental_Legacy_Pupil_Data_Loader" into the script to load the old-style data, or 2) Add the "update_methods.py" to convert the old-style data. Option (1) should be straightforward, but it is unclear to me how this chunk of code should interact with the other code. Option (2) seems even more straightforward, but I'm unable to install pyuvc (I run into this issue: https://github.com/pupil-labs/pyuvc/issues/55) . I feel like I've hit a wall. Do you have any suggestions for how best to proceed? Thanks!

user-42b39f 16 March, 2019, 21:48:43

@papr just for info, the built-in microphone works like a charm with a simple pyaudio script. I can't figure out how to progress.

papr 16 March, 2019, 22:06:03

@user-42b39f we will try to reproduce the issue and fix the issue. Feel free to look at the audio capture plugin code and start debugging.

papr 16 March, 2019, 22:06:54

@user-e7102b pyuvc should not be required if you use the upgrade procedure code directly from your code.

user-36fbf3 16 March, 2019, 22:27:17

Are there compatibility issues with Win7? I can never get the drivers to install such that libUSBK shows up in device manager

user-e7102b 17 March, 2019, 01:57:21

@papr Ok thanks. Now I've bypassed the pyuvc stuff it appears to be working 😃

papr 17 March, 2019, 07:28:05

@user-36fbf3 We do not support windows 7. Please upgrade to windows 10. In an unrelated note: Do I correctly recognize boxxy in your profile picture? 🙃

papr 17 March, 2019, 07:28:31

@user-e7102b great!

user-36fbf3 17 March, 2019, 07:43:35

"upgrade" 😭 and you do :3

papr 17 March, 2019, 08:04:47

@user-36fbf3 Feel free to make a dual boot installation with Linux on your computer. 👍

user-42b39f 17 March, 2019, 10:47:04 papr Ok thanks. I installed a utility called Oversight to monitor the access to the microphone. It looks like after choosing the buit-in microphone in the plugin, the microphone becomes active for a short time then inactive with the error message . Looking more into the plugin.
user-95608a 18 March, 2019, 05:02:50

Is there a surface tracking plug in for windows operating system?I could find it only for ubuntu os. Am using windows 10 and i want to track the eye movements of user while he is reading a text from a paper.

wrp 18 March, 2019, 05:31:13

@user-95608a There are no differences in the Pupil software (from a user perspective) in features from Windows, macOS, and Linux. All plugins are available regardless of your OS.

wrp 18 March, 2019, 05:31:52

@user-95608a please check in the plugin manager in the right hand sidebar of Pupil Capture or Pupil Player to enable the Surface Tracking plugin.

user-95608a 18 March, 2019, 05:52:43

okay i shall check it out

papr 18 March, 2019, 10:15:18

@user-42b39f I was mistaken. We use pyaudio only to list the microphone devices in Capture. The actual recording is done via pyav.

papr 18 March, 2019, 10:35:45

@user-42b39f I do not think this is a macOS permissions issue. I am able to get good quality recording with an analog external headset but not with an external usb microphone. For now, I am blaming pyav.

user-54a6a8 18 March, 2019, 13:55:11

@papr , I'll lurk in discord today if you want to talk about #1449. If it turns out that the old hmd-eyes was incorrectly resetting the clock but the new one doesn't do that, then I'm fine without there being a fix. However, if the problem is with our setup or with pupil capture, then that's something we need to fix.

papr 18 March, 2019, 13:56:27

@user-54a6a8 I think we need to make sure that the new hmd-eyes does not inherit this bad behavior (assuming that the old hmd-eyes is at fault).

papr 18 March, 2019, 13:57:24

@user-54a6a8 Can you elaborate on your procedure? Do you start calibration before or after starting the recording? Do you run multiple calibrations during a recording?

papr 18 March, 2019, 13:57:49

Also, Capture needs to be more robust against such time changes during a recording.

user-54a6a8 18 March, 2019, 13:59:54

Our general philosophy is to record everything. So we start recording before calibration. We sometimes do multiple calibrations if we find the original calibration to be unsatisfactory, but this is not a systematic part of our protocol, and if we do this then there is no break in recording.

user-86c436 18 March, 2019, 14:02:04

@papr this is the same problem as mine, so I'll also give some info on my procedure: I start recording before starting game mode in unity (and therefore before calibration). For now, I have never done more than one calibration during a recording.

papr 18 March, 2019, 14:02:35

@user-86c436 @user-54a6a8 Thank you! In how many recordings have you noticed the time jump?

user-86c436 18 March, 2019, 14:04:26

@papr I don't have access to the files right now, but for me it's about 17 of 20 recordings during the last weeks.

user-54a6a8 18 March, 2019, 14:04:53

I'll have to ask the student but he's not here right now. I think for us it's 60-70% of the recordings, so not too far off from Lisa's experience.

user-54a6a8 18 March, 2019, 14:06:17

It's entirely possible that for those recordings where we didn't see this problem, he started a recording, did a calibration, then closed it for some unrelated reason, then started again (and did the calibration again).

user-54a6a8 18 March, 2019, 14:06:58

We're doing EEG at the same time so sometimes we take off the HMD to adjust the electrodes and start over.

papr 18 March, 2019, 14:17:59

After talking to @fxlange we highly suspect that the issue is a time reset before each calibration. We introduced this as a "bugfix" at the end of last year. We will ship a fix for both hmd-eyes versions, old and new, as well as a script to fix the timestamps of existing recordings.

user-54a6a8 18 March, 2019, 14:18:59

Thanks!

user-86c436 18 March, 2019, 14:20:14

Indeed, thanks a lot!

user-54a6a8 18 March, 2019, 14:21:55

@papr , I will continue to lurk here today in case you want to discuss my PR for simpler Windows setup to run pupil from source, but no expectations. https://github.com/pupil-labs/pupil/pull/1455

user-54a6a8 18 March, 2019, 14:26:41

but if so then maybe we should take that conversation to pupil-software-dev. Sorry for the noise.

papr 18 March, 2019, 14:31:13

@user-54a6a8 In case that you have a recording with multiple calibrations/negative time jumps, could you please share it with me? It would help making the fixing script more robust.

user-54a6a8 18 March, 2019, 14:36:38

We might have to generate one. Most of our recordings are in the 1-3GB range. I'll ask the student when he gets in.

papr 18 March, 2019, 14:37:40

I would be fine with the size of the recording. 😅

user-54a6a8 18 March, 2019, 14:38:23

Our internet wouldn't deal with it well. The transmission would fail multiple times. It's an issue for us.

papr 18 March, 2019, 14:39:20

Ok. Don't worry. We will probably be able to make a recording on our own tomorrow as well.

user-97591f 18 March, 2019, 17:32:08

@papr @user-54a6a8 The following videos have calibration(s) triggered during pupil_capture recording in Unity using the pupil plugin. They fail during pupil_player import. https://drive.google.com/drive/folders/1t1JjeeGqbBNGSEp1wWmfxdw5d0qyncVy?usp=sharing

user-2be752 18 March, 2019, 22:28:34

Hi everyone, can you guys help me understand what the .IMU file is recording? how can I read it?

user-2be752 18 March, 2019, 23:14:38

@papr thank you!!!

user-6997ad 19 March, 2019, 04:38:26

@papr we're still having trouble with wildly different luminance between the eyes that is probably causing very poor calibration performance (in hmd-eyes). The 5 eye-facing LEDs are visible in the brighter eye, but only 1-2 can be seen in the darker eye leading me to suspect illumination is the problem. Is this an issue that's been seen before? Is there a way to run a diagnostic on the illumination LEDs?

wrp 19 March, 2019, 04:54:04

Hi @user-6997ad - can you send a screenshot of each of the eye windows to info@pupil-labs.com so that we can diagnose - this might be a hardware issue.

user-0a2ebc 19 March, 2019, 07:26:34

I am playing around with Pupil Player..there are so many features here

user-0a2ebc 19 March, 2019, 07:26:57

Does anyone know where can I get a guideline to explore it further ?

papr 19 March, 2019, 07:30:54

@user-0a2ebc have you been able to check out our documentation? There is a link at the top right of our website

user-0a2ebc 19 March, 2019, 07:31:53

Yup

user-0a2ebc 19 March, 2019, 07:31:57

is this the one ?

user-0a2ebc 19 March, 2019, 07:31:59

https://docs.pupil-labs.com/#pupil-player-demo-video

user-0a2ebc 19 March, 2019, 07:32:34

I don't think it explains every feature...I am still confused

wrp 19 March, 2019, 07:33:08

If you have any speicific questions, please ask and we can point you to relevant sections of the docs or respond directly here

user-0a2ebc 19 March, 2019, 07:33:36

Sure

user-0a2ebc 19 March, 2019, 07:33:40

thanks will

user-0a2ebc 19 March, 2019, 07:57:49

I would like to analyze the exported data with the following codes https://github.com/pupil-labs/pupil-tutorials/blob/master/02_load_exported_surfaces_and_visualize_aggregate_heatmap.ipynb

user-0a2ebc 19 March, 2019, 07:58:11
  1. After I exported the data, which file I should choose ?
user-0a2ebc 19 March, 2019, 07:58:42

Chat image

user-0a2ebc 19 March, 2019, 07:59:04

Let say I choose gaze_position file then which columns I should choose ?

wrp 19 March, 2019, 07:59:13

@user-0a2ebc you need to ensure that you had surfaces defined in your recording and that you had the surface tracking (offline surface tracker) plugin enabled

user-0a2ebc 19 March, 2019, 07:59:59

I see...

user-0a2ebc 19 March, 2019, 08:00:10

I don't know how to define it

user-0a2ebc 19 March, 2019, 08:00:19

I used the data that you sent me the other day during demo

wrp 19 March, 2019, 08:01:24

maybe try this one: https://drive.google.com/file/d/0Byap58sXjMVfZUhWbVRPWldEZm8/view?usp=sharing - it's an old recording with surfaces

user-0a2ebc 19 March, 2019, 08:03:18

Ok let me download

user-0a2ebc 19 March, 2019, 08:07:14

So, every surface ,eg magazine, that is put QR code in every corner (say 4 corners), once I activated the plugin of the surface tracking, the system will track it, right ? please correct me if I am wrong

wrp 19 March, 2019, 08:08:16

markers will be automatically detected, but you need to "define" surfaces (e.g. what markers should be used to make up a surface), and specify a surface size for each surface.

user-0a2ebc 19 March, 2019, 08:15:38

@wrp This something that I still don't know

user-0a2ebc 19 March, 2019, 08:15:47

how to define and specify

wrp 19 March, 2019, 08:21:16

please see: https://docs.pupil-labs.com/#surface-tracking

papr 19 March, 2019, 08:47:22

@user-97591f Great, thank you!

user-0a2ebc 19 March, 2019, 09:12:25

@wrp Thanks

wrp 19 March, 2019, 09:17:55

welcome @user-0a2ebc

user-0a2ebc 19 March, 2019, 09:48:41

@wrp I would like to ask about defining surface in Pupil Player

user-0a2ebc 19 March, 2019, 09:49:16

As you can see on the righ side (defining surface ). How do I get X and Y coordinate ?

Chat image

user-0a2ebc 19 March, 2019, 09:49:42

I mean X size & Y size ?

papr 19 March, 2019, 09:51:22

@user-0a2ebc The X and Y sizes are arbitrary numbers. They define the resolution of the exported heatmap.

user-0a2ebc 19 March, 2019, 09:52:13

@papr Then if I want to add a new surface, should I just leave blank the X and Y size ?

papr 19 March, 2019, 09:56:16

The default values are 1x1. You can set it to its physical dimensions in centimeters for example.

user-0a2ebc 19 March, 2019, 09:59:00

@papr I see..so I should measure it first right ?

papr 19 March, 2019, 10:01:45

@user-0a2ebc Yes, but it is not that important. Gaze is mapped in normalized coordinates anyway (origin bottom left, height == width == 1)

user-0a2ebc 19 March, 2019, 10:04:18

Sorry I still don't understand this completely how to define the surface after I put the stickers

user-0a2ebc 19 March, 2019, 10:05:13

Let say I have two magazine cover, each of it has been given the QR sticker, then in Pupil Player how do I add a new surface ? or define each of them ?

user-0a2ebc 19 March, 2019, 10:05:30

Refering to my screen shoot above

papr 19 March, 2019, 10:10:01
  1. Player detects surface markers (what you call QR codes) for each world frame
  2. While the surface that you want to define is visible: Hit A or the A button on the left. (A for add)
  3. Player will use all detected markers for the current frame(s) to define a surface. It will use the bounding box of the marker vertices.
  4. Clicking Edit Surface allows you to move the surface definition in relation to the markers.
  5. Clicking add/remove markers allows you to assign or remove associated markers from a surface.
user-0a2ebc 19 March, 2019, 10:16:10

@papr Thanks for the explanation...let me try this

user-4943c4 19 March, 2019, 18:32:38

Hi guys! im brand new here trying to understand the software needs hahaha

wrp 19 March, 2019, 23:49:06

@user-4943c4 hi 👋 - if you have any specific questions please feel free to ask then in this channel

user-6997ad 20 March, 2019, 00:18:35

@wrp thanks, just sent.

wrp 20 March, 2019, 00:19:46

Thanks we will take a look today @user-6997ad

user-4771db 20 March, 2019, 10:03:52

Hi guys, hey @wrp thanks for the invitation. Is there a specific age a person should have before using pupil labs headset? I mean in terms of safety and potential risks using the headset. We think about letting school-children try out pupil labs.

wrp 20 March, 2019, 10:06:42

@user-4771db we do not have any specific age suggestions, the hardware has been tested and passed IR irradiance testing for human use. There are people in the community that use Pupil cameras with very young children (1-2 year olds even) - for this they develop a custom mount for the cameras. For children age 3-8 we have a child sized frame.

user-4771db 20 March, 2019, 10:09:10

@wrp alright. Thanks for the quick reply! See you 😃

user-42b39f 20 March, 2019, 12:51:18

@papr I tested as requested another USB microphone which works well with other softwares. I have no error when choosing it from the list but the recording is absent or very bad with pupil player. I had to authorize the microphone for macos tcc purpose. Note that after this try, I tried again with the built in microphone and got no error this time but could not record anything. Is it a mix issue with the audiocapture plugin and the tcc authorization of mac mojave ? Any help appreciated!

papr 20 March, 2019, 12:53:42

@user-42b39f To be honest, I do not think it is an authorization issue. The application asks once for general microphone access, in my experience. The garbage recorded by usb microphones is definitively a problem which needs solving.

user-e7102b 21 March, 2019, 01:28:51

@papr Batch Exporter functions are here if anyone else needs them: https://github.com/tombullock/batchExportPupilLabs

user-e7102b 21 March, 2019, 01:30:00

Still a work in progress (need to eventually figure out how to export surface gaze data...but that shouldn't be too hard)

papr 21 March, 2019, 08:50:36

@user-e7102b That is great! I added your repository to https://github.com/pupil-labs/pupil-community/blob/master/README.md#scripts

user-4943c4 21 March, 2019, 15:24:19

Hi guys! @papr thanks for the welcome!

user-4943c4 21 March, 2019, 15:25:11

im completely new to this and i was wondering if anyone had any tips to start

papr 21 March, 2019, 15:30:05

@user-4943c4 https://docs.pupil-labs.com/#getting-started

papr 21 March, 2019, 15:31:16

If you own a device already try to start recording a calibration and reviewing it in Player.

user-4943c4 21 March, 2019, 16:10:51

@papr thanks! actually the device is the part i was wondering about. right now i can't afford the eyeset yes and i was wondering if i could theoretically build a more basic one myself

user-4943c4 21 March, 2019, 16:11:48

from arduino for example

user-4943c4 21 March, 2019, 16:12:14

and use the open source software to get it running

user-4943c4 21 March, 2019, 16:13:45

trying to get a grasp of the level of complexity that i'll be facing

user-dbadee 21 March, 2019, 19:03:51

I am using pupil on one laptop and watching a video on another computer. How can I best calibrate the device in this set up? I tried single marker calibration by downloading the marker file and displaying it on the video computer, but it is not as accurate as I want it. I'm wondering if I'm missing a step.

user-daa4af 21 March, 2019, 20:28:13

Hi I have a Yeti Nano Premium USB mic plugged into Mac computer, and, although Pupil Capture detects it, it does not record any audio. It will record audio from a cheap Logitech USB mic. Is there a way to get audio recording with my Yeti Nano? Also where do I find the Audio Visualization Waveform?

user-daa4af 21 March, 2019, 20:49:49

I just downloaded the newest version of Pupil Capture and it recorded the audio using my Yeti Nano.

user-14d189 21 March, 2019, 22:49:06

Hi, friends of mine want to purchase each PL kids frame and adult frame. And they want to swap over the cameras to save some cost. With the eye tracking cameras I do not see huge trouble. But the world camera ... Has someone experience with that? In a lucky case they just need to swap it a few times.

papr 21 March, 2019, 22:51:10

@user-14d189 Please tell your friends to contact info@pupil-labs.com I am sure we can make something work. 👍

user-14d189 21 March, 2019, 22:51:43

thanks!!!

user-0a2ebc 25 March, 2019, 09:44:08

https://github.com/pupil-labs/pupil-tutorials/blob/master/02_load_exported_surfaces_and_visualize_aggregate_heatmap.ipynb (Section 3 - Visualize aggregate gaze on surface for single participant)

user-0a2ebc 25 March, 2019, 09:44:18

I would like to ask about the code

user-0a2ebc 25 March, 2019, 09:45:11

grid = (1024,800) # this should match the real world size of the image ....how do I know the real size of the image ? in this case is the cover of magazine

user-0a2ebc 25 March, 2019, 09:45:35

Let say, I have another object of interest, eg. bottle

wrp 25 March, 2019, 10:00:32

@user-0a2ebc in this example it is the exact dimensions of the underlying source image in pixels.

user-0a2ebc 25 March, 2019, 10:02:03

@wrp The data is in Video recording

user-0a2ebc 25 March, 2019, 10:02:12

then how can I get the picture

user-0a2ebc 25 March, 2019, 10:02:23

Do I need to convert or something ?

wrp 25 March, 2019, 10:03:43

You can either use your own pre-prepared source image or a crop you make from a video still.

user-0a2ebc 25 March, 2019, 10:04:17

I see

user-0a2ebc 25 March, 2019, 10:05:12

Cropping image from video still may work

user-0a2ebc 25 March, 2019, 10:05:44

pre prepared source image may give different size probably in compared to cropping video stil ?

wrp 25 March, 2019, 10:15:27

Since you are tracking a 2d surface, it should not make a difference as long as the surface area you defined and source image are the same source material and proportion

user-cfa47d 25 March, 2019, 16:07:46

hello, where can I get information about the colored images from the depth camera? Is there any distance scale that matches the colors? I have been through the whole docs on the website but couldn't find anything... TIA

papr 25 March, 2019, 17:34:35

Hi @user-cfa47d The realsense camera does provide depth data as 16bit unsigned integers. Colored images require 8bit unsigned integer valued pixels. Therefore, one cannot convert depth data loss-free to colored images. The easiest way to convert would be to apply a color map. The problem is the mapping is difficult to get visually right unless one provides additonal ui elements to scale the mapping. We decided to go with an alternative visualization method that is also used by the realsense examples: We normalize the colors based on the images cumulative histogram. This results in a high-contrast visualization but has the disadvantage that the color values do not map to a fixed depth value. If you want to access the depth data, we recommend to do this directly via a plugin: https://gist.github.com/papr/0f13943e2aebd768ab6b1508d466caae

user-f3048f 25 March, 2019, 21:56:15

Hello, will pupil work fine with a 2.0 USB entry hub?

user-e938ee 26 March, 2019, 08:49:51

Hello! I'm trying to get heatmaps from Pupil Player. I've captured the footage with 4 markers, to specify surface, but whenever I try to show heatmaps, I only get white rectangle over the surface.

papr 26 March, 2019, 08:55:23

@user-f3048f That depends on the headset configuration that you are using. If you are using a 3d world camera, it will definitively not work.

papr 26 March, 2019, 08:55:49

@user-e938ee Did you setup the x and y sizes of your surfaces?

user-cfa47d 26 March, 2019, 11:16:37

I'll check it out. Thanks @papr

user-f3048f 26 March, 2019, 11:45:22

thank [email removed] It is a 2d world camera.

papr 26 March, 2019, 12:23:52

@user-f3048f Yes, that should work.

user-f3048f 26 March, 2019, 15:50:53

Thanks

user-ed537d 26 March, 2019, 22:47:13

how do you correct for y drift w/ the binocular setup?

user-ed537d 26 March, 2019, 22:47:22

i used 2d tracking

papr 26 March, 2019, 22:48:11

@user-ed537d We do not have any slippage compensation in 2d mode

user-ed537d 26 March, 2019, 22:49:08

hmm ok thank you!

user-ed537d 26 March, 2019, 22:49:23

any post processing tips for correcting it?

papr 26 March, 2019, 22:50:01

@user-ed537d if it is an option, you can try 3d mode during offline pupil detection if you have recorded eye videos

user-ed537d 26 March, 2019, 22:50:56

i do but i don't have the calibration markers

user-ed537d 26 March, 2019, 22:50:57

😦

user-ed537d 26 March, 2019, 22:51:05

recorded in the video

user-ed537d 26 March, 2019, 22:52:14

@papr i assume there's no way around this correct? i tried natural features calibration and it was not accurate enough.

papr 26 March, 2019, 22:52:17

You can try to manually annotate reference locations but this is difficult without having explicit knowledge of where the subject was looking at

papr 26 March, 2019, 22:52:32

Ah ok, so you tried that already

user-ed537d 26 March, 2019, 22:52:52

i know where they were looking at but its 4 locations quite close to one another

papr 26 March, 2019, 22:53:14

Are they spread in time at least?

user-ed537d 26 March, 2019, 22:53:34

yeah

user-ed537d 26 March, 2019, 22:54:19

i'll try the newest pupil capture. I tried on version 0.3.13.48

user-ed537d 26 March, 2019, 22:54:27

0.9.13.48*

papr 26 March, 2019, 22:55:08

I do not think this will help much :/ The basic calibration procedure did not change.

user-ed537d 26 March, 2019, 22:55:28

😦 ok

papr 26 March, 2019, 22:55:37

You could try to correct the exported csv data by fitting an 1d polynomial to the y-offset at the know locations and try to interpolate the error inbetween

user-ed537d 26 March, 2019, 22:56:43

i like that! i have the times too so I can use as well

user-ed537d 26 March, 2019, 23:00:47

i believe detrend in matlab does that

user-0a2ebc 27 March, 2019, 02:22:45

@wrp Thanks for the explanation

user-0a2ebc 27 March, 2019, 04:36:30

I would like to ask about this code

user-0a2ebc 27 March, 2019, 04:36:58

this section 2 - Loading exported fixation data on surface

user-0a2ebc 27 March, 2019, 04:37:25

in the example the file shows this columns that are availabel

user-0a2ebc 27 March, 2019, 04:37:27

Columns present in exported fixation on surface data: Index(['id', 'start_timestamp', 'duration', 'start_frame', 'end_frame', 'norm_pos_x', 'norm_pos_y', 'x_scaled', 'y_scaled', 'on_srf'], dtype='object')

user-0a2ebc 27 March, 2019, 04:37:40

However when I run the same file I got these columns

user-0a2ebc 27 March, 2019, 04:37:57

Index(['world_timestamp', 'world_frame_idx', 'gaze_timestamp', 'x_norm', 'y_norm', 'x_scaled', 'y_scaled', 'on_srf', 'confidence'], dtype='object')

user-0a2ebc 27 March, 2019, 04:38:40

In short, my file with the same file name does not have columns "duration" and "id"

user-0a2ebc 27 March, 2019, 04:39:16

Why the file from the demo from Will does not have the above mentioned two collumns ?

user-0a2ebc 27 March, 2019, 04:39:29

is it because of the new setting of pupil software ?

user-0a2ebc 27 March, 2019, 04:39:31

thanks

user-82488e 27 March, 2019, 05:12:15

Hi I'm new to Pupil labs. When I'm using the glasses in the dark, the eye camera can no longer detect eye movement and confidence of pupil diameter drops to 0. Do I not have the infrared function turned on? And if so, how to I turn it on?

wrp 27 March, 2019, 05:21:22

@user-82488e IR is always on. Please look at eye windows in algorithm mode and ensure that the max pupil size parameter is set to a large enough value. My conjecture is that in dark environment your pupil diameter is very large and is outside of the default setting range

user-e938ee 27 March, 2019, 13:15:48

@papr Haven't set the x and y values. What are the units? centimeters, milimeters, pixels?

papr 27 March, 2019, 13:16:30

Its the unit of your choice. The x and y values are used to scale the normalized gaze positions.

papr 27 March, 2019, 13:16:40

They are also used as frame size for the heat maps

user-e91538 27 March, 2019, 14:11:06

Small question: Is it possible to get the eye-rotation out of the raw data? I would like to integrate the gaze vector in a Mocap environment. However, the raw data doesnt provide anything like eyeball-angle?

papr 27 March, 2019, 14:24:07

@user-e91538 The calibrated 3d gaze data provides the 3d locations and normals of the 3d eye model in world coordinates. This should be what you need.

user-e91538 27 March, 2019, 19:15:34

Thanks @papr , its working now 😃

user-e74388 28 March, 2019, 04:05:53

Hi,

user-e74388 28 March, 2019, 04:07:51

Have anyone ported this algorithm onto embedded linux system?

user-0a2ebc 28 March, 2019, 04:18:41

Hello, I would like to ask can anyone share on how to measure pupil dilation ?

user-0a2ebc 28 March, 2019, 04:19:16

In my experiment, we are trying to detect whether a participant feels positive or not toward a certain product

wrp 28 March, 2019, 04:32:36

@user-0a2ebc Pupil diameter is exported in pupil_positions.csv

wrp 28 March, 2019, 04:33:21

for further analysis you might want to take a look at this example/tutorial: https://github.com/pupil-labs/pupil-tutorials/blob/master/01_load_exported_data_and_visualize_pupillometry.ipynb

user-0a2ebc 28 March, 2019, 04:37:22

@wrp Thanks a lot for this codes..I am digesting the code now

user-24270f 28 March, 2019, 05:17:27

just updated from 1.10 to 1.11 pupil capture/player/service, windows 64, and now im getting this, never got it before

Chat image

user-24270f 28 March, 2019, 05:18:22

(when running pupil capture)

user-24270f 28 March, 2019, 05:18:44

and pupil player

user-f27d88 28 March, 2019, 05:20:56

@user-24270f Did you try https://stackoverflow.com/questions/38132755/importerror-no-module-named-encodings

user-24270f 28 March, 2019, 05:36:23

this doesnt make sense as an answer for me, ive never had to install python to run pupil capture et al

user-24270f 28 March, 2019, 05:37:16

is there now a requirement to install python that isnt mentioned here https://github.com/pupil-labs/pupil/releases/tag/v1.11 ?

user-f27d88 28 March, 2019, 05:38:28

Did you get the error when running player and capture? Which version of Python you are using, are you using pipenv or virtualenv?

user-24270f 28 March, 2019, 05:38:45

ive never had to install python to run pupil capture et al

user-24270f 28 March, 2019, 05:38:50

i am not running python

user-24270f 28 March, 2019, 05:38:59

following here

user-24270f 28 March, 2019, 05:39:00

https://github.com/pupil-labs/hmd-eyes

user-24270f 28 March, 2019, 05:39:07

nothing about needing to install python

user-24270f 28 March, 2019, 05:40:24

wait one

user-24270f 28 March, 2019, 05:40:38

i shall try the age old "turn it off and on again (re download, reinstall)

user-24270f 28 March, 2019, 05:40:54

should have done that first anyway

user-24270f 28 March, 2019, 05:41:05

give me 5

user-24270f 28 March, 2019, 05:46:20

so re-downloaded and did a search for "encodings" in the file system, and base_library.zip came up. im guessing the first time i downloaded it, there was a corrupt file there

user-24270f 28 March, 2019, 05:46:35

i wonder if good idea to include md5 checksum on the downloads? they are quite large

user-f27d88 28 March, 2019, 05:47:28

Yes. it is a good idea. Does it work after you re-download and run it?

user-24270f 28 March, 2019, 06:10:43

yup, works now

user-f27d88 28 March, 2019, 07:40:48

Great, I thought you run from source, but looks like you run from exe file

user-0a2ebc 28 March, 2019, 08:19:14

@wrp Is there any certain threshold of pupil dilation that determines whether a participant feels positive or negative ?

user-0a2ebc 28 March, 2019, 08:20:29

@wrp I run the script...but there are some figures that I still don't understand how to get the meaning of it as follow

user-0a2ebc 28 March, 2019, 08:20:47

What does that figure mean ?

Chat image

user-0a2ebc 28 March, 2019, 08:21:36

Does that mean that there are 5 majors fixations that happen during the experiment ?

Chat image

user-0a2ebc 28 March, 2019, 08:21:51

Thanks in advanced

user-ee433b 28 March, 2019, 08:31:31

Hello everyone,

user-ee433b 28 March, 2019, 08:32:38

I tried to launch an older version of pupil capture, but it didn't work. And the problem is that now the version 1.11.4 doesn't work either.

user-ee433b 28 March, 2019, 08:33:47

The buttons detect eye 0 and detect eye 1 are green, but there is no eye windows open and the buttons are not responsive

user-ee433b 28 March, 2019, 08:36:32

Chat image

papr 28 March, 2019, 09:08:57

@user-0a2ebc - i.r.t the fixations: Correct, these 5 fixations were during the calibration procedure that was recorded. - i.r.t to pupil dilatio: No, there is no fixed pupil dilation threshold for a specific feeling. Pupillometry is a vast and complex topic. It definitively goes beyond this chat's scope. - i.r.t to the upper graphs: They show the normalised pupil position over time within the eye cameras' coordinate systems for each eye.

user-0a2ebc 28 March, 2019, 09:13:45

@papr Thanks for the explanation... I still have a question for the upper graph : Does that mean that eye0 (right eye)'s pupil position is more toward the above left direction than left eye ?

papr 28 March, 2019, 09:15:25

@user-0a2ebc Yes, but it is kind of meaningless since the eye cameras are flexible in position. In which script was the graph generated?

user-ee433b 28 March, 2019, 09:17:17

Ok, I've found a solution, I've reinstalled the drivers and it works fine. Is it normal that running an older version of pupil capture leads to troubles? Is there a way to run an older version without trouble? thx

user-0a2ebc 28 March, 2019, 09:17:44

@papr That is what I am thinking too. But I am trying to make sense the script below that generates the graph

user-0a2ebc 28 March, 2019, 09:17:45

plt.figure(figsize=(16, 5)) plt.subplot(1, 2, 2) plt.scatter(eye0_df['norm_pos_x'][eye0_high_conf], eye0_df['norm_pos_y'][eye0_high_conf], c=eye0_df['timestamp'][eye0_high_conf]) plt.colorbar().ax.set_ylabel('Timestamps') plt.xlabel('norm_pos_x') plt.ylabel('norm_pos_y') plt.xlim([0, 1]) plt.ylim([0, 1]) plt.title('eye0')

plt.subplot(1, 2, 1) plt.scatter(eye1_df['norm_pos_x'][eye1_high_conf], eye1_df['norm_pos_y'][eye1_high_conf], c=eye1_df['timestamp'][eye1_high_conf]) plt.colorbar().ax.set_ylabel('Timestamps') plt.xlabel('norm_pos_x') plt.ylabel('norm_pos_y') plt.xlim([0, 1]) plt.ylim([0, 1]) plt.title('eye1')

papr 28 March, 2019, 09:18:23

@user-0a2ebc Sorry, I meant from where did you get the script? I guess you did not write it yourself?

user-0a2ebc 28 March, 2019, 09:19:09

particularly on this section Plot Pupil Positions

papr 28 March, 2019, 09:19:19

@user-ee433b No, running older versions usually does not lead to driver issues. There might have been a coincidental Windows update which usually resets our drivers.

papr 28 March, 2019, 09:22:30

@user-0a2ebc Ok, the graph is more a prove-of-concept at that point that show cases how to visualize data. See the third tutorial that shows how to visualize a scanpath on a surface.

user-0a2ebc 28 March, 2019, 09:24:08

@papr I see...I haven't reached the third tutorial yet...:) thanks

user-0a2ebc 28 March, 2019, 23:59:49

Is the pupil-labs eye tracker suitable for testing website layout ?

user-0a2ebc 29 March, 2019, 00:00:52

Say, somebody sits before laptop and track where he gazes mostly

user-0a2ebc 29 March, 2019, 00:00:57

Thanks

user-0a2ebc 29 March, 2019, 00:18:04

And also tracking such as design of product

wrp 29 March, 2019, 04:22:56

@user-0a2ebc you can use surface tracking for screen-based studies, as well as physical materials

user-0a2ebc 29 March, 2019, 11:29:08

@wrp ah I see...thanks Will

wrp 29 March, 2019, 13:01:31

Welcome

user-2686f2 29 March, 2019, 14:32:52

Is there a way to compress the videos Pupil Capture produces (world, eye0, and eye1) and still maintain the same results?

papr 29 March, 2019, 19:10:15

@user-2686f2 yes, as long as the compressed video has as many frames as the original and can be read by pyav

user-2686f2 29 March, 2019, 19:48:51

@papr Do you know if the accuracy of pupil location or calibration suffers at all?

papr 29 March, 2019, 20:15:11

@user-2686f2 I do not think that either would suffer significantly. This of course depends on the type and strength of compression you are applying.

user-bf07d4 29 March, 2019, 21:15:49

the Pupil mobile eye tracking headset just arrived, can somebody help me with calibration? just arrived from shapeways

user-52b112 29 March, 2019, 22:54:24

I see from the literature (eg Hayes & Petrov, Beh Res Methods, 2016) that gaze position can affect pupil size measurements as a consequence of the angle between the eye-camera line and the eye-target line. Does Pilot's pupil size code do this correction? If not, has anyone written code to take care of this? (If not, we will.) Thanks!

user-52b112 30 March, 2019, 00:04:26

OK, never mind. Found the answer in the docs: for 2D gaze mapping, not corrected, for 3D it is.

user-4a4def 31 March, 2019, 14:17:14

Hello, I am having a very basic problem. I have a video that I had previously opened (it has all the correct files), and now when I try to open it, Pupil Player closes out and nothing happens and I can't figure out why. Anyone have any ideas?

papr 31 March, 2019, 22:02:12

@user-4a4def please share the player.log file in the pupil_player_settings folder after attempting to open the recording

End of March archive