πŸ‘ core


wrp 01 June, 2017, 00:52:14

@user-f7028f yes please use the dev branch of hmd-eyes we will be merging this to master soon. Apologies for any confusion this has caused

user-f7028f 01 June, 2017, 03:26:59

Ok! )

user-8c1de2 07 June, 2017, 16:14:19

@papr - thanks for your answer on gitter - "The code is very out of date though. But you can integrate the Pupil helper example for time sync in your psychopy experiment in the "coder mode"" - a few questions as I am learning about pupil labs and psychopy and python and others at the same time. Can I just add this code from the pupil helper onto the coder and that's it? or do I need to add anything else somewhere else?

papr 07 June, 2017, 18:01:42

No, just copy and pasting doesn't work. It depends on the generated code where to put the time sync part.

user-13e455 08 June, 2017, 21:53:32

hello everyone iam new here .. i working with project " Gaze plot using webcam " and draw diagram about the points in c++ please any one can help me

user-8c1de2 09 June, 2017, 14:54:14

@papr - thank you! I think I am almost there. One of the library "from network_time_sync import Clock_Sync_Master" does not work for mac - any advice or alternative route?

papr 09 June, 2017, 15:00:15

@user-8c1de2 Did you copy the network_time_sync.py file into the same directory as your other Python code?

papr 09 June, 2017, 15:03:41

@user-13e455 Could you provide a bit more context please? Which c++ code are you referring to? Pupil software is designed for head-mounted eye tracking only. This is different fom remote eyetracking that one does using webcams in front of the subject.

user-8c1de2 09 June, 2017, 18:42:47

at the moment I am just running the script directly

user-239bce 10 June, 2017, 18:45:21

Hi all! Is there code documentation for Unity Project "unity_integration_calibration"?

user-006924 11 June, 2017, 22:51:37

Hi all. This the picture of my eye feed and I barely get a pupil detection in 3D mode is it because I'm using a monocular setup and 3D detection can't be very accurate with one eye camera or is it the lighting ?

Chat image

mpk 12 June, 2017, 08:25:57

hi,

mpk 12 June, 2017, 08:26:21

this should work. just reduce the pupil min dimater and max diamter.

mpk 12 June, 2017, 08:26:26

@NahalNrz#1253

user-006924 12 June, 2017, 12:19:51

@mpk But isn't this used to change the settings for 2D detection only?

mpk 12 June, 2017, 12:20:23

Also affects 3d detection

user-13e455 12 June, 2017, 15:17:45

@papr hello buddy ... i just working with c++ open cv to detect the gaze and plot diagram in the monitor where the person is looking for " like for example i show for the user photo and when he looking for the photo i can know where hr take too much time and less time something like heat map but not exactly like that "

papr 12 June, 2017, 15:21:24

@user-13e455 Ok, and what exactly do you need help with? Do you want to know how to access the gaze data?

user-13e455 12 June, 2017, 15:22:22

yes please

mpk 12 June, 2017, 17:51:53

@user-13e455 if you have a Pupil headset you can use the marker tracker together with pupil remote: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_gaze_on_surface.py

mpk 12 June, 2017, 17:52:34

For remote eye tracking setups and webcam eyetracking Pupil is not the right project.

user-13e455 13 June, 2017, 09:55:30

@mpk thanks for answer buddy i just will explain for you what the supervisor want from me 1- setup USB cam 2- the user will sit in front of this cam and then the program will detect the gaze 3-After detect the gaze he will plot diagram that where the person looked for ( he will start from pixel 1 till pixel 10 and like this ) and he want this program in c++ opencv

mpk 13 June, 2017, 11:20:18

@user-13e455 for this setup pupil in not the right project. I recommend a Google search with webcam gaze mapping as keywords.

user-7b41a3 13 June, 2017, 13:11:39

Hello, I want to build think of eye tracking system for my 3 yo daughter, she is suffering from a cortical blindness, and it could help to improve her gaze. So I wanted to work on it, but I don't find on git hub or website, the .stl file to print the headseat. Is it available or not ? Because the classic headset will be too large for her head so i need to redu the size

user-7b41a3 13 June, 2017, 13:12:33

no "think" but "this"

user-7b41a3 13 June, 2017, 13:13:02

... well, "think" is a mix with "this" and "kind" ...

user-41f1bf 13 June, 2017, 23:32:01

Hi, Baptist. Camera mounts are open, but the headset frame is not. Please, enter in contact on sales@pupil-labs.com for a custom sized headset for your daughter.

wrp 14 June, 2017, 08:12:35

Hi @user-7b41a3 - I will reply via email. Thanks!

user-13e455 14 June, 2017, 15:10:05

hello this is what i done how can i can know the gaze points and draw diagramπŸ™„ for it

Chat image

user-13e455 14 June, 2017, 15:14:53

another test ..

Chat image

user-13e455 14 June, 2017, 15:14:55

Chat image

mpk 14 June, 2017, 15:56:34

@Aliov#1311 thats nice work. But its really not related to mobile eye tracking πŸ˜ƒ

user-99e72e 14 June, 2017, 18:43:43

Hi everybody, I was wondering how can I get three dimensional spatial information from a SIFT descriptor. Does the epipolar geometry can help me?

user-99e72e 14 June, 2017, 18:44:49

Or better yet, is it possible?

wrp 15 June, 2017, 10:21:02

@user-99e72e - what are you attempting to achieve?

user-99e72e 15 June, 2017, 10:21:55

@wrp I am trying to detect the three dimensional coordinates of an object in a scene

user-99e72e 15 June, 2017, 10:24:37

The initial thought was extracting the SIFT descriptors from two images of the same scene, match the descriptors and then use epipolar geometry (if possible) to triangulate and extract the depth. But I don't know if this is correct

user-99e72e 16 June, 2017, 10:29:47

@wrp any suggestions? What I would like to get is the depth of every SIFT descriptor in the image

wrp 16 June, 2017, 11:40:35

@user-99e72e I do not have any recommendations for you at this time. This approach seems like it might be quite noisy though

user-f896c6 19 June, 2017, 16:10:31

Hello, I am a student doing summer research using the pupil as an additive tool to my project. I installed the driver, but don't see it on my device manager. Is something wrong?

wrp 20 June, 2017, 01:27:34

@NDstudent4 what version of Windows are you using? Do you have admin privileges?

wrp 20 June, 2017, 01:29:57

Please note that we only officially support Win 10 64 bit.

user-34bbb4 21 June, 2017, 21:56:32

hello, I ve been trying to launch the simple python script "notification overview" ( https://github.com/pupil-labs/pupil-docs/blob/master/developer-docs/ipc-backbone.md ) to communicate with the ipc backbone, but a lot of error come up when i compile (replace send by send_string and "invalid argument" to send_multipart , so i assume i didn't have the good version of zero mq, in despite i installed it by : " sudo pip3 install git+https://github.com/zeromq/pyre " So far i just had the answer of the requester.send('SUB_PORT') and nothing more.

Since I want to write my code in cpp I tried to install this library ( https://github.com/zeromq/cppzmq ) to continue the project, but now i can't even get the answer to requester.send('SUB_PORT')... I tried to check which port is in use, but the 50020 is not in use anymore.

By the way i m on Ubuntu 16.04, i can run the pupil application from the source. Does anyone have any ideas ? And which version of zeroMq is in use for pupil labs? Thank you

user-d811b9 21 June, 2017, 22:50:13

hi, guys

user-d811b9 21 June, 2017, 22:51:13

can you help me with a problem in pupil capture. I write here becaouse is very important

user-d811b9 21 June, 2017, 22:51:46

sorry * because

user-006924 21 June, 2017, 22:52:36

I think you should download it from here: https://docs.pupil-labs.com/#python-wheels and this one for msgpack :http://www.lfd.uci.edu/~gohlke/pythonlibs/#msgpack

user-006924 21 June, 2017, 22:52:56

@user-34bbb4

user-d811b9 21 June, 2017, 22:55:12

My problem is this .. But it is not verify ever, but i have to close and re.open application and change the usb port about 5 times before that the eye1 camera work

Chat image

user-d811b9 21 June, 2017, 22:59:00

I don't understand the reason , because i think that it is not broken . I hope in your help

user-006924 21 June, 2017, 23:00:46

In your eye window capture selection, does it allow you to choose that camera manually?

user-d811b9 21 June, 2017, 23:04:42

Yes I can.

user-006924 21 June, 2017, 23:05:06

what happens when you choose it manually? are you still in ghost mode?

user-d811b9 21 June, 2017, 23:05:39

Yes . The messages is . The camera is already used or blocked

user-d811b9 21 June, 2017, 23:06:36

But if I switch the USB cable from 2.0 port to 3.0 port for 4-5 times , for miracle the eye1 work

user-006924 21 June, 2017, 23:08:00

I tried this method in the previous versions of pupil so I'm not sure whether it can be helpful or not but try opening your device manager and check which library your eye camera is in? if it's not under libusbk , download a software called zadig and using that software change the library of that certain camera to libusbk

user-d811b9 21 June, 2017, 23:19:54

Ok . I try this way . Is it possible that the problems there is because I installed the bundle version of pupil ?

user-d811b9 21 June, 2017, 23:21:08

It is my suppose because I am a novice with this instrument and I don't know very Well both ubuntu and the software

user-006924 21 June, 2017, 23:22:26

I'm not using ubuntu so I don't know about that but I'm guessing other people might have a better idea.

user-d811b9 21 June, 2017, 23:23:55

Ok . Thank you @user-006924

wrp 22 June, 2017, 08:52:20

@user-d811b9 do you have multiple instances of Pupil Capture running on your device? Or multiple instances of Pupil Service running?

wrp 22 June, 2017, 08:53:00

Multiple instances or "zombie" processes of Pupil Service will block the cameras (as they are already in use by another process)

wrp 22 June, 2017, 08:53:12

Please also try clearing your pupil_capture_settings folder

wrp 22 June, 2017, 08:53:16

and restarting Pupil Capture

wrp 22 June, 2017, 08:54:34

by clearing I mean deleting the entire pupil_capture_settings folder

user-d811b9 22 June, 2017, 08:57:12

@wrp No no only one .. the pupil service application is closed , therefore only the pupil_capture player running .

wrp 22 June, 2017, 08:57:56

did you ever run Pupil Service?

user-d811b9 22 June, 2017, 08:59:43

Maybe only once , to see what is this application .

wrp 22 June, 2017, 09:00:04

ok, please check System Monitor

wrp 22 June, 2017, 09:00:09

to see if other processes are still running

wrp 22 June, 2017, 09:00:20

and stop other processes

wrp 22 June, 2017, 09:00:25

if they are running

user-d811b9 22 June, 2017, 09:05:30

I saw in the system process that only pupil capture of pupil application running now

user-d811b9 22 June, 2017, 09:06:21

i try to clear the pupil_capture_setting folder and restarting

user-d811b9 22 June, 2017, 09:13:14

@wrp deleting the pupil_capture_setting folder .. the problems are double 😦 . Now both word_camera and eye1 camera are in ghost mode .

wrp 22 June, 2017, 09:40:29

@user-d811b9 Did you select the cameras from the Capture selection?

user-d811b9 22 June, 2017, 09:47:21

@wrp Yes. I removed the application and re-installed. Now work both eye1 and world cameras but the folder created is locked. This could be a problem ? i show what i say.

Chat image

wrp 22 June, 2017, 09:50:50

chmod 777 pupil_capture_settings

wrp 22 June, 2017, 09:50:57

then delete the dir and Pupil Capture will re-create

user-d811b9 22 June, 2017, 09:53:50

why ?

Chat image

wrp 22 June, 2017, 09:55:17

do you have sudo permissions?

user-d811b9 22 June, 2017, 09:57:08

Ah.. ok. I missed this . Done . Thanks @wrp

user-d811b9 22 June, 2017, 09:59:17

Now work all both eye and word camera

wrp 22 June, 2017, 09:59:54

πŸ‘

user-d811b9 22 June, 2017, 10:47:59

@wrp can You help me with a older problem , because it persist again. https://github.com/pupil-labs/pupil/issues/737

wrp 22 June, 2017, 10:51:29

Just re-read the issue. Could you be more specific re the current issue?

user-d811b9 22 June, 2017, 10:53:38

When I try to export the current image relative to a surface ( create before in the pupil capture ) the only image saved is the heat map .

wrp 22 June, 2017, 10:55:16

Did you load @user-41f1bf plugins before exporting?

user-d811b9 22 June, 2017, 10:55:54

Yes. But the result are that create the export image folder but is empty

wrp 22 June, 2017, 10:57:17

Unfortunately I am not familiar with the plugins that @user-41f1bf has written

wrp 22 June, 2017, 10:57:37

@user-41f1bf when you get online could you get in touch with @user-d811b9 ?

user-d811b9 22 June, 2017, 10:58:30

But the question is . Is it normally that in the bundle version I can export only the heatmap as png. File ??

wrp 22 June, 2017, 11:21:30

Offline surface tracker only exports heat maps based on dimensions given and does not export images from the scene

user-d811b9 22 June, 2017, 11:24:46

Ah ok . Looking on the pupil channel by YouTube . The video tutorial of pupil player showed that both image exported . Probably there is a step to obtain this result .

wrp 22 June, 2017, 11:28:30

Perhaps there is a misunderstanding

user-d811b9 22 June, 2017, 12:04:43

I send you the photo of the tutorial video of pupil player offline marker

user-d811b9 22 June, 2017, 12:05:19

Chat image

user-d811b9 22 June, 2017, 12:06:40

I want to export both heatmap and the surface , as showed in this video .

wrp 22 June, 2017, 14:16:03

@user-d811b9 this demo video is from 2014 - the exporter only exports the heatmap as png in current codebase

user-d811b9 22 June, 2017, 14:42:01

Ah ok . Thanks. Do you know any way to obtain this? I ask this because , in my thesis I have to export 952 curve and superimposition with their heatmap πŸ˜…

user-d811b9 22 June, 2017, 19:12:26

@wrp the problem of eye camera in ghost mode came back πŸ˜”

user-d811b9 23 June, 2017, 12:01:53

Hi, how can solve the problem that during the recording the eyes going out of calibration . Or better if I ask to subject to fixate a particular element before and after the recording the result is not the same even if the level of confidence is often 1.0

mpk 23 June, 2017, 12:47:26

@user-d811b9 I think you are referring to drift through slippage. Make sure the headset is worn without touching eye-brows and the collar clip is used.

user-d811b9 23 June, 2017, 13:02:43

@mpk thanks. I will do attention to this.

user-d811b9 23 June, 2017, 13:34:28

@mpk Is it normaly this ? eye0 - [WARNING] uvc: Could not set Value. 'Auto Exposure Mode'.

user-d811b9 23 June, 2017, 13:35:23

also for word camera and eye1 camera

mpk 23 June, 2017, 13:47:16

yes. this is normal.

user-d811b9 23 June, 2017, 14:43:58

@mpk Therefore I cannot use this right ?

mpk 23 June, 2017, 14:44:23

if you change the exposure mode you can change it.

user-d811b9 23 June, 2017, 14:44:47

How ?

user-d811b9 23 June, 2017, 14:53:20

I can change only these

Chat image

mpk 23 June, 2017, 14:59:38

change audto exposure mode to manual and you can change the exposure time.

user-d811b9 23 June, 2017, 15:07:26

sorry, but maybe i'm not clear. I want change auto exposure mode from "manual" to " auto" .

Chat image

user-d811b9 25 June, 2017, 09:40:15

Hi guys , if I want to analyze the eye tracking on a simulation guide , what are the better calibration methods ? And also What frequency of eye and world camera i have to use ?

user-d811b9 26 June, 2017, 08:02:51

hi; In what way i have to read this result ?

user-d811b9 26 June, 2017, 08:02:55

2017-06-26 11:36:21,819 - world - [INFO] calibration_routines.accuracy_test: Starting Accuracy_Test 2017-06-26 11:37:12,612 - world - [INFO] calibration_routines.accuracy_test: Stopping Accuracy_Test 2017-06-26 11:37:12,664 - world - [INFO] calibration_routines.accuracy_test: Collected 4923 data points. 2017-06-26 11:37:19,051 - world - [INFO] calibration_routines.accuracy_test: Gaze error mean in world camera pixel: 20.288190 2017-06-26 11:37:19,052 - world - [INFO] calibration_routines.accuracy_test: Error in degrees: [ 3.17522709 3.05694029 1.82212379 ..., 0.44822774 0.44400252 0.46188828] 2017-06-26 11:37:19,052 - world - [INFO] calibration_routines.accuracy_test: Outliers: (array([2799, 2802, 2803, 2804, 2805, 3008, 3009, 3010, 3011]),) 2017-06-26 11:37:19,052 - world - [INFO] calibration_routines.accuracy_test: Angular accuracy: 1.361990699164217 2017-06-26 11:37:19,212 - world - [INFO] calibration_routines.accuracy_test: Angular precision: 0.01974614055573268

user-d811b9 26 June, 2017, 08:04:22

and again, when i detect eye1 in the screen, the word view show this :

wrp 26 June, 2017, 08:04:39

@user-d811b9, could you please define "simulation guide" or provide an example so that we can understand what you're referring to?

wrp 26 June, 2017, 08:05:15

@user-d811b9 in the accuracy test you can look at results in the GUI of the plugin in the World window

user-d811b9 26 June, 2017, 08:05:50

2017-06-26 11:38:21,039 - eye1 - [WARNING] eye: Process started. 2017-06-26 11:38:30,696 - eye1 - [WARNING] uvc: Turbojpeg jpeg2yuv: b'Corrupt JPEG data: 149 extraneous bytes before marker 0xd7' 2017-06-26 11:38:30,699 - world - [WARNING] uvc: Turbojpeg jpeg2yuv: b'Corrupt JPEG data: 345 extraneous bytes before marker 0xd4' 2017-06-26 11:38:30,704 - eye1 - [WARNING] uvc: Turbojpeg jpeg2yuv: b'Corrupt JPEG data: premature end of data segment' 2017-06-26 11:38:30,729 - eye1 - [WARNING] uvc: Turbojpeg jpeg2yuv: b'Corrupt JPEG data: 220 extraneous bytes before marker 0xd2' 2017-06-26 11:38:30,734 - world - [WARNING] uvc: Turbojpeg jpeg2yuv: b'Corrupt JPEG data: premature end of data segment' 2017-06-26 11:38:30,742 - eye1 - [WARNING] uvc: Turbojpeg jpeg2yuv: b'Corrupt JPEG data: 163 extraneous bytes before marker 0xd7' 2017-06-26 11:38:30,753 - eye1 - [WARNING] uvc: Turbojpeg jpeg2yuv: b'Corrupt JPEG data: 154 extraneous bytes before marker 0xd0'

wrp 26 June, 2017, 08:06:05

@user-d811b9 according to your accuracy test, the results were:

Angular accuracy: 1.361990699164217
Angular precision: 0.01974614055573268
user-d811b9 26 June, 2017, 08:06:23

these resultaregood ?

user-d811b9 26 June, 2017, 08:06:45

to understand better

wrp 26 June, 2017, 08:07:19

@user-d811b9 1.36 degrees of angular accuracy is close to physiological limitations, but in our tests we have been able to achieve 0.6 degrees of angular accuracy

user-d811b9 26 June, 2017, 08:08:33

the problem could be '?

user-d811b9 26 June, 2017, 08:08:46

a bad detection ?

wrp 26 June, 2017, 08:09:52

@user-d811b9 accuracy is not bad in your test, but could be better πŸ˜„

wrp 26 June, 2017, 08:10:25

Without seeing a recording of the eye videos (or a whole data set) it would only be a guess on my part in regards to what contributes to this result

wrp 26 June, 2017, 08:11:22

low confidenc pupil detection often leads to poor results

wrp 26 June, 2017, 08:11:29

or slippage of the headset

wrp 26 June, 2017, 08:12:25

The TurboJpeg error - does this happen every time? What are the specs of your machine (CPU, RAM, etc)?

user-d811b9 26 June, 2017, 08:13:32

yes. Every time that i open only the eye1 camera.

user-d811b9 26 June, 2017, 08:18:56

Chat image

user-d811b9 26 June, 2017, 08:19:40

RAM 12 Gb

wrp 26 June, 2017, 08:24:49

Hi @user-d811b9 but the cameras display images, correct?

user-d811b9 26 June, 2017, 08:26:10

Yes, are both on aperture prior and the pupil is ever good detect,

wrp 26 June, 2017, 08:26:38

ok, these are warning and can be ignored for the time being

wrp 26 June, 2017, 08:26:50

we can adjust bandwidth settings, to reduce the warnings

user-d811b9 26 June, 2017, 08:27:24

how ?

wrp 26 June, 2017, 08:28:48

This is something that can be done on our end in future releases

user-d811b9 26 June, 2017, 08:31:04

but , you have to knwo that, after these messages, some time the eye1 camera shutdown alone and restart from the begin, and every time i have to change the parameters of exposure, ROI, Hz, and so on

mpk 26 June, 2017, 08:33:53

@user-d811b9 can you send us the full log of one of your sessions?

user-d811b9 26 June, 2017, 08:35:03

@wrp For the first request about the simulation guide. I have to analyze the eye movement of a number of peoples during driving in a simulation environment.

user-d811b9 26 June, 2017, 08:35:32

ok i try to copy now a log of a session

mpk 26 June, 2017, 08:36:00

the logfile is in the pupil_capture_settings folder btw.

user-d811b9 26 June, 2017, 08:41:28

@mpk I opened the application , changed the exposure from manual to aperture prior and imoved the eyse from right to left only .. no calibration, no anythings

Chat image

mpk 26 June, 2017, 08:42:22

Hi, can you post the same with default settings and without changing eye camera settings?

mpk 26 June, 2017, 08:42:34

please also do a calibation.

mpk 26 June, 2017, 08:42:36

thank you!

user-d811b9 26 June, 2017, 08:47:28

Chat image

user-d811b9 26 June, 2017, 08:47:36

Chat image

user-d811b9 26 June, 2017, 08:48:59

@mpk I followed your indications

mpk 26 June, 2017, 08:49:20

can you post the logile instead of a photo of it?

mpk 26 June, 2017, 08:49:32

thank you!

mpk 26 June, 2017, 08:50:57

you can attach files. Please dont past the log like this!

mpk 26 June, 2017, 08:51:17

I deleted your last post.

user-d811b9 26 June, 2017, 08:51:26

ah ok ..

mpk 26 June, 2017, 08:51:33

no worries πŸ˜ƒ

user-d811b9 26 June, 2017, 08:51:53

i send again the log files

user-d811b9 26 June, 2017, 08:52:16

this is the first page

user-d811b9 26 June, 2017, 08:52:20

Chat image

user-d811b9 26 June, 2017, 08:52:36

Chat image

user-d811b9 26 June, 2017, 08:52:37

you see the photo now ?

wrp 26 June, 2017, 08:52:46

@user-d811b9 - I think Moritz is asking you to literally drop the .log file here

wrp 26 June, 2017, 08:52:49

not screenshots

wrp 26 June, 2017, 08:53:03

so that we can read the log file in a text editor on our computers

user-d811b9 26 June, 2017, 08:53:15

capture.log

user-d811b9 26 June, 2017, 08:56:55

I 'm sorry but now , i have to do a simulation guide to a test driver for a time of 40 minutes' , therefore i will read your response after this . Thank you for your help

mpk 26 June, 2017, 09:30:29

@user-d811b9 thank you for your log. I can see that the eye1 camera is not functioning with 100% stability. We will send you a replacemnet cameras if you want.

user-d811b9 26 June, 2017, 12:47:27

@mpk sorry for my later, thanks for your analysis , but this afternoon , I will have a meeting with my professor and will talk about also this and after , I think that he will contact you.

user-d811b9 26 June, 2017, 12:47:55

By email

mpk 26 June, 2017, 12:48:16

@user-d811b9 ok sounds. good. The replacement will be free in case that was not clear.

user-d811b9 26 June, 2017, 12:56:58

@mpk ok thanks , but we have to send all pupil headset ? Because if you send only the eye camera the problem is that we don't know how to put on pupil headset πŸ˜…

mpk 26 June, 2017, 12:57:26

Hi @user-d811b9 no we just send you the camera. Replacement is super easy.

mpk 26 June, 2017, 12:57:46

to me more perscise we send you the left camera arm with camera.

user-d811b9 26 June, 2017, 12:59:01

Ok . I will talk with the professor and in this afternoon I contact you

mpk 26 June, 2017, 12:59:09

[email removed]

user-d811b9 26 June, 2017, 16:53:51

@mpk sorry but the meeting with my professor will be on 28 of this month . I will talk of this problem with the headset and , he will write you. Thanks for your help

user-f0de5d 26 June, 2017, 19:55:30

hey everyone~ i started working with a pupil labs eye tracking device a couple weeks ago, and have a quick question: when readjusting the eye cameras for optimal calibration with my hands, i noticed that they seem to get extremely hot. I'm worried that this might damage the device. Is it normal for the eye cameras to become so hot? thanks in advance πŸ˜ƒ

mpk 26 June, 2017, 19:58:29

@user-f0de5d the cameras do get hot. This is normal for 120fps capture devices of this sice with this sensor.

user-006924 27 June, 2017, 13:03:49

Hello everyone, My Pupil Player App has stopped working. The .exe files opens for couple of seconds shows several messages but closes itself before I can read anything. Does anybody have any suggestions of how I could fix this issue? It was working just fine couple of weeks ago .

mpk 27 June, 2017, 15:10:33

@NahalNrz#1253 please delete settings and try again!

user-db4664 27 June, 2017, 16:28:28

Hi all,

I encounter a problem when using the binocular pupil-lab eye tracker. Specifically, when I use data gaze_point_3d published through network, the gaze_point_3d is not so reliable as the 2-D gaze. See the attached video. https://youtu.be/QQo5TEK95qE

The left view shows the eye_position_3d with respect to the scene camera (white plane) the gaze normals of the two eyes. The cyan line segments correspond to the gaze_normal_3d. It can be seen that the gaze_normal_3d keeps flipping from time to time, however, the gaze normal of both eyes are much more stable so is the predicted gaze (represented by the red dot in the right view).

user-db4664 27 June, 2017, 16:28:32

I wonder why is that? Also, can I get the 2-D gaze (represented by the red dot) directly from the published data ? If yes which one corresponds to it?

I also raised the same question through dicord. Thanks in advance.

{"confidence":0.99, "norm_pos":[0.550466, 0.274002], "topic":"gaze", "gaze_point_3d":[48.9605, 238.669, 1351.43], "gaze_normals_3d":{0:[0.0211797, 0.122824, 0.992202], 1:[0.0628398, 0.199657, 0.977849]}, "base_data":[{"method":"3d c++", "confidence":0.99, "model_birth_timestamp":2.06285e+06, "id":0, "sphere":{"center":[0.98457, 3.73961, 38.1541], "radius":12}, "phi":-1.96718, "timestamp":2.06302e+06, "model_confidence":0.00419915, "projected_sphere":{"center":[335.999, 300.768], "angle":90, "axes":[389.997, 389.997]}, "diameter_3d":4.2213, "norm_pos":[0.371559, 0.306229], "topic":"pupil", "model_id":607, "diameter":97.2117, "circle_3d":{"center":[-3.64677, 4.06414, 27.0886], "normal":[-0.385945, 0.0270438, -0.922125], "radius":2.11065}, "ellipse":{"center":[237.798, 333.01], "angle":-14.2401, "axes":[83.2818, 97.2117]}, "theta":1.59784}, {"method":"3d c++", "confidence":0.99, "model_birth_timestamp":2.06283e+06, "id":1, "diameter":103.541, "phi":-2.01272, "timestamp":2.06302e+06, "model_confidence":0.461091, "projected_sphere":{"center":[327.654, 190.336], "angle":90, "axes":[403.601, 403.601]}, "diameter_3d":4.34627, "norm_pos":[0.330862, 0.729719], "topic":"pupil", "model_id":428, "sphere":{"center":[0.455143, -2.95328, 36.8681], "radius":12}, "circle_3d":{"center":[-4.62459, -4.6641, 26.1318], "normal":[-0.423311, -0.142569, -0.894696], "radius":2.17314}, "ellipse":{"center":[211.752, 129.735], "angle":24.7067, "axes":[81.7466, 103.541]}, "theta":1.42774}], "eye_centers_3d":{0:[20.7631, 15.2501, -17.6744], 1:[-39.8838, 13.3539, -20.7896]}, "timestamp":2.06302e+06}

-- Shuda Li

mpk 27 June, 2017, 16:35:42

Hi @user-db4664 the gaze intersection logic does not filter for intersections that are (wrongly) on the wrong side. We should propably improve on that! You should use the gaze normals for more stable results. The 2d gaze point would be "norm_pos" the units here are the scene camera width and hight. If you multpy by that you will get the position in pixels.

user-db4664 28 June, 2017, 07:43:47

Many thanks for the quick reply, I ll give it a try. Also, had norm_pos taken into account the lens distortion or not? Is lens distortion precalibrated, how can I retrieve the lens distortion parameters?

mpk 28 June, 2017, 07:44:49

norm pos takes lens distortion into account. meaning that the 3d point gets distorted accoring to the lens distortion during backprojection onto the world camera view.

user-db4664 28 June, 2017, 07:45:12

I see

user-db4664 28 June, 2017, 07:46:13

How about the distortion parameters , or I have to do a calibration by myself?

mpk 28 June, 2017, 07:47:03

we use a precalibration but you can improve on that if you run the camera clibration plugin.

user-db4664 28 June, 2017, 07:49:39

Oh I see. I haven't tried the calibration plugins yet. Will do. Many thanks, it's very helpful!

user-d811b9 28 June, 2017, 08:27:29

Hi guys, if I want us the monocular detection , the capture ( for 3D detection ) show a parameter called gaze distant mm .how can I read this ?

mpk 28 June, 2017, 09:15:33

since the monocular mapper cannnot know depth we assume a distance of 500mm. You can change this here but I recommend leaving it as is.

user-d811b9 28 June, 2017, 09:18:18

@mpk thanks.

user-d811b9 28 June, 2017, 09:19:59

@mpk I wrote you in private mode for the problem that we have discuss on the Monday linked to eye1camera

user-13e455 28 June, 2017, 15:35:49

hello every one ... does any one have idea how i can calculate the iris size using open cv ?

user-db4664 28 June, 2017, 16:15:01

Hi All,

I encounter another issue while using the norm_pos which suppose to correspond to the 2-D gaze of the pupil Lab. See the attached video: https://youtu.be/2PCmgwLAXy8

The left view is the gaze direction using 2-D position transformed from the norm_pos and right view is the original view of pupil-lab capture (I'm using version v0912_window_x64 ). Theoretically, the gaze of both views should be well synchronised, however, it is obviously not the case.

I interpret the norm_pos according to the documents of the pupil lab:

the norm_pos is the 2-D normalized coordinate: We use a normalized coordinate system with the origin 0,0 at the bottom left and 1,1 at the top right.

x_pixel = norm_pos[0] * screen_width (1280) y_pixel = (1 - norm_pos[1]) * screen_height (720)

user-db4664 28 June, 2017, 16:15:26

I also tried what mpk said on discord: The 2d gaze point would be "norm_pos" the units here are the scene camera width and hight. If you multpy by that you will get the position in pixels.

x_pixel = norm_pos[0] * screen_width (1280) y_pixel = norm_pos[1] * screen_height (720)

Neither way, can I get the predicted gaze synchronised with the red dot of pupil-lab capture. I wonder why is that?

Best regards,

mpk 29 June, 2017, 07:59:03

@user-db4664 this does not only look like a different position but differen temporal correlation. How do you choose what gaze point goes the what frame in your export?

user-db4664 29 June, 2017, 09:49:41

@mpk. Yes, there is a small lag of the gaze that causes the temporal correlation, but the poor spatial correlation is the main issue. For example, norm_pos[0] can never reach values close to 1, which corresponds to the area close to the right boundary of the images.

mpk 29 June, 2017, 09:50:46

the code that draws the point does nothing but take "norm_pos" from gaze and draw it on the screen more or less as you described.

user-db4664 29 June, 2017, 09:52:03

yeah, that's right.

user-db4664 29 June, 2017, 09:52:39

I also have synchronisation check to dicard data too old (longer than 0.1 second)

user-db4664 29 June, 2017, 09:54:03

I also notice that the timestamp published by the pupil lab capture is limited to "timestamp":2.06302e+06

mpk 29 June, 2017, 09:54:17

I meant that our code does nothing but that.

user-db4664 29 June, 2017, 09:55:07

my codes does just x_pixel = norm_pos[0] * screen_width (1280) y_pixel = (1 - norm_pos[1]) * screen_height (720)

user-db4664 29 June, 2017, 09:55:21

why I could not get the same thing as you do?

mpk 29 June, 2017, 09:55:21

seem correct.

mpk 29 June, 2017, 09:55:39

are you using gaze or pupil data here?

user-db4664 29 June, 2017, 09:56:42

{"confidence":0.99, "norm_pos":[0.550466, 0.274002], "topic":"gaze", "gaze_point_3d":[48.9605, 238.669, 1351.43], "gaze_normals_3d":{0:[0.0211797, 0.122824, 0.992202], 1:[0.0628398, 0.199657, 0.977849]}, "base_data":[{"method":"3d c++", "confidence":0.99, "model_birth_timestamp":2.06285e+06, "id":0, "sphere":{"center":[0.98457, 3.73961, 38.1541], "radius":12}, "phi":-1.96718, "timestamp":2.06302e+06, "model_confidence":0.00419915, "projected_sphere":{"center":[335.999, 300.768], "angle":90, "axes":[389.997, 389.997]}, "diameter_3d":4.2213, "norm_pos":[0.371559, 0.306229], "topic":"pupil", "model_id":607, "diameter":97.2117, "circle_3d":{"center":[-3.64677, 4.06414, 27.0886], "normal":[-0.385945, 0.0270438, -0.922125], "radius":2.11065}, "ellipse":{"center":[237.798, 333.01], "angle":-14.2401, "axes":[83.2818, 97.2117]}, "theta":1.59784}, {"method":"3d c++", "confidence":0.99, "model_birth_timestamp":2.06283e+06, "id":1, "diameter":103.541, "phi":-2.01272, "timestamp":2.06302e+06, "model_confidence":0.461091, "projected_sphere":{"center":[327.654, 190.336], "angle":90, "axes":[403.601, 403.601]}, "diameter_3d":4.34627, "norm_pos":[0.330862, 0.729719], "topic":"pupil", "model_id":428, "sphere":{"center":[0.455143, -2.95328, 36.8681], "radius":12}, "circle_3d":{"center":[-4.62459, -4.6641, 26.1318], "normal":[-0.423311, -0.142569, -0.894696], "radius":2.17314}, "ellipse":{"center":[211.752, 129.735], "angle":24.7067, "axes":[81.7466, 103.541]}, "theta":1.42774}], "eye_centers_3d":{0:[20.7631, 15.2501, -17.6744], 1:[-39.8838, 13.3539, -20.7896]}, "timestamp":2.06302e+06}

user-db4664 29 June, 2017, 09:57:13

here is a message, I recieved. I am not sure what gaze or pupil data you mean

papr 29 June, 2017, 09:57:15

@user-db4664 do you get yout data using pupil remote?

user-db4664 29 June, 2017, 09:57:31

no, I get data using pupil capture

papr 29 June, 2017, 09:57:44

Using a plugin?

user-db4664 29 June, 2017, 09:58:10

frame buffer publisher plugin

user-db4664 29 June, 2017, 09:59:14

yest pupil remote plugin

papr 29 June, 2017, 09:59:15

so you modified such that it draws gaze positions into it before publishing the frames?

mpk 29 June, 2017, 09:59:38

can you post the code in a gist?

papr 29 June, 2017, 09:59:49

http://gist.github.com

user-db4664 29 June, 2017, 10:05:47

@papr, no, I receive both original scene image and data on my client-side applications and then draw the gaze position into the scene image and display it in my application.

user-db4664 29 June, 2017, 10:06:01

I'll provide my code, I'm working on it

papr 29 June, 2017, 10:07:02

Ok, the frame has its own timestamp. You can use it to correlate it with the gaze position's timestamp

user-db4664 29 June, 2017, 10:11:18

@papr do you mean that if the frame and its own timestamp are packed together in a multipart message?

user-db4664 29 June, 2017, 10:11:46

and it is different from the timestamp of the data

papr 29 June, 2017, 10:13:07

so you subsribe to two topics, frame.world and gaze, correct?

papr 29 June, 2017, 10:13:34

gaze should be a 2-part message, frame a 3-part message

user-db4664 29 June, 2017, 10:13:41

yeah

user-db4664 29 June, 2017, 10:13:55

I see

papr 29 June, 2017, 10:14:02

where the first part ist the topic, the second par a msgpack-serialized dictionary and the third one the image buffer

papr 29 June, 2017, 10:14:16

and this dict in the 2nd part always contaisn a timesamp

papr 29 June, 2017, 10:14:40

this timestamp can be used to correlate multiple datums of different kinds

user-db4664 29 June, 2017, 10:16:02

BTW, is the unit of timestamp in second? "timestamp":2.06302e+06

mpk 29 June, 2017, 10:16:24

yes. the unit is seconds. The EPOCH is settable

mpk 29 June, 2017, 10:16:34

usually by default its the starttime of the computer.

user-db4664 29 June, 2017, 10:17:53

Is that say, if I set it as the startime of the pupil application, I should be able to get a much higher precision?

user-db4664 29 June, 2017, 10:18:12

I didn't reboot my pc for weeks

mpk 29 June, 2017, 10:19:03

I think you should already have enoug. Its a 64bit float.

mpk 29 June, 2017, 10:19:09

enough*

user-db4664 29 June, 2017, 10:20:12

no, the message that I recieved just gives me this much precision, "timestamp":2.06302e+06, nothing more

user-db4664 29 June, 2017, 10:22:00

obviously, not all of the effective digits of the 64-bit float timestamp has been packed into the multi-part message.

papr 29 June, 2017, 10:23:31

you can set the epoch via pupil remote

user-db4664 29 June, 2017, 10:23:41

cool, thanks

user-db4664 29 June, 2017, 10:25:59

@papr

Chat image

user-db4664 29 June, 2017, 10:26:22

sorry, where can I set the epoch time? I'm using v0912 windown x64

user-db4664 29 June, 2017, 10:41:36

dont worry, I'll just reboot my system and see if it improves things. Also, I would like to investigate the synchronisation between the gaze and frame publisher to ensure the temporal correlation, but I doubt it is the reason. I'll get back to you tomorrow. Anyway, thanks for your timely response.

papr 29 June, 2017, 12:10:02

You set it remotely using the Pupil Remote protocol.

user-db4664 29 June, 2017, 14:20:08

@papr Many thanks for the hint! The precision of the timestamp problem is now solved. See the attached image.

user-db4664 29 June, 2017, 14:20:20

Chat image

papr 29 June, 2017, 14:20:47

nice

user-db4664 29 June, 2017, 14:20:55

I am now working on the synchronisation between frame publisher and gaze data

user-db4664 29 June, 2017, 14:21:03

I'll get back to you

user-db4664 29 June, 2017, 14:21:10

thanks again papr!

papr 29 June, 2017, 14:21:18

no problem, happy to help

user-db4664 29 June, 2017, 14:21:24

πŸ˜ƒ

user-d7b89d 30 June, 2017, 08:11:17

Some none technical question here: Is there some serialnumber for the pupil devices?

papr 30 June, 2017, 08:12:05

@user-d7b89d No, unfortunately not.

user-d7b89d 30 June, 2017, 08:12:25

ok thanks for the quick answer

papr 30 June, 2017, 08:12:38

No problem. May I ask for the use-case?

user-d7b89d 30 June, 2017, 08:19:24

Pretty simple, we need to inventory the glasses as well as the htc vive integration

papr 30 June, 2017, 08:24:59

I see. An idea would be to add a sticker to the usb-clip part or to add a tag around the cable near the clip

papr 30 June, 2017, 08:25:50

Chat image

user-d7b89d 30 June, 2017, 08:28:17

thanks for the recommondation, I think the bureaucracy needs to be put back in this case

user-d7b89d 30 June, 2017, 08:28:41

the device already got some inventory sticker, that should be enough

mpk 30 June, 2017, 08:29:02

@user-d7b89d please use the orderid as SN if possible.

papr 30 June, 2017, 08:30:11

@mpk There can be multiple devces in an order, can't they?

mpk 30 June, 2017, 08:30:23

in that case please add a digit.

user-d7b89d 30 June, 2017, 08:30:55

ok, i'll the the orderID-1 and orderID-2 for each of the devices

mpk 30 June, 2017, 08:35:04

cool

user-db4664 30 June, 2017, 14:14:12

Hi All,

Thanks for papr's help now I have synchronise the gaze with frame publisher. But as I predicted the problem of gaze interpretation remains. See the attached two videos:

https://youtu.be/xfob8t3t0MY

https://youtu.be/5vDu6PCGB_Y

From video, I show both the pupil lab views and my applications. In my application, I use the message published by pupil-lab capture (Not remote) to visualise the 3-D gaze and 2-D gaze motion. As you can see, the synchronisation cannot be a problem because the 3-D gaze are well synchronised with the pupil lab capture.

However, the 2-D gaze seems not right.

I also notice that there are 3 norm_pos is the gaze pupil multipart message and they all have different values, it's probably because I used incorrect norm_pos??? which one I should use? Many thanks!

user-db4664 30 June, 2017, 14:15:00

Chat image

papr 30 June, 2017, 14:16:39

@user-db4664 The gaze object is a mapped point based on one or two pupil positions. These pupil positons are added under the field base_data of the gaze datum. These pupil positions contain norm_pos fields as well, but are relative to the eye window.

user-db4664 30 June, 2017, 14:16:59

I see,

papr 30 June, 2017, 14:17:00

Therefore my guess is that you are visualizing pupil positions and not gaze positions

papr 30 June, 2017, 14:17:11

These are of course meaningless in the world view

user-db4664 30 June, 2017, 14:17:12

okay

user-db4664 30 June, 2017, 14:18:03

so the third should be the correct norm_pos that I should use?

papr 30 June, 2017, 14:18:52

Yes. The gaze datum is a hiearchical dictionary. The correct gaze norm_pos should be in the top level

user-db4664 30 June, 2017, 14:22:13

Because I using c++, so I can't make use of hierachical dictionary to parse the message. so the norm_pose immediately follows topic:"pupil" are the pupil centre position, yeah?

papr 30 June, 2017, 14:23:11

So is your print-out above a python or c++ output?

papr 30 June, 2017, 14:24:11

the keys are not guaranteed to follow the pupil topic. Python dictionaries do not guarantee ordered entries.

user-db4664 30 June, 2017, 14:24:14

it's c++ print-out

user-db4664 30 June, 2017, 14:24:40

cool I think I get it

papr 30 June, 2017, 14:25:01

{ "base_data" : [{...}, {...}] }

user-db4664 30 June, 2017, 14:25:12

okay

user-db4664 30 June, 2017, 14:25:23

many thanks papr, it's very helpful. I'll get back to you if I manage to do it in the right way

papr 30 June, 2017, 14:25:40

so you do have hiearchical access?

user-db4664 30 June, 2017, 14:26:29

no, I don't but I can just use some trick to pick up the right norm_pos

papr 30 June, 2017, 14:26:47

ok, what ever works in c++ πŸ˜ƒ

user-db4664 30 June, 2017, 14:27:09

yeah πŸ˜ƒ

user-db4664 30 June, 2017, 14:50:58

It works now

user-db4664 30 June, 2017, 14:51:00

https://youtu.be/6IXHDuTX1ko

user-db4664 30 June, 2017, 14:51:28

I presume you also have a filter to remove unreliable norm_pos?

user-db4664 30 June, 2017, 14:52:12

How do you do that?

papr 30 June, 2017, 14:52:20

yes, we clip the value to some multiple of 1, since the values are supposed to be within 0 and one

papr 30 June, 2017, 14:53:04

this happens often when the confidence is low. e.g. you could only use gaze positions with a confidence of >= 0.7

user-db4664 30 June, 2017, 14:53:17

I see!

user-db4664 30 June, 2017, 14:53:30

cool, thanks papr!

End of June archive