πŸ‘ core


user-51a3b1 02 January, 2019, 05:36:16

Hi, I'm trying to use pupil cam for hololens, but the USB is not recognized on the computer. I think this problem is related to USB driver. I read the guide, and I know it should be worked if I execute pupil capture, but it doesn't work.. Is there anyone can solve this problem??

user-51a3b1 02 January, 2019, 05:48:30

I think I found the problem.. some cable was cut. thanks

papr 02 January, 2019, 11:03:49

@here We are pleased to announce the latest release of Pupil software v1.10! We highly recommend downloading the latest application bundles: https://github.com/pupil-labs/pupil/releases/tag/v1.10

user-41c874 02 January, 2019, 14:59:49

Hey. I am trying to use the natural features calibration method. The computer on which I use the eye tracking system is separate from the one which runs our psychophysics task. I show on the other screen: 9 red dots, one at a time as stimuli, whose coordinates we know in our screen space. (The 9 dot pattern shown in this link is what I am trying to use: .http://www.okazolab.com/faq-blog/how-to-calibrate-an-eye-tracker-within-eventide) The subject is asked to fixate on those dots and I click on those 9 dots in the world camera view . How are pupil positions and gaze positions normalized in this case? I am a little confused with mapping of the coordinate system of the eye tracker on the coordinate system of the screen for our psychophysics task. Could you help me out with that ? And do you recommend a better method of calibration ,if not this ? Also, where is the readable calibration file saved for each file ? All the pupil capture calibration file gives is the angular accuracy and precision . Where can I get the gain and offset values ?

papr 02 January, 2019, 15:03:36

Hey @user-41c874 Please do not repost your questions. A small reminder is usually ok. To your question: Pupil gaze is calibrated to the scene cameras coordinate system. Use the Surface Tracker plugin to get gaze in reference to an area of interest. Are you using the offline calibration feature of Player or are we talking about online calibrations with Capture?

papr 02 January, 2019, 15:07:37

@user-d9bb5a I still need to review your recording. This might be a bug Player. If this is true we will create an issue on Github where its solution can be tracked.

user-41c874 02 January, 2019, 15:18:56

@papr Thanks for your reply. I tried the online calibration with Capture using the natural features calibration method. Is the data in pupil_positions.csv also calibrated to the world camera's coordinate system at the time of calibration? (Or is that the case only for the data in gaze_positions.csv?) I will try to use the Surface Tracker plugin now and get back to you about that soon. Thanks for the suggestion.

papr 02 January, 2019, 15:23:59

@user-41c874 Pupil (pupil_positions.csv) data is relative to the eye cameras coordinate system and therefore not calibrated. Only gaze (gaze_positions.csv) and data based on gaze (e.g. fixations/gaze on surfaces) are calibrated

papr 02 January, 2019, 15:26:56

Please be aware that the online natural feature calibration can result in low accuracy since it depends on a second person clicking the correct positions in the Capture window while the subject might be moving her/his head

papr 02 January, 2019, 15:28:35

In your case, I recommend to use the offline calibration feature or one of the online calibration methods that depend on automatic marker detection (e.g. screen marker/manual marker/single marker calibration)

user-41c874 02 January, 2019, 15:39:52

So, pupil data is immune to movement of the head , but gaze data isn't. Right ?

papr 02 January, 2019, 15:40:57

What do you mean by immune?

user-41c874 02 January, 2019, 15:57:35

I need to combine the normalised x and y positions of gaze/pupil with the pixel coordinates of the stimulus I show on a screen. So, if a particular x-y position of the pupil corresponds to a particular x-y position on the screen used to show the stimulus , and the subject moves his head in between , then still the same x-y pupil positions will correspond to the same coordinates on my screen. But, since gaze x-y position is dependent on the world camera coordinates , I presume gaze-xy positions will change if the head position is changed, but the subject is still looking at the same place. (Or is this not the case? )

papr 02 January, 2019, 16:11:59

No, this is not the case. We have 3 independent coordinate systems in this case. eye camera image space, world camera image space, and the monitor pixel grid. We need to learn their relationships before we can translate locations between them. A calibration learns the mapping function between eye and world camera. This assumes the camera positions to each other to be fixed.

The relation between the world camera and the monitor is not fixed since the subject is allowed to move his/her head. Therefore we need to reestimate the current relationship for each recorded world frame. To do this we need the square/surface markers.

Afterwards we can translate pupil positions (eye image coordinates) to gaze (world image coordinates) to mapped surface gaze (monitor coordinates)

user-41c874 02 January, 2019, 18:06:53

Ah! that really helps me clear this confusion in my head ! Thanks a lot . I will try to use surface tracking with these markers (online and offline) and then get back to you. Thanks !!

user-d9bb5a 03 January, 2019, 06:36:07

@papr I dropped a post a week ago to your mail.

papr 03 January, 2019, 11:05:04

@user-d9bb5a I was not able to reproduce the issue (no exported heatmaps) on mac. I will try to reproduce on Windows. Attached is the exported heatmap.

Chat image

user-d9bb5a 03 January, 2019, 11:24:37

thank

papr 03 January, 2019, 12:48:47

@user-d9bb5a I was also not able to reproduce the issue on our Windows 10 machine. Please make sure that you are using v1.9. The export included in your recording that you sent to us indicates that you were using v1.8

user-adc157 03 January, 2019, 20:32:32

Hi, when i use mobil phone, i can record with audio. However, when i use macbook pro, i can't record audio. I opened audio plugin.

papr 04 January, 2019, 08:21:19

@user-adc157 hi, which version of Capture and macos do you use?

user-6fdb19 04 January, 2019, 12:20:31

hello... can anyone spare a video of a pupil camera recording to me?

papr 04 January, 2019, 12:21:13

@user-6fdb19 do you need a single eye video recording or a complete recording that you open in Player?

user-6fdb19 04 January, 2019, 12:21:28

single eye recording!

user-6fdb19 04 January, 2019, 12:23:25

can you send it to my email please? [email removed]

papr 04 January, 2019, 12:23:38

Sure

user-6fdb19 04 January, 2019, 12:24:01

preferably a video that is at least 3 minutes long

user-6fdb19 04 January, 2019, 12:24:13

thanks mate!! πŸ˜ƒ

papr 04 January, 2019, 12:24:45

Not sure if our public example is that long. I will check

user-6fdb19 04 January, 2019, 12:25:25

im going to buy the pupil-labs for my phd... i want to get things started already

papr 04 January, 2019, 12:26:45

@user-6fdb19 In this case I would recommend to download Player from the github release page and playing around with this dataset: https://drive.google.com/open?id=1OugotQQHsrO42S0CXwvGAa0HDvZ_WChG

user-6fdb19 04 January, 2019, 12:27:28

thanks mate!

user-6fdb19 04 January, 2019, 12:27:46

with pupil player i can extract the pupil-camera to .avi?

papr 04 January, 2019, 12:28:51

You can export the eye and world videos to mp4 using the World Video Exporter plugin (loaded by default) and the Eye Video Exporter plugin (activate in the plugin manager).

user-c1923d 04 January, 2019, 12:29:09

Quick question, regarding pupil player. When loading videos that were recorded with opencv, I get a warning about the pts not being evenly spaced despite having a timestamp file. Is the evenly spaced pts assumption used solely to retrieve the images on playback or is it going to ignore the actual timestamps and screw up the gaze mapping?

papr 04 January, 2019, 12:29:10

Also, have a look at our user docs on our website

papr 04 January, 2019, 12:31:34

@user-c1923d The evenly spaced pts are required for well-behaved playback. The gaze mapping itself is not affected by this. Be aware that the visualization might be incorrectly shown.

user-c1923d 04 January, 2019, 12:37:52

Ok. So exported gaze positions later on should map correctly to the timestamps file

papr 04 January, 2019, 12:38:03

yes

user-c1923d 04 January, 2019, 12:38:24

Thanks!

user-6fdb19 04 January, 2019, 12:42:43

thanks for the email ppr πŸ˜ƒ

user-adc157 04 January, 2019, 17:50:59

Hi, I have got v1.8, today downloaded new versiyon.

user-4878ad 04 January, 2019, 18:58:01

Just wondering if there was a place where the pupil dilation data would be stored during recording and how I could access that data.

user-4878ad 04 January, 2019, 18:58:15

Thanks in advance πŸ˜ƒ

user-4878ad 04 January, 2019, 18:59:02

Do I have to have a certain configuration for that information to be recorded?

papr 04 January, 2019, 19:05:27

@user-4878ad do you need to access the information during the recording or after the recording?

user-4878ad 04 January, 2019, 19:05:50

@papr after the recording

papr 04 January, 2019, 19:07:34

@user-4878ad open the recording, export it with the raw data exporter, and check the pupil_positions.csv

user-4878ad 04 January, 2019, 19:09:44

Chat image

user-4878ad 04 January, 2019, 19:10:35

This is what I have from the recording... Sorry I'm not that great with computer stuff... not sure why they assigned this task to me πŸ˜…

user-4878ad 04 January, 2019, 19:10:59

@papr

papr 04 January, 2019, 19:13:27

@user-4878ad did you download Pupil Player?

user-4878ad 04 January, 2019, 19:13:39

yes!

papr 04 January, 2019, 19:14:05

Great, open it, and drag the 000 folder on top of the gray window

papr 04 January, 2019, 19:15:32

After opening the recording, hit e. Check the folder in the file explorer window. It should have have an export folder containing the exported pupil data

user-4878ad 04 January, 2019, 19:17:23

I pressed 'e'

user-4878ad 04 January, 2019, 19:19:30

there is an export file!! πŸ˜ƒ

user-4878ad 04 January, 2019, 19:21:47

sorry, I am working between computers.... the new laptop has pupil labs but no MS office so I am transferring the file over to the other computer

user-c1923d 05 January, 2019, 10:19:21

I see in the docs that the recommended threshold for pupil confidence is 0.6. Similarly, is there any recommended threshold for the gaze estimation confidence?

papr 05 January, 2019, 10:27:15

@user-c1923d gaze confidence is inferred from pupil confidence. Take a look at the accuracy visualizer plugin if you want to judge the gaze estimation quality.

user-c1923d 05 January, 2019, 10:30:48

@papr this is post hoc for evaluating the gaze quality at the end of the experiment. So I'm just looking for a threshold to consider samples as data loss or for actual measurements

user-adc157 05 January, 2019, 10:31:38

Hi, when i use mobil phone, i can record with audio. However, when i use macbook pro, i can't record audio. I opened audio plugin. I selected voice and sound. I have got v1.8 capture and player.

papr 05 January, 2019, 10:39:18

@user-adc157 I can try to reproduce the issue on Monday.

user-41c874 07 January, 2019, 10:19:35

@wrp and @papr Hey. I am having similar issues as @user-7bc627 on 02.10.2018 . Pupil cam ID1 is not appearing in the "select to activate " and instead of it , it says "unknown". It shows Capture initialization failed for eye1 and shows Ghost capture. The world camera and pupil cam ID0 work fine. I have tried reinstalling drivers and running as an administrator and I am using the latest version of pupil capture (1.10). Any suggestions on what I should try apart from these troubleshooting options?

papr 07 January, 2019, 10:21:11

@user-41c874 Did Pupil Cam ID1 work before has it never been working for you?

user-41c874 07 January, 2019, 10:26:55

It worked fine before and then suddenly stopped working .

papr 07 January, 2019, 10:31:28

@user-41c874 Please contact info@pupil-labs.com with the information above as well as your order id. My colleague will take over support form there.

user-41c874 07 January, 2019, 10:42:02

Okay. Thanks.

papr 07 January, 2019, 11:19:13

@here starting on January 11 we will be making changes to how this public server is moderated. In order to post messages you will need to meet the following criteria: 1. That you have a verified email address for your Discord account. 2. That you have been a member of the server for more than 5 minutes.

These changes follow Discord's recommendations for public servers. The primary aim is to reduce the potential for spam/bots from posting off-topic/explicit content.

If, for any reason, you disagree with these changes, please let us know and we can discuss.

user-6fdb19 07 January, 2019, 11:55:09

hello everyone

user-6fdb19 07 January, 2019, 11:55:40

does anyone have the link of pupil detection algorithm used by pupil-labs?

user-c1923d 07 January, 2019, 12:10:04

The original pupil paper has a description, but it seems they also use some tracking now

user-c1923d 07 January, 2019, 12:11:05

You can always check the source code

user-c1923d 07 January, 2019, 12:11:59

Alternatively, I wrote a brief description on the PuReST paper (don't know if I'm allowed to link that here)

user-6fdb19 07 January, 2019, 12:12:31

do you mean "Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction

user-c1923d 07 January, 2019, 12:13:34

Yes, there is a PETMEI paper and a technical report on arxiv

user-c1923d 07 January, 2019, 12:14:17

The other paper I mentioned is "PuReST: robust pupil tracking for real-time pervasive eye tracking"

user-6fdb19 07 January, 2019, 12:15:38

thanks mate!

papr 07 January, 2019, 12:45:38

@user-c1923d I did not realise that it was you. Welcome to the channel. I think we met personally at ETRA last year. Yes, the "minimum data confidence threshold" (by default 0.6, adjustable in the Player general settings) is the threshold that you are looking for. There is a second, higher threshold that we use to filter pupil data when calibrating.

user-bd800a 07 January, 2019, 12:51:10

Hi, I'm trying to run the pupil capture from source (windows 10) but I have 2 problems: From the command prompt I get an error in the build.py process for the pupil_detectors (I also have some overriding of /W3 by /W and the option c++11 is ignored) If I try to start the main.py I get an error from the git (which works with the command prompt...) Any help? Thanks

papr 07 January, 2019, 12:55:39

@user-bd800a Hi, - Regarding the first issue: Are you building the pupil detectors directly by running python setup.py build? This often gives a more detailed error message. - Regarding the second issue: Did you download the source as zip from github? It does not include the necessary version information that we require. Please use git clone to download the source.

Generally, we recommend to run the bundle (especially on Windows). Nearly everything an be done via a custom plugin or the Pupil Remote network api. This avoids the tedious source dependency installation.

user-bd800a 07 January, 2019, 12:57:14

Ok I will try the setup.py, I downloaded it using github desktop, so that should have included it, I guess

papr 07 January, 2019, 12:58:19

@user-bd800a Yes, I agree that should be equivalent to git clone. Could you pm me the git error message that you get by executing main.py?

user-c1923d 07 January, 2019, 13:18:09

@papr thanks! I'm going with two thresholds based on your docs: >0.6 ("useful data") and >0 (all pupils for which at least something is known)

papr 07 January, 2019, 13:19:36

@user-c1923d The only useful information in pupil data with confidence==0. is the timestamp I guess. Everything else are default values IIRC

user-c1923d 07 January, 2019, 13:24:26

@papr makes sense. Anyway, thanks for sponsoring ETRA again! Hopefully I'll see you guys around this year as well

papr 07 January, 2019, 13:27:05

@user-c1923d Yes, we are planning on coming, too! πŸ™‚

user-c8c8d0 07 January, 2019, 14:59:38

hello everyone! I've a question about the Offline Surface Tracker...neither the docs and the player interface does explain well which is the X and Y size meanings, apart that is very important for the heatmaps visualization

papr 07 January, 2019, 15:04:02

@user-c8c8d0 It is the number of bins (resolution) which will be used for the heatmap generation. E.g. if you surface has a size of 30x20 cm and you set the size to width=15, height=10, the resulting heatmap bins will be of size 2cmx2cm

user-c8c8d0 07 January, 2019, 15:05:19

cool, the heatmap bins! πŸ˜ƒ thank you very much @papr ..it wasnt very clear

papr 07 January, 2019, 15:07:22

@user-c8c8d0 Yeah, our upcoming version will separate surface unit scaling from heatmap bins. This will make it more clear to use.

user-c8c8d0 07 January, 2019, 15:09:48

Oh, ok..and is already planned a release date?

papr 07 January, 2019, 15:10:15

No, there are still some implementational issues that need to be resolved first.

user-c8c8d0 07 January, 2019, 15:10:34

πŸ‘Œ thank you again!

user-026f3a 07 January, 2019, 23:51:58

Hello everyone. I'm trying to setup Pupil Capture using the latest Windows 10 bundle. Both eye capture cams are working fine, but I'm getting no world camera feed. If I click calibration I get an error: world - [ERROR] calibration_routines.screen_marker_calibration: Calibration requiers world capture video input. If I then go to the Realsense Manager section and select Intel RealSense R200 from the Activate source dropbown I get an error with trace:

world - [ERROR] launchables.world: Process Capture crashed with trace:
Traceback (most recent call last):
  File "launchables\world.py", line 596, in world
  File "launchables\world.py", line 391, in handle_notifications
  File "shared_modules\plugin.py", line 340, in add
  File "shared_modules\video_capture\realsense_backend.py", line 213, in __init__
  File "site-packages\pyrealsense-2.2-py3.6-win-amd64.egg\pyrealsense\core.py", line 27, in __init__
  File "site-packages\pyrealsense-2.2-py3.6-win-amd64.egg\pyrealsense\core.py", line 36, in start
  File "site-packages\pyrealsense-2.2-py3.6-win-amd64.egg\pyrealsense\utils.py", line 46, in _check_error
pyrealsense.utils.RealsenseError: rs_create_context(api_version:11203) crashed with: IKsControl::KsProperty(...) returned 0x800706be

Any clues on what's going on here?

papr 08 January, 2019, 08:21:23

@user-026f3a please contact info@pupil-labs.com regarding the use of the R200 on Windows.

papr 08 January, 2019, 08:22:42

@user-026f3a you do have a R200 realsense camera as World camera, don't you?

user-c8c8d0 08 January, 2019, 09:33:43

hello again! I've another question about the Offline Surface Tracker...is it possible to export the heatmaps over the world video?

papr 08 January, 2019, 09:34:08

@user-c8c8d0 No, this is not possible, unfortunately.

user-c8c8d0 08 January, 2019, 09:35:12

cause they are edited with opengl?

papr 08 January, 2019, 09:40:02

Correct. To be more precise: They are drawn with opengl. But all world video visualizations need to be rendered into the frame data. Therefore, rendering heatmaps into the world video is not supported.

user-c8c8d0 08 January, 2019, 09:45:01

@papr Thank you!

user-c8c8d0 08 January, 2019, 09:46:51

maybe replacing opengl with opencv can work

papr 08 January, 2019, 09:49:13

@user-c8c8d0 Rendering the heatmaps with opencv is much slower than drawing with opengl. Nonetheless, we could look into adding this feature for the export. What exactly would be your usecase/benefit of having the heatmaps rendered into the world video? The heatmaps are aggregated over the complete recording. They would not change their values.

user-c8c8d0 08 January, 2019, 09:53:42

@papr It would be pure visualization purpose, such as the circle or other vis. But I understand that it does not worth the effort, instead of exporting the video, one could simply reproduce it in the Player.

user-41c874 08 January, 2019, 10:55:57

Hello again! Quick question. Can the world camera be detached from the headset and mounted else where?

papr 08 January, 2019, 10:58:14

@user-41c874 Do you mean if one can mount it to a different headset?

user-41c874 08 January, 2019, 13:06:29

No. Just if the world camera can be taken off the headset and mounted somewhere else . (We plan on doing behavioural studies with head-fixed monkeys. and it would be simple to mount it on the top of the primate chair. )

papr 08 January, 2019, 13:09:09

@Tarana#6043 yes, technically it is possible. But I am not sure what our repair policy is, if something breaks in this case. Please contact info@pupil-labs.com for details.

user-41c874 08 January, 2019, 13:22:02

Thanks!

user-c5bbc4 08 January, 2019, 18:31:34

Hello, I am following the doc about windows dependencies https://docs.pupil-labs.com/#opencv-to-pupil-external . it shows 'opencv\build\x64\vc14\bin\opencv_world320.dll' but the opencv has updated, can I use 'opencv_world400.dll' instead? Can I do the same to other dependencies or wheels? Thanks so much!

user-c5bbc4 08 January, 2019, 18:33:40

like I am thinking to use boost 1.65 instead of 1.64 to avoid a 'unknown compiler' warning. Would this be fine?

papr 08 January, 2019, 18:34:08

Hi @user-c5bbc4, the windows dependency setup is very fragile and we recommend to follow the instructions as close as possible. It might be possible to get it running with other versions but there are no guarantees. Actually, it is often enough to run the bundled application instead of running from source. Did you try running the bundle?

user-c5bbc4 08 January, 2019, 18:36:04

Ah, the software works fine for me. I am trying to work with the code

papr 08 January, 2019, 18:37:06

Is there a specific change you have in mind? Do you need access to specific data?

user-c5bbc4 08 January, 2019, 18:37:23

I

user-c5bbc4 08 January, 2019, 18:37:51

Oops. push the wrong button

user-c5bbc4 08 January, 2019, 18:38:21

I am now a MRes student. trying to improve the performance of pupil detection

user-c5bbc4 08 January, 2019, 18:38:48

but still new. Not get stuck with running the code

user-c5bbc4 08 January, 2019, 18:40:04

C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Tools\MSVC\14.16.27023\include\type_traits(1357): note: see reference to class template instantiation 'std::aligned_union<1,_Ty>' being compiled with [ _Ty=singleeyefitter::Detector2DResult ] C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Tools\MSVC\14.16.27023\include\memory(1820): note: see reference to alias template instantiation 'std::aligned_union_t<1,singleeyefitter::Detector2DResult>' being compiled C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Tools\MSVC\14.16.27023\include\memory(1866): note: see reference to class template instantiation 'std::_Ref_count_obj<_Ty>' being compiled with [ _Ty=singleeyefitter::Detector2DResult ] c:\work\pupil\pupil_src\shared_modules\pupil_detectors\detect_2d.hpp(76): note: see reference to function template instantiation 'std::shared_ptr<singleeyefitter::Detector2DResult> std::make_shared<singleeyefitter::Detector2DResult,>(void)' being compiled error: command 'C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Tools\MSVC\14.16.27023\bin\HostX86\x64\cl.exe' failed with exit status 2

user-c5bbc4 08 January, 2019, 18:40:42

this is what I got after trying to run 'python setup.py build' with pupil_detectors

user-c5bbc4 08 January, 2019, 18:40:45

any ideas?

papr 08 January, 2019, 18:41:17

To change the pupil detection algorithm you indeed need to change the code. Unfortunately, that output does not contain an actual error message/failure reason as far as I can see.

user-c5bbc4 08 January, 2019, 18:41:48

oh, you mean I can ignore this message?

papr 08 January, 2019, 18:42:24

No, there is definitively something going wrong. πŸ˜…

user-c5bbc4 08 January, 2019, 18:42:44

πŸ˜‚ yes. I believe so

user-c5bbc4 08 January, 2019, 18:43:47

cl : Command line warning D9025 : overriding '/W3' with '/w' cl : Command line warning D9002 : ignoring unknown option '-std=c++11' detector_2d.cpp Unknown compiler version - please run the configure tests and report the results C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Tools\MSVC\14.16.27023\include\type_traits(1271): error C2338: You've instantiated std::aligned_storage<Len, Align> with an extended alignment (in other words, Align > alignof(max_align_t)). Before VS 2017 15.8, the member type would non-conformingly have an alignment of only alignof(max_align_t). VS 2017 15.8 was fixed to handle this correctly, but the fix inherently changes layout and breaks binary compatibility (only for uses of aligned_storage with extended alignments). Please define either (1) _ENABLE_EXTENDED_ALIGNED_STORAGE to acknowledge that you understand this message and that you actually want a type with an extended alignment, or (2) _DISABLE_EXTENDED_ALIGNED_STORAGE to silence this message and get the old non-conformant behavior.

user-c5bbc4 08 January, 2019, 18:44:34

does this make any sense? The length is out the limit of Discord. A bit tricky to send them all

papr 08 January, 2019, 18:45:52

Please define [...] (2) _DISABLE_EXTENDED_ALIGNED_STORAGE to silence this message and get the old non-conformant behavior.

☝ This looks like the solution

user-c5bbc4 08 January, 2019, 18:47:00

yes. I believe the other way to deal with it is to use BOOST 1.65 so that is why I am asking whether other versions are accpetable

user-c5bbc4 08 January, 2019, 18:48:03

A bit embarrassing to say that I do not know how to pass this command _DISABLE_EXTENDED_ALIGNED_STORAGE to the compiler😷

user-c5bbc4 08 January, 2019, 18:51:18

Oh, I did not notice this issue. Thanks a lot!!

user-c5bbc4 08 January, 2019, 19:07:56

Hello, 'Player' is working now, thanks very much! But 'capture' and 'service' run into the same problem

user-c5bbc4 08 January, 2019, 19:07:57

C:\work\pupil\pupil_src>python main.py service service - [ERROR] launchables.service: Process Service crashed with trace: Traceback (most recent call last): File "C:\work\pupil\pupil_src\launchables\service.py", line 98, in service import pupil_detectors File "C:\work\pupil\pupil_src\shared_modules\pupil_detectors__init__.py", line 21, in <module> from .detector_3d import Detector_3D File "detector_3d.pyx", line 35, in init pupil_detectors.detector_3d ModuleNotFoundError: No module named 'visualizer_3d'

user-c5bbc4 08 January, 2019, 19:08:40

any ideas about this 'visualizer_3d' module?

papr 08 January, 2019, 19:13:45

My only guess is that there is still something wrong with the detector_3d compilation

user-c5bbc4 08 January, 2019, 19:46:05

it should be. but there is no message now after running setup.py under pupil-detectors

user-c5bbc4 08 January, 2019, 19:46:13

C:\work\pupil\pupil_src\shared_modules\pupil_detectors>python setup.py build running build running build_ext that 's it

user-c5bbc4 08 January, 2019, 19:47:10

so does the optimization_calibration. I would assume they are working fine?

user-c5bbc4 08 January, 2019, 20:07:25

I copied the 'visualizer_3d' file from '/singleeyefitter' to '/pupil_detectors'

user-c5bbc4 08 January, 2019, 20:07:32

and it is working now

user-54bbd5 08 January, 2019, 22:51:07

Hey fellow developers, I've followed the instructions on the developer's docs as close as possible to try get the capture module from the source code running

user-54bbd5 08 January, 2019, 22:51:56

Unfortunately I've run into an error that has been posted on the pupil labs github help page but has yet to be addressed

user-54bbd5 08 January, 2019, 22:52:02

https://github.com/pupil-labs/pupil/issues/1208

user-54bbd5 08 January, 2019, 22:52:11

any help will be greatly appreciated

user-54bbd5 08 January, 2019, 23:04:12

This error appeared when I attempted to build the pupil_detector module through 'python setup.py build'

papr 09 January, 2019, 08:08:05
user-bd800a 09 January, 2019, 10:58:55

Hi following on @user-c5bbc4 issue, how can the _DISABLE_EXTENDED_ALIGNED_STORAGE be passed from the detector_3d.pyx ? Does it correct the build issue? I tried with another version of boost but it was unsuccessful.

user-bc5d02 09 January, 2019, 14:33:37

Hi! I have a problem with second eye camera using pupil mobile. Eye video window (on my laptop) is constantly switching between "not available. Running in a ghost mode" and settings without video image. On the app it also does not load. World camera and one eye camera are ok. I use binocular (200hz) eye-tracker and Blackview BV 9500 smartphone. I understand that the problem could be in my smartphone, but maybe somebody faced these kind of problem with other smartphones too?

user-64b0d2 09 January, 2019, 15:14:44

Hi all, I have some questions with regards to the Pupil LSL Relay Plugin, is anyone using it? I have tried installing it following the steps described on the github page (https://github.com/labstreaminglayer/App-PupilLabs) however, when I open pupil capture I cannot find the plugin among the other plugins. Any suggestions?

wrp 09 January, 2019, 15:18:17

@here We are very excited to announce our newest product - Pupil Invisible.

The first eye tracking device that truly looks and feels like a normal pair of glasses. Machine learning powered. No setup, no adjustments, no calibration.

Learn more and sign up for the closed beta program at https://pupil-labs.com/blog/2019-01/pupil-invisible-beta-launch/

Chat image

wrp 09 January, 2019, 15:18:24
user-c5bbc4 09 January, 2019, 15:19:03

@user-bd800a Hi, this link works for me to pass the command. https://github.com/pupil-labs/pupil/issues/1331#issuecomment-430418074. The warning about /w3 /w and the following one I cannot recall can be ignored I think.

user-bd800a 09 January, 2019, 15:48:46

@user-c5bbc4 but to have the .cpp files you need to build them, which fails, do you have to have a first failure to generate the .cpp then edit then reun the .bat files?

user-c5bbc4 09 January, 2019, 16:02:15

I did not change the boost settings. I was thinking to use another version but I soved it didn't

user-c5bbc4 09 January, 2019, 16:02:41

oops. unfinished. I mean I solved the problem before I reached that far

user-bd800a 09 January, 2019, 16:03:19

do you run it with the .bat or trhough the main?

user-c5bbc4 09 January, 2019, 16:03:37

I think I got the .cpp files after trying to run 'python setup.py build'

user-2968b9 09 January, 2019, 16:14:34

Hi there, we're currently experiencing an issue with Pupil Mobile where the mobile app stops recording, and enters gost mode on the desktop application, before coming back and continuing to stream and record to our computer but the mobile recording does not continue. We use 2 pupil cameras, one with an eyetracker and one without, we find that the issue only occurs on the camera with the eyetracker. Do you know what might be causing this issue, and how we can alleviate it? I can also obtain the capture.log file if that's of any assistance?

papr 09 January, 2019, 16:23:21

@user-2968b9 My guess is that the phone's resources are being strained. Did you try recording without streaming?

user-2968b9 09 January, 2019, 17:24:04

We have not yet, we'll give that a shot. In practice though, we would not be able to record solely on the phone/laptop. Is there a way of decreasing the strain that the app would put on the phone, or would it perhaps be a matter of using a more powerful phone? We currently use LG Nexus (I believe it's 4). Is there also anywhere I can find the specs required to run pupil mobile, and I could try and compare them vs the Phone specs

user-8779ef 09 January, 2019, 18:45:23

@wrp Great! Can't wait for the HMD version πŸ˜›

user-4878ad 10 January, 2019, 19:21:03

Has anyone published any papers using pupil labs eye tracking, additional stimulus mediated by a lab streaming layer (matlab)?

user-54bbd5 10 January, 2019, 21:35:01

Any idea on how to solve this issue: https://github.com/pupil-labs/pupil/issues/1208

user-07d4db 10 January, 2019, 22:07:02

Hello πŸ˜ƒ has anybody a recommendation which value to choose for the minimum+maximum duration in the offline fixation detector, in an experiment with human robot interaction? Human and robot have to solve a task together. Any recommendations based on experimence or concrete literture is welcome! Thank you πŸ˜ƒ

wrp 11 January, 2019, 04:46:26

@user-4878ad publications/projects using Pupil are collected in this spreadsheet: https://docs.google.com/spreadsheets/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/edit?usp=sharing

user-07d4db 11 January, 2019, 09:31:37

Hey! I have an other question: Is it possible to use instead of the dispersion based algorithm for defining fixations, also a velocity- or area based algorithmin pupil labs? Thank you!

wrp 11 January, 2019, 09:33:30

Hi @user-07d4db there is no velocity based algorithm in the codebase/app bundles, however if you want to implement a plugin for velocity based fixation detector you might want to start by looking at https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/fixation_detector.py

wrp 11 January, 2019, 09:34:01

or copying fixation_detector.py as a start then going from there

user-07d4db 11 January, 2019, 09:38:58

Thank you! Where do I have to copy "fixation_detector.py "? And did I understand it correctly, that there is no preprogrammed velocity based algorithm in pupil labs at the moment, but it is possible, to create such a plug-in on my own?

papr 11 January, 2019, 10:34:54

@user-07d4db Hey, yes, it is correct that Pupil does not come with a velocity-based fixation detector. We use a dispersion-duration-based algorithm. Yes, you can extend Pupil's functionality with custom plugins. As wrp said, a good place to start is to copy the linked file into the corresponding plugin folders: - ~/pupil_capture_settings/plugins for realtime plugins in Capture - ~/pupil_player_settings/pluginsfor offline plugins in Player that have access to the whole recording

user-07d4db 11 January, 2019, 10:43:51

Thank you very much @papr πŸ˜ƒ Just one further question: How can I measure the transition time of gazes between different areas of interest?

papr 11 January, 2019, 10:45:18

@user-07d4db each gaze point has a timestamp. Just calculate the difference between the last gaze point before exiting a surface and the first gaze point entering a new surface.

user-07d4db 11 January, 2019, 10:48:13

Okay! And is there a possibilty that puil labs does this calculation for me, sth. like an automatic setting? Becaus doing that for every participant of my experiment on my own would take a lot of time.

papr 11 January, 2019, 10:53:42

@user-07d4db no, this is not automatically calculated. But if you export the surface data, you will get csv files whose data would allow you to calculate the metric in question in an automated way

user-6fdb19 11 January, 2019, 17:06:54

how much fps the pupil invisible will work with? how much $$?

user-8903eb 11 January, 2019, 21:59:13

Hey there, so we've been having some issues with getting Pupil set up on our computer here at the university. It runs on Windows 10 and is proving to be difficult to set up. I've spent the last couple of days troubleshooting, uninstalling and reinstalling drivers, and attempting to get pupil capture to run, all of which have been unsuccessful.

However, I have a MacBookAir that I installed the pupil application and plugged in the headset. This worked wonderfully, which allows us to use mine for the time being.

My concern is, is there anyway to completly uninstall everything pupil related from the windows computer, and is there a process that I can follow that will successfully allow me to run pupil capture? Any help with this is appreciated, thanks!

user-4878ad 11 January, 2019, 22:41:18

@wrp thanks!!

mpk 12 January, 2019, 13:10:09

@user-8903eb the best way to reset the windows machine is to use the 'restart with default settings' button in the world window.

user-8903eb 12 January, 2019, 18:19:15

@mpk Puil Capture wont launch, is the issue. Nothing will display once it is run as an administrator, so I cant reset it to default settings.

user-f68ceb 13 January, 2019, 18:05:37

Hi There, has anyone discovered an easy way to print the fixation points onto a matching screen? So the results look something like this:

user-f68ceb 13 January, 2019, 18:05:46

Chat image

user-54bbd5 13 January, 2019, 21:53:09

Hey @papr, do you or your team happen to have any solutions regarding this issue: https://github.com/pupil-labs/pupil/issues/1208

user-af87c8 14 January, 2019, 20:44:46

@wrp Just saw the pupil invisible announcement! very exciting, is there any information on the tracking technology used? Neither tracking pupil nor glint and no cameras in sight, but with Inertia Sensor... very mysterious - but very cool! Keep it up πŸ˜ƒ

papr 14 January, 2019, 21:01:08

@user-54bbd5 no, unfortunately not.

user-24e31b 15 January, 2019, 01:31:24

Hi all, is there a way to get playback speed back up X1.0? (or render out the video to enable smooth playback) I am recording via the mobile interface if that makes a difference. Thanks for the help.

wrp 15 January, 2019, 03:16:57

@user-af87c8 we will be releasing more information about Pupil Invisible and the core technology that powers the gaze estimation pipeline via the beta program newsletter and our website/blog in the near future. The glasses are still video based eye tracking, but with tiny eye cameras and an entirely new approach. The IMU is not involved in gaze estimation - it is a bonus hardware feature! Hopefully that is slightly less mysterious 😸

user-94ac2a 15 January, 2019, 03:21:47

The eye camera in the store is a regular RGB or IR camera?

wrp 15 January, 2019, 03:23:34

@user-94ac2a this is an IR camera. It is the same camera used for our Pupil headsets. It is sold separately so that people with older headsets (with 120hz eye cameras) can upgrade, or if someone has a binocular frame with only one eye camera, they can upgrade and make their headset binocular by buying another eye camera

user-94ac2a 15 January, 2019, 03:24:50

@wrp Thanks. Does that mean I can replace my own IR cameras with the one comes with the headset?

wrp 15 January, 2019, 03:25:53

@user-94ac2a maybe you could provide a bit more context, I'm not sure I understand/follow.

user-94ac2a 15 January, 2019, 03:26:48

@wrp Say if I have my own IR cameras. Can I use my own IR cameras instead of the original cameras in the headset?

wrp 15 January, 2019, 03:29:26

So you would like to build your own headset with your own IR eye cameras, correct? The short answer is that UVC compliant cameras are supported by our software, but you may need to make some customizations to source code to use your cameras. If your cameras are not UVC compliant, then it would mean writing your own backend for the cameras.

user-94ac2a 15 January, 2019, 03:32:07

@wrp Got it. The DIY tutorial is about building my own headset, right?

wrp 15 January, 2019, 03:32:13

correct

user-94ac2a 15 January, 2019, 03:33:06

@wrp Any speicifc tilting degree for the cameras? Or cameras can be placed any position under the eye

wrp 15 January, 2019, 03:33:09

you might also want to read: https://docs.pupil-labs.com/#jpeg-size-estimation-and-custom-video-backends you might also want to see custom video backends in source here: https://github.com/pupil-labs/pupil/tree/master/pupil_src/shared_modules/video_capture

wrp 15 January, 2019, 03:33:56

@user-94ac2a position of the eye camera needs to be near the eye (with the pupil visible in the eye video frame), the system is designed to accommodate different camera positions.

user-94ac2a 15 January, 2019, 03:34:34

@wrp Thanks a lot!

wrp 15 January, 2019, 03:49:41

you're welcome!

user-94ac2a 15 January, 2019, 03:59:25

@wrp Any requirement for the IR camera except the UVC compliant

wrp 15 January, 2019, 04:00:24

what camera are you considering?

wrp 15 January, 2019, 04:00:39

perhaps the community could give you some feedback on your DIY work

user-94ac2a 15 January, 2019, 04:02:22

@wrp Maybe [email removed]

user-94ac2a 15 January, 2019, 04:03:24

I mean fps

wrp 15 January, 2019, 04:06:46

I will let the rest of the community respond re DIY setups - as there are likely others with some suggestions 😸

user-94ac2a 15 January, 2019, 04:07:03

OK

wrp 15 January, 2019, 04:26:08

@user-24e31b You note that you are recording with Pupil Mobile? Are you recording onto the Android device or are you using the Android device for streaming and then recording via desktop/laptop that is running Pupil Capture?

user-94ac2a 15 January, 2019, 06:04:08

What is the minimum resolution requirement for Eye camera?

user-94ac2a 15 January, 2019, 06:12:01

The one on the store shows: Sensor Global Shutter. [email removed] [email removed] Are these two cameras or one?

wrp 15 January, 2019, 06:28:26

@user-94ac2a this is one camera, with multiple spatial/temporal resolutions that can be selected via software

user-94ac2a 15 January, 2019, 06:28:50

OK

user-94ac2a 15 January, 2019, 06:52:41

@wrp So this is a stereo camera

wrp 15 January, 2019, 07:07:13

no, a single camera per eye

wrp 15 January, 2019, 07:07:15

not stereo camera

user-94ac2a 15 January, 2019, 07:32:18

πŸ‘Œ

user-94ac2a 15 January, 2019, 07:46:11

@wrp two seperated cameras connected through one single USB2.0 port?

papr 15 January, 2019, 07:47:27

@user-94ac2a It is an USB 3 hub that connects the USB 2.0 cameras.

user-94ac2a 15 January, 2019, 07:49:07

@papr So its two seperated USB 2.0 connected to USB 3.0 hub, and USB 3.0 hub connect to PC?

papr 15 January, 2019, 07:49:45

@user-94ac2a correct

user-94ac2a 15 January, 2019, 07:49:54

Thanks!

user-82e7ab 15 January, 2019, 10:54:00

Hi, may pupil be affected by external IR light sources? We want to use the pupil headset in an environment with optitrack motion capture cameras - which emit IR light for tracking just like the pupil cameras. I'm asking because I don't know how pupil's algorithms are working, but afaik many algorithms in eye tracking use corneal reflections (of known IR light sources) and there will be more than one in our case. Thanks for any help πŸ˜ƒ

wrp 15 January, 2019, 10:54:53

@user-82e7ab corneal reflections are not used. So, other IR light sources should not be a problem

wrp 15 January, 2019, 10:55:58

However, you may notice some artifacts from IR emitters in the eye video (e.g. if you had the HTC Vive lighthouse system running you might see what looks like flicker/banding due to the frequency of the IR emitters in the lighthouse system)

user-82e7ab 15 January, 2019, 10:56:17

perfect, thx!

user-82e7ab 15 January, 2019, 10:57:01

as long as these artifacts do not affect your tracking, this is no problem at all

wrp 15 January, 2019, 10:58:11

this is kind of old, but https://github.com/mdfeist/OptiTrack-and-Pupil-Labs-Python-Recorder

wrp 15 January, 2019, 10:59:36

@user-82e7ab you might also want to check out the citation list for other research/papers that use optitrack (or other pose tracking systems) with Pupil: https://docs.google.com/spreadsheets/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/edit?usp=sharing

user-82e7ab 15 January, 2019, 11:00:29

oh, we already have distinct optitrack and pupil integrations for our framework. We just haven't tested both in combination yet. But I will have a look at the citation list πŸ˜ƒ

wrp 15 January, 2019, 11:01:06

sounds good @user-82e7ab

user-b0c902 15 January, 2019, 15:07:24

@papr in regards to your response to @user-2968b9 's query - we tried to stream the eye tracker directly with the laptop and it seems to work fine. But when we connect it with the phone and stream it on the laptop, the eye camera shows 0 rate even though it streams on the phone. Do you suggest we change the phone? Jordan has already embriones the phone model that we are using

user-81072d 15 January, 2019, 16:26:58

Can anyone tell me exactly what the difference is with Robust Detection enabled for surface tracking?

user-24e31b 15 January, 2019, 18:43:41

@wrp Yep, I am recording to the Android Device then copying the files to the laptop for playback + using offline circle calibration. Can only seem to get playback at X0.5 speed?

papr 15 January, 2019, 18:44:15

@user-24e31b Which version of Player do you use?

user-24e31b 15 January, 2019, 18:44:37

pupil_v1.9-7

papr 15 January, 2019, 18:44:59

You can increase and decrease playback speed by hitting the right/left arrow keys during playback

user-24e31b 15 January, 2019, 18:45:21

@papr ^ I shall go and test those hotkeys πŸ˜ƒ

user-24e31b 15 January, 2019, 18:46:34

@papr Yep! Was as easy as that, thanks! - is there a reference sheet for all the hotkeys for Capture/Player?

papr 15 January, 2019, 18:49:10

@user-24e31b no, unfortunately not. But there not many: - space play/pause - left/right during playback: decrease/increase playback speed - left/right during pause: prev./next frame - e for export - Any other letter as shown in the half-transperent tumb icons left

user-24e31b 15 January, 2019, 18:57:54

Copy that, should be able to memorize those, they are pretty logically obvious. Thanks again, will no doubt be back here soon as I've just begun testing/bench-marking.

user-e8a795 15 January, 2019, 21:34:41

I couldn't find anything in the documentation but is there a recommended way to clean the lenses of the newer 200hz cameras? One camera feed is much blurrier than the other and I think its from some dust that got into the lens or a smudge from adjusting it. I wanted to use rubbing alcohol on a q-tip but didn't know if the material the Hololens add-on kit is printed from would react to the rubbing alcohol.

user-94ac2a 16 January, 2019, 06:34:36

@papr Is the USB 3.0 hub which is connected with two USB 2.0 eye camera a customized USB3.0 hub or it is just a regular USB3.0 hub?

mpk 16 January, 2019, 08:01:20

@user-94ac2a Its a custom made hub in terms for form factor and connectors, but its in function "just" a usb3.0 hub.

user-755e9e 16 January, 2019, 08:29:10

@user-e8a795 to clean the lens i would recommend to remove the lens and carefully spray canned air at a distance of 2/3 cm from the sensor .

user-94ac2a 16 January, 2019, 08:33:47

@mpk That means I can use any usb3.0 hub and connect with any two usb2.0 IR camera to perform the same

mpk 16 January, 2019, 08:34:49

in terms of hardware yes. We do a some special things in the driver/ usb user-space layer that might not work with your cameras though.

mpk 16 January, 2019, 08:35:51

Check out libuvc and pyuvc forks from us for that: https://github.com/pupil-labs/libuvc https://github.com/pupil-labs/pyuvc

user-94ac2a 16 January, 2019, 08:36:31

Thx!

user-a951f2 16 January, 2019, 10:05:19

hello all, i have downloaded the latest pupil software (1.10) but in the Player, I can't find the Vis Scan Path plugin - any clues what I need to do? It's running on a Mac

wrp 16 January, 2019, 10:07:01

Hi @user-a951f2 vis scan path has been temporarily disabled in Pupil Player since v1.8. Please see notes here: https://github.com/pupil-labs/pupil/releases/tag/v1.8

user-a951f2 16 January, 2019, 10:07:57

@wrp thanks! Hopefully it will be enabled soon πŸ˜ƒ

user-e2056a 16 January, 2019, 13:31:15

@papr In the exported files, are the fixation and gaze "x-scale" and "y scale" in surface folder using the lowerleft corner of surfaces as the origins? Or are they using the lowerleft corner of the world camera image?

papr 16 January, 2019, 13:37:01

@user-e2056a x_scaled = x_norm * surface_width, where surface_width is set by the user in the surface settings. Therefore, the x/y_scaled values originate in the surface origin (lower left corner).

user-e2056a 16 January, 2019, 13:38:22

@papr Thank you, there is another gaze position file in the exports, not in the surface folder, though, what is the origin of that one?

papr 16 January, 2019, 13:39:42

The gaze_positions.csv file contains gaze whose norm_pos values originate in the world camera lower left image corner.

user-e2056a 16 January, 2019, 13:42:09

@papr, Thank you, in the gaze position file, I saw multiple gazes( with difference x y positions) occuring at the same timestamp, may i ask why?

user-e2056a 16 January, 2019, 13:44:37

@ and why is the world timestamp different from the gaze timestamp?

papr 16 January, 2019, 13:51:09

@user-e2056a Gaze timestamps are inferred from pupil timestamps. If it is monocular gaze its timestamp equals to the pupil timestamp it is based on. If it is binocular gaze, its timestamp is the average of both pupil timestamps it is based on. Pupil timestamps are inferred from the eye video frame on which they were detected. Eye images are acquired at a much higher frequency than world images. Therefore, there are multiple gaze points that belong to a single world frame. Therefore, it is normal that you can see gaze data with the same world timestamp. The gaze timestamp is very unlikely to be the exact same for two different gaze points.

user-8b1528 16 January, 2019, 16:06:30

Is there any documentation about playing with the pupil sensor settings ? For example, default resolution is 320x240, which is very small, using higher resolution should help pupil algorithm perform better (else than running at slower frame rate) ?

papr 16 January, 2019, 16:17:09

@user-8b1528 We have never made exact measurements but we have never seen a systematic drop in detection performance when using 320x240. Increasing eye video resolution does not only mean lower frame rate but also higher cpu requirements/detection time.

user-29e10a 16 January, 2019, 17:49:01

@user-8b1528 we found that the 320 resolution is superior due to the coarse detection which happens at higher resolutions (which fires back sometimes) and thin eyelashes are not as sharp in 320 (which is good). Far more important are the setting for brightness etc... we use 64 for both brightness and contrast, 140 for gain, 100 gamma, 0 sharpness. We use the hmd integration so no guarantee that this are the best settings for the normal headset. For hmd version it is also important to slightly enhance the distance between eye and screen.

user-5c7218 16 January, 2019, 20:07:04

Is there a way to get simulated eye tracker data sent over zeromq? I want to work on the API when I am home and don't have access to the pupil hardware.

user-8b1528 16 January, 2019, 20:07:31

@user-29e10a Hi, thanks for the numbers ! It gives a really white image, that's what we want ? A gain of 140... interface max is at 100, could you confirm gain value ?

Thanks !

papr 16 January, 2019, 20:09:12

@user-5c7218 you can use the video file source, and playback recorded eye videos instead of using the USB camera

user-5c7218 16 January, 2019, 20:11:09

@papr but there is no way to generate simulated data?

papr 16 January, 2019, 20:16:47

You could write a plugin that feeds recorded data into Capture... But this does not exist yet.

user-5c7218 16 January, 2019, 20:17:12

@papr okay, thank you

user-e2056a 16 January, 2019, 21:10:28

@papr, the default setting of minimum data confidence is 0.5, is it ok for us to use value lower than 0.5? what is the lowest value allowed without compromising the data accuracy?

user-29e10a 16 January, 2019, 21:51:54

@user-8b1528 oh Iβ€˜m sorry I meant Gamma to 144 and leave gain to 0

user-26fef5 17 January, 2019, 08:41:11

Hello all,

papr 17 January, 2019, 08:45:07

@user-e2056a I think it is actually 0.6. Low confidence usually means less accurate. Depending on your use case, you should use different thresholds.

user-26fef5 17 January, 2019, 08:47:07

I got a small question regarding the frame publisher plugin. I managed to grab the world frames using zmq in c++ and store them as opencv mat files- so far so good. Unfortunately the publisher only outputs the raw video format, that is distorted. Even when i calibrate it using the camera intrinsic estiamation plugin, sitll the raw format is published. So I tried to get the intrinsics from the file that your software safes (camera.intrinsics) but i cant get my head around the encoding of that file. Is there a way of a) transmitting the undistorted (corrected) video stream through the publisher? or b) get the intinsic camera parameters from the saved file ? (that is preferred since I can just integrate them using opencv). Any hints regarding that topic ? Best Regards

papr 17 January, 2019, 08:49:54

@user-26fef5 The file is encoded using msgpack. You should be able to decode it in a similar fashion as Pupil network messages

user-26fef5 17 January, 2019, 08:51:20

Ahhh, I see. HavenΒ΄t thought of that. Thanks

user-8b1528 17 January, 2019, 14:45:45

@user-29e10a Concerning the post about the sensor settings, you finished the post talking about the distance between eye and screen for the HMD version: is it preferable to have the eyes closer or farther to the screen ? From the tests I done, seems to be better when close to screen, could you confirm ?

Thanks !

papr 17 January, 2019, 14:49:08

@user-8b1528 Regarding the eye-screen distance: It is most important that the eye is well visible in the eye cameras' fields of view. Since the cameras are attached to the screen, the eye-screen distance is one parameter to adjust eye visibility and therefore pupil detection quality.

user-e2056a 17 January, 2019, 15:05:11

@papr Thank you! Another question related to gaze: can I find the duration of each gaze from the gaze position file?

papr 17 January, 2019, 15:07:37

@user-e2056a A gaze datum does not have a duration -- at least not conceptually in Pupil.

papr 17 January, 2019, 15:08:14

Just to be sure, you were not referring to fixations, correct?

user-8b1528 17 January, 2019, 15:52:21

@papr Still trying to fine tune my setup and even with camera lenses carefully adjusted to get a sharp image, I noticed that if I press the HMD helmet a bit more on my face or pull it away, even just a bit, tracking of pupil if we are looking at left/right edges seems to be lost easily. So I'm concerned on how to use the setup with a lot of different persons without having to fine tune everything (adjust lens focus for instance) each time since it seems that sensible to the distance.

Any trick or info about that ?

Thanks !

user-e2056a 17 January, 2019, 20:55:05

@papr correct, Thank you!

user-41c874 18 January, 2019, 07:12:00

Hey . I'm trying to use the manual marker calibration method and displaying the marker on a screen as an image . Does it matter if we change the size of the circular marker ? And since it says that it is optimum for 1.5-2 metres , is it okay if we use it for 30 cm eye-to -screen distance? (Also where can I set eye-to- screen distance?)

user-41c874 18 January, 2019, 07:20:36

And the document says something about scaling down the markers for the screen marker calibration . "You can adjust the scale of the pattern for a larger or smaller calibration target." Can I also scale where on the screen these markers are visible ? (Since our screen is 120 cm X 68 cm , subjects cant fixate at the corner of the screen while sitting 30 cms away from the screen- with the full screen mode)

papr 18 January, 2019, 08:17:36

@user-41c874 Hi, no, the markers can be scaled down, especially if your screen is that close. You should position and scale the markers such that they are still visible in the world camera's field of view. The world process gives you feedback about visibility and detection of the markers during the Manual Marker calibration.

user-41c874 18 January, 2019, 09:43:46

Perfect ! Thanks a lot!

user-a08c80 18 January, 2019, 15:10:39

Has anyone built a custom physical apparatus modifying the physical Pupil headset? We need to design an interface that allows users to use the pupil glasses during robotic surgery. Currently the pupil apparatus interferes with the user headset of the robotic surgery console. We are interested in speaking with anyone who has physically customized Pupil to work with a head-interface device.

user-94ac2a 18 January, 2019, 23:25:11

Any minimum resolution requirement for eye camera?

user-adc157 19 January, 2019, 12:54:04

Hi again, when i use mobil phone, i can record with audio. However, when i use macbook pro, i can't record audio. I opened audio plugin. I am using new version 1.10.20

wrp 21 January, 2019, 04:41:20

@user-94ac2a I would not recommend going below 200x200 for spatial resolution

user-94ac2a 21 January, 2019, 04:41:44

Thx

wrp 21 January, 2019, 04:42:17

@user-adc157 I will try to replicate the behavior with audio recording on macOS. BTW, what version of macOS?

papr 21 January, 2019, 11:57:33

@user-adc157 And which input device is selected as default input device? The integrated microphone or an external one?

user-41c874 21 January, 2019, 14:18:34

Hey . We ran into the issue that sometimes the pupil doesnt get detected automatically at 192X192 eye camera resolution. I tried to play around with the exposure time, pupil intensity range, pupil min and max. Could you explain what these settings exactly change and what has to be ideal to get a good and stable detection of the pupil.?

user-adc157 21 January, 2019, 15:02:31

@wrp @papr MacOS version is Mojave 10.14.2, I selected Audio Source as microphone(integrated). I cant see any auido file when using pupil player. However, when i use mobil phone with offline record. I can see easily audio file pupil player.

papr 21 January, 2019, 15:03:38

@user-41c874 - Higher exposure time increases the image brightness naturally. This has a upper limit based on the selected frame rate. - The 2d pupil detection algorithm searches for black areas in the image. The pupil intensity range is a threshold that specifies which pixels belong to these black areas and which do not. - min and max pupil sizes are lower and upper bounds for filtering potential pupil candidates

Generally, you want the pupil to be as dark as possible and everything else as bright as possible.

papr 21 January, 2019, 15:04:48

@user-adc157 Thanks for the feedback. I will try to reproduce the issue with an external mic since the Mac Mini that I am working with, does not have an integrated mic.

user-4878ad 21 January, 2019, 17:10:23

Is there a reason that it says "no audio "

Chat image

user-8950d7 21 January, 2019, 17:41:02

@aduckingah, I believe 'no audio' is an option you can select. If you want to record audio, select one of your options listed above the no audio line.

papr 21 January, 2019, 17:48:53

Hi @user-4878ad, @user-8950d7 is correct.

user-4878ad 21 January, 2019, 18:04:38

@user-8950d7 @papr you were correct. thanks!

user-4878ad 21 January, 2019, 18:04:40

πŸ˜ƒ

user-adc157 21 January, 2019, 19:50:13

@user-4878ad Yes, i can select Built-in Microph from Audio Source. But, i cant record audio file.

Chat image

user-4878ad 21 January, 2019, 20:35:48

I didnt get one either as far as i can see

papr 21 January, 2019, 20:37:09

@user-4878ad @user-adc157 please check your capture.log files for any hints to what the problem could be

user-4878ad 21 January, 2019, 20:46:27

@papr .... where might i find this capture.log file? Would it be in the recording file? πŸ˜…

papr 21 January, 2019, 20:47:20

It is in the pupil_capture_settings folder

papr 21 January, 2019, 20:48:29

Please check it directly after a failed audio recording. Do not restart Capture since this would overwrite the previous log file

user-4878ad 21 January, 2019, 20:50:08

Will go try a new recording now

user-4878ad 21 January, 2019, 21:10:18

@papr I figured out the recording... but there is a time delay for visual and audio ☹️

user-94ac2a 22 January, 2019, 01:39:19

What kind of exposed (black) film negative should I purchase for DIY project

user-94ac2a 22 January, 2019, 01:40:07

Does this exposed film negative work? https://www.amazon.com/gp/product/B0152OSM4Y/ref=ppx_yo_dt_b_asin_title_o00__o00_s01?ie=UTF8&psc=1

user-14d189 22 January, 2019, 01:50:45

Hi @user-94ac2a, I think the purpose is to filter all daylight out and transmit just IR light through the filter. I used a color film. pulled it out in daylight and sent it off for development. The dark brown part of the film did the job for me.

user-14d189 22 January, 2019, 01:52:58

You need to cut off the IR filter of the camera.

user-94ac2a 22 January, 2019, 02:11:09

@user-14d189 You mean cutting off the Coated glass?

user-94ac2a 22 January, 2019, 02:11:57

Microsoft HD6000

user-14d189 22 January, 2019, 02:13:16

yes, the back of the lens of the eye tracking camera usually has a IR filter, that need to go. I wasted one eye tracking camera btw.

user-94ac2a 22 January, 2019, 02:13:31

I did that

user-94ac2a 22 January, 2019, 02:13:54

The film seems not working?

user-94ac2a 22 January, 2019, 02:14:29

Which side should I put the film on

user-14d189 22 January, 2019, 02:15:39

I'm not sure about the black and white negative film. CCD chips are sensitive to daylight and IR light. In this case we just want IR.

user-14d189 22 January, 2019, 02:15:55

and you would put the cut out where the IR filter was.

user-14d189 22 January, 2019, 02:16:08

so in between lens and ccd

user-14d189 22 January, 2019, 02:16:43

and the film must be exposed and developed.

user-94ac2a 22 January, 2019, 02:17:03

How can I exposed and developed the film?

user-94ac2a 22 January, 2019, 02:17:42

or any available one I can just purchase?

user-14d189 22 January, 2019, 02:18:05

Just pull the film out of the case, show it to a light source and then send it in to a photo shop

user-14d189 22 January, 2019, 02:19:26

there should be daylight filter available. not known to me. I would assume they are made of glass and most likely have the wrong size.

user-14d189 22 January, 2019, 02:20:02

that's why the colour film is an easy solution.

user-94ac2a 22 January, 2019, 02:20:21

Which one did you purchase?

user-14d189 22 January, 2019, 02:20:42

any will do.

user-14d189 22 January, 2019, 02:22:25

You can start as well with some recordings without the daylight filter. The only problem is that reflections from surrounding daylight might influence the quality of the recording.

user-94ac2a 22 January, 2019, 02:23:15

How about this one?

user-14d189 22 January, 2019, 02:29:44

looks allright to me. you just need the size of a hole puncher hole. even one would be enough, I had an ASA 100, the grain is a bit smaller than the ASA 400.

user-14d189 22 January, 2019, 02:31:07

Gain in negative film is the size of the salt particles that detect light. 400 is more sensitive and bigger particle.

user-94ac2a 22 January, 2019, 02:34:44

Thanks!

user-94ac2a 22 January, 2019, 02:35:00

For this film, do I still need to expose and develop?

user-14d189 22 January, 2019, 02:35:32

yes. If you do not expose and develop the daylight will come through.

user-94ac2a 22 January, 2019, 02:37:02

I see. Probalbly I need to go to a photo shop nearby?

user-14d189 22 January, 2019, 02:38:27

I ask a photographer that back in the day did film and he gave me some cut offs of developed film. that was enough

user-94ac2a 22 January, 2019, 02:39:28

Cool!

user-41c874 22 January, 2019, 06:05:10

@papr Thanks !. I'll try to work with this and a few other things and get back to you !

papr 22 January, 2019, 08:36:47

@user-4878ad so there is an actual audio recording? What did change to the previous recordings?

user-14f5df 22 January, 2019, 14:40:00

Hi all, new to Pupil Labs (using the vive drop-in). Was wondering how people are saving their recordings in Unity or accessing data directly from the first-party software. I see a SaveRecording in the PupilTools script (under networking) but not sure the appropriate way to call it. Suggestions?

user-4878ad 22 January, 2019, 16:50:13

@papr the mic was on mute ..... πŸ™ƒ

user-4878ad 22 January, 2019, 16:54:01

but yes, there is a recording now. the only issue is that there is a latency between the audio and the video. do you know if there is a fix for this or if there are things that I should change in settings?

papr 23 January, 2019, 13:07:58

@user-4878ad @user-adc157 I tried to reproduce the audio recording issue. I used Apple headphones that have an integrated microphone as input. When I selected the microphone in the Audio Capture menu, I got a system permissions dialogue which asked me to grant access to the mic. After that, I was able to make a successful audio recording that was in sync with the video. I can recommend to film a device playing this kind of video to test audio/video synchronization: https://www.youtube.com/watch?v=ucZl6vQ_8Uo

user-adc157 23 January, 2019, 19:58:38

@papr Thanks, i will try with external microphone, i understand that integrated microphone do not work on MACOS systems. I will inform to you when i get audio.

user-07d4db 23 January, 2019, 22:53:07

Hello πŸ˜ƒ Does pupil labs provide, additional to the dispersion based algorithm, an area based algorithm in order to define fixations with respect to specific areas of interest?

papr 23 January, 2019, 23:03:10

@user-07d4db no, we don't. Do you have a specific algorithm in mind? I would be interested in how it works.

user-14d189 23 January, 2019, 23:35:17

@user-07d4db personally I prefer a velocity based fixation identification filter. Your project sounds a bit different to fixation identification only. probably you need to identify the area of interest in the world camera image first, to exclude movements between head and area of interest. From there you can determine if gaze is in this area and apply a binary decision fixation yes or no. cpicanco posted earlier here a screen detection algorithm - If you run test on a screen.

user-14d189 23 January, 2019, 23:40:23

@user-07d4db @papr I prefer velocity based fixation filter, because they include fixation on possible moving objects too. Not sure what a trade off is?? Do you have any experience? The filer is easy to design, takes out data related to blinks and rapid and short eye movements (saccades) . It identifies 60-80% of eye movements as fixations. Has someone experience if that is about right?

user-07d4db 24 January, 2019, 00:08:55

Thank you very much for your reply! unfortunatly i am not an expert in programming, so that i am not able to design a plugin for a velocity based algorithm. i am conducting an experiement, measuring how visual attention allocation (number and duration of visual intake/fixations) shifts between different pre-defined aoi in human robot interaction (so i have a dynamic task). Do you have any recommendations how to deal with that, using the dispersion based algorithm?

user-14d189 24 January, 2019, 00:30:23

What are you using currently? Matlab Excel? Did you had a look to the output file of the fixation detector? You do not necessarily need a plugin. Velocity can be calculated out of the gaze data within the 3D model detection. 2D should be possible as well.

user-14d189 24 January, 2019, 00:31:54

There will be 15-20 blink per minute 20-60 fixations. At some point you want to have that be determined automatically.

user-4878ad 24 January, 2019, 04:44:47

@papr thanks!! I will take a look and get back to you with my results

user-41c874 24 January, 2019, 09:42:16

Hello. I am running into the issue that my surface markers aren't getting constantly detected. ( I am displaying the markers on a screen as an image . It is visible in the world camera ( 1280 x 1024- Frame size). ) Do you think I should increase the frame size of the world camera or maybe increase the size of surface markers?

Chat image

papr 24 January, 2019, 10:35:29

@user-41c874 you have two options. Increase the physical size of the markers or reduce the minimum marker perimeter setting in the surface tracker menu

user-41c874 24 January, 2019, 12:23:08

The minimum marker perimeter is set at the least it can go to , ie 30. I think I will try to increase the size of the markers . The image in the world camera is quite pixelated with current settings , so maybe increasing the resolution should help . I'll try both and get back to you . Thanks a lot !!

user-c5bbc4 24 January, 2019, 22:50:52

Hi, I am a MRes student in VR focusing on eye tracking and currently working with a Pupil. I wonder whether anyone here has done some work about error quantification/characterisation on Pupil? Apart from the original Pupil Labs paper, any evaluation about the current Pupil's performance? Thanks very much!

user-14d189 24 January, 2019, 22:55:53

@user-c5bbc4 Hi Yifan, check that out. Wearable Eye-tracking for Research: Automated dynamic gaze mapping and accuracy/precision comparisons across devices / MacInnes, Jeff J. /Institute for Learning and Brain Sciences University of Washington Seattle, WA USA -not Pupil labs related

user-14d189 24 January, 2019, 22:56:26

I'm also interested if some else have done some studies or comparison. cheers

user-c5bbc4 24 January, 2019, 22:58:36

Thanks! Yes, I wonder whether there are some existing work. Otherwise I plan to do it

user-c5bbc4 24 January, 2019, 23:01:23

One more question: any intro documents for Pupil's code?

user-14d189 24 January, 2019, 23:04:05

@user-c5bbc4 Pupil Labs has as well a research-and-publication channel. or the citation listing https://docs.google.com/spreadsheets/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/edit?ts=576a3b27#gid=0

user-c5bbc4 24 January, 2019, 23:05:57

Right. I have checked through it. Not much relevant results tho

user-c6ccfa 25 January, 2019, 00:25:16

Good day, if one eye camera feed is working and the other isn't, could it be a setting issue or do i need to fix my camera?

user-c6ccfa 25 January, 2019, 00:25:37

I have already updated the drivers

user-b70259 25 January, 2019, 02:56:45

Hello. I'd like to calibrate two eye cameras without world camera, so that I need transformation matrix to convert camera coordinate to world coordinate. In 3D detection, the software brings three axis data such as 'circle_3d' or 'sphere'. What point does the orign of the three axis space represent? Is it the origin of camera coordinate?

wrp 25 January, 2019, 03:04:24

Hi @user-c6ccfa If you use Pupil Capture and "restart with default settings" and run pupil capture with admin privlidges, what do you see in the eye windows?

user-82e954 25 January, 2019, 03:06:20

Hi, is there any work using pupil labs that allows the eye gaze to interact with the computer? Like the Windows Eye Control using Tobii eye tracker.

wrp 25 January, 2019, 03:07:44

Hi @user-82e954 would simple mouse control be enough of a good starting point?

user-82e954 25 January, 2019, 03:12:28

Yes, as long as the eye movement can interact with objects.

user-82e954 25 January, 2019, 03:14:10

I am thinking of using it as an input for a Unity project.

wrp 25 January, 2019, 03:14:41

This is not Unity, but does demonstrate simple mouse control: https://github.com/pupil-labs/pupil-helpers/blob/master/python/mouse_control.py

user-82e954 25 January, 2019, 03:22:22

Is there any example of the implementation of that function? Sorry I am new at this, so I don't really know how to use that.

wrp 25 January, 2019, 03:28:05

@user-82e954 this script runs independently. You start Pupil Capture, define a surface and name it "screen" according to this code, and then run the script via terminal/cmd prompt and it will use gaze coordinates on screen to control the mouse position

user-b70259 25 January, 2019, 03:32:14

Hi @wrp, Could you tell me what point does the orign of the three axis data like 'circle_3d' represents in 3D detection?

user-b70259 25 January, 2019, 03:32:51

Is it the origin of camera coordinate?

user-82e954 25 January, 2019, 04:01:28

@wrp Oh, I get it. Thanks!

user-619198 25 January, 2019, 20:20:48

Can anyone help me for the tutorial https://github.com/pupil-labs/hmd-eyes ?

user-619198 25 January, 2019, 20:35:36

Is this a project where you need to already setup the vive with unity? "This would be a good point to put said device on your head.

Use the displayed realtime videos of your eyes to make sure they are as centered as possible and in focus."

papr 25 January, 2019, 20:40:25

@user-619198 That is correct. This instruction assumes that the connection between the hmd-eyes plugin and Capture has been established already.

user-619198 25 January, 2019, 22:30:55

Awesome, thanks

user-af87c8 26 January, 2019, 20:44:09

@user-c5bbc4 @user-14d189 we are very close (think weeks) to publishing a preprint on a extensive visual test battery (many tasks) with 15 subjects, in direct (simultaneously) recording of an Eyelink-1000 as "gold standard". This might be the study you are looking for!

user-c5bbc4 26 January, 2019, 20:50:26

Hi, thanks for replying. Could you explain a bit about 'visual test battery'? I guess you mean a study with 15 participants testing the accuracy and precision of an Eyelink-1000?

user-c5bbc4 26 January, 2019, 20:51:48

not sure whether I get the point. 'gold standard' is referring to the best performance I guess?

user-af87c8 26 January, 2019, 20:57:57

@user-c5bbc4 not every eye tracking parameter can be pinned down to "ground truth", i.e. subjects do not fixate fixationcrosses perfectly (e.g. microsaccades), do not track smooth pursuit targets perfectly etc. We therefore recorded both PupilLabs Glasses & an Eyelink 1000 at the same time. So yes, "gold standard" would be best possible performance in a sense, of course dual purkinje or the likes woudl be even better, so "best" (currently) possible performance with an video eyetracker. The test battery contains multiple tasks for "classical" measures like accuracy, precision, but also a task for smooth pursuit, blinks, microsaccades, pupil dilation and head motion Hope that helps!

user-14d189 26 January, 2019, 23:51:22

@user-af87c8 looking forward to it! 'dual purkinje' would be an ideal reference point. May I ask you what you referring to with ' the likes'?

user-af87c8 27 January, 2019, 10:05:02

@user-14d189 I was thinking of scanning laser ophthalmoscope trackers. Where you can record single photoreceptors of the eye. I guess at least for small eye movements these will be even more accurate than dual purkinje eyetrackers

user-c5bbc4 27 January, 2019, 10:14:01

@user-af87c8 sounds very good!! So the work will be published in weeks? is the preprint open to the public?

papr 27 January, 2019, 10:15:02

@user-af87c8 I am also looking forward to your work!

user-c5bbc4 27 January, 2019, 10:17:08

@papr Hi, may I ask whether Pupil Labs has done some evaluation work for Pupil apart from the original paper as the product has evolved?

papr 27 January, 2019, 10:25:49

@user-c5bbc4 There is a paper by us that evaluates the 3d eye model in regards to the effect of refraction: https://perceptual.mpi-inf.mpg.de/files/2018/04/dierkes18_etra.pdf

Also, check out our Pupil Citation List: https://docs.google.com/spreadsheets/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/edit?ts=576a3b27#gid=0

If I remember correctly, it includes some independent work that evaluates the Pupil pipeline.

user-c5bbc4 27 January, 2019, 13:51:04

Thanks very much!! Is the glint-free model in the first paper the one used in Pupil Invisible?

papr 27 January, 2019, 13:52:48

The glint-free model is currently used for the 3d detection and mapping pipeline in Pupil Capture.

user-c5bbc4 27 January, 2019, 13:53:29

oh, okay. ThanksπŸ˜€

user-2798d6 27 January, 2019, 22:07:38

Hello! Does anyone know what it means when I get this error in Player?

Chat image

user-2798d6 27 January, 2019, 22:07:48

Thanks in advance!

papr 28 January, 2019, 06:49:23

@user-2798d6 if I remember correctly, this is related to the mjpeg video format and can most likely be ignored.

user-97bf6d 28 January, 2019, 14:47:00

Hello everybody!

I have a few questions for finding the best setup. I am using a HTC Vive for VR and I am not happy with my calibration. At the moment I am trying to figure out whether 2D or 3D detection works best. There a multiple other variables like the intensity range, pupil min, pupil max and all the image post processing variables I am playing around with at the moment. Is there a guide somewhere on how to set these variables? For instance what do the different rectangles in the debug window mean? How can I use the debug informatino for my purpose. Any help is appreciated! Thanks a lot!

user-97bf6d 28 January, 2019, 14:47:47

And also is there an easy way to change the calibration pattern in VR?

papr 28 January, 2019, 14:55:38

Hi @user-97bf6d Do you have an example eye video recording that you could share with us? I could have a look at it and give specific feedback.

user-97bf6d 28 January, 2019, 14:56:32

Hello papr! Not yet. I'll create one and share it with you

user-97bf6d 28 January, 2019, 14:56:55

Sorry... You need the eyes only?

papr 28 January, 2019, 14:57:56

Yes, that is correct. Please share the recording with [email removed]

user-97bf6d 28 January, 2019, 15:07:16

Done! Thanks a lot

user-af87c8 28 January, 2019, 16:43:13

@user-c5bbc4 @papr yes, the preprint will be openly availabe in the next weeks πŸ˜ƒ

user-9e1c96 28 January, 2019, 18:04:58

I recently purchased a couple monocular glasses -- while they work great, I realize my study design would require accurate gaze beyond the calibrated depth. My question is -- is it possible to upgrade the monocular glasses to binocular by simply adding another eye-camera, or do I need to purchase a separate glasses frame? (i.e., getting the binocular model).

papr 28 January, 2019, 18:28:30

@user-9e1c96 please contact info@pupil-labs.com with that question

user-9e1c96 28 January, 2019, 18:46:39

@papr thanks! just emailed them.

user-4c85cf 29 January, 2019, 23:08:14

Hello! When looking at my gaze position data, I noticed that the difference in gaze timestamps between consecutive sample points is sometimes 4.17ms, instead of 8.33ms as would be expected from my sample rate of 120/s. Can someone help explain why this is occurring?

papr 29 January, 2019, 23:10:06

@user-4c85cf if you are mapping monocularly, then you might see interleaved samples from both sides and therefore a sampling rate of 240 hz.

papr 29 January, 2019, 23:12:29

@user-4c85cf You can check by looking at the gaze topic. If it ends in 01 it is binocular, if it is 0 or 1, then it is monocular. In case you have binocular data, please send a copy of the recording to data@pupil-labs.com such that I can have a closer look.

user-4c85cf 29 January, 2019, 23:31:47

@papr thanks for the quick reply! However, I'm not exactly sure where to find this gaze topic you mentioned.

user-c550fd 30 January, 2019, 00:41:29

sorry if this is a FAQ but is it possible to experiment with Pupil and get it working without first buying the expensive glasses?

papr 30 January, 2019, 07:06:59

@user-4c85cf have you been looking at the exported gaze positions csv or at Realtime zmq data?

user-4c85cf 30 January, 2019, 16:49:07

@papr I've been looking at the csv files

papr 30 January, 2019, 16:55:01

@user-4c85cf Check the base_data field. It should include data in the format of - xxxxxx-0, - xxxxxx-1, or - xxxxxx-0 xxxxxx-1

user-c550fd 30 January, 2019, 17:40:52

did i ask a bad question? 😦

user-a6a5f2 30 January, 2019, 18:51:06

@user-c550fd check out https://docs.pupil-labs.com/#diy. The key requirement is that you use uvc compliant cameras. Alternatively, if you want to bypass hw altogether, you could run the software against a dataset of recorded eye movements.

user-41f1bf 31 January, 2019, 04:27:49

@user-c550fd, you can build your own DIY pupil-dev to make your own records. Also, you can use sample data available in the site.

user-a39804 31 January, 2019, 04:36:26

@user-41f1bf can you provide any link to build. My own recordings? Let's say I want to stream the world camera to a remote website. Is this possible?

user-41f1bf 31 January, 2019, 04:38:03

docs.pupil-labs.com

user-41f1bf 31 January, 2019, 04:38:53

Try writing DIY in the search box

user-a39804 31 January, 2019, 04:39:50

Thanks. I will try that

wrp 31 January, 2019, 07:41:32

@user-a39804 streaming world camera (or any camera feed) can be done on a local WiFi network with minimal latency. However, streaming over the internet (Pupil Capture (desktop) --> Server --> Client) would likely experience high latency.

user-1609ba 31 January, 2019, 12:30:09

@papr I would like to ask if I can read the confidence value after calibration? I know there is a threshold which decide whether the calibration is success or not, but I don't know how to get/modify the threshold, and I don't know how can I know my confidence value right after calibration.

papr 31 January, 2019, 13:29:30

Hi @user-1609ba There are a few concepts that need to be distinguished carefully here: - Confidence: Quality assessment of a pupil datum (eye camera space). You can see the pupil confidence at all time in the top left graphs of the world window. - Calibration success: A calibration requires two things to be successful: pupil (eye camera space) and reference (world camera space) data. If this is given, the calibration tries to find the best possible mapping function between pupil data (eye camera space) and gaze data (world camera space). There is no quality assessment in regards to the mapping accuracy (see below) - Calibration confidence threshold: In order to get the most accurate mapping, we filter the pupil data by confidence before applying the calibration, i.e. we only use high quality data for calibration - Calibration accuracy+precision: This is measured after the calibration by the Accuracy Visualizer plugin. See the docs for details on this.

Could you specify your questions regarding these terms?

user-1609ba 31 January, 2019, 14:03:46

@papr thanks for your explain. Here is the situation. I am trying to develope an application that I need to know if the calibration is successful enough or not. If the calibration is not good enough this time(which means the pupil data and gaze data are not consistent enough), then I will probably ask the user to redo the calibration with a slightly adjustment of HMD setting. Therefore, I would like to ask how can I quantify the calibration's quality? Also, how to improve the calibration's quality if it is not optimal?

user-a39804 31 January, 2019, 15:40:34

@wrp that will be an issue then because we want to stream in real time with low latency

user-a08c80 31 January, 2019, 23:56:15

Looking at heatmaps in pupil player. We are seeing a static heatmap. Is there a way to show the heatmap as it generates in pupil player? We have screenrecorded when collecting data to produce this effect, but feel this is not ideal. Ty

End of January archive