core


user-4bf830 01 July, 2019, 16:51:53

Hello! Is there a way to create interest areas so that my output will notes when the pupil is fixated on a specific point? Iโ€™m guessing thereโ€™s a way to draw those windows in the capture screen?

papr 01 July, 2019, 16:55:56

@user-4bf830 checkout the surface tracker plugin. It does what you need. Check the documentation for information on the setup.

user-4bf830 01 July, 2019, 16:56:18

Thank you!

user-bd800a 02 July, 2019, 12:01:24

Hi, I've experienced at least on 2 different headset a freeze of one eye or one eye camera not being recognised anymore in the middle of recordings and could only solve this with a different USB-C cable, has this ever occurred before? On another note with 1.8 and 1.10 on my machine (Windows 10) i've experience the disappearing of the side bar icons. They still work and the label also shows up but not the icon. Resizing restores them but maybe you need to know about this. I've haven't been able to replicate this on any other machine though.

user-bd800a 02 July, 2019, 12:14:45

Also do you plan on maybe releasing a headset compatible for children and/or for different populations, with a flatter nose for example ? In the latter case we experienced the glasses falling very low on the person's face ๐Ÿ˜…

wrp 03 July, 2019, 07:08:07

@user-bd800a thanks for the feedback. Notes: 1. Cameras not being recognized - Have you been able to reproduce this issue on another Win 10 machine? Are you using the the USBC-A cable that we supplied? 2. Resize issue on Windows - yes, we are aware of this and have an open issue on github 3. Child sized headset - Yes, we do have a child sized headset. While not listed on the store, we can make this custom if you email sales@pupil-labs.com 4. low nose bridge - Have you tried the clear silicone nose pads?

user-067553 03 July, 2019, 11:43:41

Hi, I would like to insert the time in the world.mp4 produced by pupil player, is it possible? Thanks

user-0cf021 03 July, 2019, 15:02:05

Hi, we've been using the Pupil Capture v0.9 with the 120Hz HTC Vive eyetracker successfully, until now, when we've tried to upgrade to a 200Hz eyetracker and Pupil Capture v1.12. Unfortunately, when we try to calibrate, there are no markers shown. Now, when trying to use the 120Hz eyetracker, we get an error "unexpected keyword argument 'exposure mode'" or "unexpectede keyword argument 'check stripes'". Could you please advise us what could we look into?

user-bc5d02 03 July, 2019, 16:13:50

Hello to everybody. Do anybody know what to do if my recording session in Pupil Capture was interupted by start of sleep mode. After my laptop had woken up the recording have been continuing. But the folder with recordings do not play in Pupil Player. Actually it is play, but only a grey screen. Did anyone face such problem?

user-ffdd08 03 July, 2019, 20:26:34

I'm having trouble viewing exported heatmaps. When I open them up (Windows Photo Viewer) they appear briefly and then disappear. Has anyone else had this problem? I'm also unsure if this is a Pupil issue or a Windows issue, although I can view other .png images just fine.

wrp 04 July, 2019, 04:59:56

@user-067553 You want to be able to write the timestamp directly into the pixels of the world video correct? While this is not curently possible with Pupil Player, you could do this with ffmpeg after exporting world.mp4 like so:

ffmpeg -i "world.mp4" -vf \
"[in]drawtext=fontsize=15:fontfile=UbuntuMono-R.ttf:text='Frame\: %{n}':fontsize=48:fontcolor='white':x=(w-tw)/2:y=(h-50)-(2*lh):box=1:boxcolor=0x00000099,drawtext=fontsize=15:fontfile=UbuntuMono-R.ttf:text='Time\: %{pts\:hms}':fontsize=48:fontcolor='white':x=(w-tw)/2:y=(h)-(2*lh):box=1:boxcolor=0x00000099[out]" \
"world_with_time.mp4"

(not the "prettiest" one liner I know, but should get the job done relatively quickly. Please ensure that you have the font installed, or specify a different font that is installed on your system. in this example I am using UbuntuMono-R.ttf)

wrp 04 July, 2019, 05:00:33

@user-0cf021 please raise the issue regarding VR in the vr-ar channel.

wrp 04 July, 2019, 05:02:46

@user-bc5d02 You should try to install an application that would enable your laptop to not sleep even if the lid is closed. What I think you saw with the "gray screen" is the result of Pupil Player trying to recover from a "split recording" beahvior, where you had video recorded for part of the recording, but not all of the recording. In cases like these, Pupil Player will be able to play back data, but will fill in gaps with blank world video frames (gray screen).

wrp 04 July, 2019, 05:03:36

@user-ffdd08 did you define a size for the surfaces before exporting? Have you tried opening in another photo viewer?

user-0cf021 04 July, 2019, 10:15:20

@wrp Should I move my question there even though we are not using the hmd-eyes? Our experiment is running on Unity 2017 so we cannot use it. Thank you ๐Ÿ™‚

user-0cf021 04 July, 2019, 10:37:10

Hi again, is it possible to get access to pre-compiled Pupil Capture v0.9.12? We desperately need it for running of our experiment, and you have already removed it from your github

wrp 04 July, 2019, 11:11:19

@user-d16d74 you can always download earlier bundles if desired -- https://github.com/pupil-labs/pupil/releases?after=v1.4 - however, it is recommended to always use the latest release.

user-0cf021 04 July, 2019, 12:35:15

@wrp Thank you very much!

user-a6cc45 04 July, 2019, 16:03:56

Hi, I have a question about timestamps. How to convert "Start Time (Synced)" to datetime? I have two recordings (from Capture version 1.11 and 1.12) and I tried to convert this timestamp on https://www.epochconverter.com/ and the results are on the picture

Chat image

user-a6cc45 04 July, 2019, 16:06:04

I need to convert it in order to make plot like below:

Chat image

papr 04 July, 2019, 20:41:14

@user-a6cc45 do you really need the exact date? You can use pandas.to_datetime(timestamps, unit="s") to convert timestamps to arbitrary datetime objects. Important is the time difference between the events, isn't it?

user-e91538 04 July, 2019, 22:26:50

Hello guys, we bought Pupil Lab eye tracking device and tried to install it on our Win 10. However, even if following the instructions about the drivers, none of the exe files wants to open and start installation. Please do advise us what to do to install the SW on our laptop? https://docs.pupil-labs.com/#troubleshooting

wrp 05 July, 2019, 02:16:13

Hi @user-e91538 If you haven't already, please try running pupil_capture.exe as admin. When you say "none of the exe files wants to open and start installation" what do you mean? What do you see when you run pupil_capture.exe with admin permissions?

user-e91538 05 July, 2019, 13:23:36

Hello, I am attaching some pictures for you to check. I downloaded the latest version from Github for 64bit Win, and in all the folders I am checking the PyInstaller folder as I don't see any exe files anywhere else. In the capture for Win 64 bit, I see 4 files: run, run_d, runw and runw_d (d stands fro drivers I assume, and w for Windows?). I clicked all of them but no success (for the "run" file I even get that it is compatible with Win 8 only)

user-e91538 05 July, 2019, 13:30:02

Here are the images attached - unfortunately I couldn't send them in rar as the file is to big, @wrp

user-e91538 05 July, 2019, 13:30:54

hm, even now although my images are 5MB, it does not allow me to send me (with a limit of 8MB?)

user-e91538 05 July, 2019, 13:32:32

In general the error messages are: "Cannot open the ....exe";

user-e91538 05 July, 2019, 13:32:38

[11524] Pyinstaller Bootloader 3x;

user-e91538 05 July, 2019, 13:32:39

[11524] LOADER executable is:...exe

user-e91538 05 July, 2019, 13:33:02

[11524] LOADER MEIPASS is NULL

papr 05 July, 2019, 13:33:12

@user-e91538 just for verification, you downloaded the 7zip file from the release page, correct?

user-e91538 05 July, 2019, 13:34:27

yes, pupil_v1.13-29-g277ac8c_windows_x64.7z 476 MB

user-e91538 05 July, 2019, 13:35:12

from here : https://github.com/pupil-labs/pupil/releases/tag/v1.13

papr 05 July, 2019, 13:35:52

ok, and the output above is shown in the console window which opens when double-clicking the Capture.exe file?

user-e91538 05 July, 2019, 13:36:16

yes

papr 05 July, 2019, 13:37:01

Can you paste the complete log?

user-e91538 05 July, 2019, 13:37:37

sec, as it is on another comp

user-e91538 05 July, 2019, 13:37:58

just to show in which folder I am

Chat image

user-e91538 05 July, 2019, 13:38:51

I will try to provide pictures, sorry

user-e91538 05 July, 2019, 13:39:32

error1

Chat image

user-e91538 05 July, 2019, 13:41:03

error2

Chat image

user-e91538 05 July, 2019, 13:41:09

error3

Chat image

user-e91538 05 July, 2019, 13:41:20

error4

Chat image

user-e91538 05 July, 2019, 13:41:27

error5

Chat image

user-e91538 05 July, 2019, 13:41:58

is that helping?

user-e91538 05 July, 2019, 13:42:33

@papr ? Is that ok? Shall I send any other info?

user-e91538 05 July, 2019, 13:52:04

logs checked - I don't even see a proper error log there

papr 05 July, 2019, 14:03:15

@user-e91538 That is the wrong exe! In the unpacked 7zip folder, there is a Pupil Capture.exe, which needs to be executed. It has a green icon

papr 05 July, 2019, 14:03:29

Do not execute anything from within the Pyinstaller folder

user-e91538 05 July, 2019, 14:03:41

perfect. let me check as I don't see it

user-e91538 05 July, 2019, 14:07:37

wonderful! I so much hoped I am in the wrong folder. That is why I wanted to send those screenshots

user-e91538 05 July, 2019, 14:07:58

Thank you so much!!!!! @papr

user-c4e9fb 05 July, 2019, 17:05:44

Hello, I get the following error uvc.c:15514:20: error: too many arguments to function โ€˜uvc_stream_startโ€™ when doing uvc.c:15514:20: error: too many arguments to function โ€˜uvc_stream_startโ€™

user-c4e9fb 05 July, 2019, 17:06:03

Im trying to install on ubuntu 18.04

user-1dd00a 05 July, 2019, 17:11:09

Hi, I've just set up Pupil Labs eye tracking. My first recording worked fine but now when I try and open the Pupil Player the Pupil Player window doesn't open properly. The command window reads: 'player - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend'. The other window shows in the task bar but doesn't display on the screen...

papr 05 July, 2019, 17:13:27

@user-1dd00a did you record with Capture or Pupil Mobile?

user-1dd00a 05 July, 2019, 17:14:07

I recorded with Capture, viewed it once with Pupil Player successfully but since then Pupil Player hasn't displayed properly.

user-c4e9fb 05 July, 2019, 17:17:02

Ok nvm I got it working

papr 05 July, 2019, 17:19:20

@user-1dd00a the error that you see is not an actual error. What is theast output that you see? Is it the same message?

user-1dd00a 05 July, 2019, 17:21:36

I just don't see the other window at all (the one that says 'Drop directory here...'). It is shown in my task bar but it doesn't appear...

user-1dd00a 05 July, 2019, 17:27:30

This is the only window that is actually visible - the Pupil Player window is displayed when I hover over the task bar

Untitled_Recovered.bmp

user-f247c1 05 July, 2019, 17:47:39

Noob question. I've got ubuntu on a laptop . I've downloaded pupil-1.13.zip from the github site, but there does not seem to be any instructions on how I install it. What do I dop next?

papr 05 July, 2019, 17:48:53

@user-1dd00a interesting. And you can't click onto the thumbnail?

papr 05 July, 2019, 17:49:23

@user-f247c1 unzipping it yields three Deb files. Just double click them to install

user-f247c1 05 July, 2019, 17:49:43

hmm cant see any deb files

user-1dd00a 05 July, 2019, 17:50:58

@papr I can but nothing happens... I just tried re-installing the Player folder but same issue persists (restarting doesn't fix it either).

papr 05 July, 2019, 17:51:50

@user-1dd00a please try deleting the user settings files in the pupil_player_settings folder

papr 05 July, 2019, 17:52:30

@user-f247c1 what files do you see when unzipping the zip?

user-f247c1 05 July, 2019, 17:53:12

directoies called :- deployment, pupil_external, pupil-src,

user-f247c1 05 July, 2019, 17:53:51

files called COPYING, COPYING.LESSET README.msd and updatze_license_header.py

papr 05 July, 2019, 17:55:09

@user-f247c1 ah, you downloaded the wrong zip. You probably downloaded the source zip. Try to download the other zip that has Linux in the name.

user-f247c1 05 July, 2019, 17:55:50

Where do I find that zip? all the links seem to bring back to the source zip?

user-1dd00a 05 July, 2019, 17:56:22

@papr Brilliant - that worked. Thanks!!! (no idea why it happened in the first place but now I know how to fix it....)

papr 05 July, 2019, 17:57:15

@user-1dd00a I do not know why this happens either ๐Ÿ™ windows ๐Ÿคท

user-f247c1 05 July, 2019, 17:58:46

ahh thankyou I was clicking on the 1.13 tag unuder latest release, which took me to the source files.

user-c4e9fb 05 July, 2019, 18:16:33

hi I setup the pupil capture but my eye looks inverted is this normal?

papr 05 July, 2019, 18:17:22

@user-c4e9fb yes, the camera is physically flipped. This has no impact on pupil detection.

user-c4e9fb 05 July, 2019, 18:17:51

Cool thanks

user-a6cc45 06 July, 2019, 20:13:55

@papr You're right, instead of exact time on X axis I've decided to put there the amount of seconds/minutes since the beginning of the recording ๐Ÿ˜ƒ however I have another question: When I play my recording (recorded in Capture v 1.11) in Player v1.13 I get message: "player - [WARNING] surface_tracker.surface: You have loaded an old and deprecated surface definition! Please re-define this surface for increased mapping accuracy!" How can I re-defined surface? Does editing old surface in Player will help?

papr 06 July, 2019, 20:18:11

@user-a6cc45 You can add new surfaces in Player as you do it in Capture. Just editing the old ones should work as well.

user-0cf021 07 July, 2019, 14:49:30

Hi again, I'm still experiencing issues after switching from 120Hz to 200Hz eyetracker. Because our Unity project doesn't currently work with the 200Hz project, we want to go back to using the 120Hz eyetracker, but we get the error "unexpected keyword argument 'exposure mode'" or "unexpectede keyword argument 'check stripes'". @fxlange from hmd-eyes channel recommended to "restart with default settings after switching hardware". Could you please advise me how to restart the Pupil Capture with default settings, when right now I cannot launch it at all?

user-0cf021 07 July, 2019, 14:53:46

This is what I get when I try to launch the exe. I'm using Pupil Capture v0.9.12 and due to our project, I cannot use a newer version

Chat image

papr 07 July, 2019, 14:55:24

@user-0cf021 Please delete the user_setting_* files in the pupil_capture_settings folder in your users' home folder and start the application again.

user-406c8b 07 July, 2019, 15:17:28

Hello! I am trying to track users' gaze while they walk in a park/in a street so that I can later assess what they are looking at and when. Would this be possible using a Pupil Labs headset, and, if so, what calibration method would you recommend?

user-0cf021 07 July, 2019, 15:17:39

@papr Thank you so much for replying on Sunday! I've tried that and the problem persists, and it did not create new user setting files

user-0cf021 07 July, 2019, 15:30:23

Alternatively, I get this error message. I haven't figured out why I sometimes get "exposure mode" and sometimes "check stripes" errors. I launch the exe file the same way, and one time I get one error and the second time I get the other error.

Chat image

papr 07 July, 2019, 15:33:45

@user-0cf021 I do not think that the correct files were deleted.

papr 07 July, 2019, 15:35:46

@user-0cf021 could you please also check the capture.log file next to the user_setting_files (or where they have been before), and look if it includes further details?

user-0cf021 07 July, 2019, 15:52:55

@papr You were right about the incorrect files! I was able to successfully start the old Pupil Capture now. It doesn't connect to the hardware though. I'm enclosing the capture log

capture.log

papr 07 July, 2019, 15:57:14

@user-0cf021 Nice! The log shows that the cameras can be found but the process is not able to start the camera. Please disconnect the hardware, restart your computer and try again.

user-0cf021 07 July, 2019, 15:59:39

@papr Perfect! I will do that, I just need to wait until some process that's currently running finishes. Thank you so much for helping, I really appreciate it!

user-e7102b 07 July, 2019, 16:59:26

Hey @papr , I've created a DIY hardware setup comprised of two Logitech C615 webcams and one C930e webcam, all sampling at 30 Hz. When I load the data into pupil player and play back the recording with both eye overlay videos enabled, it's obvious that that cameras are all out of sync. Comparing the timestamps from the different cameras also confirms that they are several frames out of sync, so this doesn't seem to just be a visualization issue. I was wondering if you have any suggestions for rectifying this issue?

user-e7102b 07 July, 2019, 17:04:32

Side note - I'm running pupil capture 1.5, which I realize is a bit outdated now (I'm still using this because the more recent versions of pupil capture won't work with our "pupil middleman" server due to the changes in the annotation format). Also, version 1 of my DIY setup was comprised of 3 identical C615 webcams, and I don't recall seeing this issue (in the current version we upgraded one camera to a C930e for better video quality).

papr 07 July, 2019, 17:08:02

@user-e7102b if I remember correctly, we use software timestamps instead of hardware timestamps for the C930. This is due the camera not producing sane hardware timestamps. The same might apply to the C615. You might need to change the source code to add the same exception for the C615

user-e7102b 07 July, 2019, 17:38:42

@papr thanks for the quick reply! That makes a lot of sense. I've tracked down mentions of the C930e webcam in lines 143 and 381 of the uvc_backend.py source code. Can you advise on the most straightforward way to make this exception for the C615?

user-0cf021 07 July, 2019, 18:11:00

@papr I've restarted the PC and the problem prevails. I've also tried reinstalling the driver following these steps but nothing changed: https://www.bullseye.uni-oldenburg.de/unity-and-pupil-capture-installation/

papr 07 July, 2019, 18:34:35

@user-e7102b mmh, actually, it looks like the exception is made the other way around: Always use software timestamps unless it is a Pupil cam. l.180-188

Could you share the timestamp files of the recording with me? I will have a look at them in the coming week.

papr 07 July, 2019, 18:37:26

@user-0cf021 Sorry, I am out of ideas here. Even though you cannot use the current version of Capture for your project, did you give it a try? Maybe its automatic driver installation does something different which makes it work?

user-e7102b 07 July, 2019, 19:11:52

@papr ok thanks. Here are the timestamp files:

eye0_timestamps.npy

user-e7102b 07 July, 2019, 19:12:00

eye1_timestamps.npy

user-e7102b 07 July, 2019, 19:12:01

world_timestamps.npy

user-0cf021 07 July, 2019, 19:28:43

@papr I've resolved the issue! I've followed the steps mentioned here by willpatera https://github.com/pupil-labs/pupil/issues/1011 (especially showing the hidden devices) and uninstalling all of them. Then restarted the PC and followed meticulously these steps https://www.bullseye.uni-oldenburg.de/unity-and-pupil-capture-installation/ I've realised I missed the step where I was supposed to unselect ignoring the composite parents. Such a silly mistake! There, I've found that for these were installed some different drivers. After reinstalling those for libusbk 3.0.7.0. everything worked fine. Thank you so much for all your help! ๐Ÿ’ฏ I absolutely couldn't have done it without you, and you've just saved a lot of people's research! ๐Ÿ˜

user-4bf830 08 July, 2019, 15:42:34

@papr Do you happen to know the size of the screws used on the ball joints for the eye cameras on the Pupil mobile headsets?

user-909a69 08 July, 2019, 16:25:48

Hi everyone, we have have your 200hz camera in my lab and I wondering if someone can send me the ref of the infra-red LED you use. Thanks!

papr 08 July, 2019, 16:40:11

@user-4bf830 @user-755e9e might know that

papr 08 July, 2019, 16:54:11

@user-909a69 @user-755e9e might be able to help here as well

user-ffdd08 08 July, 2019, 17:53:13

Hi. What are some tips to increase the accuracy of the eye tracker? I have a single 200hz eye camera. The gaze data always seems to be a few inches off where I was actually looking. But I see much better results in online videos so I feel like I am missing something.

user-ffdd08 08 July, 2019, 17:55:17

^ and the error is not consistent enough across the FOV to just adjust for drift.

user-ffdd08 08 July, 2019, 18:15:14

Also, is there a way to change the default X and Y size for new surfaces? I will be analyzing a lot of videos that all have the same surface size and this would save a lot of time.

user-c539ee 08 July, 2019, 22:05:20

ยธallo

user-755e9e 09 July, 2019, 06:53:19

Hi @user-909a69 please find attached the datasheet of the infrared LEDs.

datasheet_led.pdf

user-755e9e 09 July, 2019, 07:41:01

Hello @user-4bf830 , the screws used for the ball joint on the 200Hz eye cameras is M1.5/8mm.

user-067553 09 July, 2019, 08:44:58

hi @wrp thanks for the answer. I've tried the code you send me for ffmpeg, but it gives me error. Here it is the screenshot, I've just modified the font with arial.tff

Chat image

wrp 09 July, 2019, 08:47:33

Might be missing flags for output format. I will try this again later and get back to you with updated ffmpeg script

user-067553 09 July, 2019, 09:44:24

thanks @wrp, I've resolved, now it works. Is it possible to take the time from the recording? Like if the video was taken at 1PM, is it possible to show the time when I was recording with pupil capture?

papr 09 July, 2019, 15:45:31

@user-067553 I think you would have to write a custom player plugin to accomplish this.

user-067553 09 July, 2019, 16:00:03

Hi @papr Can we create it together? I think it could be a good functionality for all the users of pupil.

papr 09 July, 2019, 16:07:22

Might be a nice project, indeed.

wrp 10 July, 2019, 00:42:38

You might be able to do this just with ffmpeg, but it would likely be a huge one liner

user-14d189 10 July, 2019, 01:10:48

Can I ask for some experience from pupil mobile please? Can I use saved calibrations that has been obtained with a direct usb connections? or would you recommend post processing of calibrations?

wrp 10 July, 2019, 01:14:20

@user-14d189 what is your network setup like? Do you have a dedicated router? I ask because high network latency can lead to dropped frames which could in turn lead to less scene camera frames for calibration markers detection and might yield worse calibration results due to less data

wrp 10 July, 2019, 01:16:15

I would recommend starting recording prior to calibration, that way you can decide if you want to do calibration post-hoc or use the real time calibration provided by pupil capture

wrp 10 July, 2019, 01:18:08

Btw @user-14d189 I saw your other question in software-dev ; I will take a look at that later today

user-14d189 10 July, 2019, 01:21:47

We use a dedicated router, Netgear CG3100D, nothing else on there. Works good at close range. We run a custom pico flexx app in top of pupil mobile. in close range it uses 40Mbps. It does not seem to have much lag, but the point inclusion during calibration is about half.

user-14d189 10 July, 2019, 01:22:17

thanks for looking into the auto exposure.

wrp 10 July, 2019, 01:24:32

In this case I would recommend starting recording prior to calibration

user-14d189 10 July, 2019, 01:24:39

the 40Mbps --- the point cloud data is a bit chucky.

user-14d189 10 July, 2019, 01:25:30

I will have a closer look to the data and try again. Thanks!!

user-21aacd 10 July, 2019, 10:24:48

Hi, is there a way to detect fixations using your implementation (at Offline_Fixation_Detector in fixation_detector.py) passing a csv file instead of the capture? I have a csv file containing info about some points (index of the frame in wich this point has been registered, norm_pos_x, norm_pos_y) and i'd like to detect fixations, if any, among that points. Is it possible or i have to modify the script in some way? Thanks in advance

user-405421 10 July, 2019, 15:13:43

Hi everyone, I'm interested to know the expected release date for Pupil Invisible.

papr 10 July, 2019, 15:20:19

@user-405421 You can preorder starting today. We plan to start shipping during Q4 of this year.

user-405421 10 July, 2019, 15:28:15

@papr thank you. We will be needing it for a research project about to launch in the Fall, so we will need a more specific timing to decide whether this is a good plan for us.

user-a7dea8 10 July, 2019, 15:31:50

Hello, is there a way to get the location coordinates (in the video) of the circle targets used for offline calibration in the csv file that is exported by Pupil Player?

wrp 10 July, 2019, 15:34:03

@user-405421 We will follow up with more information in 1-2 weeks with a more concrete shipping date estimate for Pupil Invisible.

papr 10 July, 2019, 15:35:41

@user-a7dea8 Currently, this information is not exposed as a csv export, but this can be changed for the next version.

user-a7dea8 10 July, 2019, 15:36:50

It would be great if that feature could be added, currently I am independently finding the pixel coordinates of the targets after the fact.

user-405421 10 July, 2019, 15:36:57

@wrp great, thanks

papr 10 July, 2019, 15:37:26

@user-a7dea8 you could extract this information using Python already

user-4bf830 10 July, 2019, 15:53:04

Hey, @papr, I've been playing with the surface tracker plugin (with success) for a few days now and just yesterday and today it's been freezing my capture window the second I try to enable it. I'm trying to attach some screenshots of the operations in the event that those help. Any suggestions? I've tried some of the obvious, like restarting the window with the default settings, closing and reopening the player, etc.

Chat image

user-4bf830 10 July, 2019, 15:53:08

Chat image

user-4bf830 10 July, 2019, 16:01:10

So three reboots later itโ€™s working again, but Iโ€™m not sure what the issue was.

papr 10 July, 2019, 16:44:55

@user-764f72 please have a look at the traceback that was posted by @user-4bf830 Do you see a way to reproduce it?

user-4bf830 10 July, 2019, 17:16:35

Unfortunately (?) no, it appears to just be working now.

papr 10 July, 2019, 17:22:15

@user-4bf830 sorry, that question was meant for @user-764f72. I forgot to add the "."

user-72cb32 12 July, 2019, 04:29:07

@papr Hi! I'm sorry for bothering you once again. So, I've got msgpack for Java ready and when I receive and decode the second frame (msgpack encoded dictionary), either it does not print or it prints unclear string.

user-72cb32 12 July, 2019, 04:29:11

print("TOPIC : "); byte[] bytes = subscriber.recv(); MessageUnpacker unpacker = MessagePack.newDefaultUnpacker(bytes);
ff(unpacker); println();

print("PAYLOAD : "); byte[] bytes2 = subscriber.recv(); MessageUnpacker unpacker2 = MessagePack.newDefaultUnpacker(bytes2);
ff(unpacker2); println();

user-72cb32 12 July, 2019, 04:29:38

ff() is the function for decoding according to the dictionary that pupil labs provided

user-72cb32 12 July, 2019, 04:29:56

Is there any other dictionary available for decoding?

papr 12 July, 2019, 05:58:57

@user-72cb32 it should not be necessary to deserialize the first frame btw. It is a simple string.

papr 12 July, 2019, 05:59:42

Could also please provide some example output?

user-72cb32 12 July, 2019, 06:43:11

It looks like this!

[email removed] [email removed] [email removed] &?u???`+ D?radius???d??YC??confidence??? [email removed][email removed]i?}?sphere??center??@B?ya??????T[email removed][email removed][email removed] X??model_id??model_birth_timestamp?A"a?Y??p?theta???n??? ?phi??^6??*??method?3d c++?id?norm_pos????G???????6Yf?x

papr 12 July, 2019, 06:51:36

@user-72cb32 this looks like the raw msgpack data instead of the decoded result. Could you share your FF function? E.g. Via gist.github.com

user-72cb32 12 July, 2019, 07:09:59

Oh, ok!

user-72cb32 12 July, 2019, 07:13:17

this is it

papr 12 July, 2019, 07:16:51

It looks like you are missing a case for dictionaries

papr 12 July, 2019, 07:18:04

Or mappings

user-72cb32 12 July, 2019, 07:51:29

Could you elaborate that a little more specifically?!

papr 12 July, 2019, 07:59:58

Your function is deserializing content based on its type, see the big switch statement. It includes cases for all types but one: dictionary/mapping. The second frame send by Capture is always a dictionary, i.e. It is always ignored by your code.

user-72cb32 12 July, 2019, 08:08:19

Ok. It could be either Dictionary or Mapping? Because there are type for dictionary and also type for mapping

user-72cb32 12 July, 2019, 08:08:25

as long as I know

user-72cb32 12 July, 2019, 08:13:19

Probably something like this?

papr 12 July, 2019, 08:15:07

Mmh, I am not sure if there is a difference. Usually both terms are used interchangeably. In case one of both does not require the keys to be strings, take that one.

papr 12 July, 2019, 08:17:29

@user-72cb32 also, you might need to call the FF function recursively to unpack arrays and maps correctly

user-72cb32 12 July, 2019, 08:27:58

Yes! That was the point for FF function! I will try and if there's another issue, I will find you again. Thank you so much!

user-4d0769 12 July, 2019, 22:18:26

Hi all, does one of the Pupil Labs device give the 3D geometric parameters of the eye?

papr 14 July, 2019, 12:51:26

@user-4d0769 Capture implements the swirski 3d model, which gives you the orientation and position of the model in relationship to the eye camera. Is this what you meant?

user-94ac2a 15 July, 2019, 08:32:40

Hey. How can we make pupillabs run on Arm64 Linux?

user-067553 15 July, 2019, 10:48:55

Hi, the last week we have done recordings for our study, today i'm trying to open it with pupil player to assemble the eye recordings with world recording. 2 problems: 1- when we open with pupil player the recording (1hour) is grey (I think it's pupil player error, because the file world.mp4 if opened with Vlc player is there). 2-the eye videos are desyncronized.

papr 15 July, 2019, 13:10:38

@user-067553 could you please share the *_timestamp.npy files of the recording?

papr 15 July, 2019, 13:10:52

As well as the info.csv file, please.

user-067553 15 July, 2019, 15:26:21

Hi @papr, here it is

eye0_timestamps.npy

user-067553 15 July, 2019, 15:26:25

eye1_timestamps.npy

user-067553 15 July, 2019, 15:26:26

info.csv

user-067553 15 July, 2019, 15:26:27

gaze_timestamps.npy

user-067553 15 July, 2019, 15:26:27

notify_timestamps.npy

user-067553 15 July, 2019, 15:26:29

pupil_timestamps.npy

user-067553 15 July, 2019, 15:26:29

world_timestamps.npy

papr 15 July, 2019, 15:26:36

@user-067553 Thank you very much!

user-067553 15 July, 2019, 15:28:47

Here it is a screenshot from pupil player @papr

Chat image

papr 15 July, 2019, 15:30:09

@user-067553 Thank you. This is an issue I have been working on today. It will be fixed with the next release of Pupil Player. I will be able to send you updated timestamp files that remove the issue later today ๐Ÿ‘

user-067553 15 July, 2019, 17:10:09

Thank you @papr

user-88b704 15 July, 2019, 20:01:29

hi, I am doing offline calibration and would like to access to the x y values of the reference points. Is these a way that I can extract the data from the plcal file?

user-dae891 15 July, 2019, 21:13:26

hi, I have a problem of heat map size. I am using the most recent version of pupil player, and in the offline surface tracker I have set the size of heatmap (3840x2160). However, the explored the heatmap has size of 31x17. This is very strange, since the older version of pupil player can explore the size that I wrote.

Chat image

user-dae891 15 July, 2019, 21:13:27

Chat image

papr 15 July, 2019, 21:16:24

@user-dae891 the logic for the heatmap size changed in v1.13. Try changing the heatmap smoothness.

user-dae891 15 July, 2019, 22:35:39

Sorry, i don't understand the instruction of heatmap smoothness. I have try to adjust the value, and it giving me different size of heatmap every time. Would you please explain to me what is heatmap smoothness? and how can I get 4K resolution heatmap?

user-dae891 16 July, 2019, 17:25:59

@papr

user-dae891 16 July, 2019, 17:27:18

@papr Do you know how can get certain resolution heatmap?

papr 16 July, 2019, 17:38:17

@user-dae891 Apologies, I didn't have time to look that up today. I will come back to you tomorrow.

user-dae891 16 July, 2019, 17:49:07

@papr Thank you very much! Additionally, I was wondering do you guys have any source about how to choose 'heatmap smoothness'? I am working towards publication, and I may need the supporting reference for the parameter selection.

papr 17 July, 2019, 09:55:17

@user-dae891 I talked to @marc (the author of the surface tracker changes). He will look into it.

papr 17 July, 2019, 11:14:43

@user-88b704 The *.plcal files do not include any reference information. But the "reference_locations.msgpack" file in the "offline_data" does!

This is an example on how to read and extract the data: https://gist.github.com/papr/655cc5f005ca032b0eb602317e89f9ba

marc 17 July, 2019, 12:07:52

@user-dae891 The best smoothness value depends on your application. The generated heatmap is a 2D histogram over the surface. The smoothness value influences how many bins are used for the histogram, thus influencing the resolution of the generated image, which is exactly this histogram. In the current implementation the maximum number of bins you can get is 1000. Getting a 4K heatmap is currently not possible (although you can of course get a histogram with 1K resolution over a 4K display). With the previous version of the surface tracker people were often confused with setting the resolution of the heatmap, so we tried to simplify it this way. Since this is a recent change we are looking for feedback on this!

user-9c3078 17 July, 2019, 14:54:30

Hi! I have a question about the pldata and .csv. For example, after using pupil_capture, we can get a file called fixations.pldata. And after using pupil_player, we can get a file called fixations.csv. So they should contain the same information since player actually just synchronize the data with the same video. But after I load the pldata file, I found the data is totally different from .csv data. Is that because of synchronization? So if I want to get the gaze position on the surface, is that meaning I don't have to care about the file from pupil capture?

user-dae891 17 July, 2019, 17:11:18

@marc Thanks for answering! However, I am still confused how to extract the 1K resolution heatmap image from the histogram bin? The histogram bin itself is a three layer channel image.

user-dae891 17 July, 2019, 17:12:35

@marc If I want extract a heatmap with specific width and height, what should I do to the hisgogram bin.

user-dae891 17 July, 2019, 17:20:27

@marc Ok, now I understand

user-dae891 17 July, 2019, 17:29:34

@marc Thank you very much! All make sense now! So the width and height value in surface tracker controls the width to height ratio of heatmap, and I can amplify the heatmap later to any resolution. Is this correct?

marc 17 July, 2019, 17:30:29

@user-dae891 That is correct!

marc 17 July, 2019, 17:30:52

(any resolution up to 1K width)

user-dae891 17 July, 2019, 18:31:15

@marc I am slightly confused about the weight and height. I have set the weight is 1.777 times height. However, the result heatmap is height is 1.777 times weight

user-dae891 17 July, 2019, 18:31:24

Chat image

user-dae891 17 July, 2019, 18:31:55

@marc width should be larger than height

papr 17 July, 2019, 18:32:57

@user-dae891 this might be a simple bug

wrp 17 July, 2019, 18:38:45
user-dae891 17 July, 2019, 18:39:42

@papr Right now, I can just switch the w and h ?

papr 17 July, 2019, 18:40:01

I think so, yes

user-dae891 17 July, 2019, 18:40:40

Could you please @ me if it's fixed?

papr 17 July, 2019, 18:40:55

Of course

user-dae891 17 July, 2019, 18:41:08

Thanks

user-dae891 17 July, 2019, 18:46:17

@papr Additionally, I feel like the surface tracker in the new version player is more unstable, compare with the older version player

user-dae891 17 July, 2019, 18:47:02

@papr Here is some screen shot for a same sence

Chat image

user-dae891 17 July, 2019, 18:47:06

Chat image

user-dae891 17 July, 2019, 18:47:09

Chat image

user-dae891 17 July, 2019, 18:47:12

Chat image

user-dae891 17 July, 2019, 18:47:15

Chat image

user-dae891 17 July, 2019, 18:47:17

Chat image

papr 17 July, 2019, 18:48:45

This is due to how distortion is being handled but @marc might be able to explain that correctly.

user-dae891 17 July, 2019, 18:50:28

@papr @marc Would you please tell me how to fix it?or make it more stable?

user-88b704 17 July, 2019, 18:55:48

Hi, we just downloaded the new version of pupil player. When we tried to open an already offline calibrated recording, the new version does not automatically pull out the offline calibration timelines. Does that happen to anyone else?

papr 17 July, 2019, 19:14:43

@user-dae891 if I remember correctly, the surface is displayed in the undistorted camera space, instead of the distorted image space. So the visualization might be a bit misleading. @marc please correct me if I am wrong.

marc 17 July, 2019, 19:15:45

Not that is not quite correct. The visualization should actually not be different compared to the previous version.

marc 17 July, 2019, 19:17:15

The handling of camera distortion only changed how gaze is mapped onto the surfaces. The visualization we see should be computed the same way.

papr 17 July, 2019, 19:17:54

OK, then let's investigate this tomorrow. @user-dae891 could you please share the recording with data@pupil-labs.com such that we can make sure that any solution that we can up with also works for your recordings?

user-dae891 17 July, 2019, 19:19:04

@papr Ok, thank you very much! I will send you right now

papr 17 July, 2019, 19:19:29

https://github.com/pupil-labs/pupil/issues/1542 for reference

user-dae891 17 July, 2019, 19:29:12

@papr Thanks, and I am sure I have set the same min_marker size. I have send you the file. Thanks for your help!

user-dae891 17 July, 2019, 20:59:15

@papr @marc Hi, for the weight and height bug I mentioned above, I just realize it's cannot be solved by switch the w and h. When I swap it, it actually giving different size of heatmap. Would you please take a look? Right now, I am unable to start my work with this problem.

Chat image

user-dae891 17 July, 2019, 20:59:16

Chat image

user-8a8051 17 July, 2019, 23:21:21

@user-dae891 @papr @marc i am also having this issue with the surface tracking, will try to update with more info later today.

user-8a8051 18 July, 2019, 03:48:46

we have recordings and a tracked screen of 1920x1080 height/width ratio, in player the heatmap shows correctly overlaid on the world camera image but the exported heatmap file shows the inverted height to width ratio similar to Doris above.

user-63c8c0 18 July, 2019, 06:54:09

Is there a public demo recording directory available that can be used to test the software capabilities, or is anyone willing to share a recording with quite some pupil dynamics (large, smaller)? I want to make sure this phenomenal platform works for my specific application before buying the hardware. Thanks in advance!

marc 18 July, 2019, 07:40:29

@user-dae891 @user-8a8051 Thanks for bringing this up, we'll look into it today!

marc 18 July, 2019, 07:41:47

@user-63c8c0 there are demo recordings available on the website: https://pupil-labs.com/products/core/tech-specs Scroll down to Sample Recording!

papr 18 July, 2019, 07:43:37

@user-88b704 Pupil Player resets to default settings when opening a new version for the first time. The default is to load pre-recorded gaze from the recording. Just switch back to Offline Calibration and your previously calculated offline gaze will come back as usual.

user-63c8c0 18 July, 2019, 08:23:38

@marc thanks for your help. However, I am interested in loading not only a processed world video, but an entire project file/directory incl. raw pupil recordings, settings, and world video. Something I could load into the Pupil Player software and try and see if the recording of the pupil size matches my expectations.

user-3341e5 18 July, 2019, 08:36:19

Hey there, who should I contact here for possibly hardware problems? The front camera stopped working while both eye cameras are still working.

problem remains despite changing to windows/mac/ubuntu, to a fresh machine too, window's device manager does not detect the front camera (Cam ID2) at all, only has the 2 eye cameras (Cam ID0 and Cam ID1)

papr 18 July, 2019, 08:49:40

@user-3341e5 info@pupil-labs.com

user-3341e5 18 July, 2019, 08:55:39

Thank you @papr

papr 18 July, 2019, 10:05:28

@user-63c8c0 I am uploading the recordings for our Offline Calibration tutorial: https://www.youtube.com/watch?v=_Jnxi1OMMTc&list=PLi20Yl1k_57rlznaEfrXyqiF0sUtZMMLh

I will let you know when they are uploaded.

user-63c8c0 18 July, 2019, 10:23:45

@papr That is so helpful. Thank you very much.

user-9c3078 18 July, 2019, 10:28:44

Hi! Can anyone tell me where can I find the (Re)calculate gaze distributions button after specifying surface sizes in pupil player?

marc 18 July, 2019, 11:18:25

@user-9c3078 The re-calculation now happens automaticaly and does not have to be triggered, thus the button is gone. Whenever you change a parameter influencing the gaze distribution it will recalculate. You should see this in affect when you have the heatmap visualization enabled: if you make a change to a parameter the heatmap turns gray while it is recomputed with the new parameters.

user-9c3078 18 July, 2019, 11:26:38

Thank you so much!!!

marc 18 July, 2019, 11:41:45

@user-dae891 @user-8a8051 Two bugs are fixed with the following PR: https://github.com/pupil-labs/pupil/pull/1555

1) A bug that lead to worse performance in the marker detection in the offline surface tracker. @user-dae891 this is why the tracking accuracy was worse for your screen surface in the new version. Either way I would recommend to you to add a fourth marker to your surface to further increase the robustness of the tracking.

2) Horizontal and vertical resolution of the heatmap have been switched at one point, which lead to the wrong aspect ratio in the exported heatmap.

Both issues will be fixed in the next release. If your run from source they will be fixed as soon as the above PR is merged. Thanks for bringing these up! Let me know if you have further questions!

user-9c3078 18 July, 2019, 12:39:31

Hi! I am trying to get fixation information on surface from fixation.csv. Except calculating to gaze duration from gaze points information on surface to decide whether it's fixation or not, I want to use the img_to_surf_trans matrix, but I cannot find the 3d information of surface even I am using 3D model. Can anyone tell where I can find it or how can I get the fixation on surface/

marc 18 July, 2019, 13:32:24

@user-9c3078 Previously the Pupil Player generated a CSV file for every surface with fixations mapped onto the surface. This functionality was unfortunately lost when we introduced the NSLR fixation detector. We are now restoring this functionality again in the next release.

In the meantime you might be able to use the following work around: The fixation datums in fixations.csv all have a start_timestamp. You could look up this timestamp in each of your gaze_positions_on_surface_<SurfaceName>.csv files and see if the corresponding gaze datum is inside of the surface. If it is, the fixations is too.

user-9c3078 18 July, 2019, 14:39:53

Oh, I thought the start_timestamp is world timestamp. I was totally wrong.๐Ÿ˜… Thank you so much!!!

user-dae891 18 July, 2019, 17:09:05

@marc Thank you very much!

user-dae891 18 July, 2019, 17:38:05

@marc Sorry, I am new to github. I don't know how to download the software that you have modified.

papr 18 July, 2019, 17:39:13

@user-dae891 In this case we recommend to use v1.12 until the fix has been released. It is scheduled for end of next week.

user-dae891 18 July, 2019, 17:48:10

@papr Thanks for letting me know. However, the heatmap from v1.12 is very different from v1.13 right? The smoothness is different, and which effects the size of salience area. So I think I cannot mix my results with the new version and old version, and the new one provides better result.

user-8a8051 18 July, 2019, 22:13:16

@marc thanks very much, looking forward to the new release

user-eaab8a 19 July, 2019, 12:20:50

greetings, I'm having a hard time with the eye tracking in an environment with only artificial light, how could I improve it?

papr 19 July, 2019, 12:55:21

@user-eaab8a Could you send a screenshot of the eye window, with the Algorithm view enabled?

user-eaab8a 19 July, 2019, 13:21:16

Sure, here it is

Chat image

papr 19 July, 2019, 13:21:48

These look pretty good to me

user-eaab8a 19 July, 2019, 13:22:27

My bad, i've sent the natural light screenshot, I'll be right back with the non-natural light

user-eaab8a 19 July, 2019, 13:25:02

Here it is, this one is captured with non-natural light

Chat image

papr 19 July, 2019, 13:27:01

I would change two things: Adjust eye0 such that the eye ball is positioned in the center of the frame. Additionally, I would decrease the pupil min value in eye0

papr 19 July, 2019, 13:27:18

Else the pupil detection looks very good.

user-eaab8a 19 July, 2019, 13:28:08

Thanks papr, the adjustment are required in both situation? Or just in the non-natural one?

papr 19 July, 2019, 13:28:27

Both situations.

user-eaab8a 19 July, 2019, 13:29:03

Thanks, I'll try that

user-eaab8a 19 July, 2019, 13:29:19

The light might be a problem anyway?

papr 19 July, 2019, 13:29:40

No, I do not see a problem in either of both screenshots.

papr 19 July, 2019, 13:29:50

What are your difficulties specifically?

user-eaab8a 19 July, 2019, 13:29:58

I mean in general, not those 2 cases specifically

user-eaab8a 19 July, 2019, 13:30:32

My difficulty is in tracking with non-natural light, in wich the tracking seems to be lost and way too offset from the actual gaze point

papr 19 July, 2019, 13:30:53

Can you share an example recording with [email removed]

user-eaab8a 19 July, 2019, 13:31:36

To be more clear, I'm trying to track the look on a tablet, looking for button press and decision over the interface

papr 19 July, 2019, 13:31:48

Ideally, you include the calibration and test procedure in the recording

user-eaab8a 19 July, 2019, 13:32:10

Sure, I'll be recording some today, I'll be glad to share an example

user-4ef728 19 July, 2019, 15:50:20

Hi I just started working with the pupil glasses and I'm just running a few tests before I get started on my project, and I was wondering where can I get the data from my pupil dilation. I already have my recording finished and I have exported it, but I'm not sure where to find my pupil dilation data.

papr 19 July, 2019, 15:51:26

@user-4ef728 There should be a file called "pupil_positions.csv". Checkout the diameter and diameter_3d columns

user-4ef728 19 July, 2019, 15:52:13

Found it! Thank you very much!

user-a6cc45 20 July, 2019, 13:22:31

Hi, could you explain to me what is the difference between 2D and 3D detection & mapping mode ? I didn't find much information about it in docs :/

papr 22 July, 2019, 07:46:23

@user-a6cc45 Please see these links for reference:

Pupil Detection - 2D Pupil Detection - Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction [1] - 3D Pupil Detection - A fully-automatic, temporal approach to single camera, glint-free 3D eyemodel fitting [2]

Gaze Mapping - 2D gaze mapping - This method fits a two-dimensional polynomial function during calibration to map the pupil center to a normalised pixel coordinate system - 3D gaze mapping - This method uses bundle adjustment [3] to estimate the physical relationship between eye and world camera during calibration.

[1] https://arxiv.org/pdf/1405.0006.pdf [2] https://www.researchgate.net/profile/Lech_Swirski/publication/264658852_A_fully-automatic_temporal_approach_to_single_camera_glint-free_3D_eye_model_fitting/links/53ea3dbf0cf28f342f418dfe/A-fully-automatic-temporal-approach-to-single-camera-glint-free-3D-eye-model-fitting.pdf [3] http://ceres-solver.org/nnls_tutorial.html?highlight=bundle#bundle-adjustment

user-72cb32 23 July, 2019, 02:16:24

@papr Hi! It seems like both Pupil Capture and Player crashes whenever I try to use Surface Tracker (online and offline both). The error code is following:

world - [INFO] recorder: Started Recording. eye1 - [INFO] launchables.eye: Will save eye video to: C:\Users\SY\recordings\2019_07_23\001/ eye0 - [INFO] launchables.eye: Will save eye video to: C:\Users\SY\recordings\2019_07_23\001/ world - [INFO] camera_models: Calibration for camera world at resolution (1280, 720) saved to C:\Users\SY\recordings\2019_07_23\001/world.intrinsics world - [INFO] recorder: No surface_definitions data found. You may want this if you do marker tracking. world - [INFO] recorder: Saved Recording. eye0 - [INFO] launchables.eye: Done recording. eye1 - [INFO] launchables.eye: Done recording. world - [ERROR] launchables.world: Process Capture crashed with trace: Traceback (most recent call last): File "launchables\world.py", line 655, in world File "OpenGL\error.py", line 232, in glCheckError OpenGL.error.GLError: GLError( err = 1281, description = b'invalid value', baseOperation = glViewport, cArguments = (0, 0, 1330, 720) )

world - [INFO] launchables.world: Process shutting down. eye0 - [INFO] launchables.eye: Process shutting down. eye1 - [INFO] launchables.eye: Process shutting down.

user-72cb32 23 July, 2019, 02:16:48

I've checked several posts from github, yet I could not resolve the matter. Please help!

user-72cb32 23 July, 2019, 03:21:34

And, one more question..! So, if I record the world view as undistorted from the Pupil Capture, when I play the same file on Pupil Player, the video is still distorted. How can I keep the undistorted video and analyze it on Pupil Player?

papr 23 July, 2019, 06:02:53

@user-72cb32 what is the resolution of your computer screen?

papr 23 July, 2019, 06:04:42

@user-72cb32 Also, Capture does not save the undistorted video. But you can export the video in Player using the iMotions Exporter in order to get the undistorted video.

user-2a994d 23 July, 2019, 09:38:39

Hi all! Is the source code of the Mobile App available?

papr 23 July, 2019, 09:39:02

@user-2a994d No, the Pupil Mobile source code is not open source.

papr 23 July, 2019, 09:39:51

But the app has a network interface that can be accessed by other apps: https://github.com/pupil-labs/pyndsi/blob/master/ndsi-commspec.md

user-2a994d 23 July, 2019, 09:52:12

@papr Thank you! Is the source code planned to be released in the future? It looks like there is some space for improvements there...

papr 23 July, 2019, 09:52:50

Currently, it is not planned to make the Android code open source.

user-2a994d 23 July, 2019, 09:55:36

@papr Oh that's a pity... my understanding was that all the code is licensed under the LGPL v3.0 (according to the "License" section of the Pupil Mobile App)

papr 23 July, 2019, 09:59:08

@user-2a994d It looks like the app ships the license file for Pupil (https://github.com/pupil-labs/pupil) . This is a bug.

user-2a994d 23 July, 2019, 10:05:06

@papr I see, thank you anyway! Please consider licensing it as open source someday...

papr 23 July, 2019, 10:05:40

@user-2a994d May I ask what changes you have in mind?

user-2a994d 23 July, 2019, 10:14:06

@papr I've just installed it... however it looks like the performance is not so good on my device (it freezes from time to time). Looking at the issues on github I see there are other users with open issues, so I was wondering if there was some way to help improving it.

user-2a994d 23 July, 2019, 10:21:21

@user-64de47 I'm considering using Pupil Core for a research project, I would be more comfortable knowing that possible issues could be fixed by directly modifying the code. I think that its open source approach is one of the reasons making Pupil an awesome project!

user-eaab8a 23 July, 2019, 12:33:33

Greetings, i'm having an hard time to get the fixation_on_surfaces.csv, do I need to check any plugin to get it?

papr 23 July, 2019, 12:34:19

@user-eaab8a The functionality has been disabled in v1.13 by accident. Please use v1.12 until we release the fix for this problem in our next release.

user-eaab8a 23 July, 2019, 12:35:34

Thanks a lot, I'll downgrade until that

user-eaab8a 23 July, 2019, 12:50:52

I get this error though,even with the 1.12 version

Chat image

papr 23 July, 2019, 12:53:11

At this specific case it looks like the surface was not detected well. Please try reducing the min marker perimeter

user-eaab8a 23 July, 2019, 12:54:40

I've tried with no luck, it seems to see only the upper markers

papr 23 July, 2019, 12:55:51

@user-eaab8a Could you share the recording with data@pupil-labs.com so I can have a look at it?

user-bb9207 23 July, 2019, 14:15:39

Hi All - I have a new set up and cant get the video working. can anyone point me in the right direction?

papr 23 July, 2019, 14:16:24

@user-bb9207 That depends on which operating system you are using and what the exact problem is that you are facing ๐Ÿ™‚

user-bb9207 23 July, 2019, 14:17:38

๐Ÿ˜€ Windows 10

papr 23 July, 2019, 14:18:38

Which application are you trying to run? What is different from what you expect?

user-bb9207 23 July, 2019, 14:19:33

trying to run Pupil Capture -have calibrated pupils but can get the video working

user-bb9207 23 July, 2019, 14:19:53

"world - [ERROR] calibration_routines.screen_marker_calibration: Calibration requiers world capture video input."

papr 23 July, 2019, 14:20:08

So you can see the eye video working but not the world video?

user-bb9207 23 July, 2019, 14:20:15

yes

papr 23 July, 2019, 14:20:51

aka. the world window shows a gray background while the eye windows don't? What hardware do you use? Do you use the headset or one of the VR add-ons?

user-bb9207 23 July, 2019, 14:21:39

Using Pupil Core - USB mount with Binocoluar

user-bb9207 23 July, 2019, 14:22:20

Yes the world window should a gray backgound - eye windows seem to work fine

papr 23 July, 2019, 14:23:06

@user-bb9207 When you go to the uvc manager menu on the right, in the drop down menu, how many cameras are being listed? Is one being listed as unknown?

user-bb9207 23 July, 2019, 14:24:34

I have "local USb" - there are two

user-bb9207 23 July, 2019, 14:24:44

Chat image

papr 23 July, 2019, 14:25:00

If you refer to USB mount, do you mean the headset that exposed an usb c connector for the world cam?

user-bb9207 23 July, 2019, 14:25:31

I think you have fixed it already ๐Ÿ˜ƒ

user-bb9207 23 July, 2019, 14:25:51

Not selected the right option

papr 23 July, 2019, 14:26:11

๐Ÿ‘

user-bb9207 23 July, 2019, 14:26:19

thank you so much

user-72cb32 23 July, 2019, 14:54:08

@papr Hi! It is 2195x1235! But when i used iMotions Exporter, it did not export the gaze and surface information... it only exported undistorted images alone, without any pupil data shown on the exported video...!

papr 23 July, 2019, 14:58:05

@user-72cb32 That is correct. You currently only have one of two options: 1. Originally recorded world video (includes distortion) + visualizations 2. Exported iMotions video (without distortion) without visualizations

papr 23 July, 2019, 14:59:45

Usually, you only need undistorted video if you want to do object detection on the video. But in these cases the visualizations would interfere with the detection algorithms. i.e. we did not come across a use case yet where undistorted video + visualizations was necessary. If you have a specific use case in mind please let us know.

user-0c583a 23 July, 2019, 21:38:33

I'm trying (on Windows 10) to get PupilDrvInst.exe working, i.e., to install the drivers. When running it (as admin or otherwise) all that happens is a DOS window pops up and shuts down a few milliseconds later. It's so fast I can't even tell if text is in the window, much less try to read it.

user-72cb32 24 July, 2019, 01:04:02

@papr Thank you!

user-09f6c7 24 July, 2019, 08:29:33

I don't know why it's happening on Pupil Capture. Can someone help me?

Chat image

papr 24 July, 2019, 08:33:23

@user-09f6c7 We do not know what is causing this issue either. The next release will contain a more detailed error message. Until then, we found that deleting the user_settings_* files in the pupil_player_settings folder.

user-09f6c7 24 July, 2019, 08:34:30

@papr Ok I'll try

user-bb9207 24 July, 2019, 09:29:43

I have a Realsense D415 and I cant get the it to work with Pupil Mobile - do you know what video the mobile app you supports? I have running App on the OnePlus6

papr 24 July, 2019, 09:32:00

@user-bb9207 Pupil Mobile does not support any of the 3D cameras since they require custom drivers. ๐Ÿ˜• You will have to use Pupil Capture.

user-bb9207 24 July, 2019, 09:33:16

Does it support any video at all?

papr 24 July, 2019, 09:35:06

Yes, it supports the 200Hz eye cameras and the 120Hz World camera, as well as the Logitech C930e

papr 24 July, 2019, 09:35:35

Other usb cameras might work as well, but are not official supported

user-bb9207 24 July, 2019, 09:42:32

the Logitech seems massive

user-bb9207 24 July, 2019, 09:47:40

Do you sell the world camera?

papr 24 July, 2019, 12:53:41

Yes we do, please contact info@pupil-labs.com for details.

user-762939 24 July, 2019, 15:17:13

hello, i have questions about billing & product!

papr 24 July, 2019, 15:17:34

Hi @user-762939 Please contact info@pupil-labs.com in this case.

user-762939 24 July, 2019, 15:17:40

actually i bought 2 pupil trackers by mistake

user-762939 24 July, 2019, 15:17:42

okey

user-762939 24 July, 2019, 15:17:45

thnx!

user-bda130 24 July, 2019, 17:56:26

Hello, I am having issue opening pupil player v 1.13 on my MacOS El Captain (10.11.6). The application installs fine, but once installed it will not open. I have read the strings on GitHub about deleting the pupil_player_settings, however, this folder is not installed with the player. I have also tried uninstalling and reinstalling the application a few times, in addition to restarting the computer after a new installation. Is there another way to solve the issue?

user-e7102b 24 July, 2019, 18:18:44

Hey @papr , our lab are purchasing a new laptop exclusively for use with our pupil headsets. I have a couple of questions. 1) Will pupil capture/player work OK on Ubuntu 18.04? 2) Is it worth purchasing a system with a high spec graphics card (e.g. GTX 1060 or above), or will something like the GTX 1650 be good enough? My priorities for the system are lots of RAM (32 GB) and hard drive space (4 TB). I don't think the graphics card is too important, but I figured it was worth checking before we buy the system. Thanks!

papr 24 July, 2019, 18:21:20

@user-e7102b Ubuntu 18.04 should work very well. A high spec graphics card is only necessary if you want to do fingertip detection. I think your priorities regarding ram and disk space are very correct.

papr 24 July, 2019, 18:21:34

A good CPU is worth it, too.

papr 24 July, 2019, 18:24:22

@user-bda130 what CPU do you have exactly?

user-e7102b 24 July, 2019, 18:33:27

@papr great. Thanks!

user-bda130 24 July, 2019, 18:38:16

@papr my processor is 2.66 GHz Intel Core i5

papr 24 July, 2019, 18:39:06

@user-bda130 ok, thanks. Could you share the capture.log in the pupil_capture_settings folder after attempting to start Capture?

user-bda130 24 July, 2019, 18:58:41

@papr Capture is having the same issue

user-f0d261 24 July, 2019, 20:28:40

Hello. I ordered two Pupil Core Binocular and one camera is missing in each package. My university's purchasing center is sucpposed to get in touch with you tomorrow but I would like to know what happened with the order for such an error to have happened?

papr 24 July, 2019, 20:30:57

@user-f0d261 Please contact info@pupil-labs.com in this case.

user-f0d261 24 July, 2019, 20:31:40

@papr ok. Thank you.

user-6ec304 24 July, 2019, 23:23:38

Hey everyone - any word on when Pupil Invisible is supposed to launch? It is a platform myself and colleagues are considering using for some upcoming projects. Earlier searches yield a possible Q4 release but I'm wondering if anything more specific is known at this time. Thanks!

user-42b09b 25 July, 2019, 03:59:31

@papr We're using Pupil Core with single eye camera for research purpose. tried with all adjustments but the eyes are not captured correctly. While recording we need to adjust the glass by hand and then need to hold it to capture. It's really hard to record using this. The extender are also not fulfilling the needs. We need to adjust the eye camera top to bottom instead of left to right (or) front to backward. Could anyone please help us?

user-09f6c7 25 July, 2019, 09:58:49

@papr Where is the 'pupil_player_settings' folder at Windows10? I can not found it yet.

user-124ee6 25 July, 2019, 11:04:17

Hello everyone, I was wondering if there is a way to change the default settings of Pupil Capture or, better, to save and load different setting configurations ?

papr 25 July, 2019, 11:05:09

@user-124ee6 currently, this is not possible.

user-bb9207 25 July, 2019, 12:43:45

Hi - My World Camera is not previewing on the Pupil Mobile? does anyone have a trouble shooting guide?

papr 25 July, 2019, 12:45:29

@user-bb9207 are you running android 9?

user-bb9207 25 July, 2019, 12:46:02

yes

user-bb9207 25 July, 2019, 13:15:15

Solved - Applied for the beta and its working

user-80a897 25 July, 2019, 15:35:13

recording audio is not working on mac I did install libav and worked then stopped again

user-9c3078 25 July, 2019, 15:41:59

Hi! I've already tried to get fixations from gaze points on the surface, but then I found that there are many fixations cannot be found in the gaze points file using start_timestamp. Can you tell me why this happens? If this cannot be fixed, is that mean I have to lost most of the fixation information? @marc Also, I have a question about the heatmap . There are two heatmap modes and I try both and found they have totally the same heatmap. Is this a bug or I do anything wrong? The final question is about Camera Intrinsics Estimation. If I have this plugin, is that mean the data files I get like the gaze points position are data have already been processed and are made to map the flat surface? Sorry for so many questions๐Ÿ˜ฌ

user-42b09b 26 July, 2019, 10:31:40

We're using Pupil Core with single eye camera for research purpose. tried with all adjustments but the eyes are not captured correctly. While recording we need to adjust the glass by hand and then need to hold it to capture. It's really hard to record using this. The extender are also not fulfilling the needs. We need to adjust the eye camera top to bottom instead of left to right (or) front to backward. Could anyone please help us?

papr 26 July, 2019, 10:35:35

@user-42b09b is the glass moving down the subject's nose or why do you have to readjust the headset constantly? Also, have you tried the silicon nose pads? These might to put the headset in the right position.

user-42b09b 26 July, 2019, 10:40:12

@papr Thanks for your reply. The headset is at the right position. It sits perfectly on top of the nose. The problem is with the eye camera. Eye is not captured by the camera correctly. If we adjust the eye glass manually like moving towards top, then it's capturing eyes. That's the reason. This problem makes it really hard to capture the eyes. Is there a way to adjust the eye camera alone towards top or down?

papr 26 July, 2019, 10:51:04

@user-42b09b you can rotate them by a few degrees at the ball joint

papr 26 July, 2019, 10:52:19

Can you share a picture of the eye window please? Just to get a feeling how much of the eye is not being recorded

user-42b09b 26 July, 2019, 12:01:53

@papr . Sure I will send you tomorrow. Because it's in lab.

user-9c3078 26 July, 2019, 18:04:48

Hi! Can anyone tell me what's the world_index in gaze_positions_on_surface_<>.csv? I thought it maybe the index of the world frame. But my results start from around 300.

user-0fde83 28 July, 2019, 09:39:05

Hello! I am new to Pupil Labs and try to understand some basics. I have multiple recordings i want to analyze, for that i am using version 1.10.20, because the defined surfaces get lost in the newer versions. When i download a file to look at the fixation_on_surface - Files they appear to be different every time i do (for the same data with same settings). Why is that and how can i get consistent results?

papr 29 July, 2019, 09:38:54

@user-9c3078 Apologies for the delayed response. Let's see if I can answer your questions.

papr 29 July, 2019, 09:43:46

@user-9c3078 A) fixations on surfaces - Regarding this one, I am not sure if I understand. You extract the start_timestamp of each fixation and try to find the corresponding gaze timestamp in gaze_on_surface_X.csv? Please be aware that gaze_on_surface_X.csv only includes gaze during periods in which the surface was detected. If the surface was not detected during the fixation's start_timestamp, you won't be able to find it in the mentioned csv file. In this case, I would recommend to look for gaze data with timestamps between the fixations start_timestamp and end_timestamp.

papr 29 July, 2019, 09:48:11

@user-9c3078 B) Heatmap modes - As you said, there are two modes: Gaze distribution per surface (1), and gaze distribution across multiple surfaces (2). While (1) calculates its distribution (and therefore heatmap colors) independently for each surface, (2) does so by aggregating gaze over all surfaces. If you only have one surface defined, both modes are equivalent.

papr 29 July, 2019, 10:00:04

@user-9c3078 C) Camera intrinsics estimation - Capture uses a set of prerecorded intrinsics by default. Just opening the camera intrinsics estimation plugin does not change that. You will have to run the procedure to update to use custom intrinsics. Capture uses the intrinsics to calculate undistorted 3d gaze. For performance reasons, the intrinsics are not used to undistort the video itself. Pre v1.13, gaze was not correctly undistorted for surfaces. This changed with v1.13.

papr 29 July, 2019, 10:01:53

@user-9c3078 D) And in regards to your question in software-dev please have a look at the NSLR paper and how it defines pso (post-saccadic oscillations) https://www.nature.com/articles/s41598-017-17983-x

papr 29 July, 2019, 10:03:20

If you have further questions or comments, please use the assigned letters for reference, else it might be difficult to keep the topics separate.

papr 29 July, 2019, 10:06:25

@user-9c3078 E) Regarding the world_index column in gaze_positions_on_surface_X: Yes, it should be the world frame index. As I said in A), these files only include data of when the surface was actually detected. If it starts at frame 300, then it is likely, that the surface was detected for the first time in frame 300.

user-9c3078 29 July, 2019, 10:07:04

Thank you so much for reply.๐Ÿ‘ @papr About the fixation, I found me error on the timestamp. So I check my info.csv and export_info.csv. My relative time is: 0-00:58.408, and absolute time is 25353.502126 -25411.910753, then the synced time is 25353.55836332, the duration time is 00:00:59. I think there is an error between those two data? Because I am trying to map the fixation to the video myself. So timestamp really matters.

user-9c3078 29 July, 2019, 10:07:41

@user-9c3078 B) Yes, about heatmaps, I defined four different surfaces and in both modes I got the same result.

papr 29 July, 2019, 10:08:16

@user-9c3078 B) did you assign sizes for your surfaces?

user-9c3078 29 July, 2019, 10:09:21

@papr B) only surface 1 have size of 96x128, and the others are 1x1

papr 29 July, 2019, 10:10:26

@user-9c3078 Regarding F) timestamps - Please use world_timestamps.csv to get the timestamps for each world frame. The start time (synced) in info.csv is only important if you want to calculate the offset to start time (system), which is not needed in your case.

papr 29 July, 2019, 10:12:12

@user-9c3078 B) The difference would only be visible in assigned colors, not in structure of the heatmap. The differences might be very subtle depending of the gaze distribution across surfaces.

user-9c3078 29 July, 2019, 10:30:34

@papr F) Sorry I didn't understand well. So actually, all the timestamps in export files from pupil player have no difference between the capture. They are still the same from their xx_timestamps.csv . If I want to map every thing into the video myself, I need to use the timestamps in each file to match. And calculate the relative time using the world_timestamp - start absolute time . Then in what case I may need to care about the synced time?

user-9c3078 29 July, 2019, 10:32:02

@papr B) But the results I got are totally the same in color. Maybe because the resolution? I found blocks in my heatmaps. I'll try it again to confirm this.

papr 29 July, 2019, 10:33:31

@user-9c3078 F) I would highly recommend to get rid of the idea of relative timestamps, since the sensors do not start recording at the same time. You will have to use the absolute timestamps of each sensor (world, fixations, etc) in order to map everything correctly

papr 29 July, 2019, 10:39:03

@user-9c3078 You do not have to care about synced time, unless you have other sensors that use the unix epoch for recording absolute timestamps

user-9c3078 29 July, 2019, 10:48:18

@papr But if I need to map them into the video myself, I have to use the relative time. Is there any other way to do that? I can use only the world_timestamp to calculate the relative time of the video and to get the fixation or gaze point I just need have another mapping between the gaze_timestamp and the world_timestamp. Also I notice that there are index in the files. Do you think using the index would be more accurate if I can find the start frame I want?

papr 29 July, 2019, 10:54:12

@user-9c3078 First you do a n-to-1 mapping between gaze data and world timestamps [1]: indeces = find_closest(world_timestamps, gaze_timestamps). This gives you a mapping between gaze index and world frame index https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/player_methods.py#L136-L150

papr 29 July, 2019, 10:55:53

Then you can step through the world video frame by frame and find the appropriate data

user-9c3078 29 July, 2019, 10:58:42

@papr Thank you soooooo much for answer so many questions!!! That's so nice of you! I will try the way you mentioned!โ˜บ

papr 29 July, 2019, 10:59:52

@user-9c3078 Sure thing. ๐Ÿ‘

user-0fde83 29 July, 2019, 14:46:14

If that ist not possible as i think it is, how could I get these "fixations_on_surface_"-files with the new Version of Pupil Player? They do not seem to appear in the folder as usual. Thanks in Advance ๐Ÿ˜ƒ

papr 29 July, 2019, 14:49:04

@user-0fde83 Hey, I think you also wrote an email to info@pupil-labs.com in this regard, correct? We are still investigating the issue of the non-reproducibility. Regarding fixations_on_surface: v1.13 does not include them due to a mistake. The upcomming version v1.14 will reintroduce this feature.

user-0fde83 29 July, 2019, 21:16:06

@papr Yes , I sent an e-mail for the same question today. Thank you for the quick answer. Is there already a date planned for the release of v1.14?

papr 29 July, 2019, 21:16:40

@user-0fde83 I will try to put out the release this week.

user-0fde83 29 July, 2019, 22:18:51

@papr That sounds great, i'm looking forward to it. But if i understood correctly, the reproducibility-problem remains also in the new version. Is there still a possibility to use the data collected or do you have have any suggestions how we could approach a quantification of visual attention in defined areas of interest with other parts of the output?

user-1b0db9 30 July, 2019, 06:05:09

Hi all, just found Pupil today and it is awesome! We're currently doing researches with stand-alone Tobii devices to track interaction with ads on mobile devices and the quality is pretty awful. Have anyone seen real-life examples of Pupil Core in work with mobile devices? Some video of the final result would be awesome! Thanks and hope everyone have a great day )

wrp 30 July, 2019, 09:42:22

Hi @user-1b0db9 if you haven't already you might want to see this post from community member: https://pupil-labs.com/news/pupil-for-usability-research -- additionally you might want to see work that Eye Square did for Facebook using Pupil Core on multiple screens (both tv screen/monitor and mobile device): https://www.facebook.com/business/news/insights/measuring-multi-screening-around-the-world (this link is unfortunately broken but may be back online later). Hope this is helpful

wrp 30 July, 2019, 09:43:28

@user-1ece1e please migrate your question to the vr-ar channel.

user-1b0db9 30 July, 2019, 09:48:57

@wrp thanks for the links! Didn't see it yet, but all the videos from https://www.youtube.com/channel/UCccG1cRW5dUhDUi_yhogTkg look promissing )

user-80a897 30 July, 2019, 12:46:27

Hello is there anyone can help me with mac audio recording when I installed the libav and reboot the laptop worked fine for first time then again asks me to install libav if reopen pupil capture

user-31bce0 31 July, 2019, 08:09:09

Hello everyone!! I'd like to find out if I can obtain the data measured by Pupil Capture during it's calibration sequence. We need to do some calibration for our own purposes, so getting this data would allow is to kill two birds with one stone

user-96755f 31 July, 2019, 08:37:51

Has anyone used Pupil Labs set with Psychopy? Is there a proper way to send digital time-stamps via parallel port or something better to Pupil Capture? I want to know when each image is shown on the screen for: - having a better correlation between pupil diameter and images - a help for trimming the section I need Currently we're using two different computers, should we change our setting?

thank you

papr 31 July, 2019, 08:40:04

@user-31bce0 the data is stored in a notification with the subject calibration.data and is stored in the notify.pldata file. You can use this function to read the file: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/file_methods.py#L139-L155

papr 31 July, 2019, 08:43:07

@user-96755f You can use annotations to send information about your experiment to Capture. https://github.com/pupil-labs/pupil-helpers/blob/master/python/remote_annotations.py They can be sent remotely. But make sure the script uses the current Pupil time for its timestamps.

user-96755f 31 July, 2019, 08:45:38

So, Pupil Remote should work on the computer with Psychopy right?

papr 31 July, 2019, 08:50:35

@user-96755f Pupil Remote is a plugin within Capture. But yes, you should be able to connect to it via python code running in psychopy

user-96755f 31 July, 2019, 08:53:44

I'll give it a try! thank you

user-31bce0 31 July, 2019, 08:53:46

Great, thanks!

user-31bce0 31 July, 2019, 09:03:52

Another question. After you're done with the calibration, the green contour that appears on screen, do its vertices represent the 4 targets or the 4 corners of the screen (which are a little further away than the targets)??

user-b13152 31 July, 2019, 09:15:14

hi @papr.. After I update to 1.13, the same problem still appears for the duration and time stamp. Is there an errors in the settings? The model I am using is Pupil w120 e200b. I have 3 eye tracking with same models, they give me same results. Please help! i want to make gaze plot and AOI. Thanks.

Chat image

wrp 31 July, 2019, 11:23:23

@user-31bce0 the green contour is the boundary of the area used for calibration. Example: if you used the screen based calibration method and if all reference markers and associated pupil data was robustly detected, then you would have a green contour around the extreme most reference markers (which should closely correlate to the screen boundaries provided you are in the same physical position when the contour is shown)

papr 31 July, 2019, 11:26:16

@user-b13152 could you please make a new export and open the csv file in a text editor? Maybe there is something wrong how the csv is being interpreted.

user-b13152 31 July, 2019, 11:37:35

I have done it but there is no change. I have also taken new data with pupils 1.13 but the results are the same

papr 31 July, 2019, 11:38:41

@user-b13152 Not sure if I asked you before, but could you share the exported file with me?

user-b13152 31 July, 2019, 11:47:37

Thanks @papr , the problem is solved. When I open the CSV file in Notepad, the number changes but how to convert the file from Notepad to CSV?

user-b13152 31 July, 2019, 12:01:45

Thanks @papr .. The problem is in my excel. ๐Ÿ˜€ ๐Ÿ‘

papr 31 July, 2019, 12:12:09

@user-b13152 Yeah, that happens if software thinks it is supposed to be clever. ๐Ÿ˜„

user-c1220d 31 July, 2019, 14:53:27

hi, sometimes happens i try to drop a folder in pupilplayer but it doesnt work. The cmd says this

user-c1220d 31 July, 2019, 14:54:14

Chat image

user-c1220d 31 July, 2019, 14:54:25

may i ask the meaning?

papr 31 July, 2019, 14:57:30

@user-c1220d This is a bug that has been fixed in v1.13.

user-c1220d 31 July, 2019, 15:01:37

ok thanks, great

user-c87bad 31 July, 2019, 16:31:56

Hello! I have a question about timestamp. Now I have a UNIX Epoch time which is from another software. And I want to sync this time with the video timestamp. Is that right to use the difference between the Start time system and Start time synced? Just add/minus this difference?

papr 31 July, 2019, 16:33:56

@user-c87bad That is correct!

papr 31 July, 2019, 16:35:24

@user-c87bad I also created a small Player plugin that renders the recording time in unix epoch into the exported video: https://gist.github.com/papr/7d84267e9e1284b5763ac3afb1732494

user-c87bad 31 July, 2019, 16:44:46

Thank you so much!!!! But I am not sure how can I use this plugin, is it required to run the software via source code or?

papr 31 July, 2019, 16:45:42

You can run it using both. It is easier to just use the bundle if you do not have the source dependencies installed. Just put in in the pupil_player_settings/plugins folder and start Player. It should appear in the Plugin Manager overview.

user-c87bad 31 July, 2019, 16:48:13

Okay, I'll try that. Thanks again.๐Ÿ™‚

user-a7dea8 31 July, 2019, 17:05:58

Is there a way to access visual angle or vergence information in a .csv file? How is the pupil lab calculating a z distance in gaze position? Does it know the working distance of the object in fixation?

user-94ac2a 31 July, 2019, 17:07:56

In capture or service, why does the two eye cameras always conflict with each other? When choosing one camera the other one seems not working until tried several times?

user-81a601 31 July, 2019, 17:09:56

hey guys, I'm new here, don't know if it has been asked before, but any of the pupils hmd has positional tracking embedded ?

user-81a601 31 July, 2019, 17:10:56

I'm wanting to record a session, where I get the user head rotation and translation through a certain amount of time, and record the eyes rotation as well, so after that, I can test this record against any 3d scene I want

user-81a601 31 July, 2019, 17:11:02

using raycast and stuff

papr 31 July, 2019, 17:30:18

@user-a7dea8 Pupil transforms the 3d orientation and position of the 3d eye models into the scene camera space, and uses the closest point between these two "lines of sight" as gaze_point_3d

papr 31 July, 2019, 17:35:17

@user-94ac2a That should not happen with any of the Pupil Core eye cameras. Please select the "Start with default devices" in the UVC manager, to automatically start the correct cameras. If they are already in use, use the "Restart with defaults" button in the general settings. Please understand, that it is not possible for two separate processes to access the same camera at the same time (what you might interpret as conflict).

papr 31 July, 2019, 17:37:42

@user-81a601 The hmd add-ons do not have their own head tracking. Use the hmd's head tracking instead. e.g. @user-8779ef has experience on how to do that. If you want to do head tracking with one of the Pupil Core headsets, then you can use our head pose estimation plugin. See this youtube tutorial on how to use it: https://www.youtube.com/watch?v=9x9h98tywFI

user-94ac2a 31 July, 2019, 17:38:03

@papr if I use the custom one, what were the reasons behind it? It seems that sometimes it works but sometimes not even with restart default pressed.

papr 31 July, 2019, 17:39:35

@user-94ac2a ok, then this does not surprise me. Pupil Capture tries to start cameras with known names (e.g. the Pupil Core cameras). If it does not find it, it tries to select the next available one. Do your custom cameras have the same name?

user-81a601 31 July, 2019, 17:40:26

but how can I calibrate the eyes position and rotation against head position and rotation, there is a built-in plugin for Unity for example that do this job ?

user-94ac2a 31 July, 2019, 17:40:37

@papr yes. They have the same name. Maybe changing the name helps? Any way to change device name?

papr 31 July, 2019, 17:41:11

@user-81a601 In this case, please move the discussion to vr-ar and ask for examples there

user-81a601 31 July, 2019, 17:41:29

okay, thanks for the guidance

papr 31 July, 2019, 17:42:22

@user-94ac2a I think the camera names are usually burnt into their firmware. You might need to modify the UVC backend, in order to automatically select your custom cameras based on their serial number.

user-94ac2a 31 July, 2019, 17:43:12

@papr thanks. Which document should I look at in order to do that?

papr 31 July, 2019, 17:43:55
  1. You will have to run from source. Check the developer documentation on our website for that.
user-94ac2a 31 July, 2019, 17:44:12

Ok

papr 31 July, 2019, 17:46:29
  1. This block needs to be replaced with custom code to activate the correct cameras. https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/video_capture/uvc_backend.py#L82-L98

Please be aware that this code called in three different processes. Therefore, you will need a way to identify in which process you are in.

papr 31 July, 2019, 17:46:49

Capture solves this by passing different preferred_names lists for each process.

user-94ac2a 31 July, 2019, 17:48:22

Cool. Let me take a look

user-07d4db 31 July, 2019, 21:30:43

Hey! Do you know which files produced by the raw data exporter I have to use, when I want to measure the dwell time and the gaze samples on defined AOIs? For the gaze samples I would use one of these:โ€ข gaze on surface_xY: โ€ข surface_events โ€ข surface_gaze distribution

user-07d4db 31 July, 2019, 21:31:34

But where can I see the dwell time on the respective surface?

papr 31 July, 2019, 21:36:37

@user-07d4db You will have to calculate that time yourself based on the timestamps. You should have enter and exit events for each surface. Just accumulate exit_ts - enter_ts for all events for each surface

user-07d4db 31 July, 2019, 22:28:42

Thank you for your response papr ๐Ÿ˜ƒ and which file should I pay attention to in order to measure the gaze samples on the respective surface?

End of July archive