core


mpk 01 October, 2019, 08:08:50

@user-f497a5 I m pretty sure your recordings should still be supported. Can you share an info.csv file?

papr 01 October, 2019, 08:31:31

@mpk He did so in a personal message to me. There are some required keys missing. One can add them manually, but it was not necessary in his case.

mpk 01 October, 2019, 08:59:14

ok great!

user-ac350a 01 October, 2019, 15:06:39

Hi , is the android version of pupil open sourse?

papr 01 October, 2019, 15:10:33

@user-ac350a Hey, Pupil Mobile is not open source. But it has an open source network API which you can use to access the sensor data in realtime and to control the app remotely.

papr 01 October, 2019, 15:10:50

See https://github.com/pupil-labs/pyndsi/ for details.

user-b0f1b6 01 October, 2019, 15:36:43

hi , I have followed the instructions on installing pyndsi. Installation (on windows) seems to have completed fine, but I received errors when I am import ndsi. The errors are

import ndsi Traceback (most recent call last): File "<stdin>", line 1, in <module> File "E:\MyWorkspace\pyndsi\ndsi__init__.py", line 24, in <module> from ndsi.formatter import DataFormat File "E:\MyWorkspace\pyndsi\ndsi\formatter.py", line 10, in <module> from ndsi.frame import JPEGFrame, H264Frame, FrameFactory ImportError: DLL load failed: The specified module could not be found.

Any advice on this?

papr 01 October, 2019, 15:43:12

@user-b0f1b6 Please make sure that the turbojpeg and ffmpeg dlls are in your windows system path. See the pupil docs for links to the dependency downloads: https://github.com/pupil-labs/pupil/blob/master/docs/dependencies-windows.md#setup-pupil_external-dependencies

user-b0f1b6 01 October, 2019, 15:48:34

Hi @papr, is there any specific version of libjpeg-turbo to be used?

papr 01 October, 2019, 15:50:01

@user-b0f1b6 pyuvc uses 1.5.1, so I would recommend that one.

user-b0f1b6 01 October, 2019, 15:52:22

Thanks @papr, I have resolved my issue. Indeed it was because of the system paths for libjpeg and ffmpeg.

user-9c3078 01 October, 2019, 16:01:28

@papr Hi! I've also met the frame drop in the v1.16-71 and found that eye0 fps is normal but there are some discrete regions in world fps which happened the same time when frame drops. Is that a normal condition that frame for eye is okay but not okay for world frame?

papr 01 October, 2019, 16:06:24

@user-9c3078 No this is not normal. Does this happen for new recordings as well? If yes, please contact info@pupil-labs.com regarding a possible hardware issue.

user-9c3078 01 October, 2019, 17:32:00

@papr Yes, it happened in several videos and I've sent the data.

papr 01 October, 2019, 17:33:35

@user-9c3078 no need to send the data. Just refer to this conversation and say that your world camera might have a loose connection

user-9c3078 01 October, 2019, 17:37:59

@papr Okay, that make sense. But how could I fix it?

user-9c3078 01 October, 2019, 17:38:28

Or just because I've used it for long time. Otherwise I need a new one?

papr 01 October, 2019, 17:41:52

@user-9c3078 specific next steps will be determined by our support team. 👍

user-b0f1b6 02 October, 2019, 03:21:53

Hi, I have been trying to run the following ndsi example, but encountered some error during its execution: C:MyWorkspace\pyndsi\examples>python uvc-ndsi-bridge-host.py Traceback (most recent call last): File "uvc-ndsi-bridge-host.py", line 210, in <module> Bridge(uuid).loop() File "uvc-ndsi-bridge-host.py", line 39, in init self.cap = uvc.Capture(uvc_id) File "uvc.pyx", line 455, in uvc.Capture.init File "uvc.pyx", line 507, in uvc.Capture._init_device uvc.OpenError Exception ignored in: <bound method Bridge.del of <main.Bridge object at 0x000001D542FAD0B8>> Traceback (most recent call last): File "uvc-ndsi-bridge-host.py", line 115, in del self.note.close() AttributeError: 'Bridge' object has no attribute 'note'

user-b0f1b6 02 October, 2019, 03:46:56

Realized it was a mistake on my part, I did not connect a UVC source, which caused the program to stop executing

papr 02 October, 2019, 06:10:37

@user-b0f1b6 nonetheless, this looks like a typo in the example. Could you please create an issue in the pyndsi repository?

papr 02 October, 2019, 06:18:03

@user-b0f1b6 thank you :)

user-b0f1b6 02 October, 2019, 06:26:08

Hi @papr, similarly, i encountered similar issues with the other example

C:\MyWorkspace\pyndsi\examples>python ndsi-gui-client-example.py 14:24:54 [ INFO | OpenGL.acceleratesupport] OpenGL_accelerate module loaded 14:24:54 [ INFO | OpenGL.arrays.arraydatatype] Using accelerated ArrayDatatype Basic GL Setup NDSI Setup 14:24:55 [ WARNING | ndsi.network ] Devices with outdated NDSI version found. Please update these devices. Traceback (most recent call last): File "ndsi-gui-client-example.py", line 400, in <module> runNDSIClient() File "ndsi-gui-client-example.py", line 369, in runNDSIClient clear_gl_screen() File "ndsi-gui-client-example.py", line 78, in clear_gl_screen glClearColor(.9,.9,0.0,1.) File "errorchecker.pyx", line 53, in OpenGL_accelerate.errorchecker._ErrorChecker.glCheckError (src\errorchecker.c:1218) OpenGL.error.GLError: GLError( err = 1281, description = b'invalid value', baseOperation = glClearColor, cArguments = (0.9, 0.9, 0.9, 1.0) )

OpenGL Error most likely from drawing functions of cpu_g, and fps_g. glClearColor does work, but I am just getting a blank grey screen.

papr 02 October, 2019, 06:54:55

I think we will have to revisit and revise the examples :)

user-08d134 02 October, 2019, 08:11:51

Hi, i use the eye tracker for my thesis. I have one question: how is possible show the scan path detector ? I read that the plugin is non avaible.... but for me is very important

papr 02 October, 2019, 09:07:14

Hi @user-08d134 Yes, the scan path plugin has been disabled due to technical reasons. We have an idea for a solution which we are working on. Unfortunately, I cannot give you more information on its release yet. 😕

user-08d134 02 October, 2019, 09:25:24

Thanks @papr

user-08d134 02 October, 2019, 10:59:44

I have an other question... how i can connect eye tracker with pupil mobile?? I have a nexus 5x and it doesen't recognize the eye tracker

papr 02 October, 2019, 11:35:45

@user-08d134 This can have different reasons. The first one to check is if you need to enable OTG Storage. Search your settings, and if you can find it, enable it.

user-08d134 02 October, 2019, 12:47:43

i have enable otg storage

papr 02 October, 2019, 12:51:10

@user-08d134 Great! Are the devices recognized now?

user-08d134 02 October, 2019, 12:54:28

no...

user-08d134 02 October, 2019, 12:56:14

the device recognized the eye tracker but pupil capture doesen't recognized pupil mobile

papr 02 October, 2019, 12:56:37

@user-08d134 Ah, this is a different issue then

papr 02 October, 2019, 12:57:06

This can have multiple reasons, too 😅

papr 02 October, 2019, 13:10:35

@user-08d134 I think the issue here is that your wifi network may be blocking udp transport between Pupil Mobile and Pupil Capture. Please see: https://docs.pupil-labs.com/core/software/pupil-mobile/#streaming-to-subscribers

Have you tried using a dedicated router? The router does not need to be connected to the internet, just needs to create a local wifi network.

user-d77d0f 02 October, 2019, 13:51:03

Hi! I've been following the following tutorial https://github.com/pupil-labs/hmd-eyes for getting started with the pupil VR add-on, unfortunately I'm stuck at the second step, I've dowloaded the pupil software but can't figure out how to "extract the pupil capture app to desktop"

papr 02 October, 2019, 13:51:34

@user-d77d0f You will need the 7z software to do the extraction https://7-zip.org/

user-d77d0f 02 October, 2019, 13:53:16

When I extract it, should it appear as an app?

user-d77d0f 02 October, 2019, 13:53:26

Because it just appears as a folder

papr 02 October, 2019, 13:53:37

@user-d77d0f No, you should see a folder, with a bunch of files in it

user-d77d0f 02 October, 2019, 13:53:44

Oh ok perfect

papr 02 October, 2019, 13:53:58

One of them is the pupil_capture.exe which you will have to execute

user-d77d0f 02 October, 2019, 13:54:15

Aah indeed

user-d77d0f 02 October, 2019, 13:54:19

Thank you!!

papr 02 October, 2019, 14:59:46

@here 📣 Pupil Software Release Update v1.16-80 📣 This release addresses deprecated recordings and an issue with creating offline calibrations.

We highly recommend downloading the latest v1.16-80 bundle: https://github.com/pupil-labs/pupil/releases/tag/v1.16

user-d26a78 02 October, 2019, 17:01:57

Hi, I used Version 1.2.1 and I connect OTG setting on my phone. It worked but today I have a problem : no detection of the eye-tracker (none of the three cameras) on the phone. Eye-tracker is detected on computer. I use a proper USB-C to USB-C cable. I did not change the cable between yesterday and today. I tried with another cable just in case it was the cable but it did not work more. Could you help me please to find solution ?

papr 02 October, 2019, 17:54:35

For reference https://github.com/pupil-labs/pupil-mobile-app/issues/32

user-90b36a 02 October, 2019, 22:35:48

Hi, does pupil lab have a sale representative near New York City area? We would like to reach out to get some information about the device and possibility setup an evaluation demo if available. Thank you!

user-6d4ea5 03 October, 2019, 08:01:46

Hi,Excuse me. I have a problem about "python setup.py build" in pupil_detectors folder under win10,it shows:C:\Users\admin\pupil\pupil_src\shared_modules\pupil_detectors>python setup.py build running build running build_ext building 'pupil_detectors.detector_2d' extension C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Tools\MSVC\14.14.26428\bin\HostX86\x64\cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MD -I. -IC:\Users\admin\Anaconda3\lib\site-packages\numpy\core\include -IC:\Users\admin\opencv\build\include -IC:\Users\admin\ceres-windows\Eigen -IC:\Users\admin\ceres-windows\ceres-solver\include -IC:\Users\admin\ceres-windows\glog\src\windows -IC:\Users\admin\ceres-windows -I../../shared_cpp/include -Isingleeyefitter/ -IC:\Users\admin\Anaconda3\include -IC:\Users\admin\Anaconda3\include "-IC:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Tools\MSVC\14.14.26428\include" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\ucrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\shared" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\um" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\winrt" "-IC:\Program Files (x86)\Windows Kits\10\include\10.0.16299.0\cppwinrt" /EHsc /Tpdetector_2d.cpp /Fobuild\temp.win-amd64-3.6\Release\detector_2d.obj -D_USE_MATH_DEFINES -std=c++11 -w -O2 detector_2d.cpp C:\Users\admin\ceres-windows\ceres-solver\include\ceres/internal/port.h(50): fatal error C1189: #error: One of CERES_USE_OPENMP, CERES_USE_CXX11_THREADS or CERES_NO_THREADS must be defined. I have no idea about it,Could I get help from anyboby?Thanks very much!

user-08d134 03 October, 2019, 08:29:21

Hi, i tried pupil mobile on motorola g6 plus and it works!! The problem is that on my nexus 5x pupil mobile doesen't recognize eye tracker... How can i do?

user-54376c 03 October, 2019, 09:09:01

@papr I just upgraded my Pupil version (lacked a couple versions behind) and now the 3D mapping doesn't seem to work anymore for me. I reset all settings in the menu, but I only get "gaze.2d." messages. (e.g. gaze.2d.0. : {'topic': 'gaze.2d.0.', 'norm_pos': [0.5165608752073741, 0.4801285240183113], 'confidence': 0.9991866616640561, 'timestamp': 90103.466044, 'base_data': [{'topic': 'pupil.0', 'circle_3d': {'center': [0.49586033790509365, 0.4667618615398883, 59.52160397615749], 'normal': [-0.4517110829734998, 0.3169004307812576, -0.8339851404488939], 'radius': 1.4229377740165856}, 'confidence': 0.9991866616640561, 'timestamp': 90103.466044, 'diameter_3d': 2.845875548033171, 'ellipse': {'center': [165.29948006635973, 124.7691542356053], 'axes': [24.7630653250132, 29.64819193133116], 'angle': -36.03045521359836}, 'norm_pos': [0.5165608752073741, 0.4801285240183113], 'diameter': 29.64819193133116, 'sphere': {'center': [5.9163933335870915, -3.336043307835203, 69.52942566154422], 'radius': 12.0}, 'projected_sphere': {'center': [212.7569993844032, 90.25220859832586], 'axes': [214.01010951007933, 214.01010951007933], 'angle': 90.0}, 'model_confidence': 1.0, 'model_id': 1, 'model_birth_timestamp': 90097.299134, 'theta': 1.8932560159820968, 'phi': -2.0671904631690197, 'method': '3d c++', 'id': 0}]})

Chat image

user-54376c 03 October, 2019, 09:09:15

Did I forget any setting?

papr 03 October, 2019, 09:12:22

@user-54376c This is after doing a 3D calibration, correct? I ask, because without a previous calibration, the dummy gaze mapper just outputs the 2d pupil position one to one.

user-54376c 03 October, 2019, 09:57:37

Without calibration

user-54376c 03 October, 2019, 09:57:50

like before updating

user-54376c 03 October, 2019, 09:59:51

alright after a calibration it's 3d, thank you

user-54376c 03 October, 2019, 10:00:18

(working on a new 3d calibration, hence no calibration before)

user-2be752 03 October, 2019, 17:14:06

HI there! has anybody tried to do a calibration for an experiment with three monitors?

papr 03 October, 2019, 17:37:52

Hi @user-2be752 The calibration is usually independent of the number of monitors used in your experiments.

user-2be752 03 October, 2019, 18:31:13

Thanks papr, but if I calibrate only for one monitor, then the fixations that fall into the other two monitors (on the left and right of the calibrated one) will have very low confidence right?

papr 03 October, 2019, 18:37:34

@user-2be752 The calibration is relative to the subject's field of view/the scene camera's field of view, not to a physical object. Unless the subject's head is fixed via a head rest, it is likely the subject will turn their heads to look at the different monitors, and therefore moving their field of view to them. So the calibration should apply in the same way as it did for the first monitor.

papr 03 October, 2019, 18:40:34

Of course, if your subjects move their gaze outside of the calibration area, e.g. if their head is fixed and they need to look out of the corner of your eye, then the gaze estimation will be likely worse.

papr 03 October, 2019, 18:41:43

In the second case, I would recommend to do a manual marker calibration that is spread over the complete field of view of the subject

user-65df3f 03 October, 2019, 19:37:19

Hi all, our lab has a the pupil vive add-on tracker. One of the eye cameras has stopped working. eye0 starts in ghost mode, while eye1 starts properly. Are they any suggestions on how to fix this, or what the problem could be?

papr 03 October, 2019, 19:37:52

@user-65df3f please contact info@pupil-labs.com in this regard.

user-65df3f 03 October, 2019, 19:38:11

Thank you for quick response

user-65b830 03 October, 2019, 20:18:28

I built a DIY headset with the Logitech HD Webcam C615 and Microsoft LifeCam HD-6000, and am trying to operate it with the pupil capture gui. However, neither camera is recognized by Pupil's software (the world view camera feed remains in Ghost mode). Any assistance would be greatly appreciated!

papr 03 October, 2019, 20:30:18

@user-65b830 Hey, which operating system are you using?

user-2be752 03 October, 2019, 23:03:47

@papr Thanks for the info! My subjects can move their heads freely. I guess then i'll test just calibrating in one screen and then see if the confidence drops when they look at their other screens

user-54376c 04 October, 2019, 09:04:55

what's the best way to verify the accuracy/measure the error of a calibration model? wasn't there a plugin for that? 🧐

papr 04 October, 2019, 09:06:02

@user-54376c in Pupil Capture Accuracy Visualizer, in Pupil Player the Offline Calibration.

user-54376c 04 October, 2019, 09:07:49

The accuracy visualizer uses the data of the calibration process, right? Isn't there a way to independently "verify" it with new gaze points afterwards? Or do I get this wrong?

papr 04 October, 2019, 09:09:45

Yes, just hit the T on the left. It uses different target points. But the calculations are the same

user-54376c 04 October, 2019, 10:25:05

thank you!

user-65b830 04 October, 2019, 14:31:35

@papr I am using Windows

papr 04 October, 2019, 14:32:40

@user-65b830 in this case follow steps 1-7 from this guide https://github.com/pupil-labs/pyuvc/blob/master/WINDOWS_USER.md

user-65b830 04 October, 2019, 14:40:04

@papr thank you!

user-65b830 04 October, 2019, 14:52:27

@papr pupil capture works with my webcam now!! Thank you for such clear instructions

user-33ed6d 04 October, 2019, 19:25:49

Hi, thanks for making pupil work available. We are trying to use pupil player to parse the pupil size and location from preexisting video. We have constructed a directory including info.player.json, eye0.mp4, and eye0_timestamps.npy. Offline pupil detection works! But we have a problem: the pupil detection crashes after 39% of the video.

user-33ed6d 04 October, 2019, 19:27:40

It is always 39% regardless how many frames the video has (5000-15000 isnwhag we have tried.) We have tried adjusting recording start and end in info.player.json, and writing different time stamps to eye0_timestamps.npy. None of that changes the error.

user-33ed6d 04 October, 2019, 19:29:05

The exception is in file_backend.py — line 400: av_frame is not defined. Looks to me like that means self.frame_iterator is None.

user-33ed6d 04 October, 2019, 19:29:21

Before I add more details- does anything here ring any bells? Thanks!!

papr 05 October, 2019, 09:42:24

@user-33ed6d did you make sure that the timestamp file has the same exact amount of timestamps as their are video frames in the video file?

user-33ed6d 05 October, 2019, 11:46:50

@papr What is odd is that the GUI always stops at 39% even if we make the timestamps invalid—e g set all the timestamps in the eye video time stamp file to zero. Besides changing that time stamp file is there anything else you would try?

user-6d4ea5 05 October, 2019, 12:13:07

Is there any requirement that pupil labs have for usb wecam? My usb webcam cannot be used in pupil labs.

user-94edb6 05 October, 2019, 13:36:59

chiming in to Mark H's comment - yes, we generate the npy file by creating an array of 0:num_frames, then divide by framerate. the resulting file has the exact number of timestamps and correspond to the frame times. also, we are getting this warning (which has 'This should never happen' comment in the code) on what looks like every frame:

user-94edb6 05 October, 2019, 13:37:04

Chat image

papr 05 October, 2019, 22:31:55

@user-94edb6 the video's packet pts are different from the frame pts. You will have to reencode the video with new pts.

user-94edb6 05 October, 2019, 22:43:05

gotcha, thank you for the help!

user-06ca4e 06 October, 2019, 14:56:58

Eye 0 suddenly failed with my pupil glasses. It says video_capture.uvv_backend:Init failed

user-06ca4e 06 October, 2019, 14:57:39

capture is started in ghost mode. I unistalled drivers, reinstalled them, unplugged and replugged the connectors. tried on different machine.

papr 06 October, 2019, 16:58:42

@user-06ca4e this is likely an hardware issue. Please contact info@pupil-labs.com in this regard.

user-14d189 06 October, 2019, 22:44:39

Hi, I would like to do some post processing on the exported eye videos, but there might be an issue with exporting the videos. (win10, PL 1.13)

user-14d189 06 October, 2019, 22:44:53

filename = 'eye1.mp4' vid = imageio.get_reader(filename, 'ffmpeg') nums = range(90, 120)

pylab.imshow(vid.get_data(91))

vid.get_length() vid.get_meta_data()

user-14d189 06 October, 2019, 22:45:15

Output:

user-14d189 06 October, 2019, 22:45:44

{'plugin': 'ffmpeg', 'nframes': inf, 'ffmpeg_version': '4.1 built with gcc 8.2.1 (GCC) 20181017', 'codec': 'mpeg4', 'pix_fmt': 'yuv420p', 'fps': 65535.0, 'source_size': (192, 192), 'size': (192, 192), 'duration': 34.42}

user-14d189 06 October, 2019, 22:48:43

the "fps" can not be read correctly and subsequently the read out of single images does not work correctly. Can you suggest a library to edit videos?

user-b0f1b6 07 October, 2019, 06:59:19

Hi, I am using Pupil Remote to communicate with Pupil Capture. I am activating the gaze calibration through the command 'C' . The calibration can be initiated on pupil capture but I would like to know if there is anyway for pupil capture to notify that the calibration is successful.

papr 07 October, 2019, 12:00:22

@user-94edb6 Please make also sure to delete any *_lookup.npy files, since they include a cache of the packet pts. Pupil Player will simply regenerate the files on opening the recording.

papr 07 October, 2019, 12:06:04

@user-6d4ea5 Please see the first section of this document: https://gist.github.com/papr/b258e0e944604375752eae502b4ad3d5

papr 07 October, 2019, 12:08:38

@user-14d189 The easiest way to read images from a video is via opencv: https://docs.opencv.org/2.4/modules/highgui/doc/reading_and_writing_images_and_video.html#videocapture-read

user-b2ed5c 07 October, 2019, 13:49:01

Hi everyone, I have a question regarding synchronizing the pupil capture with another simultaneously running recording system. I am using the HTC Vive add-on and acquiring EEG data at the same time. I run Pupil Capture and EEG recordings on two different computers. Every minute I send a 500ms TTL pulse to the EEG system and to an LED that is in the field of view of the pupil cameras to 'synchronize' the two systems but was wondering if there is a neater way of doing it on the Pupil side. Is there a way that I can read the TTL in the Pupil Capture software to mark the correct timing of the pulse just using software? Thanks in advance!

papr 07 October, 2019, 14:19:50

@user-b0f1b6 Have you, by any chance, sent an email to info@pupil-labs.com in this regard already?

papr 07 October, 2019, 14:25:19

@user-b2ed5c you can send triggers, so called annotations in Pupil: https://github.com/pupil-labs/pupil-helpers/blob/master/python/remote_annotations.py

user-94edb6 08 October, 2019, 03:23:20

Thanks, @papr. I've been trying to clear the autogenerated files between runs but I'll be more diligent about it from now on. We are now extracting the pts from mp4, converting to seconds, then writing to npy. This has fixed the crashing bug, thanks! Interestingly, we are still getting the iterator warning, and only getting offline detected data for ~2/5 of the frames in a pattern: 0 1 2, then 7 8 9, then 15 16 17, etc. It's not a big deal as we can sync the timestamps but wondering if you've seen this before. If it's relevant, the pts from the lookup file are non-monotonic while the extracted video pts are.

user-34de89 08 October, 2019, 11:17:35

Some questions about Pupil Invisible. Can eye videos be streamed ? If so, is it possible to have some eye image samples? Are there some research paper about the algorithms? Is Pupil Cloud a free service ? Thank you in advance.

papr 08 October, 2019, 11:30:54

@user-34de89 Hi,

Streaming the eye videos is not officially supported. What we do support is streaming the world/scene video and the realtime gaze data.

@marc should be able to respond to the remaining questions.

marc 08 October, 2019, 12:02:33

@user-34de89 To add to what @papr said: We do not support streaming the eye videos in real-time, but the eye videos can be accessed post hoc. If the ability to stream eye videos is a feature that is wanted, we might consider adding it. Unlike with Pupil Core it is not neccessary to ever look at the eye videos for most use-cases, which is why we have not added this yet. Let me know if your use-case would require real-time streaming.

Regarding eye image samples: If you could let me know what your use-case is via PM, I can send you an appropriate demo recording.

Research paper on the algorithms: We are currently working on a white paper to describe the algorithms and give a comprenhensive summary of the performance in various conditions. I can not announce a release date for this yet though.

Pupil Cloud: Yes, this is a free service.

user-072005 08 October, 2019, 12:28:43

Hello, I'm using the pupil core glasses. They got unplugged during data collection, but I'm hoping to use what we had until then. Unfortunately, the files didn't generate properly (timestamps and word intrinsics files missing). Is there any way to recover this data?

papr 08 October, 2019, 12:39:43

@user-072005 The missing timestamp files indicate an incomplete recording. Unplugging the device does not abort a recording in such a way in newer versions of Pupil Capture.

The incompletion must result from an other issue or you have been using a very old version of Capture.

Unfortunately, in most cases, the actual issue are not the missing timestamp files but the video files being broken beyond repair.

To check this, please attempt to open the recorded world and eye videos using the VLC Player. If they open without an error message, there is hope for recovery.

Also, could you let us know which version of Capture you are using?

user-072005 08 October, 2019, 12:40:28

Ok, I need to download VLC on this computer. I'm using

user-072005 08 October, 2019, 12:40:30

1.15

papr 08 October, 2019, 12:40:56

That is a fairly recent version. Did Capture crash during this recording?

user-072005 08 October, 2019, 12:41:11

But the videos of the world and eye were gone and I closed the rest of the windows, which may be what did it

papr 08 October, 2019, 12:43:16

@user-072005 Yes, the preview stops when the camera disconnects. But you can reconnect the device and after some seconds the preview should work again. I would recommend you to give it a try in a test recording. 👍

user-072005 08 October, 2019, 12:43:45

Ok, so if it happens again, how do I properly end the recording?

papr 08 October, 2019, 12:45:25

Either reconnect the device and continue, or hit the "R" button as if you wanted to stop the recording normally.

Nonetheless, at least the world video and timestamps should be present if you closed the windows during the recording.

papr 08 October, 2019, 12:45:59

Can you share a list of the files that were generated for this recording?

user-072005 08 October, 2019, 12:46:12

Even if I close the one that looks like the command prompt?

user-072005 08 October, 2019, 12:46:14

Yes

user-072005 08 October, 2019, 12:46:43

Chat image

user-072005 08 October, 2019, 12:50:31

They won't open in VLC. Luckily, I think I can get this person to participate again

papr 08 October, 2019, 12:54:01

Ok, in this case, the recording cannot be restored. Sorry for that.

We will improve situations where the recording was not stopped via the "R" button.

It is indeed possible, that closing the terminal-like window caused the recording to be interrupted abnormally.

In the future, always stop the recording normally first, and shut down the application by closing the world window.

user-072005 08 October, 2019, 12:54:33

And if the world window is closed already, I can still click R and it will end properly?

papr 08 October, 2019, 12:55:24

No, the R button is only available in the world window. Once the world window is closed, the recording is being stopped or aborted.

user-072005 08 October, 2019, 12:55:47

hm...so it happened by itself

papr 08 October, 2019, 12:56:03

Also, as recommended above, make a test recording, where you disconnect and connect the device and checkout how it looks like in Player. You should see gray frames duing the period in which the device was disconnected.

user-072005 08 October, 2019, 12:56:06

I'll play with it. Maybe I just thought that was the eye video that doesn't exist

papr 08 October, 2019, 12:57:37

@user-072005 If any of the windows disappears by itself, the program crashed. In this case it is recommended to 1. Stop the recording if possible 2. Close the world window if possible 3. Make a copy of the capture.log file and share it with us for support purposes.

papr 08 October, 2019, 12:58:13

Only afterwards, restart Capture, since restarting overwrites the log file and any information about the crash is lost.

user-072005 08 October, 2019, 12:58:50

Oh darn, I just reopened capture. Ok, I will keep that in mind for the future

papr 08 October, 2019, 13:00:41

Do you know where to find the log file?

Next time you encounter a crash, feel free to open an issue on Github with the log file. This way we can track the issue and make sure the report is not lost in the amount of chat messages here. 🙂

user-072005 08 October, 2019, 13:01:53

You've actually told me how to before, but I forgot

user-072005 08 October, 2019, 13:02:15

wait, I found it

user-072005 08 October, 2019, 13:03:47

Oh, yeah it definitely crashed somehow. Hopefully that's the only time, but I'll keep the capture.log if it does happen again

user-072005 08 October, 2019, 13:04:08

(now that I've played with disconnecting in capture)

papr 08 October, 2019, 13:07:13

@user-94edb6 Sorry if I was not clear before:

the video's packet pts are different from the frame pts. You will have to reencode the video with new pts.

The mp4 file is foremost just a container for different types of streams, e.g. video in this case.

A stream is a series of packets. Each packet has a pts. You can access packets via "demuxing" which is different from "decoding". If you "decode" a package, you get a "frame" with the actual image/audio/etc data. A packet can yield multiple frames when decoded. Each frame has its own pts.

When we record videos in Capture and Pupil Mobile, we write the videos such that each packet yields one frame and that the frame has the same pts as the packet.

This does not seem to be the case for your video.

We use the packet pts to build the lookup table since this is quick. When processing the video we use the pts of the decoded video frames and compare it to the lookup table to check our current location within the video.

If the assumption of packet.pts == frame.pts is broken, the lookup fails and you get the above warning.

papr 08 October, 2019, 13:07:36

@user-072005 were you able to reproduce the crash?

user-072005 08 October, 2019, 13:08:16

No, it is keeping the world video up but pausing when I disconnect which is not what happened

user-072005 08 October, 2019, 13:09:20

I had it on a USB extender though, and it seems my USB extender broke

papr 08 October, 2019, 13:11:20

Mmh, a broken extender should not result in a crash, but I cannot guarantee it.

user-072005 08 October, 2019, 13:12:49

It seems like it would be the same as any other disconnect. Well, if it happens again I'll post it in the problems page on github. Thanks

user-b2ed5c 08 October, 2019, 14:16:19

thanks @papr how do I integrate/run the annotations code with the Pupil capture? Do I run it in both EEG and pupil capture computers? I am very new to python but it looks like the annotations run via a network? Due to lab regulations the EEG acquisition computer may not be allowed go on a network but i will check this.

Alternatively, is there a way to read an analog in channel from the serial port in the pupil capture code? i think that might be the easiest way for me.

user-a7dea8 08 October, 2019, 15:33:14

Is there any way to totally remove the vis polyline when exporting a video from Pupil Player?

papr 08 October, 2019, 15:34:13

@user-a7dea8 Yes, you will have to turn off the plugin. You can do it from the plugin's menu.

papr 08 October, 2019, 15:35:18

@user-b2ed5c Yes, the script uses the Network API of Pupil Capture. Nonetheless, api also works without an explicit network, i.e. on the same computer, or a directly via ethernet connected computer

user-e4aafc 08 October, 2019, 15:35:20

Anyone knows where I can find a detailed explanation of the export data on the new homepage/github?

papr 08 October, 2019, 15:36:01

@user-e4aafc Which export data specifically? pupil_positions.csv and gaze_positions.csv? Or other files?

user-e4aafc 08 October, 2019, 15:38:49

@papr ideally both. we are trying to load the data into a visualization platform and we try to understand the relation between eye_center and gaze_point

papr 08 October, 2019, 15:42:15

@user-e4aafc I have just added it to the docs. It should come only soon.

It will be available below this section: https://docs.pupil-labs.com/core/software/pupil-player/#raw-data-exporter

user-e4aafc 08 October, 2019, 15:45:19

@papr awesome thanks!

papr 08 October, 2019, 15:46:57

@user-b2ed5c One way to synchronize EEG and Pupil Capture is to write a script that manages the recording.

It would be responsible for starting the Capture recording, as well as for sending triggers to EEG and annotations to Pupil.

The annotations would include the value of the EEG trigger, as well as the current timestamp.

Please be aware of the different clock used by Capture: https://docs.pupil-labs.com/core/terminology/#timing

papr 08 October, 2019, 15:48:09

For clarification: The example script is run additionally to Capture.

user-e4aafc 08 October, 2019, 16:11:41

@papr Another question just arose.. we are using the pupil with a realsense depth camera but the resulting vector length between eye_center and gaze_point can be manually adjusted (in player --> vector gaze mapper --gaze distance) (which makes the depth camera obsolete) - is there an additional output such as actual distance between eye and gaze object?

papr 08 October, 2019, 16:16:48

@user-e4aafc No, the 3d gaze mapper does not use that option. One would have to either 1) write a custom gaze mapper that makes use of this information (difficult since gaze mappers do not have access to the current frame) or 2) write a custom plugin that modifies the existing gaze data (would only impact gaze data stored during a recording, not the gaze data published in the network API) 3) Calculate custom gaze data post-hoc by combining the gaze output by Pupil Player with recorded depth data

user-33ed6d 08 October, 2019, 16:19:08

@papr thanks for all your advice on pts and offline pupil detect. We have a good sense of what to try.
On another note, we have gotten pupil running from source on MacOS using Conda (so we don’t have to put everything into the system-level homebrew install in /usr/local). The patch to do this is somewhat rough- are you interested in it? If so we can try to clean up a bit; if not we won’t bother

papr 08 October, 2019, 16:25:37

@user-33ed6d Sorry, we are working on a different approach. We want to reduce the required system-wide libraries instead. 👍 e.g. by shipping them as part of the python module (wheel)

user-33ed6d 08 October, 2019, 16:42:32

Sounds fine thanks.

user-df9629 08 October, 2019, 20:14:14

Hi, I am using Pupil Invisible (Beta version) and I want to read the files with extensions .raw , .bin, .time and .time_aux. Any advice on how I can read them?

papr 09 October, 2019, 05:50:57

@user-df9629 Please check out our documentation in this regard https://docs.pupil-labs.com/developer/invisible/#recording-format

user-9c3078 09 October, 2019, 08:35:11

Hi, guys! I always notice there is 'error dc' which I don't know what's that mean. Can anyone explain this for me?

user-9c3078 09 October, 2019, 08:54:44

Also, I tried Apriltags and processed it removing the white edge. Obviously, it cannot be detected. So don't make this stupid mistake as I did.

user-9c3078 09 October, 2019, 08:55:46

Another question is that what 'pip install pupil-apriltags' for? Is that used to generate apriltags?

papr 09 October, 2019, 09:02:45

@user-9c3078 could you give us more context in which the error appears? Is it during the usage of surface tracking?

papr 09 October, 2019, 09:03:50

pip install pupil-apriltags

This is only required if you run Pupil from source instead of using the released bundle.

This package wraps the original apriltag detection code: https://github.com/AprilRobotics/apriltag

user-9c3078 09 October, 2019, 09:06:54

@papr Yes, during surface tracking using capture, maybe it also happens when using other functions. But from player, error dc happens with libav.mjpeg, it also says error with some xy coordinates

papr 09 October, 2019, 09:11:16

@user-9c3078 thank you, could you let us know what hardware you are using? Are you using a Pupil Core headset? If yes, does it have the 120Hz or 200Hz eye cameras?

user-9c3078 09 October, 2019, 09:15:29

@papr I am now using Pupil Core headset with 200 hz camera

papr 09 October, 2019, 09:20:24

@user-9c3078 The error message is a result of a failed attempt to decode the video frames. This is often due to the frames not being fully transmitted. Are you using a USB cable extender by any chance? And are you using the cable that came with the Pupil Core headset?

user-9c3078 09 October, 2019, 09:21:16

@user-9c3078 I am using the cable with the pupil core headset

papr 09 October, 2019, 09:21:53

@user-9c3078 are you getting the error message very often or only once or twice?

user-9c3078 09 October, 2019, 09:35:48

@papr yeah, I got it very often during the recording and it happens several times before in Sept

papr 09 October, 2019, 09:39:58

@user-9c3078 Can you try a different USB port? Do you have a dedicated USB 3 port that you could use?

user-9c3078 09 October, 2019, 10:03:33

@papr I tried a different port and there is no error dc now but with a worse frame drop for around two seconds and happened the same position of time for two recordings. Is that normal? Also I found ' video_capture.uvc_backend: Received frame with invalid timestamp. This can happen after a disconnect. Frame will be dropped!' in the terminal, can you explain what is this? So... I guess there is something loosen between my cable and port which result in the frame drop?

user-9c3078 09 October, 2019, 10:06:48

@papr BTW, I didn't find define the surface rectangular as the previous version with square marker. Then how apriltags define the surface? Is that still meaningful I define the size (width and height)?

papr 09 October, 2019, 10:08:26

@user-9c3078

I guess there is something loosen between my cable and port which result in the frame drop? Yes, this seems to be an issue with your headset. Please contact [email removed] in this regard and let them know which steps we have taken towards debugging the issue (using original cable, different ports, resulting disconnects)

papr 09 October, 2019, 10:09:31

@user-9c3078 The surface definition works in the same way as with the legacy markers. A surface is independent of the markers it is using

user-9c3078 09 October, 2019, 10:11:47

@papr So how can I resize the surface with apriltags? I cannot edit the marker as before.

user-deafd0 09 October, 2019, 10:15:58

hi can I ask what is mean by the start frame and end frame in the fixation.csv @papr

papr 09 October, 2019, 10:18:06

@user-9c3078 I do not understand. Given that you have defined surface, do you see the two red dots at the top right of the surface? One saying "edit surface"?

papr 09 October, 2019, 10:19:15

@user-deafd0 These are the indices of the world frame during which the fixation started and the world frame during which the fixation ended

user-9c3078 09 October, 2019, 10:20:32

@papr No, I didn't see anything except the marker

papr 09 October, 2019, 10:21:18

@user-9c3078 Is the marker being detected? It should have a green overlay

papr 09 October, 2019, 10:21:31

You are using apriltag markers correct?

user-9c3078 09 October, 2019, 10:21:48

@papr Yeah, I can see the green overlay

user-9c3078 09 October, 2019, 10:22:09

Wait a minute, I can show you the sceenshot

papr 09 October, 2019, 10:25:22

@user-9c3078 Maybe you just need to add an other surface definition? Just hit the "A" on the left side. A screenshot should be helpful. yes

user-9c3078 09 October, 2019, 10:26:22

Let me try add a new surface.

papr 09 October, 2019, 10:26:45

I see that there is a surface definition, but it might be an older one which is based on different markers.

papr 09 October, 2019, 10:27:22

I should say that a surface is either defined on the legacy markers or apriltag markers, not both at the same time.

user-9c3078 09 October, 2019, 10:28:37

@papr okay, I got it. It worked when defining a new surface. Thank you !!!

user-9c3078 09 October, 2019, 11:38:53

Hi! Is here any method that I can use to perfectly define my surface? For now, I know that even from the screen we see the distortion image but the data we get are not distorted. But when we define the surface, it actually get the distorted surface right?

user-df9629 09 October, 2019, 12:02:06

Thank you @papr !!

user-9c3078 09 October, 2019, 14:09:13

Hi! I just downloaded the newest version of pupil capture and found that my apriltags which can be detected before cannot be detected now. Can anyone help me?

papr 09 October, 2019, 14:11:10

@user-9c3078 Which version were you using before?

papr 09 October, 2019, 14:12:06

Also, which operating system are you using? You deleted your screenshot from earlier, didn't you?

user-9c3078 09 October, 2019, 14:22:15

@papr Yeah, I delete that, I use v1.16-71 before, now I am using v1.16-80

user-9c3078 09 October, 2019, 14:24:04

Also I found a wired thing that I my world FPS and eye FPS works fine but there still some frame drops which I think because of CPU or something? BTW, I use linux

papr 09 October, 2019, 14:25:20

@user-9c3078 Are you still seeing

video_capture.uvc_backend: Received frame with invalid timestamp. This can happen after a disconnect. Frame will be dropped!'

This means that there were frame drops due to a connection issue. Are you using a different headset by now?

papr 09 October, 2019, 14:25:48

But yes, frame drops may happen due to missing cpu resources

papr 09 October, 2019, 14:26:57

@user-9c3078 Re apriltag detection: You started the surface tracker again (it will be disabled after the update) and the green overlay does not show anymore?

user-9c3078 09 October, 2019, 14:30:47

@papr Yeah, I still got that and but it happened during my recording, that's so bad. And also I think the same problem happened again that in a 20 minutes recording, the last 5 minutes you can get any gaze information even it's not ghost mode.

papr 09 October, 2019, 14:34:38

@user-9c3078 Did you contact info@pupil-labs.com already?

user-9c3078 09 October, 2019, 14:40:33

@papr No I haven't, do you need that?

user-9c3078 09 October, 2019, 14:45:35

And this is what it looks like

Chat image

user-9c3078 09 October, 2019, 14:47:04

But the first time I opened it via player, there is a line on world FPS which is with equal length of pupil

papr 09 October, 2019, 14:48:15

@user-9c3078 There is an hardware issue with your headset that probably needs to be fixed. Else you will keep getting disconnects. Please contact info@pupil-labs.com in this regard.

user-9c3078 09 October, 2019, 14:51:23

@papr okay, thank you so much!

user-9c3078 09 October, 2019, 14:52:00

BTW, does the order ID the same as the order on the box? Cuz I guess I need that for the remote debug

papr 09 October, 2019, 14:53:39

Yes, that should help. You can find it on the shipping label. Alternatively, you can let us know the email address/name which was used to purchase the device. Please put the information into your mail instead of posting it here.

user-9c3078 09 October, 2019, 14:53:48

And my termial keep display 'DEBUG:av.buffered_decoder:frames to buffer = 1', what's that mean?

papr 09 October, 2019, 14:55:05

It is a debug message for a Player specific implementation detail. It basically tells you how many new frames are being buffered during playback.

user-9c3078 09 October, 2019, 14:55:32

Cool! Thank you soooo much for helping!

user-14d189 10 October, 2019, 08:07:00

Hi, I was just thinking that everything in mono chromatic light looks gray, doesn't it? And that recoding of the eyes as a gray movie might bring resource requirement down, might it? On the other side compressed mp4 ...

papr 10 October, 2019, 08:58:48

@user-14d189 The cameras provide the data as mjpeg stream. Jpeg uses YUV as color space. The V channel is basically the gray image, making it inexpensive to extract it.

user-df9629 10 October, 2019, 15:53:18

Hi, I am using the Pupil Invisible (beta version) and I noticed a mismatch in the number of timestamps for the IMU [say A timestamps] and the number of IMU raw data [say B raw values] after being segregated into its componenets [i.e, B/6 != A] I unpacked the raw file with float 32 little endian and the timestamps file with unsigned int 64 little endian. I have observed this on multiple data recordings. Is there something I am missing?

papr 10 October, 2019, 15:54:34

Hey @user-df9629 can you check if there is a specific ratio between the number of B and the number of A? And it it is consistent between the different recordings?

user-df9629 10 October, 2019, 15:55:09

I will check it right away

papr 10 October, 2019, 15:56:31

@user-df9629 My guess is that it is around 6 * 80 * len(A) = len(B)

user-df9629 10 October, 2019, 16:07:00

@papr , The ratio of len(B) / len(A) is between 5.5 and 6. It is inconsistent between different recordings.

papr 10 October, 2019, 16:10:35

@user-df9629 oh ok, thank you for reporting the issue. I will forward it to the responsible team member 👍

user-df9629 10 October, 2019, 16:38:50

@papr , you're welcome! 👍

user-fd5a69 10 October, 2019, 20:03:22

Hi, I'm trying to use pupil mobile for recording audio. I enabled the audio capture plugin but it did not detect any audio source. Is it even possible to use pupil mobile for recording audio?

papr 11 October, 2019, 06:59:56

@user-fd5a69 hey, did you mean Pupil Capture by any chance? Pupil Mobile records the phone's built-in microphone by default.

user-516564 11 October, 2019, 09:42:33

Hello all,

I posted this on the research channel, but got no response, so I'll ask here too. I am currently involved in research that needs eye-tracking data from videos where the person wearing the tracker is moving (walking, running, driving, playing some sport, etc). Do you know of some recording repository that might help me?

Thanks!

user-516564 11 October, 2019, 09:43:30

They could be videos not necessarily recorded with pupil, but any other eye-tracking tech

papr 11 October, 2019, 09:44:26

Hey @user-516564 I am not aware of such a repository but you might get lucky by checking the papers that cite Pupil: https://docs.google.com/spreadsheets/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/edit?ts=576a3b27#gid=0

user-516564 11 October, 2019, 10:00:29

Thanks @papr ,I will check that

user-c5fb8b 11 October, 2019, 11:51:23

@user-fd5a69 In case you are trying to stream from Pupil Mobile into Capture: Streaming audio is no supported. The only way to record audio with Pupil Mobile only is to make a recording on mobile and then import this into Pupil Capture afterwards. Or you only stream video from Pupil Mobile and record audio via some other audio input device connected to your machine.

user-fd5a69 11 October, 2019, 12:04:22

@user-c5fb8b I see. Thanks for the info!

papr 11 October, 2019, 12:04:56

@user-fd5a69 I am curious. Were you indeed trying to stream the audio?

user-fd5a69 11 October, 2019, 12:11:39

No, I want to record audio from pupil mobile into pupil capture.

papr 11 October, 2019, 12:17:24

@user-fd5a69 Maybe to slightly correct @user-c5fb8b 's comment:

The only way to record audio with Pupil Mobile only is to make a recording on mobile and then import this into Pupil Capture afterwards. He meant then import this into Pupil Player afterwards

The usual workflow is follows:

  1. Make recording on phone using Pupil Mobile
  2. Copy recording from phone to computer
  3. Open recording in Player

Alternatively: 1. Make recording on Computer using Pupil Capture 2. Open recording in Player

The possibility to stream video from Pupil Mobile to Pupil Capture should only be used for monitoring. not for making a recording. Yes, it is technically possible, but not recommended because there might be frame drops if the connection quality is bad.

Protip: You can use the Remote Recorder plugin to remotely start and stop recordings on the phones, using Pupil Capture.

user-fd5a69 11 October, 2019, 12:26:24

I'm currently using the second way for recording world and eye videos. The eye trackers are connected to phones installed with pupil mobile and I use pupil capture in a local computer for starting the recording (pupil remote) and stored the recording in a computer recording folder. But the recorded video does not have any sound, and I couldn't find any audio file in the recording folder. So I'm wondering whether I can record audio using pupil capture.

papr 11 October, 2019, 12:29:17

@user-fd5a69 Maybe to clarify:

  1. Make recording on Computer using Pupil Capture
  2. Open recording in Player This workflow expects the Pupil Core headset to be connected directly to the computer running Pupil Capture, not to the phones running Pupil Mobile
papr 11 October, 2019, 12:31:10

Could you let me know if you are using the "Local USB" or the "Pupil Mobile" manager in Pupil Capture? This would help me to better understand your setup 🙂

user-fd5a69 11 October, 2019, 12:32:33

Sorry I wasn't clear. I am using pupil mobile.

papr 11 October, 2019, 12:36:32

@user-fd5a69 Ok, thank you for the clarification :)

The possibility to stream video from Pupil Mobile to Pupil Capture should only be used for monitoring. not for making a recording.

Then, if I understood correctly, you are doing this 👆 I just want to emphasise again, that this is not recommended due to the possibility of data loss during the recording.

papr 11 October, 2019, 12:38:28

My recommendation would be to make the recording on the phone directly. It records the phone's audio by default. You can start the recording remotely, by using the "Remote Recorder" plugin. https://docs.pupil-labs.com/core/software/pupil-capture/#network-plugins

user-fd5a69 11 October, 2019, 12:43:16

Ok I see. I'll try make recording on the phone and see if that solves my problem. Thank you for your help! @papr

papr 11 October, 2019, 12:43:51

@user-fd5a69 Sure thing! Let us know how it went 🙂

user-a82c20 11 October, 2019, 15:07:15

Hi pupil_player is crashing because it cannot find eye0Lookup.npy. Lookimg in the recording from ppil_capture eye0_lookup.npy is indeed missing. How do I get pupil_capture to create eye0_lookup.npy

papr 11 October, 2019, 15:54:00

@user-a82c20 is there a eye0_timestamps.npy file?

user-7ba0fb 12 October, 2019, 09:10:20

hi,I exported raw data from pupil player using Pupil Core, but I couldn't fully understand all items in those csv. files. I have looked through user guides. where can I get some specific explanations of those items?

user-0e30fc 12 October, 2019, 19:16:43

Hello

papr 13 October, 2019, 08:57:25

@user-7ba0fb This is the relevant section in the docs: https://docs.pupil-labs.com/core/software/pupil-player/#raw-data-exporter

Additionally, I recommend to have a look at the terminology section: https://docs.pupil-labs.com/core/terminology/#terminology

papr 13 October, 2019, 08:57:54

Hey @user-0e30fc 👋 welcome to the channel.

user-14d189 13 October, 2019, 23:35:37

Hi, is pupil mobile compatible with OnePlus 5? 3? has someone experience with those devices? Do you have any recommendation?

wrp 14 October, 2019, 06:55:51

Yes @user-14d189pupil mobile works on oneplus5. Ensure OTG is on and app is locked. Please note OTG enabled and app lock only apply to OnePlus (not other devices AFAIK)

papr 14 October, 2019, 06:58:55

@user-14d189 I can confirm that it works on a One+ 3 as well.

user-14d189 14 October, 2019, 07:49:40

@wrp @papr Thanks, I'll pass that on.

user-5543ca 14 October, 2019, 11:45:05

Hello, I have a question regarding the timestamps -- I'm trying to sync the data from Pupil with Empatica E4. When I looked at the timestamps in Pupil data it looks like this ""

Chat image

user-5543ca 14 October, 2019, 11:47:53

However, the timestamps in this other device (Empatics E4) looks like "1570621762" --> which is the start_time in unix timestamp i.e. seconds from 1-1-1970 in UTC.

I noticed in the pupil_info file, that you also show the start time (System) in a comparable format (unix time stamp)

Chat image

user-5543ca 14 October, 2019, 11:49:20

Now, my question is how I could convert the time in "pupil_positions" to unix timestamp i.e. 1570624738 ..and so on for the whole export file of "pupil_positions"

user-5543ca 14 October, 2019, 11:49:30

Thanks a lot in advance! 🙂

user-df9629 14 October, 2019, 14:06:13

Hi, I am using Pupil Invisible and I am trying to collect data using Pupil Invisible Monitor. The gaze prediction on the monitor isn't the same as the preview on the phone. The center of the two red circles are different. I am adding the image for reference. Also, hitting record on the Monitor doesn't record it.

Chat image

papr 14 October, 2019, 14:09:13

@user-df9629 Hey, Pupil Invisible Monitor is not meant for data collection. It only displays the streamed data + applies an optional manual offset. You can apply the offset by focusing an object in the real world, and clicking on it in the preview. The app will calculate the offset between the predicted gaze and the clicked location. This offset is applied to all further gaze until a new offset is set, the offset is reset, or the app is closed.The "R" in the top left resets the manual offset.

papr 14 October, 2019, 14:10:28

Please be aware: Currently, the offset is only applied in Pupil Invisible Monitor, not in the app itself. If you want to make a recording with Pupil Invisible you have to start the recording in the Companion app.

user-df9629 14 October, 2019, 14:18:23

@papr , thank you for clarifying the functionality. I thought the "R" was for record. My bad. One more thing, I noticed the reset button on the companion app but couldn't figure out how to add the offset. Any advice on that?

user-a08c80 14 October, 2019, 21:10:30

Pupil player grey window does not seem to be working on Windows. When pupil player icon is double-clicked to start, a black window appears and a grey window with the drag-recording-here prompt is seen in the task bar when the pupil icon is hovered over, but is not at all accessible. Dragging a recording to the taskbar icon or the desktop icon do not resolve this issue.

user-abc667 14 October, 2019, 21:43:17

Worked extensively with what's now called Core last winter and spring, and am now coming back to continue the research. Need to introduce a new batch of students to the system, data formats, etc. Followed the current documentation to this link: https://docs.pupil-labs.com/user-guide/data-format and when I tried to follow it, got a 404. Huh??! Where is the description of data formats? Thanks!

wrp 15 October, 2019, 06:39:30

Hi @user-abc667 We recently updated the docs and all the redirects are not yet in place. Apologies for the inconvenience.

The links I believe you are looking for are: - Recording format: a list of the expected files in each recording: https://docs.pupil-labs.com/core/software/recording-format/ - Pupil data format: https://docs.pupil-labs.com/developer/core/overview/#pupil-datum-format - Gaze data format: https://docs.pupil-labs.com/developer/core/overview/#gaze-datum-format

papr 15 October, 2019, 07:06:22

@user-a08c80 Hey, could you please try to 1. shutt down Player, 2. delete the user_settings_* files in the pupil_playe_settings folder (you can find it in your home directory), and 3. restart Player?

wrp 15 October, 2019, 07:50:26

@user-df9629 we will respond to your questions in invisible channel.

user-09f6c7 15 October, 2019, 08:22:52

We got some problem with pupil core so can you check this screenshot?

Chat image

papr 15 October, 2019, 08:23:09

Hey @user-09f6c7

user-09f6c7 15 October, 2019, 08:23:32

hello papr~ we got this messages..

papr 15 October, 2019, 08:23:52

Please try installing vc_redist.x64.exe from https://support.microsoft.com/en-ca/help/2977003/the-latest-supported-visual-c-downloads

papr 15 October, 2019, 08:24:03

Afterwards try launching the application again

user-09f6c7 15 October, 2019, 08:24:20

ok I'll try immediatly thanks!

user-df9629 15 October, 2019, 11:32:01

I look forward to it @wrp . Thank you!

papr 15 October, 2019, 11:34:24

@user-09f6c7 were you successful?

user-9c3078 15 October, 2019, 14:54:18

Hi! I am running pupil_capture in a new machine and there is an error 'File "shared_modules/pupil_detectors/init.py", line 21, in <module> ImportError: /usr/lib/x86_64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.22' not found (required by /opt/pupil_capture/pupil_detectors/detector_3d.cpython-36m-x86_64-linux-gnu.so)', however, I've checked the version which is the newest: 'libstdc++6 is already the newest version (5.4.0-6ubuntu1~16.04.11).' What should I do about that?

papr 15 October, 2019, 14:56:30

@user-9c3078 I do not think libstdc++6 is the issue here but GLIBCXX_3.4.22... Do I see it correctly, that the new system is Ubuntu 16.04?

papr 15 October, 2019, 14:57:17

Which version of Capture have you installed?

user-9c3078 15 October, 2019, 14:58:43

@papr Yeah I am using Ubuntu 16.04.6 LTS and capture version is v1.16-80

user-9c3078 15 October, 2019, 15:01:29

Is that software or the hardware issue?

papr 15 October, 2019, 15:08:53

@user-9c3078 Definitively a software issue. I am looking into it.

papr 15 October, 2019, 15:14:03

@user-9c3078

I've checked the version which is the newest: 'libstdc++6 is already the newest version (5.4.0-6ubuntu1~16.04.11). How did you check that?

user-9c3078 15 October, 2019, 15:15:43

@papr To go into the files under /usr/lib/x86_64-linux-gnu/ and I also tried command upgrade

papr 15 October, 2019, 15:16:41

command upgrade Could you please clarify what you meant by that?

Have you run this already?

sudo apt-get install libstdc++6

user-9c3078 15 October, 2019, 15:18:50

@papr yeah, that's the command I use

user-9c3078 15 October, 2019, 15:22:22

And my gcc version is 5.4

user-9c3078 15 October, 2019, 15:44:25

Yes I have run it and I get: libstdc++6 is already the newest version (5.4.0-6ubuntu1~16.04.11)

user-90eebd 15 October, 2019, 15:51:11

Hi, we are looking into pupil diameter changes in response to certain light stimuli. Do you have parameters that allow us to look at horizontal and vertical diameters? Is it possible to calibrate the pupilometer to mm as oppposed to pixels? Do you have a file with the technical specs of the hardware?

papr 15 October, 2019, 15:59:24

hey @user-90eebd 👋 The Pupil detection algorithm assumes the pupil to be a circle. Therefore, we only report one diameter. If you use 3d detection, you can get the diameter in mm (diameter_3d column in the pupil_positions.csv file exported by Pupil Player). I would recommend to freeze the 3d eye model once it is fitted well to avoid a noisy diameter signal due to model changes.

papr 15 October, 2019, 15:59:40

@user-90eebd These are the tech specs: https://pupil-labs.com/products/core/tech-specs

papr 15 October, 2019, 16:01:11

@user-9c3078 can you try installing gcc 6? https://askubuntu.com/a/746480

user-90eebd 15 October, 2019, 16:07:20

hello PAPR,

user-90eebd 15 October, 2019, 16:09:29

hello papr, we found the diameter 3d column (N). the numbers are large (69-88) and alternating in direction (- vs +). hard to imagine pupils are 69 to 88mm, is there are conversion formula ? is the negative vs positive indicating right vs left pupil? do you know the resolution of pupil diameter change that is detectable?

papr 15 October, 2019, 16:13:58

@user-90eebd Please see the id column. It indicates to which eye the diameter belongs to. 70-90 mm is definitively not correct, and results most likely from a not-well fitted eye model.

user-9c3078 15 October, 2019, 16:15:30

@papr There is new error AttributeError: /opt/pupil_capture/libglfw.so: undefined symbol: glfwGetError

papr 15 October, 2019, 16:18:05

@user-9c3078 ok, thank you, that is valuable feedback! Can you try deleting the user_settings_* files in the pupil_capture_settings folder if there are any?

user-9c3078 15 October, 2019, 16:19:33

@papr And run the capture?

papr 15 October, 2019, 16:19:42

Afterwards, yes

user-9c3078 15 October, 2019, 16:19:55

@papr Same problem happens

papr 15 October, 2019, 16:20:35

Ok, can you try an older version, e.g. v1.13 as a temporary work around? We will try to fix this issue with the next release.

user-9c3078 15 October, 2019, 16:24:48

The older version won't have the same problem? Why?

papr 15 October, 2019, 18:21:48

@user-9c3078 two reasons: 1. We had to increase the gcc version for the apriltag detectors. We will have to check why this change is not compatible with Ubuntu 16.04. 2. The glfw error function call has been added to our wrapper recently. I guess we did not properly update the bundled glfw.

user-00cf0f 15 October, 2019, 19:25:44

I am trying to use the LSL Relay plugin, however it is not showing up in Pupil Capture. Therefore, I went into the capture.log in pupil_capture_settings to see what the problem was. This was the error message after "scanning pupil_lsl_relay.py" and "scanning pylsl": Failed to load 'pylsl'. Reason: 'dlsym(0x7fcebdd3b880, lsl_library_info): symbol not found'. Does anyone know how to fix this?

user-09f6c7 16 October, 2019, 00:45:43

@papr we installed VC++ Redistributor what you were recommend so the log messages are changed. but there are still error messages. we are using the pupil core on the LattePanda Alpha(Windows 10 Pro) board.

Chat image

papr 16 October, 2019, 09:26:19

@user-09f6c7 This is the same error as @user-9c3078 has encountered. We will try to fix it in the next release. Until then, please try an older version of Pupil: https://github.com/pupil-labs/pupil/releases/tag/v1.13

user-7f1687 16 October, 2019, 12:36:56

I own a pupil core and wanted to know if I can bypass the use of the android app and connect directly the eye tracker to a smartphone with windows 10

user-c5fb8b 16 October, 2019, 12:50:59

Hi @user-7f1687 can you clarify what specific setup or use-case you mean?

connect directly the eye tracker to a smartphone with windows 10 Do you mean running pupil capture on a non-android smartphone running windows 10 mobile edition? This will most certainly not be possible.

user-7f1687 16 October, 2019, 12:53:31

Thanks, I meant running pupil capture on a windows 10 mobile edition. I've just realised that pupil doesn't work on 32-bit machines. Sorry 😉

user-7f1687 16 October, 2019, 12:56:33

Do you have experience of running pupil capture on more advanced tablets such as Microsoft Surface Go or other tablets? Do you have any suggestion?

user-c5fb8b 16 October, 2019, 13:02:33

@user-7f1687 But we've been able to run Pupil Capture with no problems on a Surface Pro. I can't say anything about the Surface Go though. I am not sure if it will provide enough resources.

user-7f1687 16 October, 2019, 13:08:49

@user-c5fb8b thanks a lot for your help. In the meanwhile if other of you have experience with tables let me know. Thanks!

user-cccded 16 October, 2019, 14:23:45

Hello!

I have downloaded the pupil capture, player and service from github, and i am trying to run it. I can test with laptop cameras, or only with the pupil labs device?

user-c5fb8b 16 October, 2019, 15:16:49

@user-cccded You can use pupil with any usb camera that supports the uvc interface. Most likely your built-in laptop camera will also support uvc.

Keep in mind that you can use out-of-the-box cameras only as world camera and not for pupil detection (and thus the actual eye-tracking). We have a DIY workflow where you build a pupil detection camera from an ordinary webcam though, if you are interested in this: https://docs.pupil-labs.com/core/diy/#diy

Otherwise I'd recommend using any of our products: https://pupil-labs.com/products/

wrp 16 October, 2019, 15:26:20

@user-cccded to add on to @user-c5fb8b's response: Pupil Core software is designed for wearable/head-mounted eye tracking. I just wanted to make sure that we differentiate clearly between remote eye tracking systems vs wearable/head-mounted systems.

When you wrote "can I test with laptop cameras", the technical answer is "yes" because if your laptop camera is UVC compliant then you might be able to select it as a source in Pupil Capture. However, Pupil Core software will not work as a remote eye tracking system.

Pupil Core software requires a wearable/head-mounted eye tracking hardware with eye cameras that capture close up videos of the eye in IR and scene/world camera that captures the FOV of the wearer.

user-222750 16 October, 2019, 16:10:08

Hi, I was wondering if there's a way to overlay the RGB and RGBD videos (from a single recording) in Pupil Player? I'm trying to sync them up but apparently they were saved at different FPS.

user-a65263 16 October, 2019, 23:27:33

Hi, @papr. My lab is using a pupil core with children's frames to do research. We have already tried to increase to confidence value to 0.95 and position the eyes in the center of the eye cameras. Are there any ways to improve the results of calibration?

user-09f6c7 17 October, 2019, 00:11:41

@papr ok. we'll check the next release. and always thanks for your effort.

papr 17 October, 2019, 00:49:14

@user-a65263 please be aware of the difference between the confidence (the quality of Pupil detection) and gaze accuracy (the result of the calibration). Feel free to send one of your recordings to data@pupil-labs.com st. we can have a look at it and maybe give further ideas for improvement. Ideally, include the calibration procedure in the recording st. we can reproduce it using offline calibration.

papr 17 October, 2019, 00:50:41

@user-222750 this is currently not supported in Pupil Player. You would have to align/playback the videos based on their recorded timestamps. It should be possible to write a custom plugin that implements the overlay.

user-222750 17 October, 2019, 06:05:47

Alright, thanks, I seem to have synced them up (to a degree) by modifying their FPS and scaling the depth video. I'll see what I can do with the timestamps.

user-cccded 17 October, 2019, 11:18:50

@user-c5fb8b @wrp thank you for the answers! I understood the requirements of the software, I am going to see the DIY to build the pupil detection camera. Thanks again!

user-f086ad 17 October, 2019, 14:18:07

Hey everyone, I am having hard time to understand the angle field (key) and its value. Pupil generates it for every gaze data. What's the essence of it? What does it really show us?

user-c5fb8b 17 October, 2019, 14:20:51

Hi @user-f086ad do you mean the ellipse-angle?

user-f086ad 17 October, 2019, 14:22:06

@user-c5fb8b exactly, actually there are two angle keys

user-c5fb8b 17 October, 2019, 14:26:20

@user-f086ad The angle refers to the angle of the ellipse. Here a section from the OpenCV docs for drawing ellipses. At the bottom of the ellipse function you can see a graphic of what all the parameters represent. https://docs.opencv.org/2.4/modules/core/doc/drawing_functions.html#ellipse

user-c5fb8b 17 October, 2019, 14:29:17

@user-f086ad The projected_sphere_angle is only present when using the 3D pupil detection and refers to the ellipse that you get when you project the estimated 3D eyeball back onto the image. You can see it in the eye windows drawn as green ellipses.

user-f086ad 17 October, 2019, 14:30:10

You are my hero @user-c5fb8b

user-a65263 17 October, 2019, 20:45:39

@papr You are right. We want to have both high quality of pupil detection and the gaze accuracy. What we want to ask is if you could give us some tips on operational steps or strategies? We have read through the official documents on your web, but it doesn’t have much information about many functions listed on pupil capture program.

user-37ea30 18 October, 2019, 18:22:21

Hello Guys, I am using the pupil core for the first time and I am interested in measuring the pupil size. I think I am doing something wrong because although I am recording the pupil size in 3d, when I export the data from the Pupil Player the measurement of the 'diameter_3d' is not in mm (it is probably in pixels, since it's a huge number). Am I doing something wrong? Thanks in advance. 🙂

user-f2c41a 18 October, 2019, 21:19:26

Hello, I've just installed Pupil on my Ubuntu 16.04 laptop and am trying to do offline calibration in Player. When I do pupil detection, I get the following error: "install pyrealsense to the intel realsense backend" - can someone point me to some instructions on how to resolve this?

user-c6717a 19 October, 2019, 01:40:43

We have a Pupil Labs Core device with the World Camera, Moto Phone, and two cameras. Is it possible to record all of the eye tracking data to the phone only without having to be on the same wifi network as a recording computer? I see that I can record the video, audio, IMU, etc. info to the phone, but I'm not getting any fixations when I load the data into Pupil Player. I've tried doing the calibration on the computer and then starting the recording on the phone, but it doesn't seem like that calibration information is transferred to the phone. Any guidance would be much appreciated. Thanks!

user-e1d9a6 19 October, 2019, 05:36:34

Chat image

user-e1d9a6 19 October, 2019, 05:37:29

Hi! May I ask what is this "confidence" mean? How is it calculated?

wrp 20 October, 2019, 01:10:30

@user-f2c41a this is just warning log message. Pupil detection should be functional regardless

wrp 20 October, 2019, 01:12:30

@user-c6717a Pupil Mobile only records raw data streams. You can detect pupil in your Pupil Mobile recordings and calibrate post-hoc with Pupil Player. Please ensure that you record the calibration sequence.

user-6e3a7c 21 October, 2019, 02:56:54

Hi, I'm new to Pupil products, and have just started using it as part of my GIS courses for college. I was wondering if anyone is familiar with exporting data into a software like ArcPro to create a heat map?

user-a48e47 21 October, 2019, 03:23:20

Hello. I'm studying about "take-over" in autonomous driving,

I'm trying to use fixaiton data using Pupil Labs, but I'm writing because the coordinates are incorrect.

In my analysis last year, I found a place to do this by setting the coordinates on the right side of the pupil player (ex. (3,3), (8,8) → (0,0), (5,5)). However, I can't find it now. I would appreciate it if you guys could give me the answer.

user-6b3ffb 21 October, 2019, 11:46:02

Hi, I'm also facing the same issues as @user-09f6c7 and tried to run v1.13 and then I stacked to the following error:

Chat image

user-c5fb8b 21 October, 2019, 12:17:09

Hi @user-6b3ffb I see you are running from source. The dependencies changed a bit from v1.13, specifically back then we had a dependency on the c++ boost library, that we got rid of. Is it an option for you to run pupil from bundle? Then you can just download the v1.13 release bundle from GitHub here: https://github.com/pupil-labs/pupil/releases/tag/v1.13 Otherwise you will have to dig through the history of our documentation for how to set it up. I looked up the windows setup instructions from the date of the v1.13 release: https://github.com/pupil-labs/pupil-docs/blob/1c4fb94743ff8430d4cef6bc4a1f157a3fbe88b6/developer-docs/windows.md

user-7f1687 21 October, 2019, 13:14:48

Hi there. One of the eye cameras of my binocular pupil core apparently doesn't work. I also tried to invert the left and the right cameras but the problem seems to be related to the right part of the system.

user-7f1687 21 October, 2019, 13:15:28

Can you help me? Should I contact the support?

papr 21 October, 2019, 13:22:04

@user-7f1687 Please contact info@pupil-labs.com

user-7f1687 21 October, 2019, 13:26:50

thanks

user-86a23a 21 October, 2019, 15:39:09

Hi there! Could you tell me if one of your eye trackers is compatible with a regular virtual reality head mounted device?

papr 21 October, 2019, 16:52:13

@user-86a23a Hey, check these out: https://pupil-labs.com/products/vr-ar/

user-86a23a 21 October, 2019, 17:16:12

Great! Thanks

wrp 22 October, 2019, 02:04:41

Hi @user-a48e47 I am not sure I understand exactly what you're talking about. Perhaps you're referring to manual offset? If this is what you want to do, then you can do so by performing a post-hoc calibration aka offline calibration: https://docs.pupil-labs.com/core/software/pupil-player/#gaze-data-and-post-hoc-calibration

user-072005 22 October, 2019, 09:48:18

Hello, in previous versions of player I was able to set trim marks for the location of the calibration marker detection, but I can't find it (in 1.15). It's finding "calibration markers" in the world that aren't really. Is the ability to set trim markers still there? Or a way to remove the incorrectly detected markers?

papr 22 October, 2019, 10:12:54

@user-072005 It is both possible: 1. Set a calibration range via trim marks

Chat image

papr 22 October, 2019, 10:13:22

@user-072005 2. Remove false positives manually by clicking on them

Chat image

user-072005 22 October, 2019, 10:37:48

oops, I see I needed to click new calibration to see it. Thanks

user-00cf0f 22 October, 2019, 18:39:27

Hello @papr , I was wondering what the best way is to use the Pupil Core in real time? I have been trying to use the pupil_lsl_relay through Python, however, it is not showing up in the plugin menu in Pupil Capture. This is the error I get from looking at the capture.log in pupil_capture_settings: "Failed to load 'pupil_lsl_relay'. Reason: 'dlsym(0x7f9fcd5c0f10, lsl_library_info): symbol not found' ". Do you know how to fix this error or is there a different way that would be better for doing real time?

user-fa3706 22 October, 2019, 20:45:19

Hi does anyone know how to transfer surface settings from one version of pupil capture to another? Our research team was using 1.10.20, but we've run into some technical issues, which seem to get resolved when we use the most current software version, but our surface settings did not transfer over to the newer version as they have when have previously updated the software. I know that Capture generates surface configuration file, but I'm not sure how to make it so that the current software version sees it as well.

user-fa3706 22 October, 2019, 21:00:31

There is a directory for Pupil_capture_settings in the users folder on our pc and thats where I found the surface _definitions file so I assume it can be transferred over to the new software somehow

user-9d7bc8 22 October, 2019, 21:06:54

Hello. I'm trying to launch Capture, but I'm encountering an error I don't know how to resolve, seemingly involving it's attempt to update drivers. The attached file is a copy of it's output. I've tried completely removing and redownloading the software, but that had no effect at all. How would I fix this error?

out.txt

user-9d7bc8 22 October, 2019, 21:08:12

The previous file I uploaded was incorrect, I apologize. This one is correct.

out.txt

user-9d7bc8 22 October, 2019, 21:16:56

It doesn't show an error if I don't have the Core connected to my machine.

user-c5fb8b 23 October, 2019, 10:04:15

@user-00cf0f Hi, do you specifically want to use Pupil Core with the LSL framework? In gereal Pupil Core already has an interface for real-time streaming. the pupil_lsl_relay plugin only makes sense to use if you want to hookup Pupil Core's custom real-time streaming with other infrastructure that uses the LSL streaming framework. Anyways, it seems like your pylsl installation might be corrupted, which is why pupil_lsl_relay fails to load. Please try to reinstall pylsl. Also try running the examples that are shipped with pylsl first to check that your installation of pylsl is working.

user-c5fb8b 23 October, 2019, 10:16:57

Hi @user-fa3706 the surface tracker has changed quite a bit from v1.10. The old surface definitions are unfortunately not compatible with the new version. An upgrade is also not possible since a couple of things changed that are not automatically inferable. You will have to redefine your surfaces in the new version of player. The surface definitions are stored separately for every recording in the recording folder.

papr 23 October, 2019, 10:24:47

@user-fa3706 To elaborate on what @user-c5fb8b said: In versions previous to v1.13, surfaces were defined in the distorted pixel space. Since v1.13, Pupil defines surfaces in the undistorted camera space. Theoretically, Player is able to upgrade your old surface definitions, but only approximately. Therefore, it is recommended to create new ones.

user-c5fb8b 23 October, 2019, 11:58:24

Hi @user-9d7bc8 It seems there is some trouble with the driver installation. Please have a look at the following section: https://docs.pupil-labs.com/core/software/pupil-capture/#windows Unfortunately the reference from point 8 is currently broken (we are working on it) but you can find the relevant information here: https://github.com/pupil-labs/pupil-docs/blob/1b2a9015b82dd45619fd3a50ab209905753059e8/developer-docs/win-driver-setup.md#install-drivers-for-your-pupil-headset

Please try following this trouble-shooting guide and report back if you are still experiencing this issue.

user-a6e48e 23 October, 2019, 12:53:21

Hello, I noticed that the framerate I am getting from frame publisher is lowered to 1 fps when I disconnect the monitor from a computer which is running the pupil capture. Can this be fixed somehow?

papr 23 October, 2019, 12:55:00

@user-a6e48e Are you disconnecting all monitors from this computer? Or only one of multiple?

user-a6e48e 23 October, 2019, 12:55:15

Actually I am closing the lid of a laptop

user-a6e48e 23 October, 2019, 12:55:36

@papr, the same thing hapens when I am switching to another ubuntu workspace

papr 23 October, 2019, 12:57:22

@user-a6e48e I think this is due to GLFW (the cross-platform library for creating/handling windows) specific behaviour. 😕

papr 23 October, 2019, 12:58:11

@user-a6e48e What happens if you minimize the app before switching the workspace on ubuntu?

user-a6e48e 23 October, 2019, 12:58:28

@papr, let me check

user-a6e48e 23 October, 2019, 13:00:18

@papr, it didn't help

user-a6e48e 23 October, 2019, 13:02:22

@papr, or actually it works

user-a6e48e 23 October, 2019, 13:02:37

I checked again after restarting the Pupil Capturr

papr 23 October, 2019, 13:02:53

oh ok...

user-a6e48e 23 October, 2019, 13:02:59

@papr, I am checking with the lid closed now

user-a6e48e 23 October, 2019, 13:03:31

@papr, wow, it seems to be OK now 😄

user-a6e48e 23 October, 2019, 13:03:57

@papr, thanks a lot!

papr 23 October, 2019, 13:04:08

Let's hope it stays that way 🙏

user-a6e48e 23 October, 2019, 13:04:45

@papr, I'll experiment a bit, but now I know there's a way

user-a6e48e 23 October, 2019, 13:04:51

thanks again, have a nice day

papr 23 October, 2019, 13:05:00

You too!

user-fa3706 23 October, 2019, 13:40:44

@papr @user-c5fb8b Thanks guys!

user-fd5a69 23 October, 2019, 14:23:21

Hi @papr , I'm using pupil capture's remote recorder plugin to start two phones' recordings and I want the two recorded videos to be synced. Is there any way to start both phones' recordings simultaneously?

papr 23 October, 2019, 14:24:30

@user-fd5a69 Unfortunately not, and even if you would send the start command with the same button click, it would not be guaranteed that both recordings start at the same exact time due to network delays.

papr 23 October, 2019, 14:25:24

@user-fd5a69 I recommend to enable the time sync plugin in Capture s.t. both Pupil Mobile devices sync to Capture. This way the data generated by both recordings is comparable in time.

user-fd5a69 23 October, 2019, 14:26:54

@papr I see. Then I'll use ffmpeg to sync the videos manually afterwards. Thank you for the info!

user-2ff80a 23 October, 2019, 14:44:51

Hi all, our lab is looking into buying another Pupil Core binocular eye-tracker, to use with a RealSense camera and a mini-PC. We hoped to get an idea of whether the systems we'd like to buy will be capable of working well with the eye-tracker, since we have had issues in the past with the phone-based setup we started with. Would a system like this work? https://www.tuxedocomputers.com/en/Linux-Hardware/Linux-Computers-/-PCs/Intel-Systems/TUXEDO-Nano-v8-Mini-PC-max-Intel-Core-i7-Quad-Core-max-32GB-RAM-max-2-HDD/SSD/M-2-NVMe-VESA-Mounting.tuxedo#!#configurator

user-31df78 23 October, 2019, 22:48:53

Hi, is it advisable to use the sliding connections between the eye camera arms and the Core headset as an adjustment area? Is it designed to be partially slid outwards when attempting to capture a better picture for that eye camera?

user-e2056a 24 October, 2019, 01:01:54

hi@papr, if surfaces were not defined during recording (but the markers were visible), can I later define them in pupil player and get same results on the surfaces?

wrp 24 October, 2019, 04:41:34

@user-31df78 yes, the eye cameras are designed to be moved along this rail as well as orbited about the ball joint. Please see: https://docs.pupil-labs.com/core/hardware/#headset-adjustments Note Eye cameras in the animations are old 120Hz eye cameras, but the same principles of adjustment still apply.

wrp 24 October, 2019, 04:42:15

@user-e2056a Yes, if markers are present in the recording then you can define surfaces post-hoc in Pupil Player and still get data relative to surface(s).

user-e2056a 24 October, 2019, 14:57:59

thank you@wrp

user-8779ef 24 October, 2019, 15:00:11

also, check out this new pupil detection algo we've been working on 😛 We do plan on integrating it into the PL framework soon. the first authors are about to present it at the Facebook workshop at ICVP - they won first prize!

user-8779ef 24 October, 2019, 15:00:52

To be clear, this demo is just hijacking the eye cam imagery. Not yet integrated with the Pupil Labs software pipeline.

papr 24 October, 2019, 15:02:20

@user-8779ef Gratulations! We are also currently refactoring our pupil_detector code base, such that it should be easy to integrate it! Head over to software-dev for questions.

user-8779ef 24 October, 2019, 15:03:11

Oh, niiiice. I'll ask you for advice before we integrate, then. It may be more efficient for us to wait until the refactor.

user-8779ef 24 October, 2019, 15:03:27

...and then you guys get some help debugging!

user-663462 24 October, 2019, 15:12:37

hi, I just downloaded pupil_v1.16-80-g27f9153_windows_x64.7z when i try to start Pupil Capture I get this Error Message "2019-10-24 17:04:38,241 - MainProcess - [DEBUG] root: Unknown command-line arguments: [] 2019-10-24 17:04:38,242 - MainProcess - [DEBUG] os_utils: Disabling idle sleep not supported on this OS version. 2019-10-24 17:04:39,963 - world - [INFO] launchables.world: Application Version: 1.16.80 2019-10-24 17:04:39,963 - world - [INFO] launchables.world: System Info: User: Robin, Platform: Windows, Machine: Primebook, Release: 10, Version: 10.0.18362 2019-10-24 17:04:39,999 - world - [ERROR] launchables.world: Process Capture crashed with trace: Traceback (most recent call last): File "launchables\world.py", line 136, in world File "c:\python36\lib\site-packages\PyInstaller\loader\pyimod03_importers.py", line 627, in exec_module File "shared_modules\pupil_detectors__init__.py", line 21, in <module> ImportError: DLL load failed: Das angegebene Modul wurde nicht gefunden.

2019-10-24 17:04:40,008 - world - [INFO] launchables.world: Process shutting down." Any Ideas on how to solve this?

user-c5fb8b 24 October, 2019, 15:16:51

Hi @user-663462 please download and run the vc_redist.x64.exe file from the official Microsoft support page. Afterwards, Pupil should start up as expected. https://support.microsoft.com/en-ca/help/2977003/the-latest-supported-visual-c-downloads

user-c5fb8b 24 October, 2019, 15:17:27

If this does not fix the issue, please report back to us!

mpk 24 October, 2019, 15:42:15

@here 📣 Pupil Software Release Update v1.17-6 📣 This release addresses Surface Tracker instabilities and issues. We have also enabled Pupil Invisible video streaming via the Pupil Mobile backend and removed the eye movement classifier.

Check out the release page for more details and downloads: https://github.com/pupil-labs/pupil/releases/tag/v1.17

user-e2056a 24 October, 2019, 17:32:58

Hi @wrp , what is the procedure of adding surfaces in pupil player? is it the same as in Pupil capture? In pupil player, I could not see if the markers were all visible when adding surfaces, thank you!

user-88b704 24 October, 2019, 20:00:40

Hi , does any OnePlus 6 phone work with the core eye tracker and pupil mobile, or does one have to purchase the Core Mobile Bundle?

wrp 25 October, 2019, 03:38:54

@user-e2056a yes, the procedure of adding surfaces should be the same in Pupil Capture and Player.

wrp 25 October, 2019, 03:39:45

@user-88b704 yes, you can use Pupil Mobile on OnePlus6 we do not modify the device at all. However, you must have a high quality USB-C to USB-C cable to connect your android device to Pupil Core headset and must ensure USB OTG is enabled and the app is locked.

wrp 25 October, 2019, 03:56:53

@user-e2056a I strongly recommend that you upgrade to the latest version of Pupil for marker tracking if you are using the new AprilTag markers: https://github.com/pupil-labs/pupil/releases/tag/v1.17 -- if you are using the legacy square markers, let us know so that we can provide you with some feedback.

user-e2056a 25 October, 2019, 14:25:17

@wrp, thank you, we are using the square markers. We noticed that after switching to the latest version of Pupil, the previously defined surfaces could not transfer. Our colleague has reflected this issue to Pupil, so we decided to keep using earlier version (V1.13) to be safe.

user-030f61 25 October, 2019, 14:41:03

Hi there

user-030f61 25 October, 2019, 14:41:15

anyone here from pupil labs?

papr 25 October, 2019, 17:58:41

@user-e2056a have you tried changing the detector mode to the legacy square markers in the newer versions? Your surface definitions should still be there, but by default we detect Apriltags. If you change the detection back, your surfaces should work as expected.

papr 25 October, 2019, 17:58:46

@user-030f61 yes

papr 25 October, 2019, 17:59:16

Please contact info@pupil-labs.com regarding hardware issues 🙂

user-4bf830 25 October, 2019, 20:12:11

@papr this is occurring on a new Windows 10 computer. This is the newest software update, any thoughts?

Chat image

user-31df78 25 October, 2019, 20:39:17

@user-4bf830 I'm going to guess that you need the visual C++ redist from here https://support.microsoft.com/en-ca/help/2977003/the-latest-supported-visual-c-downloads

wrp 26 October, 2019, 01:50:46

@user-4bf830  please download and run the vc_redist.x64.exe file from the official Microsoft support page (thanks @user-31df78 for the link 😸 ). Afterwards, Pupil should start up as expected.

user-123d16 28 October, 2019, 17:13:56

Hey, could anybody tell me how I can get the data for the full eye movement in a recording? For example, if I looked at an image with 4 points and I looked at them in sequence, I want to get the eyetracking data that shows my eyes tracing the shape of those 4 points.

papr 28 October, 2019, 17:14:47

@user-123d16 In which format would you like to have the data?

user-123d16 28 October, 2019, 17:15:03

I want it visualized as an image

papr 28 October, 2019, 17:15:03

Have you opened the recording in Pupil Player already?

user-123d16 28 October, 2019, 17:15:08

Yes, I have

papr 28 October, 2019, 17:15:40

And is the export of the visualization as a video an option for you?

user-123d16 28 October, 2019, 17:16:29

I have the World Video Exporter and Raw Data Exporter

user-123d16 28 October, 2019, 17:16:42

and Eye Video and iMotions

user-123d16 28 October, 2019, 17:17:13

Essentially what I want to be able to see is the heatmap, but connected with lines

user-123d16 28 October, 2019, 17:17:34

the lines that trail my eye movements

papr 28 October, 2019, 17:18:14

@user-123d16 Checkout this tutorial on visualising the scan path on a defined surface: https://nbviewer.jupyter.org/github/pupil-labs/pupil-tutorials/blob/master/03_visualize_scan_path_on_surface.ipynb

papr 28 October, 2019, 17:19:04

Here you can find more about surface tracking in general: https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking

user-123d16 28 October, 2019, 17:19:18

Okay, thank you

user-123d16 28 October, 2019, 17:19:48

I have gotten the heatmaps working, so this solution using Python is what I will need to learn to get what I'm looking for

papr 28 October, 2019, 17:20:15

If you have the heatmaps working this means that you have a working surface already 👍

papr 28 October, 2019, 17:20:45

Yes, unfortunately, the scan path on surfaces is currently not directly exported by Pupil Player.

user-123d16 28 October, 2019, 17:21:16

I don't have any experience with programming, but I will give it a shot!

user-123d16 28 October, 2019, 17:21:35

That's no problem, thanks for your help!

user-908b50 29 October, 2019, 01:03:02

Hi All, We use version v1-11.4 of pupil capture. The programs calibrates as one off. Sometimes, the calibration is perfect. Other times, there is a lag of about 2 cm to the left, bottom or top. To use it with our paradigm, we started with covering the plus and minus signs to now covering the bottom screen entirely, which contain black oval shaped text boxes (font is in white). Any ideas why this is so? Would you recommend downloading the new version. I don't want to install updates part-way through data collection. Thanks!

user-09f6c7 29 October, 2019, 06:53:23

@papr Thank you for the 1.17 release then we can check more information about GLFW fail.

Chat image

user-09f6c7 29 October, 2019, 06:54:09

and the LattePanda Alpha is using the Intel HD Graphics 615 chipset so the solution is here : https://downloadcenter.intel.com/product/96554/Intel-HD-Graphics-615

user-09f6c7 29 October, 2019, 06:54:42

Thanks again!

papr 29 October, 2019, 07:22:39

@user-09f6c7 Nice! Great to see this feature finally working! 💪

papr 29 October, 2019, 07:27:44

@user-908b50 Could you provide us with a screenshot of your setup such that it becomes clearer what is meant by

calibrating as one off and plus and minus signs ?

papr 29 October, 2019, 07:43:31

@user-908b50 Also, starting in version v1.11, you can recalibrate your recording in Pupil Player and apply a manual offset. See our youtube tutorial for details: https://www.youtube.com/watch?v=_Jnxi1OMMTc&list=PLi20Yl1k_57rlznaEfrXyqiF0sUtZMMLh

user-6cdb90 29 October, 2019, 17:44:05

Hi All, I am using pupil eye tracker as a device for my research. Recently, in one of my experiments, I need participants to walk in different environments while wearing the eye tracker; so, I have searched for wireless eye trackers...As they are so expensive, I tried the pupil mobile app. Unfortunately, it does not work and after transferring recorded file to the computer, there are some .mjpeg files and I could not play them even converting them to the .mp4 format. Therefore, I am looking to buy an android device which completely fits with the pupil mobile app. I know there is a Core Mobile Bundle device but due to my university policies, It is not possible to purchase the mobile and I have to buy the smallest android tablet. I greatly appreciate it if anyone can help me to find the best android tablet just for using pupil eye tracker and pupil mobile app.

papr 29 October, 2019, 18:10:22

@user-6cdb90 have you tried to open the Pupil Mobile recording in Pupil Player?

papr 29 October, 2019, 18:11:03

Just drag the folder containing the videos onto Player and you should be able to playback and export your recording.

user-31df78 29 October, 2019, 18:15:22

Sorry, kind of expanding from that, but what might be the reason that the .mp4 files in the recording directory cannot be read by normal video players?

user-6cdb90 29 October, 2019, 18:16:59

@papr I did it but there is no any gaze mapping or pupil detection data. The related .cvs files are empty. I think the problem is setting up the pupil app for recording. Is there any specific feature for the android device that I need to use the pupil mobile app in the best way?

papr 29 October, 2019, 18:20:50

@user-6cdb90 no, this is expected. Check out this YouTube tutorial on how to get the Pupil and gaze data https://www.youtube.com/watch?v=_Jnxi1OMMTc&list=PLi20Yl1k_57rlznaEfrXyqiF0sUtZMMLh

papr 29 October, 2019, 18:21:25

@user-31df78 you should be able to open them in vlc player.

user-31df78 29 October, 2019, 18:25:35

Thanks papr, was probably too confident MPCHC could also open everything, I'll try VLC

user-31df78 29 October, 2019, 18:40:54

Also is there a built-in way to automatically generate static images with world video + gaze position (like in exported video) for each fixation identified?

papr 29 October, 2019, 18:47:26

@user-31df78 no, this is not built-in

user-31df78 29 October, 2019, 18:51:03

Alright, just wanted to make sure I wouldn't be reinventing the wheel 👍

user-6cdb90 29 October, 2019, 19:54:44

@papr Thank you for your response! Actually, I followed the video instruction but there are errors indicate that there are no eye videos. I think the problem backs to the pupil app and recording part. In the mobile there is no gaze mapping and pupil detection. How can I fix this?

papr 29 October, 2019, 19:55:51

@user-6cdb90 could you share the recording with data@pupil-labs.com then I can have a look at it tomorrow.

user-6cdb90 29 October, 2019, 22:50:02

@papr I have shared the recording file. Thank you again for your help!

user-9d7bc8 29 October, 2019, 23:46:58

Hello. I'm trying to use the IPC Backbone to subscribe to the binocular gaze data topic, but Capture doesn't seem to be sending any data, despite having both cameras working and calibrated. It does send data to the monocular gaze data topic. Do you know how I might fix this?

papr 30 October, 2019, 09:21:53

@user-9d7bc8 Please be aware that pupil positions with confidence less than 0.6 will always be mapped monocularly. This was introduced to avoid degradation of the gaze mapping result.

user-d77d0f 30 October, 2019, 10:52:20

Hi! Is it possible to have information on how the confidence is calculated?

papr 30 October, 2019, 16:16:17

@user-d77d0f Hey 👋

  • 2d confidence: The ratio of how many of the candidate pupil edge pixels lie on the final estimated ellipse. For details see https://arxiv.org/pdf/1405.0006.pdf
  • 3d confidence: A combination of the 2d confidence and how well a new 2d ellipse fits on an existing 3d eye model
user-bda130 30 October, 2019, 17:25:12

Quick question about Pupil Player - What does the yellow line at the bottom of the screen correlate too? I am aware that the purple and green line correlate to gaze mapper and calibration.

Chat image

papr 30 October, 2019, 17:27:51

@user-bda130 it represents the validation range

user-bda130 30 October, 2019, 17:34:28

Great, thanks! I am having an issue with multiple cross hairs in a single frame. I thought adjusting the trim markers for validation range, gaze mappers, and calibration would get rid of this issue, so that the different calibrations/gaze mappers/validation range I've created did not overlap. However, this has not helped. Is there another thing I should try?

Chat image

papr 30 October, 2019, 17:35:17

@user-bda130 This is due to multiple gaze positions being displayed on a single world frame.

papr 30 October, 2019, 17:37:24

Since gaze is estimated at an higher sampling rate (120-200Hz) than the scene is recorded (30-60Hz), we group gaze points by time and display them together for each scene frame. Each gaze position is visualized by its own cross.

user-bda130 30 October, 2019, 17:40:56

Is there a way that I can get it to average the crosses, or just simplify it to one, so that it is just one cross per frame?

papr 30 October, 2019, 17:44:00

You could write your own visualization plugin. 1. Make a copy of the original https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/vis_cross.py 2. Place the copy in the pupil_player_settings/plugins folder 3. Change the class name in line 20 4. Filter the pts list to your liking before calling the drawing for-loop in line 50

user-bda130 30 October, 2019, 17:48:14

Thank you very much

user-31df78 30 October, 2019, 18:07:05

Our team had a slight issue with text not showing up properly until a restart of Capture. We were running on Mac and I haven't yet reproduced the problem on Windows. Any suggestions?

user-31df78 30 October, 2019, 18:07:09

Chat image

user-31df78 30 October, 2019, 18:08:03

The text issues are in the id1 conf graph and the apriltag label.

user-908b50 30 October, 2019, 18:51:44

@papr I will send you a few pictures of the setup. Thanks for sending the youtube video. I'll take a look and follow the steps. Will update you on that as well.

user-bda130 30 October, 2019, 19:40:51

@papr After trying to work through your steps, I am not sure I totally understand step 4 about filtering the pts list

papr 30 October, 2019, 19:47:42

@user-bda130 the pts variable contains a list of all gaze points. You can either calculate the mean or remove all but one point from the list, it is up to you. The important part is that only one point remains such that only one cross is drawn.

user-bda130 30 October, 2019, 19:54:32

@papr Are you referring to this section?

Chat image

papr 30 October, 2019, 19:55:53

@user-bda130 no, this is already the drawing code

papr 30 October, 2019, 19:56:48

You should remove entries from the pts variable before the for loop in which the cross is drawn

user-bda130 30 October, 2019, 20:04:47

@papr This section? I apologize. I am not very familiar with python coding, so I am not sure what I am looking for or what will need to be rewritten in order to get the mean of the crosses collected.

Chat image

papr 30 October, 2019, 20:08:38

@user-bda130 one sec, I am testing an example

user-bda130 30 October, 2019, 20:11:58

Thank you so much!!

papr 30 October, 2019, 20:12:56

Example of the mean cross. In situations like this, it is not accurate. I think the best way would be to visualize the gaze that is closest to the frame in time.

Chat image

user-bda130 30 October, 2019, 20:15:10

Okay that is a good point. So how would I visualize the gaze that is closest to the frame in time?

papr 30 October, 2019, 20:23:58

@user-bda130 https://gist.github.com/papr/659f88acc5addbd0c9a1252ebc8d4db7

To install as a plugin click the Raw button at the top right of the document and save it as single_cross.py in the Pupil Player plugins folder.

See these lines for the selection of the correct gaze point: https://gist.github.com/papr/659f88acc5addbd0c9a1252ebc8d4db7#file-single_cross-py-L44-L56

user-bda130 30 October, 2019, 20:29:37

You are an absolute rockstar, @papr !! This is greatly appreciated. Thank you, thank you.

user-eb1bb0 31 October, 2019, 08:51:47

hi

user-eb1bb0 31 October, 2019, 08:53:02

need a help to design a meeting room with vr,ar spatial etc

papr 31 October, 2019, 08:55:07

Hi @user-eb1bb0 Are you referring to this? https://spatial.is/

papr 31 October, 2019, 08:56:08

And do you need with designing the room itself or do you need help with integrating Pupil eye tracking into it?

user-f3a0e4 31 October, 2019, 10:48:03

Hi @papr I am currently undertaking research with young children who (obviously) have much smaller heads. To optimise the camera position I have to utilise the arm extenders so I can get the pupil more square on. The issue is that doing so causes more shadows and ultimately poorer pupil capture. Do you have any recommendations on how to optimise pupil capture when using the extenders? Thanks.

papr 31 October, 2019, 11:44:21

@user-f3a0e4 Do you use the normal sized or child-size Pupil Core headsets? You can attempt to decrease the area of the pupil detection by changing the ROI in the eye window. Eye window -> General Settings -> Mode: ROI -> drag corners s.t. the pupil is included and the shadows excluded.

Additionally, you can change the intensity range if the shadows are not as dark as the pupil.

user-eb1bb0 31 October, 2019, 11:45:29

Hi @papr my requirement for meeting room is shown below.

• The room is supposed to leverage cutting edge technology including, but not limited to, potential use of: Virtual Reality (VR), Human Centric Artificial Intelligence, Augmenter Reality (AR), spatial data etc. • User friendly data manipulation with real-time output update • The room should be able to loop in via video/phone/screen sharing/data sharing any relevant stakeholders to the decision making process • Touch, voice, text, video interfacing are to be considered to enable decision-making • AI/Machine learning enabled algorithms to facilitate decision-making by specific users given specific meeting dynamics • The room should be easily upgradable (hardware and software) to account for technological advancements • Components of the room should be easily movable; thus use of projectors instead of screens is recommended • The room should be integrated with the existing data systems

papr 31 October, 2019, 11:51:15

@user-eb1bb0 I am afraid that this is not the right place for this kind of questions/problems. This server is specifically used for the community and questions around the Pupil platform https://github.com/pupil-labs/pupil and its related products https://pupil-labs.com/products/. In a broader sense we also discuss mobile eye tracking. Your problem does not seem to be related to any of the above.

papr 31 October, 2019, 12:51:36

@user-31df78 Unfortunately, I do not exactly know is triggering this but it is not the first time that I see this either.

papr 31 October, 2019, 12:52:03

Generally, this happens if some opengl drawing code is not 100% correct

user-c0daa8 31 October, 2019, 13:03:00

Hello, is pupil core good for both screen tracking (ads, visuals etc.) and also offline such as store tracking ?

papr 31 October, 2019, 13:07:50

@user-c0daa8 Hey 👋 Yes, Pupil is able to provide real-time gaze data as well as do all calculations offline. Additionally, you can use our surface tracker functionality to get gaze in relation to defined areas of interest. https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking

End of October archive