πŸ‘ core


user-8058d7 01 December, 2017, 03:01:40

@papr Thanks for kind reply last night

user-8058d7 01 December, 2017, 03:03:00

Could you let me know where is the algorithm codes about depth estimation part

user-29e10a 01 December, 2017, 09:11:22

Hi! Any news about the VR integration of the new 200 Hz cameras? We're about to order... πŸ€‘

user-e1dd4e 01 December, 2017, 11:54:31

So I'm not exactly sure what casused this but I went to record recently and there was this black ring around the image. Did the foward facing camera's lense just come loose? Anyone know how to fix this?

Chat image

papr 01 December, 2017, 12:17:26

@user-e1dd4e This is normal if you record in Full HD (1920x1080). If you set the frame size to 1280x720 the ring should not be visible.

wrp 01 December, 2017, 13:48:11

@user-29e10a we have 200hz cameras and Pupil headsets for 200hz cameras now, but are still putting finishing touches and further testing into the new mounts for the 200hz VR integrations.

wrp 01 December, 2017, 13:52:06

@user-98013c Accuracy - "Accuracy is calculated as the average angular offset (distance) (in degrees of visual angle) between fixations locations and the corresponding locations of the fixation targets."

Precision - "Precision is calculated as the Root Mean Square (RMS) of the angular distance (in degrees of visual angle) between successive samples during a fixation"

From source code here: https://github.com/pupil-labs/pupil/blob/465f02aa83e2a742091f5640cd69f3ec7d87fb60/pupil_src/shared_modules/accuracy_visualizer.py#L60-L67

Also displayed in the Accuracy Test after running the test

wrp 01 December, 2017, 13:52:30

@user-98013c see also docs https://docs.pupil-labs.com/#notes-on-calibration-accuracy

user-2798d6 01 December, 2017, 19:11:28

Is there any potential issue with having long recordings (like 30 minutes) but only parts of the recording are actually capturing the data I want? I'm trying to avoid recalibrating for several different videos and am planning to just do 4 different experimental run in one big take.

mpk 01 December, 2017, 19:21:54

@user-2798d6 if you use Pupil Mobile for this I recommend doing one long recording. I you use capture, then jsut calibrate first and do short recordings when needed.

user-2798d6 01 December, 2017, 19:31:40

I'm doing offline calibration, so I'd need to do a separate calibration sequence for each recording right?

user-2798d6 01 December, 2017, 19:32:27

The online calibration does get the accuracy that I'm looking for

user-2798d6 01 December, 2017, 19:33:05

Can I use mapping ranges to do the offline calibration if I do several experimental procedures within one long recording?

mpk 01 December, 2017, 19:33:20

yes. mapping ranges would make sense then!

user-2798d6 01 December, 2017, 19:33:43

Perfect! Thanks!

user-2798d6 01 December, 2017, 19:34:14

Do I include the calibration frames in the mapping ranges or does that not matter?

user-2798d6 01 December, 2017, 19:53:40

One other thing - the Eye 1 camera window sometimes does not display the menu/plugins. The sidebar is there, but when I click on the settings icon, for example, it doesn't expand out.

user-8058d7 03 December, 2017, 10:25:08

Hello, i have a question about how accurate the binocular eye camera is, even if it has a depth difference compared to a mono eye camera.

user-4e7774 03 December, 2017, 19:52:32

Hello everyone, I'm new in eye tracking. I'm doing a project where I will do some eye tracking tests with participants. My question is: Pupil is a eye tracking device, but it is possible to link cognitive test to the device. For exemple, a fixation with random stimulus that comes in. Thanks !

user-8779ef 04 December, 2017, 15:31:23

@user-2798d6 I strongly suggest you recalibrate once or twice during the 30 minute recording. In MOST trackers, tracking quality degrades due to small movements of the glasses on the head. I wouldn't assume that pupil tracker has fixed this issue. In fact, use of multiple competing eye models (visible in 3d debug mode) means that the track quality will change quite a bit over time.

user-8779ef 04 December, 2017, 15:32:29

@wrp Will 200 Hz camera integrations be internal (behind the optics) ?

papr 04 December, 2017, 15:33:35

@user-8779ef Are you asking if the new 200Hz cameras use the same lenses as the old eye cameras?

user-8779ef 04 December, 2017, 15:38:57

@papr Nope. I'm asking if the HMD integration of the 200 Hz cameras will be behind the HMD optics. The current placement of the 120 Hz cameras is far away from the visual axis, and does not get a good eye image of many of my participants.

user-8779ef 04 December, 2017, 15:39:46

To be blunt, it seems that, if there's a bit of puff in the cheeks, the cheeks block the camera's view of the pupil.

papr 04 December, 2017, 15:41:14

Ah, I understand. I did not catch the HMD context. @mpk should be able to answer this. πŸ™‚

mpk 04 December, 2017, 15:42:45

@user-8779ef we plan to place the new smaller cameras furthe rin the FOV. However we cannot move behind the optics.

mpk 04 December, 2017, 15:43:12

I hope to get a better angle by moving further up without adding a lot more occlusion.

user-c0f86d 04 December, 2017, 17:30:32

Hello everyone, i am trying to install pupil from source on win10 using this guide: https://docs.pupil-labs.com/master/#windows-dependencies. Now I'm on final step, executing main.py, but failed. I found https://github.com/pupil-labs/pupil/issues/728 but the fixes are not working for me. Do you have any idea what i can do? Thanks

Chat image

papr 04 December, 2017, 17:41:51

@user-c0f86d Could you go to the pupil_src\shared_modules\pupil_detectors folder within the command prompt and run python setup.py build? What is the output?

user-c0f86d 04 December, 2017, 18:14:37

FirstOutput

Chat image

user-c0f86d 04 December, 2017, 18:15:09

2nd

Chat image

user-2798d6 04 December, 2017, 20:37:37

Hello! I was wondering if there is an ETA on the next iteration of the software? I use a MacBook pro with retina display and the offline calibration is not working right now. GitHub said they would fix it for the next release, so I was just wondering when that might be. Thank you!

papr 04 December, 2017, 21:54:39

@Eva#2951 It looks like the pupil detector build procedure cannot use your boost installation. I would recommend to repeat the installation steps for the boost lib.

papr 04 December, 2017, 21:57:12

@user-2798d6 We do not have a release date yet. If you need the new fixes I would recommend running Pupil from source. The macOS install instructions are kind of straightforward. There are much less possible hick-ups than on Windows or Linux.

user-6419ec 05 December, 2017, 09:52:08

@papr thanks. i tried but during building the boost libs again i got a new error πŸ™ˆ it seems that this is the fault. i will reset windows 10 and try it again

wrp 05 December, 2017, 09:52:46

@user-6419ec may I ask what you are doing that requires building Pupil from source on Windows?

user-6419ec 05 December, 2017, 09:59:49

Of course. I have a setup of different sensors which are working together in an experiment, now i want to implemenet pupil labs in the existing setup. Thats why i had to write some plugins for the timesynchronisation, because the existing setup works with a own ntp software and datetime of visual studio for the timestamps. I already build pupil labs from source on another pc and implemented it in my setup with certain plugins in a proper way but the power of the pc isnΒ΄t good enough for binocular tracking thats why i want to install it on a new pc with more power

wrp 05 December, 2017, 10:00:57

@user-6419ec have you considered writing a plugin and loading your plugin for the app bundle?

wrp 05 December, 2017, 10:02:05

I ask, because (as you've noticed) setting up all dependencies on Windows is unfortunately not easy

user-6419ec 05 December, 2017, 10:03:19

thats true πŸ˜ƒ not till now, because i dont know how this is working πŸ™ˆ

user-6419ec 05 December, 2017, 10:35:00

@wrp sorry for my newbie question, but how would this work? how can i load my plugin for the app bundle?

wrp 05 December, 2017, 10:36:04

@user-6419ec https://docs.pupil-labs.com/master/#plugin-guide

wrp 05 December, 2017, 10:36:46

You can put the plugin in pupil_capture_settings/plugins folder

wrp 05 December, 2017, 10:36:56

it will be loaded the next time you start Pupil Capture

user-6419ec 05 December, 2017, 10:44:10

@wrp but therefore i have to build it from source or? thats what i have done on the old pc with less power

wrp 05 December, 2017, 10:44:39

@user-6419ec the great thing about plugins is that you do not have to build Pupil from source

wrp 05 December, 2017, 10:44:55

you just write your plugin, and it is loaded by Pupil Capture (app bundle) when you start Pupil Capture

wrp 05 December, 2017, 10:45:10

(or Pupil Player if writing a plug-in for Pupil Player for example)

user-6419ec 05 December, 2017, 11:01:25

@wrp thanks a lot this was something i didnΒ΄t know till know and it made my life much easyer πŸ‘ i already read the plugin-guide but because of "In the following sections, we assume that you run the Pupil applications from source. " i thought i had to run it from source. Thanks again

wrp 05 December, 2017, 11:02:02

I apologize for the lack of clarity in the docs

wrp 05 December, 2017, 11:02:11

We will work to improve the text in docs

wrp 05 December, 2017, 11:02:18

to help clarify

papr 05 December, 2017, 11:26:17

@user-6419ec @wrp I wrote this on purpose since running custom plugins from bundle can lead to problems if you use python (system-)modules that are not included in the bundle.

user-6419ec 05 December, 2017, 12:12:57

@papr i see, that makes sense

user-9d900a 05 December, 2017, 12:41:38

Hi, I have coded a basic Matlab scripts that allows me to communicate with Pupil Capture via zmq. I feel like there is yet not much I can do other than start the eye process, calibrate and start/stop the recording.. is there any way to define more thoroughly the acquisition settings (e.g. frames per seconds, luminosity, etc.) ???

papr 05 December, 2017, 12:45:16

@user-9d900a cool! Could you share your existing code with us? I am curios how you implemented the zmq communication.

user-9d900a 05 December, 2017, 12:45:40

Sure it's super basic! And certainly not bug proof but it works πŸ˜ƒ

user-9d900a 05 December, 2017, 12:45:44

Where should I send it?

papr 05 December, 2017, 12:49:32

You can upload it to http://gist.github.com/ and share the link with us if you like πŸ™‚

user-9d900a 05 December, 2017, 12:50:38

I actually relied on existing code (matlab-zmq-master by fagg : https://github.com/fagg/matlab-zmq). Then just needed to tweak a function for serializing the notifications using msgpack. I

papr 05 December, 2017, 12:59:09

@user-9d900a Ah, I see. I guess it would be helpful to write some abstractions around it that makes it easier to connect, subscribe, and receive. maybe even do the serialization. Using this package you should be able to write a very flexible matlab client (compared to the udp relay version).

user-9d900a 05 December, 2017, 13:01:36

I did. and it works..

user-9d900a 05 December, 2017, 13:02:17

It's just that I'd like to go beyond simply calibrating, starting and stoping the acquisitions. But there doesn't seem to be a way to really change acquisitions settings via the standard notification process is there ??

papr 05 December, 2017, 13:03:34

No,there are no camera setting notifications

papr 05 December, 2017, 13:04:15

Usually you set them manually once and do not need to automate them.

user-9d900a 05 December, 2017, 13:05:04

@papr Hmmm ok. Would you recommend to switch to python then? What would you say is the development strategy if I eventually want a stand-alone solution (executable with a UI, perhaps python based) that calls pupilLabs as well as a number of other processes? No choice but to work on Pupil Labs source code right?

papr 05 December, 2017, 13:06:31

No, in this case I would use the notification system. But why would you want to duplicate functionality that is already present in the Pupil ui? edit: Clarify question.

papr 05 December, 2017, 13:11:57

What I mean is that in most cases the Pupil api is used to access data for processing and to push trigger events that are stored during a recording.

user-9d900a 05 December, 2017, 13:12:11

@papr Because the solution we have in mind does not have eye-tracking as a principal functionality: we have a UI that deals with sound inputs.. so we don't want the user to have to manipulate the pupil capture UI ontop of that.. we'd like all the pupil acquisition processes to be automated and run in the background of our main application..

user-9d900a 05 December, 2017, 13:12:14

What do you say?

user-9d900a 05 December, 2017, 13:12:59

@papr , push triggerevent? like watermarks that you can retrieve in the recording file??

papr 05 December, 2017, 13:20:34

trigger events in the sense of annotations, e.g. the start of different experiment conditions. I understand. Then I would suggest to use Pupil Service and modify the eye process in such a way that it does not show any windows anymore. Additionally, you will have to implement notifications for the uvc source, and all other ui settings that you want to automate. This is a bit of work but the base infrastructure is there to build upon.

user-9d900a 05 December, 2017, 13:24:23

@papr . Ok let me make sure I have understood (I am quite the newbie). So you suggest I modify Pupil Capture's source code, in particular the code related to the eye processes? + you suggest I implement myself the mechanisms so that the video capture module can receive notifications?? What does Pupil Service have to do with this? Thanks for your help @papr, much appreciated!

papr 05 December, 2017, 13:27:37

@user-9d900a Do you use the Pupil Headset or the hmd add-ons?

papr 05 December, 2017, 13:29:42

Pupil Service is a version of Pupil Capture that does not have a world window but it is dedicated to the hmd integrations.

Yes, you will have to modify the eye process in both cases. If you use the headset you will have to modify Pupil Capture to remove the world window ui.

papr 05 December, 2017, 13:30:54

Pupil Capture is plugin based. Each plugin receives all notifications. Therefore you just need to extend the set of notificatins that the uvc source accepts.

papr 05 December, 2017, 13:32:16

The eye process does not have such a plugin structure. Therefore it is a bit tickier to implement all notifications for it but I think it is doable after reading its source code.

user-9d900a 05 December, 2017, 13:37:11

@papr Ok awesome. I currently use the headset. I'll look into how to modify the source code then.. starting by trying to install all the dependencies.. I actually have the choice to either work on mac or on windows.. Is the mac platform easier to work with?

papr 05 December, 2017, 13:39:39

@user-9d900a Installing the Windows dependencies is pure pain. I highlle recommend macOS if you have the choice.

user-9d900a 05 December, 2017, 13:44:07

@papr Great! I'll get into that then! Wish me luck πŸ˜‰

papr 05 December, 2017, 13:45:56

Cool! Post your questions here if you need any pointers. πŸ™‚

user-84047a 05 December, 2017, 16:55:34

Hi, thanks for the previous help around accuracy! We are still having some problems with our callibration, as it seems to fall apart halfway through our recordings. A couple of more specific questions: Is there a way to break down the calibration point by point. That is, if there are calibration points recorded that are not suitable/ or that are not detected, is there a way to remove them? In some of our data collected, the online calibration recording is separate to the rest of the recording (ie - they are two separate recordings). Is there any way to combine these together, or will we have to discard our calibration and use natural features on the second video.

user-a36fbf 05 December, 2017, 20:26:57

Does anyone have issues with the pupil software having issues with tracking the pupil

user-a36fbf 05 December, 2017, 20:27:38

We lose the pupil periodically

user-a36fbf 05 December, 2017, 20:35:04

In the algorithm mode, we have an issue with the blue box not recognizing the whole eyw

user-a36fbf 05 December, 2017, 20:37:44

Chat image

user-a36fbf 05 December, 2017, 20:40:41

Chat image

papr 05 December, 2017, 20:41:12

A workaround is to reduce the eye camera resolution.

papr 05 December, 2017, 20:41:56

And to increase the Pupil max value.

user-a36fbf 05 December, 2017, 20:44:26

So we reduce the resolution in the software right?

papr 05 December, 2017, 20:45:45

Exactly, in the eye window.

user-a36fbf 05 December, 2017, 20:48:08

Thanks!

user-d60420 06 December, 2017, 14:30:54

Hi! I have 2 questions about a study we are considering about "tracking the eyes of someone who is meditating (with open eyes)" We will focus particularly on pupil size and blinking, I think. 1) We will have the meditator meditating with a reduced amount of light. Do you see any issue about the reduced quantity of light? (I will try, but maybe someone has suggestions anyway)

user-d60420 06 December, 2017, 15:24:39

2) we would like to show the user a visual representation of the eye in real time to see if/how this influence the meditation status. Not the image of the eye from the camera but a visual representation (an image created based on the current pupil size). What is the best way of doing this? I can imagine (a) doing a separated program accessing data from IPC backbone https://docs.pupil-labs.com/#the-ipc-backbone or (b) create a separated plugin (maybe starting from other plugin already visualizing pupil data in real-time)? There are other ways? What are their relative advantages and disadvantages? Thanks!

user-29e10a 06 December, 2017, 15:26:31

Hi, short question: Is the pupil hardware, especially the HMD integration, CE certified? πŸ˜ƒ

user-2798d6 06 December, 2017, 15:55:17

@user-2798d6 We do not have a release date yet. If you need the new fixes I would recommend running Pupil from source. The macOS install instructions are kind of straightforward. There are much less possible hick-ups than on Windows or Linux.

user-2798d6 06 December, 2017, 15:55:33

I'm sorry to be out of the lop, but what do you mean by running from source?

user-2798d6 06 December, 2017, 15:55:36

loop*

papr 06 December, 2017, 16:18:21

@user-2798d6 Running from source means to install the dependencies, download the source and to execute the program via the command line without having to create an application bundle.

papr 06 December, 2017, 16:25:34

@user-d60420 1. The eye cameras are sensible to IR light and carry their own active IR emmiters. This means that darkness is not a problem. You will have to adapt some pupil detection parameters though since the pupils will be very dilated. 2. I would recommend the IPC for your usecase. Just subscribe to the pupil and blink topics. The pupil data includes the diameter and the blink data tells you about blinks.

user-2798d6 06 December, 2017, 17:28:32

Hello again - I am having trouble exporting data video from Player. I've tried two different computers just to see if it was a processing speed issue or a specific computer issue, but both are struggling. I am able to export raw data (at least I think I'm getting all of the files) but no video. Do you have any suggestions? It's a fairly long video - about 15 minutes.

user-5d12b0 06 December, 2017, 21:26:51

Hello, I'd like to isolate some performance issues so I'm trying to build pupil from source. I got as far as running python setup.py build in pupil\pupil_src\shared_modules\pupil_detectors but I'm encountered with the linker error cannot open file 'boost_python3-vc140-mt-1_65_1.lib'. I have the vc141-mt-1_65_1.lib version of the file but not the vc140 version. Do you happen to know why it is demanding the MSVC 2015 version of the lib instead of the 2017 version?

Is 2017 required? I can go back and redo all the steps with MSVC 2015. If 2015 works then it might be possible to skip some steps.

user-5d12b0 06 December, 2017, 21:37:21

umm... nevermind? I had forgotten to copy the opencv_world331.dll into pupil_external. It doesn't make sense to me, but after copying a missing opencv dll I can now get past that boost lib error and it seems to build successfully. By the way, with the latest boost, it is no longer necessary to edit boost\python\detail\config.hpp

user-5d12b0 06 December, 2017, 21:56:50

One more error before I go home:

D:\Tools\VR\Pupil\pupil\pupil_src>python main.py
MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version.
world - [ERROR] launchables.world: Process Capture crashed with trace:
Traceback (most recent call last):
  File "D:\Tools\VR\Pupil\pupil\pupil_src\launchables\world.py", line 113, in world
    import pupil_detectors
  File "D:\Tools\VR\Pupil\pupil\pupil_src\shared_modules\pupil_detectors\__init__.py", line 19, in <module>
    from .detector_3d import Detector_3D
ImportError: DLL load failed: The specified module could not be found.

I'll check back in tomorrow.

papr 07 December, 2017, 09:39:42

@user-2798d6 Is this by any chance a recording of a Pupil Mobile stream?

papr 07 December, 2017, 09:43:15

@user-54a6a8 Thanks for the note on the boost config edit. It looks like the Pupil Detector was not built correctly. Can you delete the build directory in pupil_detectors and run python setup.py build within this folder? Mabe this gives any clues about what is going wrong.

user-d60420 07 December, 2017, 11:01:10

@papr thanks!!!

user-5d12b0 07 December, 2017, 15:02:27

@papr, when running python setup.py build in pupil_detectors, it does seem to terminate a little abruptly but there are no fatal error messages I can see. The last object it tries to build is EyeModel.cpp. Here are some pieces of the output:

   Creating library build\temp.win-amd64-3.6\Release\detector_3d.cp36-win_amd64.lib and object build\temp.win-amd64-3.6\Release\detector_3d.cp36-win_amd64.exp
Generating code
d:\tools\vr\pupil\ceres-windows\glog\src\logging.cc(2025) : warning C4722: 'google::LogMessageFatal::~LogMessageFatal': destructor never returns, potential memory leak
Finished generating code
EyeModel.obj : warning LNK4049: locally defined symbol [email removed] (public: __cdecl google::LogMessageFatal::LogMessageFatal(char const *,int,struct google::CheckOpString const &)) imported
[... several more warnings like this ...]
EyeModel.obj : warning LNK4049: locally defined symbol [email removed] (public: class std::basic_ostream<char,struct std::char_traits<char> > * __cdecl google::base::CheckOpMessageBuilder::ForVar2(void)) imported
user-5d12b0 07 December, 2017, 15:03:55

And I have a pupil_src\shared_modules\pupil_detectors\build\lib.win-amd64-3.6\pupil_detectors\detector_3d.cp36-win_amd64.pyd

user-0f3eb5 07 December, 2017, 15:22:21

Hi all πŸ˜ƒ I received my pupil headset a week ago and haven't succeeded in getting correct calibration yet, so I'm posting my question here in hope of your help.

I have:

1) Made sure both eye camera's are in focus 2) The world camera is in focus 3) The green circle is about the size of my eyeballs 4) I don't move my head when calibrating 5) The minimum and maximum iris radius are correct...

Here is an example video that I exported in which you can also see both eye recordigns: https://drive.google.com/file/d/1CbyoHwjFf5rMaUe-qceMrCq_qLUrqNfM/view?usp=sharing

The eyes are supposed to be on the index-fingernail.

As you can see, the midline is captured perfectly, but everything above and below the midline is not correct.

Do you know of anything that could explain this?

Chat image

user-5d12b0 07 December, 2017, 16:25:02

Are there any docs about what the pupil detection algorithm overlay or debug window is showing? I don't know if this is telling, but my 2D debug window shows up at the lop left edge of my screen without a menu bar and I can't move it around; it's difficult to see from within Steam VR's desktop viewer. From what I can see, there's a circle in the upper left corner and it jitters about, but it mostly stays in the upper left corner.

user-2798d6 09 December, 2017, 01:36:34

@papr - the recording is not from pupil mobile. It's just a regular recording from Capture on a MacBook Air

user-59f06b 09 December, 2017, 06:28:14

Hi all, I've been attempting to install pupil from source on ubuntu 16.04. I have some really complicated build issues that I cannot solve so I created a docker container to make everything more simple. I have installed all the dependencies and pupil successfully builds, but I am receiving the error:

[email removed] python main.py MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version. world - [INFO] launchables.world: Application Version: 1.1.63 world - [INFO] launchables.world: System Info: User: pupil, Platform: Linux, Machine: 04c3d52af60b, Release: 4.4.0-103-generic, Version: #126-Ubuntu SMP Mon Dec 4 16:23:28 UTC 2017 world - [INFO] pupil_detectors.build: Building extension modules... world - [INFO] calibration_routines.optimization_calibration.build: Building extension modules... world - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend world - [INFO] launchables.world: Session setting are from a different version of this app. I will not use those. libGL error: No matching fbConfigs or visuals found libGL error: failed to load driver: swrast world - [ERROR] launchables.world: Process Capture crashed with trace: Traceback (most recent call last): File "/home/pupil/pupil/pupil_src/launchables/world.py", line 299, in world main_window = glfw.glfwCreateWindow(width, height, "Pupil Capture - World") File "/home/pupil/pupil/pupil_src/shared_modules/glfw.py", line 522, in glfwCreateWindow raise Exception("GLFW window failed to create.") Exception: GLFW window failed to create.

world - [INFO] launchables.world: Process shutting down. ```

I even tried running with nvidia-docker to see if that would resolve the GLFW window issues. Can anyone offer any insight?

user-59f06b 09 December, 2017, 06:28:18

The command to create my container: sudo nvidia-docker run -itd --runtime=nvidia --privileged -v /etc/localtime:/etc/localtime -v /tmp/.X11-unix:/tmp/.X11-unix -e DISPLAY=unix$DISPLAY --device /dev/snd --device /dev/video0 --name pupil nvidia/cuda

user-6e1816 11 December, 2017, 06:51:38

Does anyone know how to save "frame.img" in the events as a picture?

papr 11 December, 2017, 06:54:57

@user-6e1816 https://docs.opencv.org/3.0-beta/modules/imgcodecs/doc/reading_and_writing_images.html#imwrite This should work. frame.img is already a BGR array. So there should be no need for conversion.

user-6e1816 11 December, 2017, 08:10:38

@papr I thought the "frame.img" was the picture of world camera, but the results of it is "None", so could you tell me the meaning of "frame.img" and the variable name of world camera image?

papr 11 December, 2017, 08:11:54

@user-6e1816 Could you tell us the result of type(frame)?

user-6e1816 11 December, 2017, 08:13:21

@papr <class 'NoneType'>

papr 11 December, 2017, 08:14:45

@user-6e1816 How do you access the frame object? Through the recent_events plugin method? Or do you use pyuvc directly?

user-6e1816 11 December, 2017, 08:15:55

@papr def recent_events(self,events): frame = events.get('frame.img') print(type(frame))

papr 11 December, 2017, 08:17:59

In this case, if frame is None there is no new frame. Just wait for the next call of recent_events where frame is not None.

user-526669 11 December, 2017, 09:23:35

Hi, I received the pupil headset but I still didn't success the calibration in Mac OS. With my Macbook, calibration in full display doesn't work at all. It only worked on the small window. How can I fix it?

user-2798d6 11 December, 2017, 16:53:10

Just in case this got lost: @papr - the recording is not from pupil mobile. It's just a regular recording from Capture on a MacBook Air

user-2798d6 11 December, 2017, 16:53:52

Also, I've transferred files over to a MacBook Pro and tried exporting from there and it doesn't work either.

papr 11 December, 2017, 16:54:55

@user-526669 Could you describe what you mean by doesn't work at all? Does the app freeze or crash? Is there no marker shown? Is the marker shown but not detected?

user-6419ec 11 December, 2017, 18:09:51

Hi everyone. I have a problem with the cameras: pupil capture can not detect eye01 and starts in ghost mode and i do not know why. Am i making something wrong, or is there something wrong with my device?

Chat image

user-0f3eb5 11 December, 2017, 18:52:26

Hi @papr, could you please take a look at this? I am really struggling to calibrate my pupil headset. I use the screen calibration.

I have:

1) Made sure both eye camera's are in focus 2) The world camera is in focus 3) The green circle is about the size of my eyeballs 4) I don't move my head when calibrating 5) The minimum and maximum iris radius are correct...

Here is an example video that I exported in which you can also see both eye recordigns: https://drive.google.com/file/d/1CbyoHwjFf5rMaUe-qceMrCq_qLUrqNfM/view?usp=sharing

The eyes are supposed to be on the index-fingernail, but it's not doing that. Do you know of anything that could explain this?

user-6e1816 12 December, 2017, 03:15:13

When I write frame.img as a picture, there is not a red "gaze point" in it (first picture), so how to save a picture (second one) with the red gaze point in real time?

Chat image

user-6e1816 12 December, 2017, 03:15:16

Chat image

wrp 12 December, 2017, 03:19:05

Hi @user-6e1816 the frame.img is only the image frame. If you want to visualize the gaze position into the frame, then you will need to write the gaze position into the image array.

wrp 12 December, 2017, 03:19:57

The world view (second screenshot that you posted) shows a gaze position overlaid on the image. This visualization does not manipulate the pixels in the frame, it is an OpenGL point on top of an OpenGL texture.

user-6e1816 12 December, 2017, 03:21:25

@wrp Thanks, I see.

wrp 12 December, 2017, 03:22:21

@user-6e1816 you can look at Pupil Player to see how this is done. When we export a video with Pupil Player the vis_* plugins can modify the pixels of the frame to render the gaze visualization into the video frame.

user-526669 12 December, 2017, 03:32:51

@papr no markers shown. only white display.

wrp 12 December, 2017, 04:47:36

@user-0f3eb5 were you using 3d or 2d mode. The pupil detection looks robust (from what I can see qualitatively from the eye videos). Have you tried re-calibrating offline?

wrp 12 December, 2017, 04:47:59

@user-0f3eb5 if you'd like to share a quick sample dataset with eye videos you can email [email removed]

wrp 12 December, 2017, 04:49:09

@user-6419ec this looks like a driver issue. You will likely need to manually uninstall drivers that are under the Imaging Devices category and re-install drivers with the install-drivers .exe within Pupil Capture

wrp 12 December, 2017, 04:51:05

@user-526669 could you let us know some specs of your MacBook and OS version?

wrp 12 December, 2017, 04:53:44

@user-526669 can you share the pupil_capture_settings/capture.log file - you can also email this to [email removed]

wrp 12 December, 2017, 04:57:51

@user-59f06b this seems like a GLFW and OpenGL issue that is affecting Linux, macOS, and Windows

wrp 12 December, 2017, 05:00:04

it looks like there are quite a few fixes that are about to be released for GLFW v3.3 - (apparently v3.3 is 92% complete according to www.glfw.org)

wrp 12 December, 2017, 05:02:34

apt-get libglfw3-dev will install v3.1.2-3

wrp 12 December, 2017, 05:03:06

@user-59f06b do you have your dockerfile somewhere online - maybe in a Gist so I can take a look?

user-526669 12 December, 2017, 08:10:17

@papr I send you the mail to [email removed] please check the mail.

papr 12 December, 2017, 08:11:15

@user-526669 Ok, thank you, I will have a look at it.

user-6419ec 12 December, 2017, 11:20:52

@wrp unfortunately it is not working. Listed image devices are only pupil cam1 ID0 and pupil cam1 ID2

Chat image

wrp 12 December, 2017, 11:25:32

Can you check show hidden devices?

wrp 12 December, 2017, 11:27:58

And see if there are any devices installed in imaging devices category

wrp 12 December, 2017, 11:28:21

@user-6419ec ☝

user-7c4790 12 December, 2017, 12:31:03

@wrp there are only pupil cam 1 ID0 and pupil cam 1 ID2 if i enable hidden devices

Chat image

papr 12 December, 2017, 12:38:01

@user-7c4790 They are in the wrong category. Right click and remove the devices from the Bildverarbeitungsgeraete category.

wrp 12 December, 2017, 12:38:08

Delete these pleaee

user-7c4790 12 December, 2017, 12:44:48

i deleted both and they are listed in libusK Usb Devices, but not ID1

papr 12 December, 2017, 12:45:10

@user-7c4790 Is ID0 listed twice?

papr 12 December, 2017, 12:45:36

Or is it possible that you own a monocular headset?

user-7c4790 12 December, 2017, 12:47:13

no they are only listed once and it worked in the past binocular

Chat image

user-3f894d 12 December, 2017, 15:06:28

Hello, I was wondering if Pupil software allows to map the fixations recorded with Pupil directly onto the stimuli that were shown on the screen. I searched the documentation and experimented with the Pupil player but didn't find an option for this

papr 12 December, 2017, 15:17:03

@user-3f894d You can add surface markers to your monitor, define the monitor as a surface and fixations will be mapped in relation to your monitor surface.

papr 12 December, 2017, 15:17:32

Use the (Offline) Surface Tracker plugin to define/detect the surface.

user-3f894d 12 December, 2017, 15:21:40

@papr Thanks. But the fixations will still be mapped to the video from the world camera, right?

papr 12 December, 2017, 15:22:06

Yes, the mapping to the surface happens additionally.

user-3f894d 12 December, 2017, 15:22:38

@papr OK, thanks for help

user-131985 12 December, 2017, 16:19:05

Hi everyone, I am workung on a University Project which aims at cotrolling the mouse of a PC with the gaze tracked by the pupil labs eye tracking system. We have already implemented the solution provided by pupil labs on Github, but now want to augment it a little further. Does anyone of you know how to extract the raw gaze data from pupil capture? I am talking about the kind of data that is responsible for the mouse movement. We dont really understand which data set from the 'msg' we are supposed to use for our task, and would be delighted if any of you could help us. thanks in advance and greetings!

user-7c4790 12 December, 2017, 16:29:48

itΒ΄s me again πŸ™ˆ because of the missing pupil ID1: could it be anything concerning the hardware, or am i maiking something wrong with the drivers? if it#s a bigger thing i can also write an email if it is easier for you (in addition to the pupil hardware we also bought a support contract)

papr 12 December, 2017, 16:40:23

@user-7c4790 Please write an email to info@pupil-labs.com concerning this matter.

papr 12 December, 2017, 16:45:27

@user-131985 The raw gaze data (available by subscribing to gaze instead of surface in line 44) is relative to the world camera. But in your case you need gaze relative to your screen. You define a surface to find the relation between your screen and the world camera. Gaze is automatically mapped onto this surface (gaze_on_srf). Have a look at https://docs.pupil-labs.com/master/#development-overview for an overview about the data structures.

user-526669 13 December, 2017, 02:18:53

@papr Hi, how can I fix the calibration problem? how is it going?

user-6e1816 13 December, 2017, 03:19:38

Could I open eye camera 0 and eye camera 1 at the same time, but the gaze point only moves with the eye camera 0 ( gaze point only mapping with eye0)? I only need eye camera 1 to provide "confidence" info.

papr 13 December, 2017, 06:52:38

@user-526669 You can disable the full screen mode in the calibration settings. Run the calibration in window mode or the manual marker calibration procedure as a workaround. We will be releasing a new version soon that improves the compatibility with Retina displays.

user-526669 13 December, 2017, 11:19:54

@papr But it also didn't worked on my connected monitor. I tested on my Macbook and the connected monitor but only small window calibration worked. Is it same problem? because of the retina display of Macbook?

papr 13 December, 2017, 11:24:20

@user-526669 I am not sure. We were not able to reproduce this issue with the upcoming version. Please test it when it is released and report back if the problem is fixed for you.

user-813280 14 December, 2017, 10:15:14

Hello guys, is it possible to use the Pupil SDK with an another camera such as an ordinary webcam instead of the eyewear ones sold on the website? I understand the tracking quality will suffer.

wrp 14 December, 2017, 10:16:07

Hi @user-813280 please see https://docs.pupil-labs.com/master/#diy

wrp 14 December, 2017, 10:16:38

you may also want to message/discuss with @user-41f1bf about alternate eye cameras for DIY setup

user-813280 14 December, 2017, 10:18:40

Thank you. I'm not really sure you can have code in LGPL but not let it be used for commercial purposes, might want to change the license?

wrp 14 December, 2017, 10:19:27

@user-813280 perhaps there is a mis-understanding/mis-communication somewhere. The code can be used in commercial applications according to the LGPL v3 license

user-813280 14 December, 2017, 10:20:15

Maybe there is, I got the impression from this "If you are an individual planning on using Pupil exclusively for noncommercial purposes, and are not afraid of SMD soldering and hacking – then, buy the parts, modify the cameras, and assemble a Pupil DIY headset. "

wrp 14 December, 2017, 10:20:48

@user-813280 hardware != code

wrp 14 December, 2017, 10:23:21

the DIY headset is intended for individuals and non-commercial users

user-813280 14 December, 2017, 10:23:42

I see. This is derailing from my original question but if the diy kit uses an ordinary webcam and usb connection, whats stopping people from making their own 3d printed holder for the camera and leds and just using that?

user-813280 14 December, 2017, 10:24:02

(for commercial stuff)

wrp 14 December, 2017, 10:24:42

Good point. Personal integrity/ethics.

wrp 14 December, 2017, 10:25:13

Sales from hardware are what keep Pupil Labs afloat and enable Pupil Labs to continue providing free open source software.

wrp 14 December, 2017, 10:26:14

So, yes, one could (ab)use our model. And there may be people out there doing so.

user-813280 14 December, 2017, 10:30:54

Exactly. I don't see much moral issue here to be honest and definitely no legal issue. Perhaps you should consider switching the license for newer versions? I know it sucks but I don't see an alternative solution here. You could also ask for donations and patrons but I don't know if that would help keep the project afloat. There's also some open source software conservation fund that finances such projects but I forgot the name, I'll try to find it.

user-813280 14 December, 2017, 10:32:05

(off topic: do you by any chance have a donation button already I could use?)

wrp 14 December, 2017, 10:32:43

We do not operate via donation and do not have a donation button πŸ˜„

wrp 14 December, 2017, 10:32:57

We are pleased with software under LGPL license

user-813280 14 December, 2017, 10:33:25

well I can't afford a eyewear yet but would like to help. Anyway, these are just my thoughts.

wrp 14 December, 2017, 10:34:07

@user-813280 send an email to info@pupil-labs.com and we can discuss further

user-813280 14 December, 2017, 10:34:24

OK

user-813280 14 December, 2017, 10:35:02

Anyone here tried running the SDK on a Tinker Board?

mpk 14 December, 2017, 10:44:39

@user-813280 I guess to add to this: Our hardware is highly specialized and cannot just be sourced elseware. Eye tracking depends is in half on good hardware and good software.

user-813280 14 December, 2017, 10:45:43

OK

mpk 14 December, 2017, 10:48:50

@user-813280 if you want to build your own diy kit and you get our frame on shapeways a part of that cost goes towards the Pupil Project. I guess this is a way to support us, even if you cant afford the specialized hardware we sell in our store.

user-813280 14 December, 2017, 10:56:25

I see. My final project is this: an animatronic robot that looks directly at a person in front of it. There will need to be a camera on its forehead for me to see what it sees, then a Pupil eyewear for me to translate my exact eyeball angles to the animatronic. If I could get Pupil running on a Pi3 or Tinkerboard I could put it inside the head of the animatronic and all I would need would be cable coming out of the animatronic for eyewear, no PC needed. Animatronic is built by me. https://www.youtube.com/watch?v=P6RYITlRgTg Warning, kinda scary.

user-41f1bf 14 December, 2017, 11:46:13

@raiori, one do violate the EULA of Pupil-Labs when using Pupil DIY kits for commercial purposes. In commercial contexts, one also violate legal issues when exposing people to non regulated IR devices. Pupil software is not exclusive for pupil cameras, for obvious reasons, it would not be nice. Even so, such exclusiveness would not violate LGPL because your freedom to modify the code, access it, share it, and so on, would not be violated.

user-41f1bf 14 December, 2017, 11:58:52

You are free to build your own hardware other than Pupil DIY and use it commercially with pupil software. It would not violate any terms at all. However, please remember that even though we are a community larger than Pupil Labs, Pupil Labs is the core of the community and would not be nice leave them eith empty hands. If you are in a limited budget, you can help in many ways other than money. @wrp already said that ethics is an important issue. I would like to add that we are social beings, and as such, the word csn be spread... People will talk about ones work or lack of work in a social project.

user-41f1bf 14 December, 2017, 12:11:37

Pupil Labs is doing a great work by promoting accessibility of eye tracking research, and so, contributing to education in this specialized field. I am a concrete example. Without them, my work would not be possible.

wrp 14 December, 2017, 12:12:12

Thanks for your support and contributions @user-41f1bf

user-33d9bc 14 December, 2017, 15:31:51

Hey all

user-33d9bc 14 December, 2017, 15:31:56

Two quick questions

user-33d9bc 14 December, 2017, 15:32:10

Does pupil glasses fit kids size

user-33d9bc 14 December, 2017, 15:32:15

Ages 7-17

user-33d9bc 14 December, 2017, 15:32:44

2) does it work with glasses

user-41f1bf 14 December, 2017, 15:33:45

@user-33d9bc , I have been testing with contact lens

user-41f1bf 14 December, 2017, 15:33:57

And it works nicely

user-41f1bf 14 December, 2017, 15:36:13

Some glasses are a little bit difficult to fit, but some should have a better fit with new 200hz tiny cameras

user-33d9bc 14 December, 2017, 15:37:07

Thank you for your help. Do you have any idea about the adjustability of the headset

user-41f1bf 14 December, 2017, 15:37:45

They have custom headsets for children

user-33d9bc 14 December, 2017, 15:38:25

So, I have to ask for

user-33d9bc 14 December, 2017, 15:38:34

Special glasses

user-33d9bc 14 December, 2017, 15:38:51

I can’t use general one

user-41f1bf 14 December, 2017, 15:40:37

Do you want a solution for your chield?

user-41f1bf 14 December, 2017, 15:40:49

Child*

user-33d9bc 14 December, 2017, 15:41:10

I am running a research that I need to use the headset

user-33d9bc 14 December, 2017, 15:41:17

With ages 7-17

user-33d9bc 14 December, 2017, 15:41:28

And was wondering if the glasses will fit

user-33d9bc 14 December, 2017, 15:41:36

The kids head size

user-41f1bf 14 December, 2017, 15:43:27

I understood you want to use correction glasses and a pupil headset at the same time, sorry about that

user-33d9bc 14 December, 2017, 15:43:40

That is true as well

user-33d9bc 14 December, 2017, 15:43:48

I had twi questions

user-33d9bc 14 December, 2017, 15:43:56

*two

user-41f1bf 14 December, 2017, 15:45:26

Ok.. So, for your first question, you could ask for a custom headset. And about using glasses, you will have better chances with the new tiny camera

user-33d9bc 14 December, 2017, 15:46:20

What is different about the custom headset

user-33d9bc 14 December, 2017, 15:46:26

The frame ?

user-41f1bf 14 December, 2017, 15:47:08

Yes, the frame

user-41f1bf 14 December, 2017, 15:48:29

Each person has a head size. For tiny heads you increase your chances with a custom headset.

user-41f1bf 14 December, 2017, 15:51:46

I would guess that 13-17 would not be a problem with the normal one. The custom one should fit 7-12

user-33d9bc 14 December, 2017, 15:52:51

With diy version of the glasses do you get the same performance/accuracy

user-33d9bc 14 December, 2017, 15:53:03

As the ready to go

user-33d9bc 14 December, 2017, 15:53:07

Glasses

user-41f1bf 14 December, 2017, 15:56:05

DIY requires time to be assembled and is less versatile. Unfortunately I have not compared the two concerning their accuracy/precision.

user-41f1bf 14 December, 2017, 15:58:06

However, I have an intuition. The binocular system is more robust, and is another history. But chances are that mocular systems are not so different.

user-8a8051 14 December, 2017, 23:55:10

hi all, i have a few questions regarding the pupil mobile system.

user-8a8051 14 December, 2017, 23:57:17

how does the offline calibration work, do you need to run through the manual marker calibration process at the beginning of the recording and then work through something like the natural features calibration with the video after the fact?

user-8a8051 15 December, 2017, 00:01:20

context: i am looking to use eyetracking on train drivers looking at the different screens/controls/buttons in the cab environment as well as vision through the windscreen.

user-8a8051 15 December, 2017, 00:04:15

also the recordings will probably end up being quite long, around 2 hours end to end for each run. in terms of storage on the phone, the pupil bundle includes a 64Gb SD card, how many hours of recording is this likely to hold?

wrp 15 December, 2017, 02:02:52

@user-8a8051 regarding offline calibration you can use the marker or natural features. If you're using the marker, the marker needs to be present in the recording. See videos in this section of the docs within the offline calibration sub-section: https://docs.pupil-labs.com/master/#data-source-plugins

wrp 15 December, 2017, 02:03:55

Markers will be automatically detected in the video with Pupil Player. If you're using natural features, then you will have to click on points in the scene.

wrp 15 December, 2017, 02:05:07

@user-8a8051 re Pupil Mobile - our testing shows that you can achieve 4 hours of continuous recording with the 64GB SD card on Moto Z2 Play (with external battery pack). The limiting factor here is battery.

wrp 15 December, 2017, 02:07:25

@user-33d9bc To add on to @user-41f1bf response. We (Pupil Labs) also make kid sized frames. These are not listed on the website but are indeed available. The kid sized frames shoudl accommodate 6-12 year old kids. After that the regular 'adult' sized headset should work.

wrp 15 December, 2017, 02:10:19

@user-33d9bc with a Pupil Labs monocular system, you get a 120hz or 200hz eye camera. Higher frame rate provides more observations of the eye and therefore leads to more robust pupil detection. With the 200hz eye camera it has a global shutter, therefore motion blur artifacts are reduced and can yield more robust pupil detection especially with fast eye movements.

user-33d9bc 15 December, 2017, 02:13:17

Great, can I purchase one glasses system

user-33d9bc 15 December, 2017, 02:13:22

With the two franes

user-33d9bc 15 December, 2017, 02:13:25

Frames

wrp 15 December, 2017, 02:13:40

@user-813280 If I understand correctly, your animatronic robot only needs to receive gaze positions. The Pupil headset will be connected to a computer and then the Pi3 or other board can connect to WiFi or ethernet and can receive gaze and other data in real-time. (BTW - the video is great - thanks for sharing 😸 )

wrp 15 December, 2017, 02:14:41

@user-33d9bc The cabling is integrated into the frames. Therefore what we could offer is two frames each with its own world camera, and then you could swap the eye cameras. (World camera is not designed to be swapped)

user-33d9bc 15 December, 2017, 02:15:53

Does pupil glasses works well

user-33d9bc 15 December, 2017, 02:16:01

With people wearing glasses

wrp 15 December, 2017, 02:17:30

@user-33d9bc Pupil + prescription eye glasses - Many members in our community and developers on our team wear prescription lenses. You can put on the Pupil headset first, then eye glasses on top. You can adjust the eye cameras such that they capture the eye region from below the glasses frames. This may not be an ideal solution, but does work for many people.

user-33d9bc 15 December, 2017, 02:19:56

That is an awesome news. sadly I have already submitted my grant proposal to the university and got fund for one pupil glasses with the shipping cost. Which one you advise to go with (kids frame or adult ) and what could be the added cost for your proposed solution

wrp 15 December, 2017, 02:22:24

@user-33d9bc I will send you a PM with details so we don't overload the group thread πŸ˜„

user-33d9bc 15 December, 2017, 02:22:35

Oh- sorry.

wrp 15 December, 2017, 02:22:43

no problem 😸

user-8a8051 15 December, 2017, 02:54:31

@wrp thanks very much,

wrp 15 December, 2017, 02:58:02

@user-8a8051 you're welcome.

wrp 15 December, 2017, 03:04:06

@user-813280 you may be interested in checking out this project that uses Pupil to communicate with rpi to control prosthetic/robotic hand - with a very nice Readme and demo videos: https://github.com/jesseweisberg/pupil

wrp 15 December, 2017, 03:04:39

One of the videos here: https://youtu.be/KYcfLEvbxSc

user-8a8051 15 December, 2017, 04:30:13

@wrp quick follow up question regarding pupil mobile and offline calibration, do you find much difference between the accuracy of offline vs screen marker or manual marker calibration?

user-a700b3 15 December, 2017, 05:30:31

Hi, I am trying to use the Unity plugin, but it doesn't seem to be recording the video. I am trying to use the demo scene. It looks like it made the folder but never manages to write out a file

wrp 15 December, 2017, 05:32:10

@user-8a8051 accuracy shoudl not be different - in fact you may be able to achieve higher accuracy with post-hoc calibrations in Pupil Player because you can fine tune Pupil detection and set calibration sections for different parts of the dataset if desired.

wrp 15 December, 2017, 05:32:46

@user-a700b3 Please could you migrate your discussion to πŸ₯½ core-xr channel and ask @user-e04f56 for assistance/feedback.

user-8a8051 15 December, 2017, 05:40:47

@wrp interesting, i had not thought of that, thanks very much.

user-41f1bf 15 December, 2017, 19:48:26

@Fin#6137 I can confirm that post-hoc calibration increases overall data quality. I would recommend you two things... If you are using 3d pupil detection, make sure to ask people to roll their eyes during the recording and before calibration. This way you will have conditions to feed a good 3d eye model and avoid bad models. Sometimes, 3d detection does not works. So I would recommend always 9 points for calibration, or more, so you will have some alternatives if anything goes wrong

user-d7c56b 16 December, 2017, 17:13:15

Hi, I’m trying to DIY following the instruction. I’m wondering if I can use a camera equipped with IR-LED as eye camera instead of making one? Such as this one: http://us.dlink.com/products/connect/hd-wi-fi-camera-dcs936l/ If it is possible, that would make customization much easier!

user-2798d6 16 December, 2017, 20:52:23

Hi, is there a way to have multiple calibration points in one video rather than one long section?

user-2798d6 16 December, 2017, 20:57:45

*calibration ranges I mean

user-2798d6 16 December, 2017, 20:58:16

For example from frame 200-700 & also frames 1500-1700

mpk 17 December, 2017, 09:13:48

@user-2798d6 you can make a long section and then use natural features.

user-8a8051 17 December, 2017, 21:44:09

@user-41f1bf thanks for the advice

user-0f3eb5 18 December, 2017, 13:25:37

hi guys, is the pupil docs website offline?

mpk 18 December, 2017, 13:32:13

@user-0f3eb5 not for me. Can you try again?

user-0f3eb5 18 December, 2017, 13:33:36

thanks for checking, it's still not working for me and my internet is fine (other websites work), i'll try again later, thanks

user-0f3eb5 18 December, 2017, 14:04:28

it's working again for me πŸ˜ƒ

user-943415 18 December, 2017, 14:08:49

Hi! I have a question about blink data. I have enabled the "Blink Detection" plugin in Capture and recorded a short session. I have then enabled the "Raw data exporter" plugin in Player and exported. However I cannot find any info about blinks in the csv files. Am I correct in saying that the the "Blink Detection" plugin only triggers blink events (as written at https://docs.pupil-labs.com/#blink-detection ) but there is nothing at the moment listening for these events and writing them on a file? Or maybe there is an offline plugin for the player which computes blink onsets and offsets and can export them? If not, how would you suggest getting blink data? Thanks!

papr 18 December, 2017, 14:15:50

@user-943415 You are right that the blink events are currently not exported via the Raw Data Exporter nor is there an offline blink detection plugin yet. There is an open issue (https://github.com/pupil-labs/pupil/issues/968) on which I am currently working on: https://github.com/papr/pupil/tree/offline_blink_detection

Nonetheless, the online blink events are recorded in the pupil_data file. You can extract them manually by unpacking the file with msgpack and accessing the blinks array.

user-943415 18 December, 2017, 14:20:33

Yep, thanks! In fact I searched "blink" in this chat and got only my recent questions about blinking but then I found that there is a "Relevant" label beside "Recent" in the search bar on the right. And there I found a comment by you saying how to do it with msgpack. Trying now, thanks! πŸ˜€

user-943415 18 December, 2017, 14:32:36

Ok, got blink using load_object from https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/file_methods.py

user-943415 18 December, 2017, 14:32:43

pupil_data = load_object('pupil_data')

user-943415 18 December, 2017, 14:32:52

In [14]: len(pupil_data['blinks']) Out[14]: 7 Thanks!!!

user-0f3eb5 18 December, 2017, 14:51:18

@wrp

"@user-0f3eb5 were you using 3d or 2d mode. The pupil detection looks robust (from what I can see qualitatively from the eye videos). Have you tried re-calibrating offline?"

I am using 3d mode. I have gone through all the steps several times with no luck. One thing that immediately catches my attention when I compare my world-cam images to the one in this video is that my recordings seem scewed (almost like a fish-eye lens) while the lines in this video seem perfectly straight: https://www.youtube.com/watch?v=PXo0k7WmGYs

Do you think that's related to the problem? Is there a way to achieve the non-skewed view?

wrp 19 December, 2017, 05:23:07

@user-0f3eb5 this is a very old demo video - was using a different world camera at that time. The current high speed world camera comes with a wide angle lens (100deg approx) as well as a narrow angle lens (60 deg approx).

wrp 19 December, 2017, 05:23:20

We received your email and will follow up there

user-0f3eb5 19 December, 2017, 09:32:09

thanks @wrp πŸ˜ƒ

user-f1d926 21 December, 2017, 13:20:17

Hi! I have a question. I'm on Ubuntu 16.04 and pupil capture didn't recognize my headset. After some research, I found this page https://groups.google.com/forum/#!topic/pupil-discuss/5mvfSs5841Mhttps://groups.google.com/forum/#!topic/pupil-discuss/5mvfSs5841M and added my user to that group. It didn't work, but with permissions issue on my mind I change my user to root and pupil capture worked!

user-f1d926 21 December, 2017, 13:20:40

Is there a way to use pupil capture without being root? what should I do??

user-f1d926 21 December, 2017, 13:20:44

did I miss a step?

papr 21 December, 2017, 13:21:18

@user-f1d926 Sometimes it is necessary to reboot after adding the user to the plugdev group.

mpk 21 December, 2017, 13:23:57

adding your user to plugdev has the purpose of not needing sudo.

user-f1d926 21 December, 2017, 13:27:54

Oh! I forget that I needed to log out for group changes. Thanks, it was that (:

user-58cb0c 21 December, 2017, 18:40:04

I got pupil software working, now I want to develop. But I get an error. I think there is something wrong with the opencv.

MainProcess - [INFO] os_utils: Disabled idle sleep. In file included from detector_2d.cpp:614: In file included from /usr/local/opt/opencv/include/opencv2/core.hpp:52: /usr/local/opt/opencv/include/opencv2/core/cvdef.h:428:14: fatal error: 'array' file not found

include <array>
         ^~~~~~~

1 error generated. error: command '/usr/bin/clang' failed with exit status 1 world - [ERROR] launchables.world: Process Capture crashed with trace: Traceback (most recent call last): File "/Users/jonathan/Projecten/pupilHR/pupil/pupil_src/launchables/world.py", line 113, in world import pupil_detectors File "/Users/jonathan/Projecten/pupilHR/pupil/pupil_src/shared_modules/pupil_detectors/init.py", line 16, in <module> build_cpp_extension() File "/Users/jonathan/Projecten/pupilHR/pupil/pupil_src/shared_modules/pupil_detectors/build.py", line 25, in build_cpp_extension ret = sp.check_output(build_cmd).decode(sys.stdout.encoding) File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/subprocess.py", line 336, in check_output **kwargs).stdout File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/subprocess.py", line 418, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '['/Library/Frameworks/Python.framework/Versions/3.6/bin/python3', 'setup.py', 'install', '--install-lib=/Users/jonathan/Projecten/pupilHR/pupil/pupil_src/shared_modules']' returned non-zero exit status 1.

MainProcess - [INFO] os_utils: Re-enabled idle sleep.

wrp 22 December, 2017, 02:28:34

@user-58cb0c what version of OpenCV are you using?

user-58cb0c 22 December, 2017, 14:57:06

I used brew install opencv3 --with-contrib --with-python3 --with-tbb --without-python

user-58cb0c 22 December, 2017, 14:57:11

to install opencv

user-813280 24 December, 2017, 16:39:37

Is it possible to change the angle of the camera relative to the eye?

user-41f1bf 24 December, 2017, 17:01:49

Yes it is

user-41f1bf 24 December, 2017, 17:03:03

I can confirm that changing the lenses angle is also possible. For wide angles use small ROI's

user-813280 24 December, 2017, 20:09:49

Thank you. Here is what I am doing, glasses that are covered with an IR reflector. Invisible to the eye, look like a mirror to the IR camera. Glasses are slightly tiled, cameras are tiled as well and positioned above the eyebrows. This allows to capture the eye exactly from the center without putting a camera right in front of the eye. I don't know if this will improve accuracy but I don't see why it won't.

wrp 25 December, 2017, 08:45:59

@user-813280 you can use a "hot mirror" without negatively affecting the pupil detection algorithm the Pupil Labs DK2 add-on does this. Just to note, hot mirror setups can be problematic in natural environments because the environmental light can overpower the IR illuminators and then you will have a degraded image (or no image). Hope this is helpful

user-813280 25 December, 2017, 08:52:39

Got it, thanks. Final question, is it okay that the mirror (glasses) have some curvature to them. Nothing extreme, similar to this. https://cdn6.bigcommerce.com/s-d8bzk61/images/stencil/1280x1280/products/2528/6970/Pyramex_Intruder_Safety_Glasses_with_Clear_Lens__16824.1474037730.jpg?c=2

mpk 25 December, 2017, 08:55:52

@user-813280 a curved mirror is like a lens, if you use a mirror like this I believe you will get a very distorted image.

user-813280 25 December, 2017, 09:19:16

sure, but maybe the software can compensate for that since eyeballs are spheres and it can undistort based on that

user-813280 25 December, 2017, 09:20:21

(like it takes into account perspective/angle of eye/camera)

user-41f1bf 25 December, 2017, 16:22:05

There is no undistortion algorithm for the eye camera.

user-59f06b 26 December, 2017, 02:52:24

Recoreded video using pupil main, having trouble loading the saved data:

player - [INFO] launchables.player: Application Version: 1.1.63
player - [INFO] launchables.player: System Info: User: willem, Platform: Linux, Machine: willem-laptop, Release: 4.10.0-42-generic, Version: #46~16.04.1-Ubuntu SMP Mon Dec 4 15:57:59 UTC 2017
player - [INFO] camera_models: Previously recorded calibration found and loaded!
player - [INFO] launchables.player: Session setting are a different version of this app. I will not use those.
player - [ERROR] launchables.player: Process Player crashed with trace:
Traceback (most recent call last):
  File "/home/willem/side/nystag/pupil/pupil_src/launchables/player.py", line 439, in player
    p.recent_events(events)
  File "/home/willem/side/nystag/pupil/pupil_src/shared_modules/vis_scan_path.py", line 57, in recent_events
    gray_img = frame.gray
  File "/home/willem/side/nystag/pupil/pupil_src/shared_modules/video_capture/file_backend.py", line 81, in gray
    self._gray = np.frombuffer(self._av_frame.planes[0], np.uint8).reshape(self.height,self.width)
ValueError: total size of new array must be unchanged

player - [INFO] launchables.player: Process shutting down.

any ideas?

user-59f06b 26 December, 2017, 02:52:54

I want to use the raw data exporter plugin. Alternatively, if there are any utility functions or ways I can unpickle this data then I would like to inspect it in this method Edit: use load_object() in file_methods.py

mpk 26 December, 2017, 07:59:46

@user-59f06b can you share the video file? The traceback suggests the file cannot be read? Did you make changes to the code or is this runnning current release/master?

user-813280 26 December, 2017, 13:21:53

"There is no undistortion algorithm for the eye camera." Then how do different camera lenses/FOVS and angles relative to the eye work? All these cause optical distortion to the frame.

mpk 26 December, 2017, 21:00:40

@raiori#5658 the distortion introduced by your mirror will be different then when using a lens in the sense that it is non symmetrical. This could be a problem. I suggest to just give it a shot.

user-813280 26 December, 2017, 21:12:07

right but camera perspective (angle) distortion isn't symmetrical either. I can try but if theres any developer here they can give a real answer. Testing only shows what apparently works only, or what doesnt work but not always the why.

user-53a623 28 December, 2017, 08:11:34

hey guys! Does anyone have experience with pupil kit and HTC VIVE?

wrp 28 December, 2017, 08:35:31

Hi @user-53a623 I responded in the πŸ₯½ core-xr channel

user-5216fd 28 December, 2017, 14:35:39

Hi! I have a problem on the interface with the text of the confidence and pupil diameter (see image, top left). They are black spots. Do you know why? I'm on Ubuntu. Thanks!

Chat image

mpk 28 December, 2017, 14:43:12

@user-5216fd this is a known openGL issue. We are working on it. Thanks for the report!

user-5216fd 28 December, 2017, 14:53:42

ok thanks! I have another one. Just today one eye camera is very strange, very grainy (not sure about the English word...) (see the image). Thanks!

Chat image

user-53a623 28 December, 2017, 14:54:41

have you tried restart the service?

user-5216fd 28 December, 2017, 15:11:36

I tried restarting pupil capture some times

mpk 28 December, 2017, 15:13:56

@user-5216fd this looks like a hardware issue. What resolution and framerate are you using?

mpk 28 December, 2017, 15:14:14

If this happens at VGA and QVAG we should do a repair replacement.

mpk 28 December, 2017, 15:14:45

you can write to us at info[at]pupil-labs.com

user-5216fd 28 December, 2017, 15:18:24

resolution (1280, 720) / framerate 30

user-5216fd 28 December, 2017, 15:19:18

uhm, these are setting for the egocentric camera. were you referring to the eye cameras? where do I see resolution and framerate?

user-5216fd 28 December, 2017, 15:20:41

found it, of course it was in the relative eye window!

user-5216fd 28 December, 2017, 15:21:01

(320, 240) framerate 120

user-5216fd 28 December, 2017, 15:24:02

writing you an email now

user-ec208e 28 December, 2017, 15:31:13

hi! is there a problem with the Android app? I've downloaded the app on several phones and it doesn't recognize the pupil 😦

mpk 28 December, 2017, 16:09:55

@paolo#0833 in this [email removed] shoot us an emay to get this resolved!

mpk 28 December, 2017, 16:12:03

@user-ec208e what phones did you try? What cable are you using to connect the headset to the phones?

user-dcd7f2 28 December, 2017, 17:58:16

@mpk I tried with an LG g5 and a Samsung S7. The cable its an usb-c to usb-c. thanks!

wrp 29 December, 2017, 03:19:07

Hi @user-dcd7f2 Thanks for the report. Currently Pupil Mobile is confirmed to run well on Moto Z2 Play, Nexus 5x, Nexus 6P, and One Plus 3. While Pupil Mobile + Pupil may work with other devices it depends on 3 main factors (1) USB Controller hardware quality in the Android device (not all USBC controllers are created equally or up to full USBC spec), (2) Android Version (3) USBC Cable.

Please also note that the USBC-USBC cable that comes with the device usually is designed for charging and possibly data transfer. Cables like this one - https://www.amazon.com/CHOETECH-Hi-speed-Devices-Including-ChromeBook/dp/B017W2RWB8/ref=sr_1_4?ie=UTF8&qid=1511416500&sr=8-4&keywords=CHOETECH+USB+C - have been proven to work.

user-5216fd 29 December, 2017, 09:49:38

I tried once Honor 9 and it was working. I didn't do a lot of tests but just tried once but everything seemed ok.

wrp 29 December, 2017, 10:54:48

Thanks for sharing this report @user-5216fd πŸ˜„

user-5216fd 29 December, 2017, 14:48:12

πŸ‘

user-ec208e 29 December, 2017, 18:06:39

Hi @wrp! The cable that I use it’s the same that you said. I tried with a Galaxy S8 and it worked! But I’ve another problem, when I record and I don’t change the video location it says that it records on default, where is β€œdefault” located? I’ve been testing with the cellphone and I didn’t find the recording that I did without setting the location. When I change the location I don’t have any problem. Thanks in advance for your response πŸ˜€ !

user-7e5ea3 30 December, 2017, 06:23:43

Hi everyone, I am trying to use Raspberry pi zero with NoIR camera to make pupil headset. I couldn't find listed IR emitter in stores. Is there anything special about them? Can I safely use a general 5mm IR LED of same wavelength?

wrp 31 December, 2017, 00:54:37

@user-ec208e if using local storage the files are saved in /Movies/Pupil Mobile - on a nexus 5x this looks like: /storage/emulated/0/Movies/Pupil Mobile

wrp 31 December, 2017, 00:55:36

To locate the files you can use a file browser app like Amaze (or other file browsing apps)

user-ec208e 31 December, 2017, 14:43:23

Thank you @wrp ! ☺️

End of December archive