💻 software-dev


papr 01 June, 2018, 05:33:13

@user-b91aa6 This is due to the fact that the script has not been updated for Python 3. I will update it today.

user-b91aa6 01 June, 2018, 07:34:28

Thanks.@papr

user-b91aa6 01 June, 2018, 07:35:10

Why dll load failed in pupil detector while running pupil lab source code in windows?

Chat image

papr 01 June, 2018, 07:36:26

Looks like the pupil_detector was not built correctly/failed building. From the docs:

When starting run_capture.bat, it will build module pupil_detectors. However, if you are debugging, you may want to try building explicitly. From within `pupil/pupil_src/capture/pupil_detectors` run `python setup.py build` to build the pupil_detectors.
user-b91aa6 01 June, 2018, 07:41:51

Thanks. May I ask what this step mean? It doesn't tell us to do something

Chat image

user-b91aa6 01 June, 2018, 07:41:57

@papr

papr 01 June, 2018, 07:48:46

It is just a note. The following steps refer to it.

papr 01 June, 2018, 07:49:36

But you are right that this step could use some clarification

user-b91aa6 01 June, 2018, 07:53:01

How do I know whether the pupil detector is build successfully?

user-b91aa6 01 June, 2018, 07:53:21

Chat image

user-b91aa6 01 June, 2018, 07:57:06

Meet the problem again.

Chat image

user-b91aa6 01 June, 2018, 07:58:21

It seems that something is wrong with the instruction.

Chat image

user-b91aa6 01 June, 2018, 07:58:51

We don't have capture, but shared modules folder

Chat image

papr 01 June, 2018, 07:59:40

This is outdated information. I have just submitted a PR to change that

user-b91aa6 01 June, 2018, 08:00:55

I am doing exactly the same as the instruction. But the pupil detector still can't be imported

mpk 01 June, 2018, 08:03:21

@user-b91aa6 please note that runnig from source on windows is not easy. We cannot give support in this regard beyond trying to answer specific questions within reason. I recommend that you 1) try installing in Linux (much simpler) or purchase our support package for dedicated video support.

user-b91aa6 01 June, 2018, 08:04:46

Can I run codes using Cygwin or Virtual Machine?

mpk 01 June, 2018, 08:04:53

no.

mpk 01 June, 2018, 08:05:16

try a live usb stick.

user-b91aa6 01 June, 2018, 08:06:20

I don't have a second computer to run pupil lab

mpk 01 June, 2018, 08:08:30

@user-b91aa6 you can run Pupil from the bundle on Windows. You can install linux alongside windows on your machine.

user-b91aa6 04 June, 2018, 15:38:46

I am using C++ to receive data from pupil lab, why the gaze data obtained in C++ codes always about 37 seconds behind the real time gaze data in PupilLab?

Chat image

user-b91aa6 04 June, 2018, 15:39:12

@papr

papr 04 June, 2018, 15:46:40

@user-b91aa6 looks like you need to synchronize your clocks

user-b91aa6 04 June, 2018, 15:51:53

Thanks for your reply. But the python codes don't do that, python just subscribes to the gaze topic, then, the gaze data can be received correctly. How to synchronize the clocks?@papr

papr 04 June, 2018, 15:53:16

There should be a section about that in the docs

user-d9bb5a 05 June, 2018, 11:33:07

Good afternoon. Can someone come across saccades in Pupil labs. How do you calculate their speed and distance?

papr 05 June, 2018, 11:35:39

@user-d9bb5a You will have to do that manually. There is no official plugin yet that detects saccades.

user-d9bb5a 05 June, 2018, 11:37:09

Thank you. And tell me how to correctly do this, taking into account our equipment and capabilities?

user-d9bb5a 05 June, 2018, 11:37:55

How to calculate them correctly with our data and equipment

papr 05 June, 2018, 11:39:30

Look up a paper on saccade detection, implement the algorithm, and run it on the csv data exported by Player. The details depend on your chosen algorithm.

user-d9bb5a 05 June, 2018, 11:44:02

with Windows it can be done?

user-d9bb5a 05 June, 2018, 12:00:50

oh ... but you can show an example ... I do not understand how to do it (

papr 05 June, 2018, 13:34:46

I don't have an example

user-d9bb5a 05 June, 2018, 13:35:40

thank you very much. I will look for solutions)

wrp 06 June, 2018, 03:02:19

@user-fcc645 I just installed pyglui-1.2.2 with pip install pyglui-1.22-cp36-cp36m-win_amd64.whl

wrp 06 June, 2018, 03:02:55

I rebuilt the wheel just in case it makes any difference - and uploaded the rebuilt wheel to https://github.com/pupil-labs/pyglui/releases/tag/v1.22

wrp 06 June, 2018, 03:03:16

please could you try to install again with pip install --upgrade pyglui-1.22-cp36-cp36m-win_amd64.whl

wrp 06 June, 2018, 03:04:02

is it possible that you have multiple versions of Python installed on your system?

wrp 06 June, 2018, 03:04:14

also are you using the most recent version of pip - python -m pip install --upgrade pip

wrp 06 June, 2018, 03:04:30

Please check the above points and let me know if you are able to install the wheel

user-fcc645 06 June, 2018, 03:29:17

I did have python installed previously now I have deleted all its folder and will try again

wrp 06 June, 2018, 03:30:05

How many versions and what versions of Python were installed @user-fcc645 ? Additionally, it may be beneficial to use uninstallers on Windows

user-fcc645 06 June, 2018, 03:35:55

can you please send me a link of the python installer that you used for testing if the executable is different than python-3.6.5-amd64.exe

user-fcc645 06 June, 2018, 03:38:33

I had python 27 installed

user-fcc645 06 June, 2018, 03:42:47

could the amd64 and x64 cause any issue?

user-fcc645 06 June, 2018, 03:44:01

mine is x-64 based processor

wrp 06 June, 2018, 03:44:46

Tested on a Windows 10 machine with Python v3.6.1

wrp 06 June, 2018, 03:45:12

the wheel is for a 64bit machine

user-fcc645 06 June, 2018, 03:45:12

ok

wrp 06 June, 2018, 03:45:37

I think the issue here is that pip or python2.7 is causing a problem on your machine

wrp 06 June, 2018, 03:45:40

therefore the mismatch/error

user-fcc645 06 June, 2018, 04:00:45

I have uninstalled everything and will start again

user-fcc645 06 June, 2018, 04:35:57

Is it possible to have pupil player analysis (graphs etc) in the pupil capture added through a plugin or something?

user-fcc645 06 June, 2018, 04:45:35

the installation issue is solved thanks for your help

wrp 06 June, 2018, 04:48:34

@user-fcc645 That's good to hear, thanks for the update. To confirm, this was related to multiple Python versions installed on your machine, correct?

In response to graphs - you could create a plugin that displays graphs in another window.

user-fcc645 06 June, 2018, 05:17:19

yes correct I had multiple python versions installed

user-b91aa6 06 June, 2018, 07:23:12

Question about the gaze position: why does the gaze position jitter and seems not continuous?

user-b91aa6 06 June, 2018, 07:23:17

@papr

papr 06 June, 2018, 07:25:04

Two reasons: 1) pupil detection/gaze estimation error. 2) eye movement itself is not necessarly continuous.

user-c7a20e 06 June, 2018, 08:02:41

Is it possible to control mjpeg compression quality? From what I understand pupil cams can only stream mjpeg as uncompressed is too much for USB2 and they dont seem to support USB3. But I think reducing MJPEG compression ratio would make the cameras need less processing to be done and so maybe squeeze one or two more fps from the latency.

papr 06 June, 2018, 08:07:42

@user-c7a20e At which resolution are you running the cameras?

mpk 06 June, 2018, 08:08:11

@user-c7a20e jpeg compression can not be changed.

user-c7a20e 06 June, 2018, 08:08:46

@mpk you mean the compression ratio cannot be changed?

mpk 06 June, 2018, 08:08:52

correct.

mpk 06 June, 2018, 08:09:44

the limiting factor is the sensor not the usb bandwidth.

user-c7a20e 06 June, 2018, 08:10:45

what do you mean? limiting factor to what?

papr 06 June, 2018, 08:11:26

For the amount of incoming fps

papr 06 June, 2018, 08:12:08

you said "pupil cams can only stream mjpeg as uncompressed is too much for USB2" . mpk says that this is not true

user-c7a20e 06 June, 2018, 09:17:42

I see

user-b91aa6 06 June, 2018, 09:40:11

About 3D calibration, it seems that I also need to send two parameters called translation_eye0, translation_eye1, what do they mean?

Chat image

user-b91aa6 06 June, 2018, 09:40:22

@papr

user-b91aa6 06 June, 2018, 09:40:47

Do we have a Python example for 3D calibration?

papr 06 June, 2018, 09:42:51

I am not sure what value needs to be set there. My guess is that it is the distance between the two vr-headset screens. But I would recommend to look that up in the hmd-eyes project.

papr 06 June, 2018, 09:43:10

I do not have an example for 3d HMD calibration, sorry

user-b91aa6 06 June, 2018, 09:43:50

But I checked many times and don't find where they do the 3D calibraion in the hmd-eyes project codes, can you help me check that which file is doing the 3D calibration? Thank you very much.@papr

user-c7a20e 06 June, 2018, 09:45:07

Sorry for interrupting but why is 3d calibration needed for HMDs?

papr 06 June, 2018, 09:45:21

No, I am sorry, but I do not have the time to dive into the hmd-eyes project.

papr 06 June, 2018, 09:45:39

@user-c7a20e We highly recommend to use the 2d HMD Calibration.

user-c7a20e 06 June, 2018, 09:45:57

please explain difference

papr 06 June, 2018, 09:47:35

3d calibration requires to estimate he geometry between eye cameras and scene. This is difficult to do. 2d calirbation uses simple 2d regression and is therefore much simpler to use

user-c7a20e 06 June, 2018, 11:10:46

So 2d calibration determines eyeball rotation angle relative to its center?

user-c7a20e 06 June, 2018, 11:12:19

And according to your explanation 3d calibration tries to determine exactly at what pixel on the screen in front of it it is looking at? I don't understand how you could do that. Because any misalignment of the HMD would offset the result

papr 06 June, 2018, 14:13:27

No. The calibration procedure itself just learns the mapping function between pupil positions (eye camera coordinate system) and gaze positions (scene coordiante system)

papr 06 June, 2018, 14:14:32

But yes, slippage is a common problem in eye tracking.

papr 06 June, 2018, 14:18:24

2d pupil detection tries to detect the pupil and fit a 2d ellipse to it. The 2d mapping is a polynomial function that maps (x-eye0, y-eye0, x-eye1, y-eye1)->(x-scene, y-scene)

papr 06 June, 2018, 14:19:59

3d pupil detection uses a time-series of 2d ellipses to fit a 3d model. The result of this procedure is a 3d pupil vector (relative to the eye cameras). The 3d mapping uses a linear mapping/matrix multiplication to map into the 3d scene space

user-c7a20e 06 June, 2018, 17:04:48

I dont understand the need for 3d pupil detection, unless it is dynamic and solves slippage.

papr 06 June, 2018, 17:32:09

@user-c7a20e you are exactly right. That is what it does. New models are fitted on the fly as soon as new observations do not fit the old model

user-8779ef 06 June, 2018, 17:32:57

@papr Is it correct to say that the 3D mapping estimates the vector at the center of the pupil that is orthogonal to the pupil's surface. This alone would be a noisy estimate. So, over time, these vectors are used to estimate the center of the globe on which the pupil lies (the eye center).

user-8779ef 06 June, 2018, 17:33:27

...and the benefit is that the additional constraint that the orthogonal vector must also cut through the eye center reduces error.

user-e062ba 07 June, 2018, 03:46:43

Hi, I am looking for blink detection using Matlab LSL

user-e062ba 07 June, 2018, 04:31:27

Found the parameters

user-b91aa6 07 June, 2018, 12:32:13

In mobile eye tracking headset, how do we perform 3D gaze calibration since you don't know the 3D position of reference points?@papr

papr 07 June, 2018, 12:32:35

We assume a specific distance

papr 07 June, 2018, 12:33:31

But during bundle adjustment we allow the optimization to move the points on the z-axis as required

user-b91aa6 07 June, 2018, 12:37:19

I haven't totally understood the role of bandle adjustment, what the bundle adjustment is optimizing?@papr

papr 07 June, 2018, 12:47:18

It tries to estimate geometrical relation of the eye and world cameras. The result are two mapping function/matrices that can map 3d vectors from the eye cam space into the world space

user-b91aa6 07 June, 2018, 13:21:11

Thank you very much for your reply. "during bundle adjustment we allow the optimization to move the points on the z-axis as required", what do the points mean? What the points represent?

user-b91aa6 07 June, 2018, 13:21:20

@papr

papr 07 June, 2018, 13:23:57

The reference points/calibration targets

papr 07 June, 2018, 13:24:35

Its a technical detail that the optimization is allowed to adjust these. Nothing much to be concerned about

user-b91aa6 07 June, 2018, 13:26:46

What's the object funciton for bundle adjustment optimization?

user-b91aa6 07 June, 2018, 13:26:51

@papr

papr 07 June, 2018, 13:27:22

I don't know this detail

user-b91aa6 08 June, 2018, 08:02:12

Quenstion about 3D calibration: What the last gaze distance mean in estimating the 3D gaze positon?

Chat image

user-b91aa6 08 June, 2018, 10:12:31

@papr

mpk 08 June, 2018, 10:55:21

@user-b91aa6 this is the magnitude of the 3d gaze point.

user-b91aa6 08 June, 2018, 11:57:33

Thank you for your reply. Here is what I don't understand, 3D gaze point is a point, why should we care about the last gaze distance.@mpk

papr 08 June, 2018, 12:14:49

Because we cannot estimate depth during monocular mapping. Therefore we reuse the most recent gaze distance

user-b91aa6 08 June, 2018, 12:22:31

So you are assuming that the gaze depth is the same as last gaze when the gaze mapping has to do monocular gaze mapping, and the depth of last gaze will be the depth of the current gaze?@papr

user-b91aa6 08 June, 2018, 12:22:41

Thank you very much.

papr 08 June, 2018, 12:23:25

the last gaze distance is updated as soon as we can do a binocular mapping, yes

user-b91aa6 08 June, 2018, 12:25:39

Thanks.

user-b91aa6 08 June, 2018, 13:21:43

Currently, can I get 2D gaze using 3D eye tracking model?@papr

user-8779ef 08 June, 2018, 14:43:31

@user-b91aa6 , I'm guessing not, but very curious to hear @papr 's response. The 2D marketplace demo seems to switch the 3D mapper to the 2D gaze mapper upon initialization. Switching it back into a 3D gaze mapper produced horribly inaccurate results the last time I tried.

user-8779ef 08 June, 2018, 14:44:08

@user-b91aa6 I don't quite understand why switching the gaze mapper would lead to inaccuracies in the 2D gaze representation - I would have imagined the gaze mappers to be interchangeable.

user-8779ef 08 June, 2018, 14:46:31

However, this is all related to the horrible issue I posted about 14 days ago here: https://github.com/pupil-labs/hmd-eyes/issues/47 .: That data representation and gaze mapper methods are conflated. The 3D scene demo pairs the 3D gaze mapper with a unusably noisy representation of gaze in depth, rather than a ray (as in the 2D demo). It seems we lack the ability to pair the 3D gaze mapper with the more useful 2D gaze representation.

user-8779ef 08 June, 2018, 14:47:20

I've been waiting for someone to address or acknowledge this limitation, but it seems HMD Eyes is not receiving much attention right now.

mpk 08 June, 2018, 16:11:19

@user-8779ef the reason that 2d detection and mapping are interlinked is indeed somewhat abitrary. For mobile eye tracking this combination does make sense but we have not implemented a very good 3d pipeline for hmds yet. I have some ideas I d love to try but we dont have the time for this right now.

mpk 08 June, 2018, 16:12:28

A small improvement would be using 3d pupil detection and regressing to gaze data via the normals and not norm positions. This should give some degree of slippage compensation for hmd whiles also yielding stable 2d gaze data.

user-b91aa6 08 June, 2018, 17:00:35

@mpk, what are differences between normals and norm positions?

mpk 08 June, 2018, 17:00:58

One is 3d directions from the eye model.

mpk 08 June, 2018, 17:01:10

The other is position of the pupil center in eye image.

user-b91aa6 08 June, 2018, 17:02:43

Do we must provide markers with depth in HMD 3D calibration?

user-b91aa6 08 June, 2018, 17:02:47

@mpk

mpk 08 June, 2018, 17:06:27

Yes.

user-b91aa6 08 June, 2018, 17:08:55

May I ask that what's the reason?

user-b91aa6 08 June, 2018, 17:08:59

@mpk

user-e5aab7 08 June, 2018, 17:52:18

I apologize for this naive or broad question but if I wanted to modify pupil source code, what IDE would allow me to modify it most freely (any recommendations? VS?)? Or was the console used? I am trying to utilize pupils eye tracking to get gaze vector data that can be used in unity or unreal. I wanted to use a Pi for the eye tracking but using the entire pupil GUI is a no go.

user-8779ef 08 June, 2018, 19:54:27

@mpk Thanks for the response! Keep in mind that the 3D integratino market is currently wide open. Tobii is the only competition, and ... mEH. MEH MEH MEH.

user-8779ef 08 June, 2018, 19:56:11

I have been teaching workshops on eye tracking in 3D (last year at ECEM, this year at VSS). I have been advocating your mobile trackers and telling people that I can't yet advocate the HMD integration. I would truly love to give it my full endorsement.

user-41f1bf 09 June, 2018, 18:03:42

@user-e5aab7 I have been using sublime text for simple plugins, PyCharm, VS and Geany should do the work too.

user-049a7f 11 June, 2018, 14:47:40

whats the purpose of init.py in launchables?

mpk 11 June, 2018, 14:57:43

@user-049a7f this makes the files inside importable. Its a python specific detail.

user-049a7f 11 June, 2018, 17:00:24

Does all the actual eye tracking occur in eye.py?

mpk 11 June, 2018, 17:51:42

Yes.

user-049a7f 11 June, 2018, 18:23:57

@mpk is there any way to just run the eye.py on footage without opening the pupil GUI? Or would they eye tracking found in eye have to be taken and coded as a stand alone?

papr 11 June, 2018, 20:18:46

@user-049a7f you will have to change the source code if you want to run the detection stand-alone.

user-24270f 14 June, 2018, 05:05:10

hi all, new user here. whats the simplest way to start pulling gaze data through c++ or c#? i dont want to use unity or python if i can avoid it. i just wish to record x and y positions over the course of a session in the htc vive, which i will then use to create my own heatmaps and stuff

user-24270f 14 June, 2018, 05:06:05

all of the docs all talk about unity and python so was hoping to be pointed in the right direction for just a pure c++ or c# plugin that i can use with visual studio

user-ea779f 14 June, 2018, 08:23:23

Hello, i'm planning to buy an pupil labs eye tracking add-on for my htc. i'm an experienced Unreal Engine Developer and it's nessessary for me to get it running in Unreal Engine. For my first project i just need to get the pupil dilation. Is there any SDK or Plugin that i can use?

papr 14 June, 2018, 08:24:52

@user-ea779f @user-f1eba3 did work on that

user-29e10a 14 June, 2018, 08:47:29

@user-24270f look at the HMDeyes project on the Pupil GitHub... it is a Unity project, but you can see how to connect to the network communication with C#. Just isolate this stuff and you're good to go... you always need to run Pupil Capture (or Service) as an executable, just to be clear on this

user-24270f 14 June, 2018, 08:49:18

👍

user-ea779f 14 June, 2018, 08:53:57

Thanks for your quick reply! So it's basically possible with low to medium effort to get it running in unreal engine, thanks that helps a lot

user-f1eba3 14 June, 2018, 13:22:32

@user-ea779f I made a way to receive synchronoulsy data from pupil. If this data is found in the pupil or gaze topic then you just have to use some of my code and get that dilation

user-f1eba3 14 June, 2018, 13:23:27

the plugin also has some calibration mechanism that you would basically want to delate

user-24270f 15 June, 2018, 03:33:26

so i tried a usb3.0 extension cable so that i could actually reach the computer, and it wouldnt work. take it out, use the supplied cable only. now it works. so i bought this thing and the supplied cable is not actually long enough to reach the PC. what gives?

user-24270f 15 June, 2018, 03:34:11

using a htc vive, so the vive HMD connection is usb-c but the provided cable has the standard usb port end so it must go back to the pc. has anyone got it working with a specific brand of usb extension cable?

mpk 15 June, 2018, 04:12:15

@user-24270f I send an answer in the other chat.

user-24270f 15 June, 2018, 06:45:05

pupil capture records and exports a csv with gaze_positions which seems to be a good way to get the data. since im using VR, i have no world view to overlay the data using the player

has anyone got a recording of the PC view (of the game youre playing, for example) into the player to replace the world view?

user-24270f 15 June, 2018, 06:45:26

that was my plan, to record gameplay then overlay the gaze data for post-play analysis. but if i can use the tools provided, all the better

user-29e10a 15 June, 2018, 07:25:31

@user-24270f I'm interested in replacing the world video with the game video, too – I had no time yet, but I have two ideas: Creating a Fake UVC device which streams from FFMPEG (recording the game view is implemented already in HMDeyes) or make usage of the "frame_publisher" plugin from pupil, you can send custom data over the network and display this as an overlay in the pupil capture window. I'm not sure if this will be recorded as world video, but at least it would be a start.

user-8779ef 15 June, 2018, 12:23:30

@user-29e10a This would be a great addition. I worry a bit about adding real-time image compression to your pipeline, which is already quite busy. However, I think some kind of compression would be necessary, or the real-time recording feature will not be widely used.

user-41b9be 15 June, 2018, 17:19:26

Hello, I am unable to get the pupil service app to run on Mac OS High Sierra. I get the following error:

2018-06-15 20:06:25,707 - service - [ERROR] launchables.service: Process Service crashed with trace: Traceback (most recent call last): File "launchables/service.py", line 149, in service File "shared_modules/plugin.py", line 294, in init File "shared_modules/plugin.py", line 321, in add File "shared_modules/service_ui.py", line 103, in init socket.gaierror: [Errno 8] nodename nor servname provided, or not known

Has anyone run into this error? Thanks!

papr 15 June, 2018, 17:35:50

@Jorge#3168 hey, which Mac do you use?

user-11dbde 15 June, 2018, 17:39:39

Hi sorry for the question...but when it is expected that the calibration/tracking will be working properly in Hololens?

user-41b9be 15 June, 2018, 17:41:46

@papr Mac Air from 2015

papr 15 June, 2018, 17:55:51

@user-41b9be which version of Service do you use?

user-41b9be 15 June, 2018, 17:59:57

@papr 1.7.42

user-7e60fc 15 June, 2018, 23:06:19

Hi everyone,

I have a question regarding the IPC backbone of Pupil. I am trying to intercept the frame raw data (sent as a compressed jpeg by the camera through USB if I understand it correctly) through communicating with the IPC. However, I tried to listen to the SUB_PORT and only the notifications, logs and gaze+eye positions data are sent through there. Do you guys know from which port the raw image data of the world frame are transmitted? Is there anything I can do to grab them manually?

Thank you!

papr 16 June, 2018, 07:18:55

@user-7e60fc you need to turn on the frame publisher plugin

user-7e60fc 16 June, 2018, 18:47:41

@papr and then the frame will be sent to the IPC and I can just subscribe to catch it?

papr 17 June, 2018, 07:38:24

@user-7e60fc correct. The topics start with frame if I remember correctly

user-7e60fc 18 June, 2018, 22:14:28

Hi @papr,

The topic is "frame.world". I was able to retrieve the raw data from the IPC backbone, but I couldn't deserialize it with msgpack. I keep getting an error: "msgpack.exceptions.ExtraData: unpack(b) received extra data". Do you know how to fix this? I use msgpack 0.5.6

mpk 19 June, 2018, 06:23:35

@user-7e60fc check out this example: https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py

user-7e60fc 19 June, 2018, 17:08:36

@mpk Thank you so much!

user-ecbbea 19 June, 2018, 17:47:57

Hey there, I found that in some of our data, our normalized X and Y pupil positions are outside the bounds of [0, 1]. What does this mean? For example, we had a datapoint where one of the dimensions was reading a value of 12

user-8779ef 19 June, 2018, 20:56:01

@user-ecbbea LIkely means that the estimated gaze location was beyond the limits of the screen.

user-8779ef 19 June, 2018, 20:56:06

ehr, of the scene camera.

user-8779ef 19 June, 2018, 20:56:19

Anyone here developing / running from source on a mac?

papr 19 June, 2018, 21:01:21

@user-8779ef I do. Why do you try installing stuff with anaconda anyway?

user-3f0708 20 June, 2018, 01:54:49

good evening I need help. Has anyone used the mouse_control.py code available from github in the latest version of the pupil labs?

user-3f0708 20 June, 2018, 02:02:04

Well I wanted to move the mouse with the movement of the eyes through the code mouse_control.py with the help of the markers, but I can not. Fco the calibration process soon after I execute the code mouse_control.py but the mouse does not move.

wrp 20 June, 2018, 08:21:36

Hi @user-3f0708 I responded in the 👁 core channel. We can continue the discussion in this channel if desired. Please post only in one channel 😄

user-8779ef 20 June, 2018, 10:19:56

@papr Don't worry - gave up on anaconda very quickly. However, I'm still unable to run....

papr 20 June, 2018, 10:20:21

So you are still having problems with pyav?

user-8779ef 20 June, 2018, 10:20:39

No, now it's this ...

user-8779ef 20 June, 2018, 10:20:41

"ImportError: dlopen(/Users/gjdiaz/PycharmProjects/pupil_pl_2/pupil_src/shared_modules/calibration_routines/optimization_calibration/calibration_methods.cpython-36m-darwin.so, 2): Library not loaded: /usr/local/opt/boost-python/lib/libboost_python3.dylib Referenced from: /Users/gjdiaz/PycharmProjects/pupil_pl_2/pupil_src/shared_modules/calibration_routines/optimization_calibration/calibration_methods.cpython-36m-darwin.so Reason: image not found" I can confirm that boost_python3 and boost have been "untapped."

user-8779ef 20 June, 2018, 10:21:29

( installed via brew )

papr 20 June, 2018, 10:22:04

Are you sure that you want that to be untapped?

user-8779ef 20 June, 2018, 10:22:33

Eh? Maybe i'm using the wrong language. I've installed boost / boost_python3 according to instructions

user-8779ef 20 June, 2018, 10:23:13

"brew install boost brew install boost-python3"

user-8779ef 20 June, 2018, 10:23:46

Brew installs are system-wide, right?

user-8779ef 20 June, 2018, 10:23:59

well, python wide

papr 20 June, 2018, 10:24:07

But what do you mean by this?

"I can confirm that boost_python3 and boost have been "untapped.""

As far as I know untapping in brew means not to link the libs in the places where the linker looks for them

user-8779ef 20 June, 2018, 10:24:43

Huh. I said that because, when I install, I believe it says something about untapping. Perhaps it's pouring. I don't know... they try and be too cute 😛

user-8779ef 20 June, 2018, 10:25:35

In any case, they packages are already installed and up-to-date

user-8779ef 20 June, 2018, 10:25:53

Is there some possibility of path issues related to brew?

papr 20 June, 2018, 10:27:05

yes. Because the linker was able to find th lib during compilation but now it does not find it anymore

papr 20 June, 2018, 10:27:54

Try to delete all so files in /Users/gjdiaz/PycharmProjects/pupil_pl_2/pupil_src/shared_modules/calibration_routines/optimization_calibration/ and delete the build folder as well. Afterwards start capture. It should rebuild the module

user-8779ef 20 June, 2018, 10:34:03

Strange path issue now...

user-8779ef 20 June, 2018, 10:34:24

Console copy/paste:

user-8779ef 20 June, 2018, 10:34:25

gbook:~ gjdiaz$ CFLAGS=-stdlib=libc++ /Users/gjdiaz/anaconda3/envs/Pupil3b/bin/pip install git+https://github.com/pupil-labs/pyndsi Your PYTHONPATH points to a site-packages dir for Python 3.x but you are running Python 2.x! PYTHONPATH is currently: "/usr/local/lib/python3.6/site-packages:" You should unset PYTHONPATH to fix this. gbook:~ gjdiaz$

papr 20 June, 2018, 10:35:06

you are using the anaconde pip to install ndsi. This will install the package to some isolated anaconda directory

user-8779ef 20 June, 2018, 10:36:59

yeesh, didn't notice that. Let me see if that's the issue ( I thought I installed pip on pyhton 3 / not conda)

user-8779ef 20 June, 2018, 10:38:43

Ok, compiled with: "MACOSX_DEPLOYMENT_TARGET=10.13 pip3 install git+https://github.com/pupil-labs/pyndsi"

user-8779ef 20 June, 2018, 10:39:40

capture started. Great! Now I'll try player...

papr 20 June, 2018, 10:40:13

If you are not sure which binary you are running you can always run which pip3 in the terminal and it will tell you the exact path to the binary that you will use. In this case replace pip3 with the binary that you are unsure of

user-8779ef 20 June, 2018, 10:40:30

Thanks.

user-8779ef 20 June, 2018, 10:40:46

So, deleting that "build" folder did work.

user-8779ef 20 June, 2018, 10:40:54

I'm back in the game!

user-8779ef 20 June, 2018, 10:41:03

😃

papr 20 June, 2018, 10:41:09

nice!

user-8779ef 20 June, 2018, 10:41:53

Thanks, papr! That was killing me.

user-b91aa6 27 June, 2018, 07:54:59

How to fetch eye images from pupil service?

mpk 27 June, 2018, 07:55:52

@user-b91aa6 I think this should work if you change it to receive eye images instead: https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py

user-b91aa6 27 June, 2018, 07:59:25

Thanks. I adjusted the camera focus of left eye, I can't get a clear eye image as the right eye. Any idea to solve this?

Chat image

mpk 27 June, 2018, 08:07:08

@user-b91aa6 the left eye is not focused. I would recommend rotating the lesn until is is further out and then slowy working inward until in focus.

papr 27 June, 2018, 08:07:37

Given that you have a 120Hz headset

mpk 27 June, 2018, 08:16:41

@papr looks like a vive addon.

papr 27 June, 2018, 08:17:25

Correct!

user-b91aa6 27 June, 2018, 08:25:12

When you measure the accuracy in Vive, how do you measure the accuracy, beczuse the accuracy is different in center area and periphery area. Only accuracy in center is measured?

user-b91aa6 27 June, 2018, 09:44:32

why the periphery area is always blured for left eye camera, center area of the image is sharp

Chat image

papr 27 June, 2018, 09:50:41

I could imagine that the lens is simply dirty...

user-b91aa6 27 June, 2018, 09:52:59

I clean the lens surface using a brush. What't the best way to clean it?

papr 27 June, 2018, 09:56:07

I strongly recommend to use a microfiber cloth

user-b91aa6 29 June, 2018, 09:40:08

Can the pupil service fetch fixation in real time?

wrp 29 June, 2018, 10:51:00

@user-b91aa6 yes, you can subscribe to the fixation classification plugin for realtime fixation data.

papr 29 June, 2018, 10:51:35

You need to explicitly start it though via a notification

wrp 29 June, 2018, 10:52:10

Yes, thanks for that clarification @papr

user-b91aa6 29 June, 2018, 11:32:56

Thank you very much

End of June archive