💻 software-dev


user-c5fb8b 02 December, 2019, 08:29:06

Hi @user-65eab1 If you want to receive live data while using Pupil Capture, you will have to connect to the IPC backbone and process the network messages. I recommend reading through the Network API section of the docs to get a better understanding of the involved terminology: https://docs.pupil-labs.com/developer/core/network-api/ Also here's a small example script for how to subscribe to specific messages from Capture: https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_messages.py

user-a7d017 03 December, 2019, 05:06:05

hi all, i'm currently using self.g_pool.timestamps to feed into a function which calculates some statistics based on gaze data. I noticed that the values of the attributes in self.g_pool do not change based on the time trimmer/seeker bar selections in Pupil Player. What is the way that I can utilize the trimmer/seeker bar?

user-c5fb8b 03 December, 2019, 08:20:00

@user-a7d017 The trim marks are part of the Seek_Control system plugin. You can access them via:

g_pool.seek_control.trim_left
g_pool.seek_control.trim_right
user-a7d017 03 December, 2019, 19:03:04

@user-c5fb8b thank you!

user-dae891 11 December, 2019, 01:07:35

Hi all, I got this error when I try to build optimization_calibration setup.py: python setup.py build

user-dae891 11 December, 2019, 01:08:19

fatal error C1007: unrecognized flag '-Ot' in 'p2' LINK : fatal error LNK1257: code generation failed error: command 'C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\VC\Tools\MSVC\14.13.26128\bin\HostX86\x64\link.exe' failed with exit status 1257

user-dae891 11 December, 2019, 01:09:23

Chat image

user-dae891 11 December, 2019, 01:10:07

I followed all the instructions in "dependencies-windows.md", does anyone has any idea?

wrp 11 December, 2019, 03:32:25

@user-dae891 quick question, is it an absolute requirement for you to build windows from source? Is there something you need that can not be done by running the app and using the network API or plugins for example?

user-c5fb8b 11 December, 2019, 08:41:01

@user-dae891 As @wrp also suggested I would reconsider if you really need to run Pupil from source for your use case. If not, please consider just using the provided bundles, as the setup on Windows can be quite frustrating at times. If you need to build from source though: we have experienced this error in the past: https://github.com/pupil-labs/pupil/issues/1208 It seems it was related to some incorrectly chosen boost setup. This should not happen in newer versions anymore, since we don't have boost as a dependency anymore... can you check that you are building the newest version of Pupil and that you are not using boost?

user-b8789e 11 December, 2019, 18:24:37

Hello there, I really need some help or support about softwares that are compatible with Pupil Core, to make Heatmaps without use of April Tags / QR Codes. Could you recommend us some AOI software related?

user-dae891 11 December, 2019, 18:58:33

@user-c5fb8b I want to extract the fixation point for every 7 seconds from a record video, it is time consuming if I do it manually in Pupil Player, that's why I am thinking to modify the source code. Do you think if it's necessary?

papr 11 December, 2019, 19:26:50

@user-dae891 there will be likely more than one fixation for each fixation window. How would you decide which fixation to extract?

user-dae891 11 December, 2019, 19:33:42

@papr I am extract the fixation within the defined surface

user-dae891 11 December, 2019, 19:34:00

only one surface defined

papr 11 December, 2019, 19:36:20

@user-dae891 ok, but you said that you want to extract one fixation every 7 seconds. Could you elaborate on the details? I think you could simply use the export output of the fixation detector.

user-dae891 11 December, 2019, 19:48:16

@papr Yes, I just notice I can explore all the fixations, and then separate them in the output excel file. However, I cannot do that for heatmap right?

user-dae891 11 December, 2019, 19:48:43

Maybe I can generate heatmap by myself based on fixation data.

user-dae891 11 December, 2019, 19:49:34

Additionally, may I ask what is the difference between "fixations_on_surface_Surface 1" and "gaze_positions_on_surface_Surface 1" in the output data?

papr 11 December, 2019, 20:14:18

@user-dae891 a fixation is a group of spatially close gaze data in a given time interval.

papr 11 December, 2019, 20:19:53

Maybe I can generate heatmap by myself based on fixation data. Yes, you can do that.

user-dae891 11 December, 2019, 20:20:37

@papr thank you so much !!!

user-dae891 11 December, 2019, 20:20:50

Problem solved!

user-06ae45 11 December, 2019, 21:06:38

Hi! I'm using the realsense D435 with pupil. We are hoping to calibrate to the realsense's IR stream rather than RGB stream so we will have calibration directly on the depth map which has a wider FOV. Ideally, the IR stream would be used as the scene camera for realtime calibration, but if the IR stream could even be saved as an .mp4 along with timestamps as the depth map is, calibration could be done offline. Any chance something like this is in the pipeline? If not, we'll probably take a crack at it ourselves, and any recommendations on how to go about this would be most welcomed. Thanks!

papr 11 December, 2019, 21:26:38

@user-06ae45 if you enable the rectified option, you will get rgb and depth streams such that they correspond to each other. Then you can calibrate in color with the default pipeline, and transfer the gaze position into the depth stream.

user-045b36 12 December, 2019, 09:20:05

Hello! Can anyone show an code example of unpacking msgpack message gaze.2d.0? I am using c ++

user-c5fb8b 12 December, 2019, 09:31:41

@user-045b36 Are you using the msgpack C++ library?

user-045b36 12 December, 2019, 09:37:07

@user-c5fb8b Yes

user-c5fb8b 12 December, 2019, 09:41:51

@user-045b36 Well I can't exactly tell you how to use this, as we don't use msgpack in C++. We are using the python bindings, but maybe that helps you with figuring out how to use the C++ version: This is the python binding that we use: https://github.com/msgpack/msgpack-python We basically just call

msgpack.unpackb(data, raw=False)

where data is the binary data.

user-045b36 12 December, 2019, 09:43:25

@user-c5fb8b thanks!

user-06ae45 12 December, 2019, 19:41:12

@papr Thanks for the quick reply! That makes sense, but in this case the IR stream won't be saved at all, will it? Only the depth stream? If we calibrate to the color video but don't have the IR stream, when we convert the calibration from the RGB to depth in the corresponding coordinates, the eye tracking in the wider field of view area from the IR/depth stream will be poor, since it's outside the area calibrated to in the RGB stream, correct?

user-067553 13 December, 2019, 09:07:17

Hi, I want to detect prolonged eye closures more than 1s. I'm trying it by modifying the blink detector plugin, do you have any suggestion/plugin/instrument to do that?

wrp 13 December, 2019, 09:46:05

@user-067553 your question was answered in 👁 core channel. In the future please try to post only in a single channel.

user-067553 13 December, 2019, 10:32:06

Ok @wrp thanks

user-045b36 16 December, 2019, 15:48:21

Hello! How to find out the resolution of a world camera? Maybe an example in C ++?

user-0eae33 17 December, 2019, 10:43:55

I posted this in the 🥽 core-xr channel already and thought maybe this channel might be more fitting for this thread:

Hey guys, Currently I am working with the HTC Vive Plug-In in the Vizard environment. I already got some data output, which is awesome. Now I'm trying to measure the vergence via the data output 'gaze_normal'. In the current data output there is no data for that eventhough it is supposed to be there. I guess that I somehow have to collect in the 3d mode instead of the 2d mode. And I find an option for that in the pupil_capture-app, but everytime I run the code from vizard the preferences are set back to the 2d mode (same for another calibration mode btw.).

Now, is there a possibility to set the data recording mode to 3d in the vizard code (and change the calibration mode)? Or can I define that somewhere else so I can get the data I want?

Thank you guys for your work btw!

papr 17 December, 2019, 10:47:20

@user-0eae33 It might be possible that their integration does not support the 3d detection and mapping mode. But I cannot tell for sure. I would recommend to contact their support in this regard.

user-0eae33 17 December, 2019, 10:52:19

I'll try that. Thanks 🙂

user-0eae33 17 December, 2019, 14:56:12

Hey, When I try to run the example for the pupil remote(https://github.com/pupil-labs/pupil-helpers/blob/master/python/pupil_remote_control.py), upon connecting to the port my program crashes without any error messages.

Does anybody have an idea what this could be about?

Is the example I talk about not supposed to be run by itself?

user-c5fb8b 17 December, 2019, 14:59:56

Hi @user-0eae33 you will have to start Capture somewhere. This example script assumes Capture is running on the same computer. If not, you will have to specify the IP address in line 33 of the example. Note that this example script will just run a couple of example commands and then shut down anyways, it's not a running application.

user-0eae33 17 December, 2019, 15:03:10

Hey @user-c5fb8b thank you for your quick response! Capture is actually running in the back. I tried restarting it or starting Service instead. Nothing worked so far.

Thank you for the extra hint. I'm aware of that, but at the beginning I just want to run a simple example to see if it works 😅

user-c5fb8b 17 December, 2019, 15:10:45

@user-0eae33 Can you check that the Pupil Remote plugin is enabled? In Pupil Capture

user-0eae33 17 December, 2019, 15:12:45

@user-c5fb8b Yeah it is.

user-c5fb8b 17 December, 2019, 15:13:58

@user-0eae33 Is the port 50020 in the Pupil Remote plugin? If that port is already used it will chose a different one.

user-0eae33 17 December, 2019, 15:15:29

Yes, the Pupil Remote Setting in the Capture app says 50020. Same like in the code.

user-c5fb8b 17 December, 2019, 15:16:48

Ok. So I know there's certain scenarios where connections to Pupil Remote can fail. Maybe @papr can provide more information on how to investigate this?

user-0eae33 17 December, 2019, 15:17:41

Okay, Thank you guys for dealing with this 🙃

user-c5fb8b 17 December, 2019, 15:19:18

I remember there was a user with some very restrictive antivirus setup which blocked network connection from Pupil Remote.

user-0eae33 17 December, 2019, 15:22:57

I'll check up on that, but I think there is no special software installed.

papr 17 December, 2019, 15:23:29

a user with some very restrictive antivirus setup which blocked network connection from Pupil Remote. That was Pupil Invisible, and specifically the zyre protocol. This is different from a connection to Pupil Remote. Nonetheless, a restrictive network configuration might be at fault here.

user-0eae33 17 December, 2019, 15:24:39

I already tried turning off the firewall. But it didn't change anything.

papr 17 December, 2019, 15:26:58

@user-0eae33 I just read your initial comment. A connection issue would result in the script waiting indefinitely for the connection. So the python script would still be running. If you say crash, I assume that the python script is terminated. Can you confirm this?

user-0eae33 17 December, 2019, 15:27:40

Yes, I can confirm that.

papr 17 December, 2019, 15:29:04

Then this is not a network issue but a problem with your Python setup.

papr 17 December, 2019, 15:29:39

Can you let us know what operating setup and Python version you are using? How did you install pyzmq?

user-0eae33 17 December, 2019, 15:31:30

Operating system: Windows 10 Python version: 2.7

I'm working in the Vizard Environmet, there is an option to install packages ('pip install pyzmq')

user-0eae33 17 December, 2019, 15:31:41

Vizard 6

papr 17 December, 2019, 15:32:20

@user-0eae33 Does Vizard require python 2.7? We usually recomend to use Python 3.5 or higher

user-0eae33 17 December, 2019, 15:35:14

OKay, I'll have a look, if can upgrade to python 3.5

user-c5fb8b 17 December, 2019, 15:36:05

(as a sidenode: python 2 will be officially deprecated on 1.1.2020: https://pythonclock.org/ )

user-0eae33 17 December, 2019, 15:37:51

I see 😅 I need that stuff for research. And there we always take what has proven reliable on the worn out devices we have access to.

user-c5fb8b 17 December, 2019, 15:39:31

(the official (!) recommendation to switch to python 3 and not use python 2 anymore was announced in 2008)

papr 17 December, 2019, 15:41:25

Most software is much more stable in Python 3 by now because everybody is too lazy to test against Python 2.7 😄

papr 17 December, 2019, 15:43:33

That you do not get an output at all is very weird though.

user-0eae33 17 December, 2019, 15:44:27

Alright. It looks like it should be no problem to use Python 3. I'll try again as soon as I installed it. If it still should refuse to work then, I'll come here again. For now, thank you so much for your time, guys!

papr 17 December, 2019, 15:46:32

@user-0eae33 I would recommend to deinstall Python 2.7 first btw, because the executable names might conflict and result in a setup where you install modules for python 2.7 version by accident and cannot find it when running Python 3.

user-0eae33 17 December, 2019, 15:48:55

I'm not sure, if that will work as there are still a few current projects running on 2.7 that I don't attend. But anyway, thank you for your advice, I will have that on mind.

user-045b36 17 December, 2019, 16:15:16

Hello! I turn on the recording, I start to receive data. How to find out the resolution of a world camera while recording?

user-c5fb8b 17 December, 2019, 16:18:39

Hi @user-045b36 you can see the resolution in the tab UVC Source on the right. The icon with the photo camera.

user-c5fb8b 17 December, 2019, 16:19:26

Or do you want to receive the resolution via network?

user-c5fb8b 17 December, 2019, 16:22:16

In that case you would have to enable the Frame Publisher plugin. It publishes the entire world video stream via the network under the topic frame.world. It contains keys width and height for frame resolution.

user-045b36 18 December, 2019, 08:00:43

@user-c5fb8b I am making my custom application on c++. All interaction with pupil_capture goes through zmq_socket. What should I send or where to connect in order to receive frame.world ?

papr 18 December, 2019, 08:31:18

@user-045b36 Hey, unfortunately, we only have an example in Python for this: https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py

But the msgpack+zmq apis are very similar across the different programming languages. I am sure that you will be able to reconstruct this example in c++ 🙂

user-045b36 18 December, 2019, 09:22:21

@papr Thanks!

user-0eae33 18 December, 2019, 10:44:49

@papr Just a short update to my case: As Vizard is apparently supposed to run in Python 2.7 and my supervisor was generally not happy with installing installing Python 3, we tried installing an older version of the pyzmq package. That worked. I guess that the newer Python version would have done it as well. Again, thanks for your help!

user-045b36 20 December, 2019, 13:55:18

@papr Hello! I looked an example in Python. Got a question. What kind of data [raw_data], what format? recv_world_video_frames.py //string 58 //payload['raw_data'] = extra_frames

user-045b36 26 December, 2019, 08:50:16

Hello! What kind of data format does it send frame.world and frame.eye?

user-c5fb8b 30 December, 2019, 08:02:22

@user-045b36 regarding raw_data: you can find information on this in the comments in the example file recv_world_video_frames.py. Please make sure to have a look at the whole file:

def recv_from_sub():
    '''Recv a message with topic, payload.
    Topic is a utf-8 encoded string. Returned as unicode object.
    Payload is a msgpack serialized dict. Returned as a python dict.
    Any addional message frames will be added as a list
    in the payload dict with key: '__raw_data__' .
    '''
# ...

Regarding the structure of the frame.world topic. Here you can see the code from Pupil that is responsible for creating the message: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/frame_publisher.py#L80-L90

In the example the blob data is just a numpy buffer. You can see how to unpack this into an actual image at the bottom of the example: https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py#L69

user-f8c051 30 December, 2019, 09:51:46

Hey awesome pupil team! We chatted a few weeks ago with @papr about pyuvc.

user-f8c051 30 December, 2019, 09:53:37

I had trouble making it work with a microscope I'm using for my research at UTokyo. But few days ago we got a firmware update from the scope manufacturer and now it plays quite well with your pyuvc library 🙂

user-f8c051 30 December, 2019, 09:55:27

However, there is still one mystery, some UVC properties seems to reset after each

user-f8c051 30 December, 2019, 09:56:42
cap.get_frame_robust()
user-f8c051 30 December, 2019, 09:57:17

so I need to re-apply them every frame

user-f8c051 30 December, 2019, 09:57:49

to be more specific this is the case for the White Balance temperature

user-f8c051 30 December, 2019, 09:57:53

property

user-f8c051 30 December, 2019, 09:58:47

and also the Absolute Focus takes effect only after I capture at least one frame

user-c5fb8b 30 December, 2019, 10:10:15

@user-f8c051 Hi, unfortunately @papr is currently on vacation. I'll see if someone else on the team can answer your question. We'll get back to you as soon as we have an answer!

user-f8c051 30 December, 2019, 10:10:42

There is no rush! happy Holidays!!

user-f8c051 30 December, 2019, 10:11:22

we are also off here, was just curious to check if the library works after the firmware update.. will ping you guys in a week or so!

user-f8c051 30 December, 2019, 10:11:25

Thanks!!!

user-c5fb8b 30 December, 2019, 10:11:58

Ok perfect! Happy Holidays for you as well!

End of December archive