Hi @user-65eab1 If you want to receive live data while using Pupil Capture, you will have to connect to the IPC backbone and process the network messages. I recommend reading through the Network API section of the docs to get a better understanding of the involved terminology: https://docs.pupil-labs.com/developer/core/network-api/ Also here's a small example script for how to subscribe to specific messages from Capture: https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_messages.py
hi all, i'm currently using self.g_pool.timestamps to feed into a function which calculates some statistics based on gaze data. I noticed that the values of the attributes in self.g_pool do not change based on the time trimmer/seeker bar selections in Pupil Player. What is the way that I can utilize the trimmer/seeker bar?
@user-a7d017
The trim marks are part of the Seek_Control
system plugin. You can access them via:
g_pool.seek_control.trim_left
g_pool.seek_control.trim_right
@user-c5fb8b thank you!
Hi all, I got this error when I try to build optimization_calibration setup.py: python setup.py build
fatal error C1007: unrecognized flag '-Ot' in 'p2' LINK : fatal error LNK1257: code generation failed error: command 'C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\VC\Tools\MSVC\14.13.26128\bin\HostX86\x64\link.exe' failed with exit status 1257
I followed all the instructions in "dependencies-windows.md", does anyone has any idea?
@user-dae891 quick question, is it an absolute requirement for you to build windows from source? Is there something you need that can not be done by running the app and using the network API or plugins for example?
@user-dae891 As @wrp also suggested I would reconsider if you really need to run Pupil from source for your use case. If not, please consider just using the provided bundles, as the setup on Windows can be quite frustrating at times. If you need to build from source though: we have experienced this error in the past: https://github.com/pupil-labs/pupil/issues/1208 It seems it was related to some incorrectly chosen boost setup. This should not happen in newer versions anymore, since we don't have boost as a dependency anymore... can you check that you are building the newest version of Pupil and that you are not using boost?
Hello there, I really need some help or support about softwares that are compatible with Pupil Core, to make Heatmaps without use of April Tags / QR Codes. Could you recommend us some AOI software related?
@user-c5fb8b I want to extract the fixation point for every 7 seconds from a record video, it is time consuming if I do it manually in Pupil Player, that's why I am thinking to modify the source code. Do you think if it's necessary?
@user-dae891 there will be likely more than one fixation for each fixation window. How would you decide which fixation to extract?
@papr I am extract the fixation within the defined surface
only one surface defined
@user-dae891 ok, but you said that you want to extract one fixation every 7 seconds. Could you elaborate on the details? I think you could simply use the export output of the fixation detector.
@papr Yes, I just notice I can explore all the fixations, and then separate them in the output excel file. However, I cannot do that for heatmap right?
Maybe I can generate heatmap by myself based on fixation data.
Additionally, may I ask what is the difference between "fixations_on_surface_Surface 1" and "gaze_positions_on_surface_Surface 1" in the output data?
@user-dae891 a fixation is a group of spatially close gaze data in a given time interval.
Maybe I can generate heatmap by myself based on fixation data. Yes, you can do that.
@papr thank you so much !!!
Problem solved!
Hi! I'm using the realsense D435 with pupil. We are hoping to calibrate to the realsense's IR stream rather than RGB stream so we will have calibration directly on the depth map which has a wider FOV. Ideally, the IR stream would be used as the scene camera for realtime calibration, but if the IR stream could even be saved as an .mp4 along with timestamps as the depth map is, calibration could be done offline. Any chance something like this is in the pipeline? If not, we'll probably take a crack at it ourselves, and any recommendations on how to go about this would be most welcomed. Thanks!
@user-06ae45 if you enable the rectified option, you will get rgb and depth streams such that they correspond to each other. Then you can calibrate in color with the default pipeline, and transfer the gaze position into the depth stream.
Hello! Can anyone show an code example of unpacking msgpack message gaze.2d.0? I am using c ++
@user-045b36 Are you using the msgpack C++ library?
@user-c5fb8b Yes
@user-045b36 Well I can't exactly tell you how to use this, as we don't use msgpack in C++. We are using the python bindings, but maybe that helps you with figuring out how to use the C++ version: This is the python binding that we use: https://github.com/msgpack/msgpack-python We basically just call
msgpack.unpackb(data, raw=False)
where data
is the binary data.
@user-c5fb8b thanks!
@papr Thanks for the quick reply! That makes sense, but in this case the IR stream won't be saved at all, will it? Only the depth stream? If we calibrate to the color video but don't have the IR stream, when we convert the calibration from the RGB to depth in the corresponding coordinates, the eye tracking in the wider field of view area from the IR/depth stream will be poor, since it's outside the area calibrated to in the RGB stream, correct?
Hi, I want to detect prolonged eye closures more than 1s. I'm trying it by modifying the blink detector plugin, do you have any suggestion/plugin/instrument to do that?
@user-067553 your question was answered in 👁 core channel. In the future please try to post only in a single channel.
Ok @wrp thanks
Hello! How to find out the resolution of a world camera? Maybe an example in C ++?
I posted this in the 🥽 core-xr channel already and thought maybe this channel might be more fitting for this thread:
Hey guys, Currently I am working with the HTC Vive Plug-In in the Vizard environment. I already got some data output, which is awesome. Now I'm trying to measure the vergence via the data output 'gaze_normal'. In the current data output there is no data for that eventhough it is supposed to be there. I guess that I somehow have to collect in the 3d mode instead of the 2d mode. And I find an option for that in the pupil_capture-app, but everytime I run the code from vizard the preferences are set back to the 2d mode (same for another calibration mode btw.).
Now, is there a possibility to set the data recording mode to 3d in the vizard code (and change the calibration mode)? Or can I define that somewhere else so I can get the data I want?
Thank you guys for your work btw!
@user-0eae33 It might be possible that their integration does not support the 3d detection and mapping mode. But I cannot tell for sure. I would recommend to contact their support in this regard.
I'll try that. Thanks 🙂
Hey, When I try to run the example for the pupil remote(https://github.com/pupil-labs/pupil-helpers/blob/master/python/pupil_remote_control.py), upon connecting to the port my program crashes without any error messages.
Does anybody have an idea what this could be about?
Is the example I talk about not supposed to be run by itself?
Hi @user-0eae33 you will have to start Capture somewhere. This example script assumes Capture is running on the same computer. If not, you will have to specify the IP address in line 33 of the example. Note that this example script will just run a couple of example commands and then shut down anyways, it's not a running application.
Hey @user-c5fb8b thank you for your quick response! Capture is actually running in the back. I tried restarting it or starting Service instead. Nothing worked so far.
Thank you for the extra hint. I'm aware of that, but at the beginning I just want to run a simple example to see if it works 😅
@user-0eae33 Can you check that the Pupil Remote plugin is enabled? In Pupil Capture
@user-c5fb8b Yeah it is.
@user-0eae33 Is the port 50020 in the Pupil Remote plugin? If that port is already used it will chose a different one.
Yes, the Pupil Remote Setting in the Capture app says 50020. Same like in the code.
Ok. So I know there's certain scenarios where connections to Pupil Remote can fail. Maybe @papr can provide more information on how to investigate this?
Okay, Thank you guys for dealing with this 🙃
I remember there was a user with some very restrictive antivirus setup which blocked network connection from Pupil Remote.
I'll check up on that, but I think there is no special software installed.
a user with some very restrictive antivirus setup which blocked network connection from Pupil Remote. That was Pupil Invisible, and specifically the zyre protocol. This is different from a connection to Pupil Remote. Nonetheless, a restrictive network configuration might be at fault here.
I already tried turning off the firewall. But it didn't change anything.
@user-0eae33 I just read your initial comment. A connection issue would result in the script waiting indefinitely for the connection. So the python script would still be running. If you say crash, I assume that the python script is terminated. Can you confirm this?
Yes, I can confirm that.
Then this is not a network issue but a problem with your Python setup.
Can you let us know what operating setup and Python version you are using? How did you install pyzmq?
Operating system: Windows 10 Python version: 2.7
I'm working in the Vizard Environmet, there is an option to install packages ('pip install pyzmq')
Vizard 6
@user-0eae33 Does Vizard require python 2.7? We usually recomend to use Python 3.5 or higher
OKay, I'll have a look, if can upgrade to python 3.5
(as a sidenode: python 2 will be officially deprecated on 1.1.2020: https://pythonclock.org/ )
I see 😅 I need that stuff for research. And there we always take what has proven reliable on the worn out devices we have access to.
(the official (!) recommendation to switch to python 3 and not use python 2 anymore was announced in 2008)
Most software is much more stable in Python 3 by now because everybody is too lazy to test against Python 2.7 😄
That you do not get an output at all is very weird though.
Alright. It looks like it should be no problem to use Python 3. I'll try again as soon as I installed it. If it still should refuse to work then, I'll come here again. For now, thank you so much for your time, guys!
@user-0eae33 I would recommend to deinstall Python 2.7 first btw, because the executable names might conflict and result in a setup where you install modules for python 2.7 version by accident and cannot find it when running Python 3.
I'm not sure, if that will work as there are still a few current projects running on 2.7 that I don't attend. But anyway, thank you for your advice, I will have that on mind.
Hello! I turn on the recording, I start to receive data. How to find out the resolution of a world camera while recording?
Hi @user-045b36 you can see the resolution in the tab UVC Source
on the right. The icon with the photo camera.
Or do you want to receive the resolution via network?
In that case you would have to enable the Frame Publisher
plugin. It publishes the entire world video stream via the network under the topic frame.world
. It contains keys width
and height
for frame resolution.
@user-c5fb8b I am making my custom application on c++. All interaction with pupil_capture goes through zmq_socket. What should I send or where to connect in order to receive frame.world ?
@user-045b36 Hey, unfortunately, we only have an example in Python for this: https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py
But the msgpack+zmq apis are very similar across the different programming languages. I am sure that you will be able to reconstruct this example in c++ 🙂
@papr Thanks!
@papr Just a short update to my case: As Vizard is apparently supposed to run in Python 2.7 and my supervisor was generally not happy with installing installing Python 3, we tried installing an older version of the pyzmq package. That worked. I guess that the newer Python version would have done it as well. Again, thanks for your help!
@papr Hello! I looked an example in Python. Got a question. What kind of data [raw_data], what format? recv_world_video_frames.py //string 58 //payload['raw_data'] = extra_frames
Hello! What kind of data format does it send frame.world and frame.eye?
@user-045b36 regarding raw_data
: you can find information on this in the comments in the example file recv_world_video_frames.py
. Please make sure to have a look at the whole file:
def recv_from_sub():
'''Recv a message with topic, payload.
Topic is a utf-8 encoded string. Returned as unicode object.
Payload is a msgpack serialized dict. Returned as a python dict.
Any addional message frames will be added as a list
in the payload dict with key: '__raw_data__' .
'''
# ...
Regarding the structure of the frame.world
topic. Here you can see the code from Pupil that is responsible for creating the message:
https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/frame_publisher.py#L80-L90
In the example the blob data is just a numpy buffer. You can see how to unpack this into an actual image at the bottom of the example: https://github.com/pupil-labs/pupil-helpers/blob/master/python/recv_world_video_frames.py#L69
Hey awesome pupil team! We chatted a few weeks ago with @papr about pyuvc.
I had trouble making it work with a microscope I'm using for my research at UTokyo. But few days ago we got a firmware update from the scope manufacturer and now it plays quite well with your pyuvc library 🙂
However, there is still one mystery, some UVC properties seems to reset after each
cap.get_frame_robust()
so I need to re-apply them every frame
to be more specific this is the case for the White Balance temperature
property
and also the Absolute Focus
takes effect only after I capture at least one frame
@user-f8c051 Hi, unfortunately @papr is currently on vacation. I'll see if someone else on the team can answer your question. We'll get back to you as soon as we have an answer!
There is no rush! happy Holidays!!
we are also off here, was just curious to check if the library works after the firmware update.. will ping you guys in a week or so!
Thanks!!!
Ok perfect! Happy Holidays for you as well!