Is this related to the port number? Do I have to have particular settings in the Pupil Capture software?
@user-57c2b9 No, this is not related to the port number. If you receive data, the IPC is working.
@user-57c2b9 can you send me the modified script? I can quickly test it
Sure, what is the best method to send it?
Just drop the file here
Here it is:
@user-57c2b9 you need to add cv2.waitKey(1)
after each cv2.imshow()
. Else opencv will not draw the image into the window.
And an other, less relevant hint: eye 0 is usually the subject's right eye, not the left one.
@papr Thanks for the info! I was able to get this example working.
My next question is on timestamps. Can this IPC method be used to associate timestamps to the images? Furthermore, I looked at the documentation on timestamps: https://docs.pupil-labs.com/#data-format
The documentation talks about synchronization and timestamps for Linux and MacOS, but does not mention anything specifically about Windows 10. How do these things work in a Windows 10 system compared to Linux and MacOS?
@user-57c2b9 The msg
in line 71 should have a timestamp
field. This is the associated timestamp of the image.
@papr I asked that right before christmas (bad timing I guess), a small reminder: How do I get messages from the standart plugins like fixations over IPC? Is there somewhere a complete list of that (I did not find it in the documentation?)
@user-bab6ad unfortunately, there is not complete list of that. π In case of fixations, subscribe to fixations
@papr ok, that is already something... thx! Would be cool to either have such a list or something to easy grep βpatternβ for it in the source code to find all?
@user-bab6ad You can subscribe to `` (empty string). This way you will receive everything that is on the IPC. Start by looking at the received topics and filter manually those that you are not interested in.
hello everyone does anyone have the link of pupil detection algorithm used by pupil-labs?
@user-6fdb19 The algorithm is described in the original Pupil white paper. You can find the paper here: https://arxiv.org/pdf/1405.0006.pdf The according code is here: https://github.com/pupil-labs/pupil/tree/master/pupil_src/shared_modules/pupil_detectors
Hi all, I am a beginner both in Unity and Pupil Labs. I want to connect the Pupil Labs to Unity, but my project won't be using any HMD. I just can't find any guide online for purely getting the eye movement as an input. Do you have any suggestion or reference?
Thanks in advance.
hi mar! thank you very much! is there an equivalent code for matlab?
@user-6fdb19 not that we know of. But you can run Pupil Capture and use Matlab to access Capture's network interface.
@user-82e954 What device are you using if not an HMD?
thanks papr!! do you think it will lag something? i will need to perform an real time analysis of pupil dilation (for my phd)
@user-6fdb19 There will be certainly be some network lag. What data do you need access to? And how much lag would be ok for you?
i need to perform an real time pupil size estimation... i need at least 30fps in my analysis
Do I understand it correctly, that you want to do your own pupil size estimation? Do you need access to the frame data then?
Alternatively, every pupil datum already includes a pupil size measurement.
i can acces this 'pupil datum' in real time on matlab ? i need at least 30 lines of pupil size per second for my analysis
i need this estimation for further concatenate with some task constraints
filter_messages.m
subscribes to pupil data by default. You should be receiving pupil datat at full eye camera frame rate
As I said, there might be network lag/delay though.
network card or ssd will reduce the delay ?
Not using matlab would reduce the delay as far as I know,
The least delay you would have by creating a plugin for Capture. This is often not necessary since the network api is very fast. The issue here is that the matlab zmq + msgpack implementation is not well supported. I would highly recommend to use python + pyzmq (as in https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_messages.py).
thanks papr! you're very helpful!
papr, this code recall real time pupil size? if so, do you know how much lines per second i can get with?
The cameras record with 120/200Hz, depending on your hardware and setup. The network interface is fast enough that you can receive all detected pupil data with the above script.
@papr I'm trying of using the eye tracker for interactive medias, like moving screens or robotic arms. So I think no HMDs are necessary. I just want to use the eye mapped eye movement as an input in Unity.
Thanks for replying btw
@user-82e954 I would suggest subscribing to the gaze positions in Unity. Please see a description of the network based API in the docs here: https://docs.pupil-labs.com/#interprocess-and-network-communication also see reference implementations that show how to communicate with Pupil software in the pupil-helpers repo: https://github.com/pupil-labs/pupil-helpers/
Hi, I am currently doing eye tracking in an airport with the standard eye tracking mobile bundle. However, we noticed that the video output is incomplete. For some eye tracking participants, sometimes it could be Eye01 video missing, Eye02 video missing, World camera video missing. For the worst case, no video can be found on the phone (we are 100% the eye tracking is recording videos). But we canβt find any output. Is there any solution to this? Thank you very much.
@user-52a72a what version of pupil mobile are you running? Usually this happens when the headset disconnects during the recording.
@mpk , thanks for your quick reply. App version 0.31.0, 200hz binocular High Speed eye Tracker. Phone: Motor Z3 play that was purchased with the mobile Bundle
Ok. Please update pupil mobile. I think we have a more recent version if I'm not mistaken.
0.31.0-revert? Thatβs the version that I am using now
I'm not sure and currently on the go so I can't check. Have you considered using a laptop instead of phone. It's not ideal but could be a short term solution.
π we are testing on real airport users, getting people to carry a laptop is really challenging. Thatβs why we bought the mobile bundleπ€π€
the strange part is that the user will still have some video, but itβs not the complete set, some would have the left eye video, some would have right eye
Thereβs a guy only have world video today
Yeah, I fully understand. I think if you opt into to the beta program you can use the most recent version of Pupil mobile. That option if available in the play-store.
@user-52a72a Use this link to join/leave the beta program: https://play.google.com/apps/testing/com.pupillabs.pupilmobile
@alright, I need to leave now. @user-52a72a reach out to info[at]pupil-labs.com via email if the problem persists and we can help you via video ASAP during the week.
sure, thanks! Will do, have a great day!
@papr So, is the new gaze mapper for pupiil invis going to be a trained network (ie blackbox!) ?
Hi! Today I was trying to play pupil's recorded video on the standard player. But unfortunately, I wasn't managed to start it. Can you help me please?
@user-5a3c68 You need to drop the folder containing the videos onto the gray Player window, not the videos themselves.
@papr Thank you for answer, but I meant standard Windows player or VLC for example. World's video doesn't play properly. In VLC it shows 3 different frames from different time stamps. And standart windows player returns error 0xc1010103. Maybe the problem is on my side, I don't know.
@user-5a3c68 Please use the world and eye video exporter plugins in Player to transcode the intermediate video format into something that the standard media players can playback. We make some timing-related changes to the videos during recording. This allows us to seek frame-accurate in Player but breaks compatibility with the standard media players. Exporting the videos reverses the timing-related changes.
@papr Ok, I'll try it, thank you. And additionally I have one more issue. I want to calibrate world's camera using chessboard. But it looks like pupil crops it's image, because I can grab 1920x1080 in python from the camera. But video itself has resolution 1280x720. What is the proper way to crop and resize image to get video's resolution? Or maybe I can find camera's intrinsic parameters somwere else?
@user-5a3c68 Capture comes with prerecorded camera intrinsics estimations. Additionally, you can recalibrate using the Camera Intrinsics Estimation plugin in Capture. These are the prerecorded intrinsics: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/camera_models.py#L24-L128
Hi, just started using Pupil and going through setup.
Does anyone here have experience using it with Vizard? https://docs.worldviz.com/vizard/latest/index.htm#Native_support.htm#Eye_tracking
I'm looking through their docs and can't find native support
Hello, I am trying to use pupil on windows and and this is occurring "error: Error executing cmd /u /c "C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\VC\Auxiliary\Build\vcvarsall.bat" x86_amd64 && set " with this message "C:\Program was unexpected at this time." From what I am looking up, this is due to some scripting error, but I don't know where/how to fix it
can anyone please help?
@user-c6ccfa Did you modify the source code? Or are you simply following the dependency install instructions?
@papr I followed the instructions and did not modify any source code
@user-c6ccfa Did you look through our closed github issues? If you are lucky, there are related issues that can help you fix the issue.
@papr The closest advice I could find was "Make sure to include vcvarsall.bat to your windows path." @ issues/896
it is on my path, and I'm still not sure how to fix this error
I've found that this error does not occur when using the plain command prompt, and occurs when using the command prompt for visual studio.