I have install pyglui v1.18, but now I want to install pyglui v1.9, how to do?
@user-6e1816 the most recent release is v1.18
- https://github.com/pupil-labs/pyglui/releases/tag/v1.18
@user-6e1816 what OS are you using
@wrp ubuntu
I know that the v1.18 is the most recent release.
@user-e7102b surface stability looks much better. Thanks for sharing progress :+1:
@user-6e1816 sudo pip3 install --upgrade git+https://github.com/pupil-labs/pyglui
Did you already try this?
I am trying https://pupil-labs.com/blog/2017-12/real-time-object-recognition-using-fixations-to-control-prosthetic-hand/. But his version may be v1.9, and an error occurred about pyglui when its version is v1.18
so I want to know if I could reinstall an earlier version.
In terms of version formats v1.18 > v1.9 (also there was never a v1.9 release as far as I can see)
You shouldn't need an earlier version
Hi @user-8b1388 I would be happy to help you out. You note that you need help with setup. Have you looked through https://docs.pupil-labs.com yet?
@user-6e1816 @wrp Some earlier versions of Pupil require older pyglui versions, that is correct. You need to clone the repository, check out the version that you want to install, replace the git+URL part with the local path to the local repository in the the pip command above.
Hi there, I''m trying the pupil mobile with my tracker which has the high resolution world camera, when I connect to my OnePlus 5 phone I only can see the feed coming from the eye camera on the phone but the world camera feed is not showing anything !? Could you please help me out with this!?
@wrp Hi! thanks for your answer. i have a problem with an accurate gaze tracking. @papr help me now.
i share link to my video folder.
Sorry, I don't follow, do you mean that you shared a link already with @papr?
please clarify
yes. i shared link with @papr. also i should get an order ID. i can do this afternoon.
Yes, I asked @user-8b1388 to share a recording that includes the calibration procedure. I did not have the time to look at it yet though. But I did not ask for an order ID.
I have also asked for order id via separate email thread
Following @user-ed537d question, is there a way to pull or calculate where the user is looking in terms of visual angle in relation to the world camera? I mean, ideally, if the user is gazing at a point straight ahead, I would like to get (0.5, 0.5)
@user-cf2773 @user-ed537d Yes, this is possible if you know the world camera's intrinsics. The accuracy visualizer uses this method to calculate the accuracy in terms of visual angles.
Aren't the world camera's intrinsics known by Pupil Capture already?
I am trying to get these directly from Pupil Remote
If I print result['norm_pos'] of the "gaze" topic from Pupil Remote, the coordinate varies only a tiny bit between one sample and another, regardless where I am gazing
@user-cf2773 @papr the intrinsics do not need to be known if you use 3d mode .
in this case just use gaze_point_3d
and compare to a vector going straight out: (0,0,1)
OK I will try, thanks @mpk !
The norm_pos
is normalized within the world camera image, where (0,0) is the bottom left corner and (1,1) the top right corner. Therefore it is expected to change very little between samples.
Is there a way to set default settings so when pupil capture starts up the settings are the same?
It doesn't work. I have used 3d calibration and computed scipy.spatial.distance.cosine(data["gaze_point_3d"], (0, 0, 1)). It always returns me values around 0.02 or 0.03, regardless where I am looking at
Any idea about what is going on? Do I need to follow a different calibration procedure/computation?
0.02 = cos(angle-tocenter)
. To calculate the actual angle, you need to call numpy.arccos()
on the result of scipy.spatial.distance.cosine(data["gaze_point_3d"], (0, 0, 1))
yes I did that too, but the variation is obviously similarly small
also make sure that the gaze_point_3d
points has the same sign as your "center" vector (0, 0, 1)
What x,y values do you get in gaze_point_3d
if you look at the center? Are they close to zero?
Yes, but they are also close to zero when I look to the top-left or bottom-right corners
They are close to zero all the time, that's what I don't understand
In Capture, the gaze seems to be tracked properly, so I think it's just an issue of how the coordinates are expressed
Whats the z value for them?
@user-cf2773 whats the values of data["gaze_point_3d"]
?
do they make sense?
One thing to remember is that arccos returns radians. Did you convert them to degrees with np.rad2deg
?
small x,y changes are ok if z is also small. the cosine distance function normalizes both vectors, so that is taken care of.
data["gaze_point_3d"] = (-60, 26, 484)
This looks like a sensible value.
For some reasons, I have closed Pupil Capture, reopened it, recalibrated, and now it seems to work: I get cos around 0 when I look at the center and different values when I look far away. Thanks a lot for your help!
Great! 🙂
Hey everyone! I just got a 120Hz binocular setup with normal world camera, but we've been having trouble setting up the calibration. Nothing we do seems to get us reliable calibration (the gaze dot on the screen does not match where they are looking). When we do get calibration, it lasts for only a minute, and it is like any movement in stimuli or head wrecks it. We're going to be running a study where people look at some paper advertisements, which is why we are trying to use these glasses.
@user-3565f9 can you share a example recording with eye videos? This will allow us to help. You can send this via PM or email to info[at]pupil-labs.com
Hi guys
has anyone had problems calibrating the pupil ?
witha HTC Vive
Namely, the problem is that wheni click on calibrate it just shows the white dot on the screen and it does not do anything\
even though I am looking straight at it 😄
@user-f1eba3 this is a question for HMD eyes. @user-e04f56 can help you there.
@user-3565f9 It would help a lot if the recording includes the calibration, i.e. if you could start the recording before starting the calibration
May I ask about the expected time of launching the Saccade detector plugin?
Unfortunately, we do not have a time frame for its release yet.
Understood, thanks
Hi guys, I am using 1.4.1 and I have an issue with the exported excel file including the blink detection data. The data is shown correctly only for some of the blinks. I am sure there were many more blinks recorded but these are not shown correctly in the file. As you can see with the pink shadow, there is a lot of data not correctly depicted in columns, which I guess should be the start time stamps, confidences etc. However, it is difficult to find what refers to what considering the amount of data. Has any other had this issue?
@user-b23813 This looks like a software bug. Please raise an issue on github including this screenshot
@papr thanks a lot, I will
@here Pupil Labs Team has been busy! We just pushed an update of Pupil Capture Player and Service: https://github.com/pupil-labs/pupil/releases/tag/v1.5 !
wheres the CV1 eye tracker
@user-24270f coming soon 😃
im sure i believe you
@user-24270f we are very close but we had to revise the camera because there is SO little space in the Rift. Waiting for new parts now. Hope we can show something in the next weeks.
these things take time
Its challenging when you make an add-on and thus cannot modify the Rift design.
Capture allows creation of recordings without an active world video source
Thank you guys so much for this @mpk
I want to prevent the 3d model from updating after it's set (I have pointed out this issue before), can I achieve this by decreasing or increasing the "model sensitivity" value? Otherwise I don't know exactly how to achieve that by changing the code.
By the way @papr , @user-e7102b has updated our middleman repo with some nicer matlab examples over at https://github.com/mtaung/pupil_middleman if it might end up being useful to you guys
@user-0d187e yes decrease the sensitivity.
@user-dfeeb9 Cool!
I still need to implement ntp timesync at some point
@mpk Thanks. Does setting the sensitivity to 0 stops the model change completely?
@user-0d187e no, we have not implemented a freeze like that. We can do that though...
Ahh. How easy is to modify the code for this if I just download the source code?
super easy:
just change the min value to something like 0.
min value of what?
the model sensitivity you mean
got it
I have no idea how to compile the code after changes and make the executable file! but I will try
@user-0d187e all you need to do is follow these steps : https://docs.pupil-labs.com/#developer-setup
then make the changes and run python3 main.py
it will auto compile the pupil detector for you.
cool
thx
Hi guys
Noob question: I was reading the docs and it states that :
Once you plug the usb cables into your computer:
the right eye camera will show up with the name: Pupil Cam 1 ID0 the left eye camera will show up with the name: Pupil Cam 1 ID1
Where shoul it show up ?
@user-f1eba3 Click on the UVC Manager
icon on the right. It should show you a selector with all available cameras
@user-f1eba3 but you dont have to really worry about that. Just start Pupil capture and the right cameras will be selected for world, right and left eye.
ger
@mpk @user-0d187e setting the min value to zero does not completely freeze the model, you have also implemented the check for the fit performance gradient, which is included in the eyemodel... i'm also looking to freeze the center of the fitted sphere (to a reasonable value, I work with the Vive addon), since the radius is fix.
So basically what I am trying to do is game development and robotics with unity but to do that I will use Unreal engine. Today I want to make a python script or something simillar to get the and y position and show it in a console for starters ?
Where should I loook on code that interacts with the pupil
Have a look at this scrip https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_messages.py
It uses Pupil Remote and the IPC backend to subscribe to the pupil data (uncomment line 24 to subscribe to gaze data) and prints the received data
how could I have missed that
I must have been super blind
should I use the newest Python version if I am thinkering with some parts of pupil ?
We support >= Python 3.5, but I highly recommend Python 3.6
Hi. Is the pupil detection algorithm the same for general pupil labs and for HMD version? I wonder because the point of view and lightning is not the same.
Hey just curious if anyone in here knows exactly how confidence is calculated for each sample? I can't find any information in the documentation
@user-a49a87 it is the same. The point of view is not relevant as long as you can see the pupil clearly. Different lighting is handled by the algorithm as well. Or to be specific: The operator is responsible to change the uvc controls such that the pupil is detected well
@user-ecbbea to cite @marc here: Support pixel / ellipse circumference. Where support pixels are the number of pixels that were used to fit the ellipse.
This means that the confidence is high if the support pixels, the detected pupil edge pixels, are similar in shape to an ellipse.
Great thanks
essentially the less circular the pupil looks, the less confidence?
Hey is it possible to adjust the focus on the newer low-profile 200hz cameras? We can't seem to find an obvious way to do so
(sorry for all the questions)
ALSO: is it possible to get the CAD file for this? https://www.shapeways.com/product/KS8ME6JNV/pupil-labs-camera-arm-extender-for-120hz-and-200hz?optionId=64865617
@user-ecbbea the 200hz cameras do not have the possibility to adjust the focus
@charles bukowski#5234 you can get the cad file here: https://github.com/pupil-labs/pupil-geometry/blob/master/Pupil%20Headset%20triangle%20mount%20extender.stl
Hello everyone, I am trying to get hardware set up on Mac. I am able to get video feed of the two pupil cameras, but world feed is not working. In the UVC manager, there is only two listed camID. Selecting UVC source button shows Local USB Source: Ghost capture. Capture initialization failed.
Additionally have just attempted to use pupil device on WIndows machine. Windows recognized and added two cameras only, the pupil cameras. Should it have also detected a third camera for world?
Hello! Is there a way to check accuracy of calibration if doing offline natural marker calibration?
@user-2798d6 not yet, but this is quite high on my todo list
Thank you!
@user-a7d017 what world camera does your setup have? Is it a 3d world camera (realsense r200 sensor?)
Zadig messed up by notebooks builtin webcam. No matter what driver I choose it isnt recognized anymore by any program. In the device manager it is now under Generic USB devices, Not Imaging Devices and is labeled as "UVC 2.0 Webcam". I tried clicking on it and choosing update driver, it says drivers are up to date. I tried uninstalling the driver and rebooting Windows, nothing changes. Help, I need my webcam.
I can't switch the Resolution and Frame rate on the "Pupil Capture" window, if I do, the window will disappear and it can't open again. My OS is Ubuntu.
Hi @user-10e1e3 in order to restore drivers you either had to have backups of system drivers so you can roll back or try to do a system update for Windows to get most recent drivers.
For those reading the above message. Please do not use Zadig to install drivers. You should be able to use Pupil driver installers that ship with Pupil Capture. If you are using Zadig, please make backup drivers and or backup your system before fiddling with driver installation.
@user-6e1816 are you running the latest version of Pupil Capture?
@user-6e1816 I was not able to recreate the behavior your noted. Please could you try restarting Pupil Capture with default settings?
@wrp I stick the surface trackers on my iphone and I was trying on myself. I am wondering for the heap map, why it is always showing as red in the recording? I know that some of the time I wasn't looking at my mobile, but why the heap map is still red? Thanks!
@user-7bc627 did you set the size of the surface?
Hmm, I am not sure. Now is like this
@user-cbb918 please specify x size
and y size
and then press the (Re)-calculate
button
I see...hmm...sorry but how can I know the size?? like should I specify 30? 50? 100?
You can specify the dimensions of the surface in pixels (e.g. the resolution of the screen size of your phone). You can supply other dimensions with relative proportions if you like as well. Basically the width and height are required in order to determine the output size of the png that is generated when you export the surface. The width and height also determine the number of bins for the heatmap
example - iPhone 7 is (375, 667)
great I see! Another question... the visual attention is always showing in the middle of the left side. Meaning that... it did not track my eyes very well?
Please visualize the gaze position and you can see where you were looking and compare to the generated heatmap. I would suggest re-calibrating and ensuring that you have high confidence pupil detection and accurate gaze mapping.
ok I got it! Thanks!
@user-7bc627 welcome 😄
hello,
its there somewhere a good tutorial how to adapt pupil capture and his algorithms for eye tracking in HTC vive?...
I am using at the moment HMD-eyes plugin but I am confused for example what detection method to use
@wrp Hello, I am installing windows dependencies, I encounter error by running python setup.py build
in pupil/pupil_src/capture/pupil_detectors
Here is part of build log of visual studio cmd, it sait "cannot open file 'boost_python3-vc140-mt-1_64.lib'"
The boost version it try to link is incorrect, my computer installed boost for vc141. Do you know how to link to the correct lib? (i.e. boost_numpy3-vc141-mt-1_64.lib)
@here The Pupil Mobile 200hz issue has been fixed. Please update to latest version (v0.21.0) !
👏 👏
Can you somehow run the helper scripts
withouth the use of pupil capture through an API for example
@user-f1eba3 you mean access the data with having the Pupil Capture windows?
*without 🙂
withouth having the app yes
This is not possible at the moment.
do you have an api for example to do something like pupil.init(port to connect)
and then adress that port with other programs
@user-f1eba3 well if you dont need the world camera, then you can use Pupil Service
@user-f1eba3 There is no background service that would be able to start the applications via an api, no. The apps API entry point is represented through Pupil Remote.
But Pupil Remote is only available if Capture/Service is already running.
Hello, can someone recommend a reliable freeware software for stimuli presentation control and synchronisation with Pupil?
@user-9575cd PsychoPy is one popular free stimulus presentation package. It has both GUI and command line experiment building options.
There is no PsychoPy GUI integration for Pupil that I know of though. But integrating the Pupil Helper scripts into the generated Python code of PsychoPy should be fairly easy.
thanks, Psychopy is actually the software I currently use, but I hopelessly got lost trying connecting it with Pupil
the idea about Helper is interesting though. By any chance you know some source where to get more information?
Are you modifying a gui-generated experiment or do you build your experiment from scratch using the PsychoPy Python module?
Currently trying to start from scratch by using Eyetracker demo from the workshop, but constantly get adlib error
These are the Pupil helper example scripts: https://github.com/pupil-labs/pupil-helpers
Most of them are implemented using a loop that calls the socket's recv()
method directly. This is problematic in your case since that is a blocking call that makes your ui unresponsive. You will need to check for available data within your ui loop before calling recv()
. This way the ui loop stays responsive.
Thanks for the direction, Papr! Didn't try helper yet, but it seems good idea
By the way, did any of you succeeded in making working experiment with Pupil and Psychopy?
In one of the recent versions
My bachelor thesis was actually about integrating Pupil into PsychoPy. But the code is over 1.5 years old and is by no means up-to-date... By now I highly recommend the way mentioned above.
@user-9575cd You should also have a look at the Surface Tracker plugin and define a surface around your screen if you want to have gaze in relation to your on-screen stimuli. https://docs.pupil-labs.com/#surface-tracking
So grateful for additional ideas and that spark of optimism! 😃 Since was loosing hope to make PsychoPy work.
Did the latest release solve the issues between the mobile app and pupil player?
The latest Pupil Mobile release, correct
excellent, I'll continue my research then
@user-9575cd Alternatively, if you're able to get hold of a copy of MATLAB (not free, but cheap for students and sometimes available for free via University libraries), you can download Psychtoolbox3 (PTB) for free and use that for stimulus control. I have created a working example for controlling pupil_capture using PTB and sending event codes. This method seems to work well, but it is more clunky than just doing everything within python.
Thanks, Tom. Perhaps it is even possible to have a look at your example somewhere online?
Thanks @papr .
Once again, big thanks. You guys are so helpful!
👍
where can I better understand the concept of gaze
or gazing ?
Do you mean gaze
as the term that we usually refer to in the Pupil software?
not just as the literal meaning
like a definition
I am searching for a debate or something similar to fully grasp the idea of gazing
So now I managed to understand most of the concepts used in Pupil and how the helpers work
How do you think I should proceede in order to build a plugin for Unreal Engine
?
You should look for a zmq implementation that you can use in the Unreal Engine.
I am also going to take the hmd eyes project as an example
I would recommend that as well!
cool
@Clover#9724 I do use Psychopy and Pupil together for my experiments, where are you stuck at in the integration?
I am currently having issues with Pupil Capture 1.4.1. When I calibrate the tracker, the gaze is tracked properly for the first 4 or 5 seconds, and then it gets lost and I need to recalibrate. Any idea on why this is happening? I stay steady in front of the screen, so I am not moving at all.
@user-cf2773 is this with 2d or 3d mapping
3d
How do the confidence graphs look in the world window?
usually 100% steady but, sometimes, one eye goes to 0 with no reasons
in the eye camera I can see the pupil tracked properly and the green circle surrounding the eye
And the green circle does not change in position?
with some calibrations yes, but then I recalibrate and is generally steady
And is it opaque? The more translucent the green line is, the less is the model confidence.
Can you open the 3d debug window? You can find it in the eye windows' pupil detector settings
Yes it is quite opaque
And I have the 3d debug windows open, the blue ball seems to be steady
But for one of the eyes, there are three models competing with each other
What do you if you say "it gets lost" (citation from your first message)? Do you mean that the gaze visualization does not show up anymore or that the tracking is very inaccurate?
It becomes inaccurate and then it doesn't show anymore, and I get a lot of nan from pupil remote gaze
Also, often when I do the calibration I obtain reflection detected in the world camera
nan values? Which gaze datum fields are nan?
gaze_point_3d
also, the two magenta markers in pupil capture tracking the gaze are rarely overlapping: they diverge more and more
This is a recent change. Low confidence data is mapped monocularly to avoid worsening a binocular result. The previous version discarded such data. The competing models are an indicating for low model confidence data. This means that new 2d pupil data does not fit the current 3d model and therefore new models are built up.
Try to find a different position for the eye camera with the competing models. Sometimes a different point of view yields more stable models.
I have been tried for quite a while and with different people, do you have any recommendation on how the cameras should be positioned (e.g., distance from the eyes, in front or below the eyes, etc.)?
@user-cf2773 if you are having a hard time we can do a video support session to debug this issue. Best way forward is to write us an email to info[at]pupil-labs.com
Will do, thanks a lot!
Good afternoon. Tell me what I'm doing wrong, why my marker jerks and does not stop twitching?
Which marker are you talking about?
@user-d9bb5a please provide a sample video for us to look at. You can share a google drive link in this chat.
@user-d9bb5a please provide a sample video for us to look at. You can share a google drive link in this chat.
oh ... second))) about the marker that are needed for the surface (
video, I can reset tomorrow (now at the meeting) ...
can I take the wrong size?
How many markers do you use to track you surface?
4
else 8
Is it possible to make a heat map with video without a marker?
You mean a heatmap that is relative to a gaze target? No, this is only possible by defining the gaze target as surface.
It is very a pity (Thanks ... then it is necessary to understand to the end why they are not defined or twitch.
As @mpk mentioned, the most effective way to help you is to share a recording of the surfaces with us. Until then we would have to guess possible reasons and this is not feasible. 🙂
Yes, tomorrow I will share with you, since. Your help is very necessary. Apparently she just got confused. On the old computer everything worked out, and the new one still can not adjust the work ((((Thanks.
has anyone used c++ to do a 0mq request to pupil ?
I think the hmd-eyes project uses the c++ zmq library.
maybe I am mistaking but I guess not https://github.com/pupil-labs/hmd-eyes/blob/master/python_reference_client/zmq_tools.py
I will try to make a request using zeromq with c++ somehow
I think the hmd-eyes project uses the c# zmq version. You should probably use this http://zeromq.org/bindings:cpp if you want c++ bindings
Specifically https://github.com/jship/CpperoMQ looks like a good high level lib to use.
I would recommend to reimplement the Pupil Helpers in c++ to get started.
You will require https://github.com/msgpack/msgpack-c for communication with Pupil as well
@papr i was also thinking about reimplementing the filter_messages helper as well
I would start with https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/pupil_remote_control.py
because you do not need any msgpack up to line 55 but send simple byte strings instead
great idea
what is with this 2 lines in the remote_control : if name == 'main': from time import sleep, time ?
The first line is a Python convention that you can ignore. The second line just imports the system sleep and time functions such that we can use them in the script.
I thought it was something like #ifndef
in c
If you import a python file the interpreter runs all its code. This line avoids running code within the if statement during such an import. It is only executed if the file is called direct. In these cases the file's magic __name__
variable is set to '__main__'
instead of the file's actual name.
Would you recommend this https://github.com/jship/CpperoMQ over ZeroMqs bindings
?
Can somebody help me to run pupil-capture service properly, for me it is not working properly and i can not able to fix the problem
I mean it has not been managed for 2 years now
@user-f1eba3 Oh, I did not see that is has been stale for that long. I would go for the most active one
@user-e7f18e What exactly does not work?
eye1 is not catching the red circle on pupil, i am not able to find out that would it be a hardware problem in HTC vive
@user-e7f18e But you are able to see the eye1 video?
yes
So you problem is bad pupil detection?
Can you share a screenshot of your eye1 window with us?
exactly that why hmd-eyes demo for 3D scene is not accurate as per hmd-people
sure
so would you recommend I rather use this : https://github.com/zeromq/cppzmq?
@papr
@user-f1eba3 I did not use personally but this looks like the official c++ version by the zmq people. I would give it a try
@user-e7f18e I see two issues: 1) Try to move the cameras such that the eye is more centered. You can so do so by adjusting the lens-to-eye distance of the hmd. 2) You need more contrast between pupil and iris. Solving step 1) might solve 2) already. In case that it does not: Try playing around with the gamma and gain controls in the UVC Source menu.
I changed the distance of lenses right now but step 2 I have no idea how to do that
could you please help @papr
Sure, no problem. Would you mind sharing an other screenshot with the new eye positions?
Sure
You need to open the UVC source settings to adjust the uvc controls. You can do so by clicking on the camera icon on the right of the eye windows.
Then expand the Image Post Processing
sub menu. You should see a lot of sliders for different options. I would recommend to increase the gain
and gamma
values slightly. The exact values depend on your exact lighting conditions. Adjust them such that the pupil has a high contrast and the pupil detection works well.
current scenario.
looks proper now let me change the settings you said.
@papr is it ok with gain 5 and gamma 110.
Do you use the 120Hz or the 200Hz eye cameras?
120
eye1 looks out of focus. You can adjust the focus by rotating the lens if you use the 120Hz system.
Sorry, I read the docs on the calibration and am still a little confused on the workflow. I plan on using the pupil mobile app to study cyclists. Do I need to calibrate every time I start recording or can I save a calibration file for the start of the ride and record a series of short recordings throughout the ride and use the first recording's calibration file for all of them? (I'm assuming here that the glasses don't get moved between recordings)
@user-072005 It is currently not implemented to export/import calibrations from one recording into the other. You could do that if that worked and your assumption that the glasses do not move was right. But I doubt that the glasses do not move during cycling. I would recommend to do a calibration at the start of each recording.
Hello there! I just have a small question, it's probably a bit stupid but I gotta ask.
My superviser has made some modifications to Pupil Labs and so he sent me his source code. Now I need to build the program from the code and If I'm understanding this correct, I need to install all the dependencies listed here?: https://docs.pupil-labs.com/#windows-dependencies I've been installing stuff now for 2 hours... is this the correct way to do it?
@user-381730 Unfortunately, this sounds like the way to go. Although it is much easier to install the dependencies on Mac or Linux...
Ah yes, I usually dual boot with Linux, but Windows decided to "repair" my disk, making my Linux parition unusable.. Don't have the time to do a full reinstall now unfourtunatly.
Alternatively, you could try and extract your supervisor's changes into a plugin and use the bundle. I honestly think that you save the most time by reinstalling ubuntu and the linux dependencies 😕
Probably, though I use Manjaro(Arch) which would make the installation more difficult anyway. :/ I guess doing a plugin is possible..but I just needed to try a few things with his version. Didn't realize it would be this difficult to install.
Hello, would it be possible to have an individual's calibration settings persist if pupil_capture gets shut down or pupil headset gets temporarily unplugged/replugged? Currently it seems like they get erased.
@user-381730 yeah, sorry about that, but Windows is very tedious im that regard. Are you allowed to share his changes with me? I could tell you if it is easily convertable to a plugin.
And probably also give hints on how to do that
@user-e7102b the calibration is persistent in the gaze mapping plugin. These should be session and disconnect persistent
@user-e7102b is there any indication that Pupil Capture does not terminate correctly?
@papr Hmm, OK I haven't tested this extensively, but I'm midway through a pilot study at the moment and we disconnected/reconnected the tracker and the calibration went bad.
Ah, there is a difference! I am very sure that the calibration itself (gaze mapping function parameters) stay the same. Do you use 2d or 3d detection? Why do you disconnect the tracker explicitly? Does the subject do anything during the disconnected time?
We use 2D detection. We run some very long studies so it can be useful to disconnect the participant to allow them to move around, go to the bathroom etc. I understand that there is the chance that the participant could knock the eye-tracker and this could disrupt the calibration, but on this occasion I made sure this didn't happen.
@user-e7102b 2d calibration is very prone to slippage. If you do long recordings it is highly recommended to do multiple re-calibrations
There is always some kind of slight slippage that builds up over time.
@user-381730 What happens if you execute the opencv exe?
Sure, we do plan to build multiple calibrations into our protocol.
@papr, it extracted. :) I think I have managed to install everything now but some files have newer verions than those specified in the guide, I hope it works anyway. Though I'm having trouble starting the application now, when I run main I get error messages that I'm misssing () on calls to print. Shouldn't I use Python 3? I'm kinda a beginner at Python so I'm not sure.
Would you recommend we use 3D tracking? We're just having participants view stimuli on a monitor, so 2D seemed like the better option. I tried 3D tracking but this seemed less accurate.
@user-381730 Yes, you need Python 3! Python 2 is not compatible with our code
@user-e7102b Do you require the data for online interaction?
Yeah, I'm using 3.7 at the moment, doesn't seem to work though.
I'm starting to wonder if my supervisor sent me the correct code, it didn't have any main.py in the pupil.src folder...
No online interaction at the moment - just passive recording.
@user-381730 3.7 is very new 😄 Did not try that yet but should work in theory. Maybe it is a very old version of Pupil?
@user-e7102b Then I would recommend 2d tracking and calibration but only for monitor/control usage. Record the eye videos and include the calibration procedure into the recording. Then you will be able to use offline calibration and also profit of future pipeline improvements.
Maybe, not sure actually, probably a year old at least.
But I just cloned the official repo and tried running it, definatly something wrong with the versions;
Did you call git clone
or did you click the Download zip
button on github?
I just took the zip 😛
@papr Thanks for the advice - I'll make sure I record the calibrations.
Yeah, thought so 😛 The zips do not include the git information for some reason. You will have to explicitly clone the repository. You should have a git shell
(comes with the git installation) on your computer. Use it to clone the repository
A yeah I know how git works, but then this could be why the modded version doesn't work as well.. hnn
Is the modded version on github?
nope
Hello! I'm having difficulty getting the pupil glasses to function as a mouse. I have the mouse_control.py code from the Helpers repo, and I have seen the demonstration on YouTube. Any basic steps to get replicate this?
@user-6952ca Did you setup the required surface?
I did! I used 8 of the markers to outline a monitor.
Ok nice! Did you rename the surface as well?
Yep!
Did you do a calibration?
Yep
Are you receiving any events in the script?
there is only one line of readout: x_dim: 2560, y_dim: 1440
Can you print out the value of gaze_position
in line 56? Is there any output?
Still trying to get the offical repo working. When I run the main file, it keeps saying that it can't find the GLFW library can't be found, but I have triple checked the file paths and all, still can't get it to work.
What is the exact error message?
I had the same issue with glew but I noticed that the filepaths were wrong so I fixed it, but I still get this one :S
@user-381730 Did you setup a virtual env? Else use python3 main.py
please
@papr getting a constant stream of outputs while looking at the tracked surface
Nope, but python3 doesn't work, the command is not recognized even though it is on the path :S
@user-6952ca Nice! Could you post a single line in here?
@user-381730 You will definitively need to fix your python 3 installation first....
Did you reboot?
Nope, maybe I should try.
🤞
Cause it is added;
@user-381730 I believe you. With "fix your installation" I meant to get it working. I do not have enough insight into the magic world of pain that is called Windows to help you there. =/
@papr Also, another question. I'm sending commands to pupil remote (running on a machine with Mac OS X High Sierra) from a stimulus display machine (running Mac OS X Sierra) via an ethernet cable directly linking the two machines (I've set the IP address to be static for this connection). I've noticed that on a couple of occasions the IP address for pupil remote has changed, which of course means that the pupil middleman server fails to connect. The eye-tracker machine is also connected to the local network via a second ethernet connection, so perhaps this is confusing pupil remote? It seems that restarting the machine resets the pupil remote address back to the original (correct address). Is it were possible to set the pupil remote address to a static address?
@papr This is probably not exactly one line, but here is some of the output: {'topic': 'pupil', 'confidence': 1.0, 'ellipse': {'center': [82.98399353027344, 99.7783203125], 'axes': [33.62591552734375, 37.53098678588867], 'angle': 22.96988868713379}, 'diameter': 37.53098678588867, 'norm_pos': [0.2593249797821045, 0.5842569986979167], 'timestamp': 28123.65402, 'method': '2d c++', 'id': 1}]}, 'timestamp': 28123.652015}], 'timestamp': 28123.644999, 'camera_pose_3d': None}
@user-6952ca For some reason you are receiving pupil data instead of surface gaze data. Can you check line 44? What does it say?
Yeah, I just tried rebooting, didn't help I'm afraid.
God just why did it have to mess up my Linux installation this day of all days. :/
sub.setsockopt_string(zmq.SUBSCRIBE, 'surface')
@user-e7102b Yes, that should be possible. Disable the User primary network interface
option in the Pupil Remote interface and set the ip address + port
But I can start the Python shell separetly though, still get errors about missing GLFW
@user-b5e0c9 Oh, alright. python
executes python 3. That is good! One issue less 😃
yeah 😄
I don't have any other Python version installed so it shouldn't be able to run anything else
Yeah, you showed it with the first command in the screenshot 👍
Maybe I can try and add GLFW to my path instead?
Did you repeat the glfw-lib copy part?
Yep, also tried redownloading in case it got corrupted or something.
Can you show me the contents of your pupil_external folder?
Sure
Looks good. And that is the one in the repository that manually cloned, correct? Sorry for the dump questions but I want to make sure that we do not overlook anything by accident
No, I cloned it from git (with the Windows Bash tool) so it should be correct.
@papr here is a better pasting of the output, so I think gaze position is there. sorry for the confusion. {'topic': 'gaze.2d.01.', 'norm_pos': [-0.8056872850215058, -1.9805976641233527], 'confidence': 0.8257696120612803, 'timestamp': 28773.085415, 'base_data': [{'topic': 'pupil', 'confidence': 0.6615392241225607, 'ellipse': {'center': [119.89435577392578, 88.1365737915039], 'axes': [39.73045349121094, 96.71500396728516], 'angle': 94.72855377197266}, 'diameter': 96.71500396728516, 'norm_pos': [0.3746698617935181, 0.6327642758687337], 'timestamp': 28773.08761, 'method': '2d c++', 'id': 0}, {'topic': 'pupil', 'confidence': 0.99, 'ellipse': {'center': [230.96566772460938, 190.4365997314453], 'axes': [14.059481620788574, 17.84821128845215], 'angle': 118.0363540649414}, 'diameter': 17.84821128845215, 'norm_pos': [0.7217677116394043, 0.20651416778564458], 'timestamp': 28773.08322, 'method': '2d c++', 'id': 1}]
Oh that is what I meant by manually cloned. Sorry for the ambiguity.
No problem. I now tried adding glfw-3.2.1.bin.WIN64/lib-vc2015 to the path but it didn't seem to help
Are these paths correct?
"pupilScripts" is my "work" folder
@user-6952ca That is a normal gaze datum. You should not be receiving it...
@user-b5e0c9 Whats the exact path to your pupil externals folder?
C:\pupilScripts\pupil\pupil_external
@papr here is the error I am getting: Traceback (most recent call last): File “mouse_control.py”, line 83, in <module> move_mouse(x, y) File “mouse_control.py”, line 22, in move_mouse m.move(x, y) File “/path/python3.6/site-packages/pymouse/x11.py”, line 128, in move fake_input(d, X.MotionNotify, x=x, y=y) File “/path/python3.6/site-packages/Xlib/ext/xtest.py”, line 103, in fake_input y = y) File “/path/python3.6/site-packages/Xlib/protocol/rq.py”, line 1347, in init self._binary = self._request.to_binary(args, keys) File “/path/python3.6/site-packages/Xlib/protocol/rq.py”, line 1069, in to_binary static_part = struct.pack(self.static_codes, pack_items) struct.error: required argument is not an integer
@user-6952ca do you get that error consistently? can you print x and y in line 81?
@user-6952ca I think I know the issue. Please change line 82 to move_mouse(int(x), int(y))
@papr Yep, that did the trick! thanks for your help! 😁
No problem
Thank you for finding that issue! I committed a fix to the repository
Ok, I will downgrade to Python 3.6 and do all the steps again using the official repo, I probably missed something somewhere.
Hi all. New to Pupil and Discord. I just collected a big data set using the binocular Pupil headset, but it looks like a lot of gaze_positions were dropped (e.g. only 1892 out of 1984 frames were accounted for). The pupil_positions look fine (e.g. on average ~7 recordings per frame). World video is being recorded at 30 Hz and pupil is being recorded at 120 Hz. Is there a simple way to remap the gaze positions offline (like a player plugin or something)? Additionally, if there is a more appropriate place to ask this question, please let me know.
Hello,
I have just received my pupil labs eyetracking set. I need to implement it with my HTC vive headset. Where do I start?
Hi @user-723401 please start with docs here: https://docs.pupil-labs.com/#htc-vive-add-on
@user-e00b32 Welcome to the chat/community 😸 Question - did you record the calibration procedure as well? If so, then you can run pupil detection with different paramaters in Pupil Player and re-calibrate as well.
It's a new day and a new attempt at building from source
I'm having issues when running the boostrap.bat. That should generate b2.exe. When I run the bat file, it fails. If I look at the generated logs it says "c1: fatal error C1083: Cannot open source file: 'yyacc.c': No such file or directory "
Shouldn't that file be included with Boost?
@wrp I'm sorry I didn't make myself clear, I am running the latest version of Pupil Capture, the problem as shown in the video
mm the Solution seems to download the ZIp and not the EXE to extract things
@user-6e1816 thanks for the video. I just tried the same steps as you (with the bundle - not from source on Ubuntu 16.04) but was not able to replicate the crash
@user-6e1816 did you try starting again with default settings - delete capture settings
@user-3a0646 if you have notes or suggestions for dev instructions please make an issue in pupil-docs repo: https://github.com/pupil-labs/pupil-docs
@wrp just clink the General Settings-Restart with default settings?
Yes, you can clidk the Restart with default settings
button. Since you're running from source you can also delete capture_settings
dir in your pupil
repo.
@user-6e1816 did you also try recreating this issue when running from app bundle?
@wrp I will do that once I have figured out on how to get it working. 😄
There is some stuff that is kinda unclear 😛
@wrp start again with default settings doesn't work, but I can't also replicate this crash when running from app bundle
I managed to do every step of the process now but when running the main.py file causes the error
I have triple checked that I got the correct files in the correct locations. :/
@user-3a0646 use the .bat
file to run capture
Aha, hmm tried it, now get some other errors :S
@user-3a0646 I would suggest actually trying to build the optimization_calibration - python pupil_src/shared_modules/calibration_routines/optimization_calibration/setup.py build
and python pupil_src/shared_modules/pupil_detectors/setup.py build
so you can isolate errors prior to running the .bat
alright, will try it
Note - that these can take quite some time to build
Alright, well they didn't as they failed instantly 😄
ok
hmn... maybe cd into this dir first
and then just run python setup.py build
Which directory?
oh wait
I see
ok
Can't find eigen...hmm
Running python pupil_src/shared_modules/calibration_routines/optimization_calibration/setup.py build now works but getting errors when running python pupil_src/shared_modules/pupil_detectors/setup.py build I have double checked that cython is installed
cython version?
also try running from within the dir
I have installed Cython‑0.27.3‑cp36‑cp36m‑win_amd64.whl
Hmm I tried changing folder as you said, does this mean that it worked?
Hi all, is there a way to use the new pupil cameras outside of pupil (e.g. with directshow)?
But if I try to run the run_capture, I still get errors that it can't find DLLS
@user-3a0646 I looks like the detectors did not build in this case
@user-3a0646 try clearing the build
dir within pupil_detectors
and run python setup.py build
again
I did as you said, now it is building, taking a good amount of time
ok, that is progress
@user-9325d9 what are you trying to achieve exactly?
@user-3a0646 you may need to hit enter a few times in the cmd prompt if ceres/glog is hanging
Alright, well the build process crashed, saying it ran out of space but I have 50gb free :S
that's an error I have not seen before @user-3a0646
I tried doing it again and I think it worked this time, will try the bat again
Nope
Still missing the detector
@user-3a0646 the cmd prompt shows that build failed
@wrp we are trying to read the video stream captured with the high-speed cameras, outside of Pupil (we use a different framework)
Yeah I know but I tried rerunning it and this time i didn't get the space error. Then <I tried the bat again, still says I'm missing DLL detector 😦
Here is the CMD output for first building the detector and then running the bat : https://gist.github.com/anonymous/01707959e811f16e730fd96c5028b44e
Tried rebuilding multiple times now, still the same error :S
@papr after some strugles the library https://github.com/jship/CpperoMQ is too depracated to work with (I would have to change to c++11 if I wanted to use it) . Thought it might be usefull information for future development
can anybody help me for settings of pupil capture service
So I tried installing it on another machine now, still got problems but not the same. IT says I'm missing the pyav module, but it is installed, I have double checked.
(It's me, Kalle)
It would be nice help if anybody from pupil lab help me to the settings of pupil capture service before calibration by taking remote of my machine
@user-e7f18e please write an email in info[at]pupil-labs.com to schedule video support.
Seems like more people than me has issues with av: https://stackoverflow.com/questions/49153792/python-missing-dll-from-installed-module
After messing around with mouse_control.py, I am trying to improve the accuracy of the mouse cursor movements. I have a few questions:
1) What calibration/camera settings should I use for optimal performance (e.g, 2d++ or 3d++)
2) Correcting for eyeblinks: Whenever an eyeblink occurs, the position of cursor moves drastically
3) Ideal surface tracking settings: Currently, I am using a 32" computer monitor at eye level and standing a few steps away. I created a Google Slide show in which I placed the fiducial markers around the perimeter of the slides and moving a fixation cross around the screen.
Thanks for all the outstanding assistance you all have been providing!
@mpk i sent the mail into info[at]pupil-labs.com and waiting for the reply, the mail id is info@pupil-labs.com right?
Yes. We got the mail. I'll reply in a bit!
@user-6952ca I would try 2dans 3d and use the better one. Eyeblinks can be filtered via confidence. Just discard all data lower 0.9 in the pupil helpers script. Your surface tracking setup seems OK. Sometimes printed markers work better.
@wrp Thanks for the welcome and the response! Unfortunately, we did not start recording the calibration procedure as a video until later in the data set collection... I've done some exploring, but want to confirm - the calibration coefficients are not saved anywhere?
@mpk I am still in office and waiting for your reply, could you please give me a reply for the timings?
I may not in the office anymore ill reply tomorrow. Sorry
Could you please tell me your office timing tomorrow? I need to set-up the plan accordingly
Still nobody here who knows why python can't find va module?
did someone from this chat developed the hmd eyes app for Unitiy ? and if so can we chat a little bit 😄 ?
@user-f1eba3 that would be @user-e04f56 in hmd eyes chat
HI, just some feedback that might be helpful for you guys:- I just upgraded to the latest version of pupil_capture (v1.5) and my pupil_remote setup, which was working perfectly, stopped working. I made sure all the settings were exactly the same so this was quite puzzling. After much troubleshooting, the solution was to delete the pupil_capture_settings folder in the user directory (I'm running Mac OS X High Sierra) before restarting Pupil_Capture.
Does the pupil mobile app only record for about 5 minutes? Is it possible to record for longer?
@user-072005 Pupil Mobile is able to record for much longer. Please make sure that you have enough storage available.
@user-e7102b upgrading usually removes/ignores the old settings. Therefore you automatically start with the default settings after upgrading.
@papr yes, all the settings were reset to default when I upgraded, however pupil-remote just would not work despite me ensuring the IP address/port were set up exactly as before. When I manually deleted the pupil_capture_settings folder and restarted pupil_capture, it magically started working.
Can I replace the cord that comes with the eye trackers with another usb-C to USB cord?
@user-072005 yes, any usb-c cable of decent build quality will work.
great, thanks
I added a 256 GB micro SD to my phone and set it to save to the SD card. But it seems to shut off randomly still. It should have sufficient memory for a short (~5-10 min) bike ride (I'm studying cyclists). I thought maybe it's because the phone was in the rider's pocket, but the file didn't save properly when this happened so I think it isn't a press of the record button stopping it. When the problem occurs, the info file is missing from the folder. It doesn't always cut the recording early either. What could be the cause of this?
Oh and I'm using pupil mobile
Odd, trying to build from source here. I now get an assertion exception that my uvc. The latest one on git is 0.11 but in the python code, there is an assertion that the version must be 0.13 or higher. If I remove the assertion, I get another (real) exception saying that pyndsi version is to old, and that to is the latest. :S
Removing that assertion to though makes it compile and I can run the application(finaly, after 1.5 days of trying hehe). Now lets just see what errors I will encounter...
@mpk Thanks for the info on the eyeblinks! I am noticing that the accuracy for the cursor movements is drastically less than that of the gaze position in relation to the world view (which makes sense) . I am curious as to how the calibration process affects the accuracy of surface tracking. Calibrating at different distances from the monitor has different effects on the cursor movement accuracy. My colleagues and I would like to eventually write a program that obfuscates an image on a monitor except for a small spotlight around the point of fixation in real time, so accuracy of fixation on the screen is paramount.
Hey, is it possible to somehow record the Unity Scene with the Recorder when using Vive mount, similar to the front camera capture option? Would really like to do this, but couldn't find info about it.
(Need to analyze focus patterns in the scene)
After I insert the "from object_detector_app import Object_Detection" into the world.py, the main.py can't run successfully.
Hey! I have a quick question. Does anyone know if it is possible to measure pupil size?
@user-3a0646 The assertions are there for a reason. There are some features that depend on the newest versions. You might need to build uvc and ndsi from source to get the newest versions.
Hey! Has anyone tried to incorporate Leap Motion data into capture? The plan is to record hand movement and save this along with the data which is saved by Pupil Capture. I'm having some trouble with this accessing the Leap Motion data per Pupil-frame from the backbone.
@user-6952ca Surface detection is independent of gaze calibration. Mapping gaze to a surface is a simple linear transformation: https://en.wikipedia.org/wiki/Homography
The prediction error is measured in degrees of visual angle. This unit is distance independent. Absolute errors (in screen pixels) will be higher if the subject is further away than if it is closer to the screen.
I would recommend a chin rest + 2d calibration in your case. @user-41f1bf has a lot experience with using Pupil for screen based experiments. He might be able to comment on that.
@user-163330 Yes, the pupil datum contains the pupil size in pixels. If you use the 3d detection mode you can read out the pupil size in millimeters
@user-06a050 Could you elaborate on your setup and which data you are trying to access from where?
@user-1ad73e Yes, this is defintively possible. I saw @user-e04f56 working on that. Please ask in the vr-ar channel on how to do that
@papr thx!
@user-6e1816 Where does this object detector come from? I advise against adding module imports in main.py! The launcher needs to be as clean as possible in order to launch processes correctly. I would recommend to add such code into a python file which is then placed into your plugin folder. Capture/Player will automatically try to load it. If you run python3 main.py debug
it will even tell you details of what is not working.
On the other side, I am wondering why there is no Python exception trace. this must be an underlying c library that crashes the program...
@papr Of course: I'm publishing hand motion sensor data to the backbone, using zmq_tools.Msg_Streamer in a seperate thread. It publishes on the topic "hand". Inside recent_events I want to access this data, but only data which fits to the current Pupil-frame time-wise.
My current way of doing this gives me all data which was ever published on the "hand" topic since starting capture, every time recent_events is called.
Currently I use this: def recent_events(self): while self.hand_sub.new_data: t, p = self.hand_sub.recv() ...
self.hand_sub being a Msg_Receiver instance which is subscribed to the hand topic. The data I'm publishing on the hand topic has the Pupil timestamp.
Is that comprehensible? 😅
@papr Alright, well I will test it out, if I get any problems, I know where to look.
@user-3a0646 In the uvc case this might be very sublte timing problems that result in unsynchronized data. This would render any recording useless. Just as a warning.
@user-06a050 Yes, that is helpful. I am happy to see people making use of our backbone infrastructure! Your setup looks good to me. What is the exact issue that you have with it?
@papr The next step I want to make is to write this backbone data into the events dic, so I can use the data inside other plugins. For that to work, I need only the hand data from the backbone which fits to the current pupil frame. My current method gives me all the data which was ever published in the hand topic every time recent_events is called.
Alright, good to know.
@papr A naive way of doing this would be to iterate over all that data and compare timestamps, then I could only save the relevant data in events. But this is infeasible because the list of entries in that topic grows fast, very fast.
@user-06a050 The received data will never be ahead in time. Therefore the last recv() in your while loop yields the most recent datum. Do you record the data within the leap motions application? Or do you want to store everything with Pupil? In the second case I would not discard any of the data and put everything that you received in a single recent_events call into the events dictionary.
def recent_events(self, events):
recent = []
while self.hand_sub.new_data:
t, p = self.hand_sub.recv()
recent.append(p)
events['hands'] = recent
This data will automatically be saved to the pupil_data file. Therefore you need to make sure that it is serializable with msgpack.
@papr That is exactly what I'm doing... But as I've said: the while loop iterates over all data which was ever published since starting capture. Is there something I can do wrong with publishing to the topic?
@papr Any idea on how I can build these modules myself from source? Sorry I'm a realy python noob. Maybe there is a tutorial about it?
@papr Oh and the data from the Leap Sensor is read inside a different python thread and send to the backbone. This thread is started outside of the Pupil plugin.
@user-06a050 Do I understand it correctly that calling recv() twice yields the same data? Everything you publish is received exactly once. Therefore, if you receive the same data over and over again, you must be sending it over and over again.
@user-3a0646
git clone <repository> <repo_target_folder>
cd <repo_target_folder>
pip3 install .
@papr Yes, the same data plus something more. Havn't checked that, will do! Thanks!
@user-06a050 I recommend to only publish the most recent data. If you need previous data, cache it in the plugin
Thanks, will test it!
happy International Womens day to yall cool devs
@wrp could you please help me for setup the pupil capture software properly?
Hello guys, I'm new to Pupil Labs eye-tracker and I have one question - can I use 120Hz model with Xioami mi5 through mobile app?
@papr so I managed to install pyndsi by downloading some dependencies for it. Now I'm having trouble installing uvc. It needs DLLs from both libusb (which I managed to find) and Libuvc which I didn't find. So now I have to run cmake to make the build script fro Libuvc. The problem is that I don't know the command for this. The command I mean is the one to include the other files, Step 3 on this link https://github.com/pupil-labs/libuvc/blob/master/INSTALL_WINDOWS.md
I have a problem with opening the recordings, We we try to drag the folder into the player it send an error message that the video is not valid. Also we have a hot key in captered, the T, which i don't see anywhere where should i use
@user-feb6c2 Could you post a screenshot of your recordings folder content?
@user-feb6c2 Be aware that--if you did not provide a custom session name--the recordings folder should be something like 000
and not 2018_03_08
@user-3a0646 Do you mean that you do not know <pthread_dir>
?
yes, or the include statement. basically how do I write the arguments for cmake?
@papr I can't it is not on my computer but i will ask. I found the recordings in the folder but i couldn't play them
@user-3a0646 If I undertsan dit correctly you are not supposed to change the cmake manually but by using Visual Studio.
The pthread_dir
seems to refer to the extracted folder form Step 1)
Ah yes, but I read it as I need to use cmake first and then visual studio
I¨ll take a screenshot
wait a minute
Looks like you need to adapt lines 37 and 39 of the CMakeLists.txt file https://github.com/pupil-labs/libuvc/blob/master/CMakeLists.txt#L37
aaaa that could be it
let me try
Nope
Click on finish. I do not think that you are supposed to add the libs there
But in the table of the main view
Click on Add Entry
and see if it opens a window that allows you to set include and lib paths.
I clicked finished and it failed
Micrsoft.Cpp.Default.props was not found
Unfortunately I cannot help you with that issue. =/
A shame the author couldn't have documented it clearer 😦 Three hours wasted on this problem
;P
@user-e7f18e didn't we talk about your setup last week?
yes @papr
Why isn't there any pre compiled DLL:s to find anywhere? Isn't that half the point of DLLs, to easily share code?
Do you have any experience with Pupil Mobile android app guys?
@user-e38712 We have not tested this model. But it should work in theory if it has a usb-c connector
@user-3a0646 I have an idea. Dowload the newest windows bundle. It includes pre-compiled dlls. Maybe you can copy it from there.
@user-e7f18e What do you need help with specifically then?
to connect Pupil to my phone I would need USB-C to USB-C, right? I Have already OTG microUSB and microUSB to USB-C, but I'm not sure if it would work 😛 Probably not, because there will be lower speed
@user-e38712 Correct, you would need a usb-c to usb-c cable
@papr pupil capture software says my pupil is out of focus and I can not set it properly
it would be great help if anybody helping me for the setup using a remote session
@papr ok, when I'll have opportunity I'll try it with mi5 and let u know if it works
@user-e7f18e Please write an email to info@pupil-labs.com
@papr That's a smart idea
@user-e38712 That would be great! 🙂
I already sent a mail to them as instructed by @mpk but no response yet
@user-e7f18e Please be patient. We are a small company and we do our best to keep up with the support requests.
Ok well maybe I
Maybe I'm being stupid here now.
@papr sure no problem at all
I did as you said, cloned the pyuvc repo and then run pip install . on it. Before that I configured some file paths inside the setup.py file. Now when I run install I get an error saying "Cannot open include file : "libucv/libuvc_config.h, so such file or dir" And yes, If I navigate to this folder with the file explorer I have a file called libucv_config.h.in not sure what this is about :/ The libucv is also cloned from github
Like here: https://github.com/pupil-labs/libuvc/tree/master/include/libuvc
@papr
@papr I drop the mp4 into the player but It doesn't play it.
@user-feb6c2 Ah, no, you need to drop the entire folder. The one named 000
ah okey thank you !
Hey guys, today I have to make the decision whether we are going to keep the pupil or not but I am still not sure to be honest. My boss thinks that it can be helpful but we are still struggling to produce an accurate results.
Based on the feedback I have gotten here so far my capture procedure is as follows: Test subjects sits in a chair with mobile device (5-6 inch) held in a grip on stativ in front of him in a heigh that allows him to operate the device (we need him to be able to play the game). I am using the 2D mapping mode with the calibration being done directly on the mobile device (recorded screen marker calibration on PC and playing it on mobile). So far so good but when the subject starts playing he naturally cannot hold his head still and any change in angle of his view completely messes up the tracking (tried to fix this manually in post processing but that's just way too tedious). We cannot really use any sort of headrest since that would mess up with the "natural gaming environment" even more than the grip already does.
Now, I really think that the technology is neat and would love to keep it but due to my incompetence (we are launching our new game soon and there really was not that much time to play around with this despite having a month) or due to pupil not being reallly that much of a fit for our specific purpose (choose one) I have not been able to produce results based on which I could confidently recommend the purchase to my boss.
To sum this up, any last minute advice before I have to make a decision? Whatever the result I appreciate what you are doing here and wish you a lot of success!
@papr I managed to play the video and export the datas but now i got .csv files and even if converted it in excel it still doesn't open it, it says that there is loading error. Do you know something about that ?Thank you !
@user-feb6c2 Excel says that there is a loading error? Would you mind sharing the csv file with me?
with the other computer I could open it, it just doesnt work with the I made the recordings.
Can I ask all the questions come up to me during the the studying of the instructions ?
@user-feb6c2 The csv files seem to be correct. Maybe the excel conversion messes something up, but I can not say for sure.
@papr Can I ask all the questions come up to me during the the studying of the instructions ?
Sorry it is a lot but for me the intructions on the web page not really clear 😦
Ok, lets tackle them one by one 🙂
T
hotkey starts the accuracy test procedure. Just perform the same steps as in the calibration. The difference is that it starts the accuracy visualizer plugin instead of creating a new calibration.3.1. The accuracy visualizer has two phases: 1. Collection of pupil data and reference points. 2) Mapping of pupil data using the current calibration and comparison of the mapping result to the reference points.
3.2. The hololens relay is a specialized plugin for our Hololens integration. https://github.com/pupil-labs/hmd-eyes/tree/master/unity_pupil_plugin_hololens
3.3 The log history just lists all past log messages in the side menu. This is just a convenience plugin if you missed a log message.
3.4 Surface heatmaps can be enabled in the Surface Tracker plugin. Open it via the Plugin Manager, mentioned in 2.
3.5. There is a Capture Annotation plugin to whih you can send annotations remotely via our network api.
can I ask something more c++ related then pupil but that still has to do with the device
?
@user-f1eba3 Yes, but I might not be able to answer it.
@user-feb6c2 4. We will do this separately 5. Plugins can access pupil and gaze data. The pupil detection algorithm itself cannot be influenced via plugins.
So I am sort of stuck. I'm trying to do a simple app to subscribe to the pupil because if I have that I can very fast build a plugin for Unreal Engine. I have tryed cpperoMQ and it simply breaks at every step. Now should I try to use this and cmake it on windows ( Yes I'm only using windows for my current setup) or do you see some other way I can go forward with this
by this i ment
A
hotkey and showing the surface to the camera
7.5. Marker Tracker plugin = Surface Tracker plugin. See question 2
7.6. Timestamps are created by a monotonic clock and are only useful when comparing timestamps to each other. If you want them to be meaningful (e.g. representing the current time) you will have to synchronize Pupil Capture's clock with your reference clock.
7.7. The fixation detectors are based on the same method to detect fixations: max dispersion in a minimum duration. They differ only in the event format
7.8. https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_messages.py#L23@user-f1eba3 I would recommend cppzmq. But I would look for precompiled dlls...
that s what i'm also looking for
@papr thank you so much ! 7.3 In the introduction of frame publisher, they say that it broadcast videoframes but I don't understand what kind of frames and how can they be useful ?
It allows external applications to access the video frames of the Pupil headset. This is needed because it is not possible for multipel application to access them same camera at the same time. This is a developer feature only.
Okey thank you so much ! I think I will get back to you again because we have difficulties with the program but so far thank you again !
@papr I'm still struggling. The problem was that I receive the same data when calling recv twice. I made sure that my code itself doesn't send the data twice and now I made sure that there is only one socket connection per thread (one pub for the thread which sends the Leap Sensor data and one sub for the Pupil plugin) just to be sure. But the problem persists, I get the same data twice when calling recv! Do you have any suggestions where to look for solutions?
First I would make sure that this is not due to leap motion sending duplicated data. Start by sending a simple counter value from one thread to the other. But an other question: do you use Python's multiprocessing
or multithreading
module? If you are using Python threads, you can use a shared queue instead of zmq.
And if you use multiple processes, I would suggest to use the Pull-Push pattern and not the Pub-Sub one. Pub-Sub is for one-to-many relations; Push-Pull for one-to-one
Using a simple print from the function which executes ipc_pub.send() I made sure that this gets called only once per Leap datum. I don't create any threads myself, this is done by the Leap SDK (and Pupil of course). Using threading.get_ident() I checked that there are only the two described threads at play and to make sure the socket connections aren't used across thread borders. The final solution should send the Leap data to the backbone, because we potentially want to use this in different applications too.
So data is also arrives duplicated if you send a simpel increment counter value?
Yes
@papr Could you please check my post above (3 hours ago)? I have to make the decision now and would really like to know whether I can achieve the desired results (if it is technically possible given enough time and understanding of how everything works) or not.
@papr Here is some code, maybe this helps. It's quite simple and short. https://pastebin.com/SzNRwais
@user-4d2126 I am afraid that Pupil might not the correct tool to achieve your goals. I would suggest to try a remote eye tracking system that uses the phones front camera (or any camera that is fixed to the phone itself). But I do not know what the current state of the art is for such systems' accuracies. I would love to hear back from you in case that you try some. I wonder how close Pupil gets---in terms of accuracy---to such specialised tools...
@papr Thanks for your answer, appreciate it. Hope that in time there will be a solution for this. Definitely gonna check this channel from time to time to stay informed on the topic. Also thanks for all the help along the way! Good luck with your endeavours.
Good luck to you, too!
@user-06a050 ~~I do not see the problem. That is exactly the output I would expect.. 🤔~~ Never mind, I overlooked something
btw this is awesome, is there a way to buy you a coffee? Or a beer?
@papr Well that's funny 😄
on_connect and on_frame are running in one thread and recent_events in another, if that's important
Where do you initialize LeapListener.counter
?
Oh sorry, that'd be inside the LeapListener class, as a class property
With 0
ok, was just a detail
Your code looks correct to me.
And the output?
The output is unexpected
Is it normal to receive the same data twice? Because you said before that it is not.
Ah okay
Could you try to run the LeapListener + leap api stuff in a separate python script?
Will do, gimme a minute
No change
Well except for the fact that there is obviously no FRAME X output in the Pupil output
Oh wait, there is something amiss
has anyone here succesfully used zmq for c++ on windows?
Okay, I did everything correct. The output is still wrong.
Not everything, then the output wouldn't be wrong. You know what I mean.
Ok, lets try to simplify the test further. Please ~~create a~~ use this stand-alone sending example, that does not rely on leap motion api: https://gist.github.com/papr/94982539daa1cca1e02957523958f9c1
@papr is someone around the comunity more c++ capable than me :))) ?
i've really been strugling for the past 2 days to subscribe to pupil using cpp
@user-f1eba3 But are struggling with c++ itself as language, with the cppzmq lib compilation, the cppzmq api or the general concept of how to use zmq?
cppzmq lib compilation
I did not find a tutorial or something to work with 0mq and cpp
there are a bunch with python though
Did you manage to get step 1 of the cppzmq build instructions running?
Had to change time.monotonic_ns() to time.monotonic(). The output is here: https://pastebin.com/qpuSwGMp
I removed some of the output because I was slow and the log was really long...
Eh, yeah, monotonic_ns() is a Python 3.7 feature. My bad. But the output is extremely unexpected.
Especially now since there is now Leap involved
It does probably not play a role at all, but please execute /usr/bin/python3.5 main.py
within /home/patrick/Code/pupil/pupil_src
im running on windows not on mac
@user-f1eba3 Are you using https://cmake.org/runningcmake/?
@user-f1eba3 You need to start by building libzmq: https://github.com/zeromq/libzmq/blob/master/INSTALL https://github.com/zeromq/libzmq/tree/master/builds/msvc
@papr Tried that, still no change. Is it of any importance that I regularly have to change the base class of my plugin to make capture actually load it? When I use Producer_Plugin_Base it sometimes just isn't loaded until I switch it to Plugin and activate it in the settings menu. After that I can switch the base class back to Producer_Base_plugin.
Producer_Plugin_Base
is not a base class you should be using for a custom plugin.
Oh, it's never said anywhere I think 😄
Will change it
Even though it makes semantically sense, it was used in Pupil <v1.0 to display producer plugins in a different drop down menu. In the current version, the Plugin_Manager
excludes all plugins from its ui that inherit from any of these: System_Plugin_Base, Base_Manager, Base_Source, Calibration_Plugin, Gaze_Mapping_Plugin, Producer_Plugin_Base
.
Sorry to jump in in the middle of these other discussions. I'm still having some trouble with the pupil mobile app. I added a 256 GB micro SD to my phone and set it to save to the SD card. But it seems to stop recording randomly still. It should have sufficient memory for a short (~5-10 min) bike ride (I'm studying cyclists). I thought maybe it's because the phone was in the rider's pocket, but the file didn't save properly when this happened so I think it isn't a press of the record button stopping it. When the problem occurs, the info file is missing from the folder. It doesn't always cut the recording early either. What could be the cause of this?
@papr But sadly it doesn't help with my current problem 😄 Well it's been 4 straight hours now with no advancements, I think this'll have to wait until Monday. Thank you so much for your help! Is there a way to donate? It wouldn't be much, but this/you deserves something.
@user-072005 This might be related to https://github.com/pupil-labs/pupil-mobile-app/issues/10 Please post a comment with yur experience there
Ok I'll do that
@user-06a050 Referring us to your collegues and mentioning us postively in your research publications would be more than enough 😃
@papr Will do, thanks again! Have a nice day!
@user-06a050 You too!
Hello! I have a Capture file that when I drop it into Player, sometimes Player shuts down, sometimes it will work until I want to add a plugin (specifically the eye overlay). Do you have any suggestions for what I can do? I've gotten as far as offline pupil detection and then it usually shuts down.
PS-I just downloaded the latest Player version, and it still shuts down when I try to add eye overlay
One more question - I've found I have to delete my settings when I download the latest update, but then all of my manual calibration points disappear and I have to redo. Is there a way around that?
Hey guys, I keep running into an issue and I'm deciding if I need to post it on Github. I am having trouble with Pupilcapture only detecting my World cam unless I delete the pupil_capture_settings and even then, the problem appears upon my next opening of the application. When I do get the eye cam, I get a crazy amount of error messages that loop continually. I've added a screenshot of this. I am thinking it could possibly be a cable error, however I haven't had an issue with this cable before so it seems unlikely. I just lack any reliability in getting the program to work which is very frustrating.
@papr This object detector come from world.py. Now I move it to the plugin folder, there is still a crash...
So when I go to record the marker that shows where I am looking seems to always be off by a few inches. I will look at the center of my screen and the marker will show up a few inches below that even though it seems to calibrating properly.
Hi @user-02ae76 You are using a 200hz eye camera system, correct? There might be a hardware issue here, just to check are you using the cables that shipped with your Pupil headset?
@user-e1dd4e please try the accuracy test after calibrating and let us know results
Hi @wrp and everyone. we try to get pupil capture working on a mac os 10.9 (Mavericks), but it crashes at starting. We have tried the last version of pupil software and the previous one. It works fine on another computer running El capitan. I attach the crash report. Any hint about what it might be happening would be great. Thanks
Thanks @user-cbb918 I will take a look today
@user-cbb918 the issue is not OS version related but comes from the fact that your 10.9 machine runs an Intel CPU thats not i3/5/7 the bundle only works with this type. If you need to run on this machine you will need to install all dependecies and run from source. I hope this is helpful!
@user-2798d6 we need a bit more info. Are you using Pupil Mobile? If so and you are using 200hz cameras note that there was a bug that broke recordings. Have since fixed this.
@mpk I am not using pupil mobile
Hi, I'm having trouble with pupil mobile changing the camera settings during the recording. I'm using it outside. I will set it to have the frame rate be dynamically altered, but during a short recording (under 10 min), it will change back to the default and then my recording is useless as it is all white washed. Is there something I can do to address this?
Hey guys, what module do you use to read and display frames from the world.mp4 without having to convert to ffmpg?
...and what module should I look at as an example?
@user-8779ef we use pyav. Check out their examples on github: https://github.com/mikeboers/PyAV/blob/master/examples/save_frames.py
@AprilG#6958 I recommend raising an issue with our Android devs here: https://github.com/pupil-labs/pupil-mobile-app we can address this quickly and push a release.
@user-2798d6 in this case I will need the logfile or terminal output after the crash. This is an unknown bug.
@mpk You use it for loading and displaying the images as well?
It's part of our source, yes. Have a look at their other examples of our source code.
Has anyone else had trouble with pupil mobile changing the camera settings mid-recording?
Thanks mpk
Many thanks @mpk, we will try to get it compiled from source
@mpk - I'm not sure how to send you the log file or terminal output. Do you want me to send the Capture file?
I'm studying cyclists and using pupil mobile, I'm having a problem with the recording shutting off randomly. I've tested a bunch ofideas for the cause while not on a bike (such as it bouncing around in the pocket or having connnection issues with the cord) but it has never shut off prematurely. I am now wondering if the IMU in the app is having trouble with the cyclists' acceleration. Is this possible? Is it possible to disable the imu recording?
Nevermind, it wasn't that. It's Wi-Fi cutting in and out, does the app actually need wifi?
@user-072005 I will try to replicate this. Thank you for debugging the issue. The wifi is only used for streaming but should not terminate a running recording.
This is actually easy to reproduce.
Glad I was helpful this time and not just asking a million questions.
I accidentally clicked on "Clear Natural Features" in the offline calibration in Player...is there anyway to undo this?
@user-2798d6 No, unfortunately not. =/ Maybe this button should be positioned at the very bottom of the offline calibration menu...
Hey! I'm still stuck with the problem from last week: I've got one python script which sends data into the backbone and another one that's supposed to receive this. The receiving end (recent_events in my Pupil plugin) somehow get's every message which was ever send for that topic on every call. I expect it to only receive message which it hasn't received before. @papr we talked about this a few days ago. I have no freaking idea where to look for solutions or what to try next. Here's some code: https://pastebin.com/ASAHpEW9 Don't be bothered with the Leap-naming of things, this is because I wan't to include Leap Motion Sensor data in Pupil, but as you can see in the code I'm not using actual Leap data.because I'm currently trying to fix this problem. Any help would be greatly appreciated!
hello, I have a issue with pupil player (latest version). When I load a recording only eye1 seems to work, eye0 confidence is at 0 for the entire recording and if i used the "eye video overlay" plugin the image is frozen.
If i try to do a offline pupil detection both eyevideo are running but after that only eye1 is working
during the recording with capture( also latest version) both eye seems to work (at least the confidence treshold was varying over time for both eye)
I m using Windows 10
Hey @user-c14158 would you mind sharing a screenshot of your recording folder that includes file sizes? How did you record the videos? Pupil Capture or Pupil Mobile?
The video were record with pupil capture, i can play both eye video with vlc
Ok, that would have been my next question. 🙂 Could you upload the recording and share it with pp@pupil-labs.com ? I would like to replicate the issue locally.
sure , a zip of the folder works for you ?
Yes, it does
done, thx for you help
@user-06a050 Could you try a different topic name than hand
?
@user-06a050 Funny enough, we can reproduce your issue if we publish/subscribe with hand
but not with something different, e.g. han1
...
@papr That's interesting... Will try it later and let you know if it works! Thanks!
Hi everyone! Got a problem with Intel RealSense + MacBook Pro. After starting Pupil Capture, everything goes OK for about 30 sec, after that video from Intel crashes and only re-plug and restarting app can return video, again for around 30 sec. It's ok on Windows, but on Mac - crash. What can I do to fix that?
How old is your Macbook @user-1d894f ?
A1706, pretty new, 2017 as i guess (got it only this month)
Alright, does that mean that you use a usb-c to usb-c cable to connect the headset to the macbook?
I tried both options - usb-c 2 usb-c cable and usb-c 2 usb 3.0 over hub.
Could you share the capture/log file with us after such a crash happend?
I guess here it is
2018-03-12 17:25:58,880 - MainProcess - [INFO] os_utils: Disabled idle sleep. 2018-03-12 17:26:02,561 - world - [ERROR] video_capture.realsense_backend: Camera failed to initialize. No cameras connected. 2018-03-12 17:26:02,895 - world - [WARNING] launchables.world: Process started. 2018-03-12 17:26:04,654 - eye1 - [ERROR] video_capture.uvc_backend: Init failed. Capture is started in ghost mode. No images will be supplied. 2018-03-12 17:26:04,654 - eye1 - [INFO] camera_models: No user calibration found for camera Ghost capture at resolution [320, 240] 2018-03-12 17:26:04,654 - eye1 - [INFO] camera_models: No pre-recorded calibration available 2018-03-12 17:26:04,654 - eye1 - [WARNING] camera_models: Loading dummy calibration 2018-03-12 17:26:04,813 - eye1 - [WARNING] launchables.eye: Process started. 2018-03-12 17:26:06,866 - eye1 - [INFO] launchables.eye: Process shutting down. 2018-03-12 17:26:07,871 - MainProcess - [INFO] os_utils: Re-enabled idle sleep.
Which Capture version do you use?
1 5 12
Ok. We will try to reproduce this issue.
Thx
hey
I wanted to ask something but while thinking of the question I answered myself 😄
Love those moments
Could somebody help me write a zmq request from a third party library
?
@wrp sorry for the delay, I am using the 200Hz eye cam system. We actually were not using the cables because I operate on a MacBook Pro which only has usb-C outputs. We used an Amazonbasics USB-C to USB-C cable. I was worried it might be the video transfer capacity on the cable. We have had more luck using the iMac charging call (also C to C) so I may check out what specs it has to determine what could be going wrong.
@papr You have send me a plugin reference that use python serial library before, Is it possible to modify the plugin without installing the pupil dependencies? Sorry for noob question but I am new to python and also not get used to develop large program which include a bunch of (20+) libraries.
@papr Actually, I made an external electronics hardware for studying subject's eye movement(when focusing on an arriving object). It is a ball screw rail with a visual stimulus marker that will getting closer and closer to subject's eyes. My goal is to use serial communication to synchronize pupil to my device, so that I will have the timestamp and travel distance of the visual stimulus. I just reading ZMQ these day, and have a hard time on installing dependencies on win10 and mac but not quite successful. So I am finding other solution that do not need to build from source.
@user-c14158 I found the issue with your recording. For some reason there is a time offset in your eye1 data. The video is not broken. Can you give more details on how you made the recording? Capture version, bundle/source, OS, etc?
Never mind, the info.csv
contains all I need. Do all you recordings have this issue or only this one in particular?
@user-e02f58 You can use the serial module when running from source. But you will have to install the module folder within the plugins directory. Your custom plugin should be there as well. E.g. on mac
~/pupil_capture_settings/plugins/pyserial/
~/pupil_capture_settings/plugins/my_custom_plugin.py
we had this issues with successive recordings , but the issue disappeared after restarting pupil capture
If you say successive recordings, do you mean that the issue did not appear in the first few recordings but then in the later ones? Can you estimate after how many recordings that happend?
We had this issue in one session involving multiple recordings (this issue was present from the first recording of the session), however when restarting pupil capture and in different recording sessions (for example on a different day), the problem did not appear.
Mmh. Please tell us in case this happens again.
Sure, can you think of a way for me to retrieve the data from the undetected eye ?
Yes. You fit a linear function to both eye0 and eye1 timestamps, and substract the difference in the functions' intercepts. This will align the eye timestamps but is not a 100% correct restore.
import numpy as np
from scipy.stats import linregress
wts = np.load('world_timestamps.npy')
ets0 = np.load('eye0_timestamps.npy')
ets1 = np.load('eye1_timestamps.npy')
def fit(ts):
return linregress(np.arange(ts.shape[0]), ts)
result = list(map(fit, [wts, ets0, ets1]))
intercepts = [r.intercept for r in result]
slopes = [r.slope for r in result]
print('timestamp offsets')
print(f'\teye0 - wts: {intercepts[1] - intercepts[0]}')
print(f'\teye1 - wts: {intercepts[2] - intercepts[0]}')
print('timestamp divergance')
print(f'\teye1 - eye0: {slopes[1] - slopes[2]:f}')
ok i will try that, thx a lot
The above script yields these results for your recording:
timestamp offsets
eye0 - wts: 286.6918953607383
eye1 - wts: 0.298578210244159
timestamp divergance
eye1 - eye0: 0.000064
Question: Are there any recommended transfer rate specs for usb-c cables (for attaching headset to laptop)? I believe I may not have one that is powerful enough. (Amazon basics USB-C to USB-C). Recommendations for tried and true cables are also appreciated. Trying to avoid using an A to C adapter for my laptop port if possible.
Trying to decide if I would be better off using the original USB-A to C cable with an adapter/extender or whether it would be overall better to just use a long (10ft) USB-C to C cable.
I've noticed that when I download a new version of the software, I have to delete the user settings, but then all of my calibration points are gone. Is there a way to keep the work I've done on a file in a previous version of the software when I download and use the new version?
Has anyone used the glasses outside with the pupil mobile app? My calibration has been turning out terribly and I was wondering if anyone had some tips.
@user-02ae76 https://www.amazon.de/Certified-CHOETECH-marker-Delivery-MacBook/dp/B06XDB343M/ref=sr_1_14?ie=UTF8&qid=1519751041&sr=8-14&keywords=choetech+usb+c+3.0 this should do
Hi guys just a short quest: I'm near to choose to buy or a new macbook pro 2017 or a macbook air, did you find any particular problem with these laptop? for example any problems with the USB-C of the ne macbook pro? very welcome any suggestion 😃
@user-1bcd3e usb-c on the macbook is fine. You will just need a usb-c-c cable.
@mpk ok thanks a lot!
@user-1bcd3e See my link above if you need a recommendation for a such a cable
@papr Yes I got it ... very helpful... thanks! 😃
@user-2798d6 when you update we have to reset the settings. Its very hard to test that with code changes sets of old settings remain valid.
hello guys, remember me when I was talking about testing Pupil mobile with Xiaomi mi5?
all cameras seems to forking fine, but right eye camera is inverted 180 degree
left is fine
This is due to the physical orientation of the right eye camera
is it possibel to calibrate using Pupil mobile app?
I also tried to record sth, but I can't find location of this
ok, found it in Videos
@user-e38712 You are not able to calibrate on the phone. You could stream the video wifi to a computer running Pupil Capture and calibrate there. Or you can do offline calibration in Player.
When I use the glasses outside, the camera gets washed out because of overexposure. Is there a setting other than the autoexposure priority I can adjust to help this?
@user-072005 what camera? Eye or world?
World, both a bit but I think the world is the bigger problem
But when I disconnect from wifi, it won't let me change the autoexposure priority
Are you sure that is due to the loss of wifi and not due to the recording running?
Yes, I checked it while I wasn't recording as well
Actually, I can't seem to change anything after auto-exposure mode
Were you able to replicate it? Perhaps it is my phone? I'm using samsung galaxy s8 because the school already owned it.
Would an absolutely primitive method like wearing sunglasses over the cams work?
@user-072005 Could you list the exact steps to reproduce the issue? I will have a try on my One Plus 3
Ok, start not connected to the wifi. I have the glasses facing outside so it's bright and I could see if the setting changed immediately. I opened the Pupil Cam1 ID2 and tapped the upper left button for settings. Then tried to change the auto-exposure priority to the frame rate may be dynamically varied by the device. When I click it, the setting looks changed but the camera doesn't actually change and if I reopen the settings tab, it's back at the frame rate must remain constant
And I was never recording during this
@user-cdabd8 I'll try that out if I need to
Ah I see it was assigned on github. Thanks for the help guys @mpk and @papr I'm leaving to conduct the study on Friday, so hopefully you won't here from me quite as often soon
Hi, I am using the 1.3.9 version of pupil player, and i'm trying to do offline calibration. In the earlier version I used to be able to see the frame number in the bottom left of my screen. However, in this version I can't find it which is making it hard for me to enter the calibration range.
Also, I have been recording the videos using 3D detection and mapping mode, however when I run these videos on pupil player when i select calibration mode 3d, the calibration status shows "failed".
@user-2da779 Please upgrade to the newest version. You can set calibration/mapping ranges easily from the trim marks.
thank you
@user-2da779 In regrard to your second issue: Did you record the calibration procedure? If yes, did you run the circle marker detection? If no, did you define natural features and changed the mode for the section?
I haven't clicked on the circle marker detection. I will try that, thank you!
Just to elaborate on what the issue was: Without the circle marker detection there are no reference point that the calibration can use. Therefore it fails. I guess the user feedback could be improved in this regard.
Ok, so i clicked on circle marker detection and selected recalibrate. it shows failed calibration. However, when i change the calibration mode to 2D, the mapping is completed.
Did you use offline pupil detection as well?
yep, and the detection mode for that was 3d too
Mmh, then it should have worked.
how do you run calibration on pupil htc vive addon?
When I press C in pupil capture, it says " Calibration requireds world capture video input"
MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version. Running PupilDrvInst.exe --vid 1443 --pid 37424 OPT: VID number 1443 OPT: PID number 37424 Running PupilDrvInst.exe --vid 1443 --pid 37425 OPT: VID number 1443 OPT: PID number 37425 Running PupilDrvInst.exe --vid 1443 --pid 37426 OPT: VID number 1443 OPT: PID number 37426 Running PupilDrvInst.exe --vid 1133 --pid 2115 OPT: VID number 1133 OPT: PID number 2115 Running PupilDrvInst.exe --vid 6127 --pid 18447 OPT: VID number 6127 OPT: PID number 18447 Running PupilDrvInst.exe --vid 3141 --pid 25771 OPT: VID number 3141 OPT: PID number 25771 world - [ERROR] video_capture.uvc_backend: Init failed. Capture is started in ghost mode. No images will be supplied. world - [WARNING] camera_models: Loading dummy calibration world - [WARNING] launchables.world: Process started. Running PupilDrvInst.exe --vid 1443 --pid 37424 OPT: VID number 1443 OPT: PID number 37424 Running PupilDrvInst.exe --vid 1443 --pid 37425 OPT: VID number 1443 OPT: PID number 37425 Running PupilDrvInst.exe --vid 1443 --pid 37426 OPT: VID number 1443 OPT: PID number 37426 Running PupilDrvInst.exe --vid 1133 --pid 2115 OPT: VID number 1133 OPT: PID number 2115 Running PupilDrvInst.exe --vid 6127 --pid 18447 OPT: VID number 6127 OPT: PID number 18447 Running PupilDrvInst.exe --vid 3141 --pid 25771 OPT: VID number 3141 OPT: PID number 25771
@user-d9bb5a you have a realsense world camera. Make sure that this backend is enabled.
yes))I deleted the settings and everything started
@user-b23813 please put the size of the screen in pixels for the surface size
@wrp so it should be, for example, 1920x1080 for both surfaces?
1920, 1080 for the full screen surface and if the other surface is smaller make it proportionally smaller
the size that you specify for the surface is used for exporting a png of the surface - so it can be proportional (does not need to be exact)
the size is also used to determine the number of bins used in creating the heatmap histogram
@wrp thanks I ll try that
You're welcome @user-b23813
I'm looking at the offline calibration section of the user docs. If I want to use natural feature selection to calibrate, would I still also use the marker detection first, then add natural features? So I would use both, not just natural feature selection?
@user-072005 you do not need to use markers at all if you want to use natural features
You can also define multiple calibration sections if you like
but ranges can not overlap (e.g. you can't apply two calibration methods onto the same range)
ok thanks. So, if I want to define a section in player, I would click the "set from trim marks" for the calibration? Nothing seems to happen when I do this
@user-072005 That's because the default trim mark positions are aligned with the default calibration section ranges. Move the trim marks around and click the Set from trim marks
button again
so, a little embarrassed to ask this, but what do the trim marks look like? I can't find something that will move
@user-072005 at the end of the seek bar in the bottom of the Pupil Player window
Oh I see it, thanks
For others who might wonder about "trim marks"
the green box around the seekbar determines the trim marks. By default the trim section includes the entire duration of the recording.
The trim marks also define which section of the recording will be exported.
This is a bit of a theoretical question, but do you know if the size of the calibration marker has much impact on the calibration? I printed it out on a regular letter sized sheet of paper and it wasn't picked up well at the distance I was calibrating at. I was wondering if I print a bigger one, is it as if I am calibrating closer than I really am? And I'm generally trying to decide how big I'll print it.
The recommended diameter of the printed marker is 94mm
ok I'll stick to that
recommended distance is 1-2,5m
Would that be true even if the person wearing the eye tracker is expected to be looking at objects further away than that?
then not. Try something like 3m then. See if the marker is still tracked.
ok, and then maybe scale the marker according to the additional distance if it isn't?
only if the marker is not detected.
ok I'll experiment with that today
Thanks
@mpk 94 mm is the diameter of the outermost white ring?
@papr thanks for the cord recommendations! Unfortunately I believe I have isolated the issue as being a software problem. I tested using both of my headsets with their in-box cables on my iMac and got strange results - both showed the World view no problem however the eye camera shows up as just a black screen with nothing getting picked up (almost as if the light isn't on). I then retested on my MacBook PRO using an Apple USB-c to C cable and got both images for both headsets. I did try several fixed on the imac (deleting settings, selecting input sources , etc.) But cannot get an eye image on my iMac. I used to have no issues using this machine. I think this confirms at least that my headsets are functional, however I'm considering posting in the GitHub since I've had such issues with consistency.
Any ideas why my eye image might show up as a black screen on one machine and show up fine on another?
Hi @mpk - I sent you the file of the recording that keeps shutting down when I try to work with it in Player. You also mentioned a log file and terminal output - will you get that from me sending the recording file or is that something separate?
@arispawgld#8014 make sure to use the latest version of Pupil Capture. The issue you are outlining sounds like you are using an older version of capture.
@mpk I'll check this out, I believe I do have the latest version but I'll update if installing the new version doesn't help.
Hey, where do I download the little QR codes so I can bound my pupil positions to within a monitor?
has anyone builded the zmq library for windows ?
thank you @papr
@user-f1eba3 Are you still struggling with that? 😕
yup
I found the way to run my code in Unreal (I guess
I just need to compile the god damn library
are you locked into using unreal? I've had success using Unity
Hi! Could I ask what is the progress of saccade detection plugin and do we know more/less release day?
@user-dc2842 I am sorry to report that there has not been any progress with the saccade detection plugin yet. My collegue and I have been busy with implementing audio playback for Player.
@papr ok, understood. I found a paper with a robust algorithm for saccade detecion (doi:10.1109/EMBC.2014.6944931), hope maybe it will help you somehow in the future : )
@user-ecbbea yes. I want to integrate Pupil in a bigger system that uses multiple robots and everything is developed unde c++/Unreak
Hi, have anyone here been successfull on getting a normal webcam working with this software? A friend of mine managed to do it on an older version in Mac and I''m trying to convert his changes to the up to date Windows version which has changed alot His "hack" was based on a modified backend called non_uvc_backend.py. I've managed to add a new category "Non-UVC-Source" to the gui and I start a new thread where I want the stream to work from. However, it doesn't seem to work. https://gist.github.com/Baxtex/f2aacef643c50157e6c1fa167f9cd2cd Do anyone have experiene with doing this? I need it to try a few things. The modified backend is based on the fake_backend.py
@user-d72566 Ideally you would not have to open a new thread. You would grab the image in the main thread. This avoids code complexity.
For the backend you need two plugin classes: A Source and a Manager. The manager lists all available cameras that your Source class can use to grab a video from. After selecting a camera a single Source object is instanciated and is responsible to grab the frame from the camera during the recent_events()
call and to add it to the events
dict
Also, this is not the correct way anymore to display a frame: https://gist.github.com/Baxtex/f2aacef643c50157e6c1fa167f9cd2cd#file-non_uvc_backend-py-L46
As mentioned above, you need to provide a Frame
object that is placed into the recent_events
events dict with the key 'frame'
.
Let me know if you have any further questions. I am happyt to see that people use the possiblity to write their own backends 😃
Hmm, interesting, thanks. I will try a few things, will report back here later today! 😃
I believe I have done this, more or less. Let me post my full backend file and a image on how it looks.
Here is the backend: https://gist.github.com/Baxtex/f6e553de2d56dcf61dd45809e9ac1670 This is how it looks like: https://imgur.com/a/TJUSg When I press any of the buttons, it will start a new thread but then nothing happens. Gonna experiment with it a bit. PS: I have included None_UVC_Source, None_UVC_Manager in the init.py file
@user-d72566 None_UVC_Source
does not have a recent_events()
method
Aha I see
is this what is used for an IR pass filter? https://i.imgur.com/EzhA01w.jpg
@papr I've now tried to implement this method with the help of uvc and fake backend but I'm probably doing it wrong beacuse nothing happens. 😄 However, I tried just copying over the activate method from fake_backend, and it actually started. I'm talking about these lines: https://gist.github.com/Baxtex/8c0fdec6d894bfbb4b319fc6226ae08d Not sure if this is something I can use? Sorry for these questions, I don't know Python very well but I've learned a lot just trying to get this working. 😄
@user-d72566 Yes, this is somethign you should use, but you need to adapt the notification: {'subject':'start_plugin',"name":"Non_UVC_Source",'args':settings}
and settings such that the keys match the keyword arguments of Non_UVC_Source
Alright, I did what you said and has been able to deal with the errors one at at a time until I got to recent_events. I have no idea what source I should use there. I'm guessing I need the frame? So I could do like in the fake_backend; frame = self.get_frame() ?
Sounds correct. What ever function your generates your Frame
object
@user-41bb50 for the DIY headset we recommend Agfa Precisa 100
@user-d72566 e.g.: events['frame'] = self.get_frame()
Hmm, alright, seems like my img isn't set properly now, gonna investigate...
The software expects the img
property to return a height x width x depth
BGR uint8 array
Isn't that what I do with the make_img function?
Or, maybe trying to do 😄
Could you provide a current version of your code?
Sure, I just cleaned it up a little, had a lot of junk.
Ah, ok, the reason why line 30 fails is the following:
You defined a read-only property called img
in line 40. In line 30 you try to set it which fails. Change line 30 to self._img = img
Oh, I understand! Then that is fixed, thanks! Now I don't have any errors anymore, which is unfortunate, as it is still not working. However the background now turned black instead of grey, so something is happening at least. 😄
@user-d72566 Ah, a subtle but fatal issue: You overwrote gl_display()
. The base class contains the code to draw the image. You have to call super().gl_display()
in line 84 instead of pass
Hmm, I did what you said. If fixed some of the rendering issues, but it is still not functioning, just a white background I'm afraid.
Ah, yes, an other detail: https://gist.github.com/papr/22ac1d4658166dd8559125ee9b72a1e0/revisions#diff-681a090c22157108bc63b860044816beR116
_recent_frame is used to draw the frame
is this what is used for an IR pass filter? https://i.imgur.com/EzhA01w.jpg5728493612957698/285728493612957698
@user-41bb50 please see @mpk response above 👆
sorry for being blind, and also for double posting
dont films exposed to ambient light/sunlight get damaged over time?
@user-41bb50 You need to get the film developed unexposed. Afterwards it should work for many years.
so developed, or unexposed?
Sorry for being unclear. You need to get it developed without exposing it to sunlight beforehand.
thanks
I'm back! I just tried your modified version (great many thanks btw!) however, the import on line 12 doesn't work, I don't have anything called manager_classes
@user-d72566 Ah, yes, I had to add this import because I used your backend as a runtime plugin instead of modifying the source. Just delete lines 12 and 277
Ah, I will try!
Hmm the results seem to be equal that of the fake_backend, minus some of the counters. So I got like a green/blue image.
Yes, this is expected, isn't it?
Ah, not really, my goal is to try and use my regular webcam as a source, sorry if I was unclear 😛
ok, but that is the next step. Your uploaded version is expected to do exactly that: Show a static image with a gradient.
Ah
To be exact, get_frame now needs to call capture.read()
and use its output instead of self._img
You won't need an other thread for that. Important though: Only call capture.read()
once per recent_events/get_frame
call. You need to loose the while loop that you used in your background thread
Hmm ok, so its a little like what I had from the original mod then.
Alright, I will try it!
Hmm, maybe this is working, I now have a black screen with artifacts at the top. I think it could be because of the different parameters inside get_frame. Will experiment a little 😃 If I cover the camera, the artifacts disappear. If I uncover it, the artifacts come back, so something is definatly working. 😄
But you are sure that it shoulnd't be on a separate thread? Because the UI and the whole application turns really slow when I choose the camera
That depends with how many fps the camera is running. If you need to start a thread, then it should be startedand stopped from within Non_UVC_Source
Its only 30 fps, but I have set that I think. Yeah I don't know really, just that everything is really slow, just like the main thread gets blocked
Yeah, that is probably the case.
You should add a small sleep(1/60) to your recent_events method if you move the capture.read() loop into a background thread. Else your main loop will run at 100% CPU usage
I could try. Btw, this is how it looks: https://imgur.com/a/xdvkO
I've now tried starting a separate thread to save the image that the recent function then fetches. GUI is much more responsive now, but background is just grey, no artifacts at all: https://gist.github.com/Baxtex/4b726038cfae389609ca206bfaaa6c20 so it is sadly not working as expected.
Which Webcam are you using?
Just my laptop's built in, a really generic cheap china thing. However I'm gonna try a better external one tomorrow (usb 2).
Are there any ways to improve accuracy after calibration? The three visualized gaze positions after calibration often appear on completely different positions. Also, even though calibration seems to be successful, the gaze positions are slightly different from where exactly a user is watching at. Any suggestions?
Turns out, I have confused some functions names, 😃 Now however, when the thread saves the image to self.g_pool.capture.img, I get an error: AttributeError: can't set attribute.. I then tried using just self.img, but that didn't work either, though it didn't throw any error. Hmm. This is the current state of my code: https://gist.github.com/Baxtex/153fc8f42ca6ce1bd44d14a6d798e174
hey guys, when trying to detect eye 1 from capture i get the error: "Eye 1: init failed. Capture is started in ghost mode. No images will be supplied." and the screen of the detect eye 1 window in grey and no image is shown. Detect eye 0 works fine and world camera also works fine. I'm using MacOS 10.13.3 and Pupil Capture v.1.5-12. Can anyone help?
@user-b116a6 Has eye1 been working before?
@papr Yes
@user-d72566 This is the problem of the read-only property in line 100 again. Have a look at this small tutorial that explains how they work: https://www.programiz.com/python-programming/property
To solve your problem you either have to assign the read image to capture._img
or remove the property definition.
@user-b116a6 Please write an email to info@pupil-labs.com including your order number, Discord username, a short description of the problem, and a list of things that you tried to get it working again.
Ah, I have some experience with Getters and setters (I'm a C# and Java dev), I guess I'm just to used to get iformative compiler errors. :D I did what you said, put in the underscore and now the camera finally works! Many many many thanks to you for helping me out with this!
Great to hear that it is working! Just a thing I stumbled upon: https://gist.github.com/Baxtex/153fc8f42ca6ce1bd44d14a6d798e174#file-none_uvc_backend-py-L112
@user-d72566 max(1, 0/60)
will always return 1
. Sleeping that long will make your ui unresponsive.
Ah, yes I noticed so I use the orginal delay instead;
max(0,1./self.fps - spent)
@papr I don't seem to have the order number, I am a student and this is part of my Thesis Project, is it necessary to include the order number or can I include it at a later email correspondence? It is time-sensitive that I find a way to continue and hopefully a replacement is not needed.
@user-b116a6 I understand. Let's try to do some debugging then. Disconnecting/reconnecting the camera does not work?
What headset do you use? 120/200Hz?
@papr 120Hz, binocular, I tried disconnecting/reconnecting, reverting to a previous version of the Pupil software, switching to Windows 10, rebooting several times, changing USB ports, tried killing zombie processes but not found any
@user-b116a6 can you check if this connector is correctly connected?
Or if there are any visible defects with the cables?
@papr the cables seem to be connected properly for both cameras
@user-b116a6 Is Pupil Cam1 ID1
listed in the selector of the UVC Manager menu?
@papr No it is not listed, only Pupil Cam1 ID2 and Pupil Cam1 ID0 are listed
@user-b116a6 Please remove the connectors of both eye cameras carefully, switch the cameras and try again.
Which cameras are listed now?
@papr Thank you very much, this was the only thing that I didn't try. The connector of eye 1 was a bit loose. Everything works fine now.
Ok, happy to help. Good luck with your thesis!
@papr Sorry to bother you again but I haven't managed to find anything about this. When I try to Calibrate (I've tried all the different methods) when the calibration is finished and full screen mode closes, the whole application crashes: File "C:\work\pupil\pupil_src\shared_modules\calibration_routines\calibrate.py", line 351, in preprocess_3d_data ref_vector = g_pool.capture.intrinsics.undistortPoints(ref_vector) AttributeError: 'NoneType' object has no attribute 'undistortPoints'
I haven't touched these files at all, is there something I have to change? 😃
@user-d72566 Your Non_UVC_Source
class needs an intrinsics
attribute.
See the camera_models.py
file for the different intrinsics classes. I guess you should start with a Dummy_Camera
instance and run the Camera Intrinsics Estimation
procedure afterwards in order to calculate the correct instrinsics for your camera. Are you using your webcam as world or eye camera?
Hmm ok, right now I'm using my homebuilt headset, both world and eye are webcams.
Ok, that is fine. The eye cameras always use a Dummy_Camera
intrinsics instance
Ok! I looked at the fake_backend.py file, I just tried writing " self._intrinsics = Dummy_Camera(size, self.name)" in my make_img function, just crashes though., it says it is not implemented.
Sorry to bother again, calibration process does not start, instead of the usual markers showing in a white screen, only the white screen is shown. I tried deleting the folder ~/capture_settings and restarting the application but no luck.
@user-d72566 Could you provide the traceback please
sure, just a sec
@user-b116a6 Please use the Restart with defaults
button in the general settings
@papr Yes I tried that also. Does it matter that I leave the detect eye windows open?
If you close the world window, the eye windows should close automatically as well
@papr its the name property that is causing trouble, trying to fix it
@papr Yes i tried restarting the application, restarting with default settings, deleting the ~/capture_settings folder but no luck in all cases.
@user-b116a6 Which Mac and which version of macOS do you use? Could you disable the full screen option and try again?
Yes I read somewhere that it may be caused by the retina display. I have the Macbook Pro 15" Retina running High Sierra 10.13.3. Let me try that and get back to you.
Ok, so I managed to fix the name property. So now when I select the camera, the application crashes, no error message or anything. If I comment out the "self._intrinsics = Dummy_Camera(size, self.name)" it works again, just like before. (but it crashes at calibration of course)
@user-d72566 Did you check the log file? Maybe there is more information about the crash in it
I will
you mean the capture.log ?
what thee... "AttributeError: 'None_UVC_Source' object has no attribute '_name'"
@papr Yes, disabling the fullscreen did it.
@user-b116a6 You are using v1.5 correct?
Yes, 1.5-12
@papr isn't this correct? https://gist.github.com/Baxtex/fb43065787c1ee4e11bec8f9f59f942f
Or am I just being stupid about the properties again? 😄
My guess is that you need to define self._name
earlier than calling make_img()
since make_img
accesses self.name
What would be some good specs for a computer collecting the eyetracking data?
@user-3565f9 Intel Core i5, 8GB Ram, and lots of storage for the videos
You could do with less but these are the recommended specs 🙂
Yup, that was it! Again, thanks!
has anyone recorded sounds using the Pupil Capture?
@user-88dd92 Yes 🙂 I did
how did you input the audio to pupil? simple microphone or something a bit more fancy?
On Mac I used the built-in microphone. On linux and USB webcam mic
Cool, thank you very much! Im working on Linux and have had some trouble with playing the audio files back. Ill go get a proper mic (i.e. not one made from a pair of earphones). Thanks PAPR
@user-88dd92 Do you mean playback in e.g VLC?
Or in Pupil Player?
either one... basically even when recording with the audio pulgin and exporting using the video exporter (pupil player) the MP4 files have no associated audio file... as such I assume no audio was recorded.
I will give it a go and if I'm still stumped I will come back and once again ask the experts!
Cheers
@papr What version of Pupil Capture are you using? Cheers
V
I am running the current master from source
1.5 seen in message on top. Please disregard!
thanks
hello, what is the meaning of the green circle in the eye image ?
also I have some issue with the setting of the camera, i cannot set the exposure for the eye camera to auto ( for example) or change the exposure time.
I also have the same problem with the R200 camera where i cannot change a lot of variable
Hi @user-c14158 the green circle is a visualization of the eyeball from the 3d model
@user-c14158 what OS and version of Pupil are you using (running from src or from app bundle)?
Win 10 , version 1.5.12 of pupil (running from the app bundle )
regarding the visualization of the eyeball , i assume it is not supossed to vary a lot during the recording ( it is the case for me ). does that mean that the model sensitivity value is wrong ?
Hello again, I'm trying to convert from PUPIL EPOCH to UNIX EPOCH and I found here a possible solution: https://groups.google.com/forum/#!msg/pupil-discuss/rnarge_J7Go/pOtATxQSAgAJ However when executing the python files I get these errors
Firstly uvc could not be found so I changed to pyuvc. I don't know if that can work so if anyone succeeded in changing the timestamp with any other way I'd appreciate the help.
@user-b116a6 You will need to install pip3 install git+https://github.com/pupil-labs/pyuvc
and change the source code back to from uvc import ...
@papr Now this error is shown when executing this command
Would you mind sharing the complete output, please?
@papr
@user-b116a6 You will need our version of libyvc as well https://docs.pupil-labs.com/#install-libuvc
Generally, try to use the dpendencies in our docs
@papr Thank you, I followed those instructions, however I don't remember if it was before or after I formatted my Mac. Let me try that again and I will let you know. Thanks again.
@papr I followed those steps and I got an error here: pip3 install git+https://github.com/pupil-labs/pyndsi Everything else worked, however when I tried to run pupil_time_sync_master.py I got a "ModuleNotFoundError : No module named uvc"
Are there any ways to improve accuracy after calibration? The three visualized gaze positions after calibration often appear on completely different positions. Also, even though calibration seems to be successful, the gaze positions are slightly different from where exactly a user is watching at. Any suggestions?
Hey guys, I want to use Pupil eye tracking for the real time analysis of the data recorded. Starting I want to e.g. receive and manipulate the fixations while the eye tracking recording session is in progress. Is it possible and can anyone guide me in the proper direction? Thank you.
@user-d84237 A slight estimation error of ~1 degree (depending on mapping method) is expected.
@user-b116a6 Do you know about the Pupil Helpers repository?
I suggest to have a look at the example scripts there.
There are gaze_point_3d_x, gaze_point_3d_y and gaze_point_3d_z in the gaze_positions.csv, I would want to know which one represents the gaze distance?
The z value. The unit is millimeters.
https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/calibration_routines/gaze_mappers.py#L247 and the 500 is the default value?If I use screen_marker_calibration method, this means the distance between my eye and screen is 500mm?
Correct. This distance is needed to initialize the bundle adjustment optimization.
So if the distance between my eye and screen is 1m, just change the 500 to 1000?
That is correct 🙂
OK,thanks!
Hi folks. We are following up on an issue posted here back in January. We have a calibration phase, then we stopped the video, and then a recording phase. We are trying to integrate analysis across these phases--to basically apply the calibration to the recording phase. Step 1 was to merge the videos. We finally figured out how to do that. But to analyze the data in pupil player, we need to merge the 'other' files in the pupil capture recording folder, yes? Can someone provide some guidance on exactly what pupil player needs?
@user-78dc8f Do I remember correctly, that you recorded onPupil Mobile?
@papr yes
Ok, then you will need to concatinate the timestamp files only. These are numpy arrays. Use numpy.load()
, numpy.concatenate()
, and numpy.save()
to do so.
Make sure that you do not loose any frames when concatinating the videos though!
@papr ok...so these are the .time files?
@papr or the .npy files?
the npy files. If they do not exist, open the recording in Player once and Player will generate them for you
Only merge timestamps of the same cameras. Do not merge eye0 and eye1 timestamps for example.
@papr ok (and, yes, merging cross cameras would be daft 😉
@papr then we should put the merged .mjpeg files and merged .npy files into a folder and, in theory, pupil player should be happy?
Correct!
@papr (at least, it should open...)
@papr thanks. We'll explore (back in a month....ha ha)
@papr Following on the message I sent on Saturday, I managed to write a script to communicate with the IPC backbone and I subscribed to receive metrics with topic 'fixation' and I am writing them in a JSON file. Is it possible now to use the methods called from Player that generate the fixation_report.csv and fixations.csv so I can generate them myself without using the Player app? I am asking because I need to calculate the fixations real-time not offline through the Player app. Thank you in advance.
@user-b116a6 Do I understand it correctly that you want to write online fixations to a csv file?
This is the Pupil Player code that exports the fixations to csv: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/fixation_detector.py#L436-L471
@papr Yes, I have seen this file. Currently, I set to record 70 frames/sec and I write in a file only the metrics with topic fixation. Can I e.g. call a method from fixation_detector.py that will calculate based on the recorded metrics the fixations and create similar files to those created from Player or do I have to implement my own e.g. I-VT algorithm to calculate them in real-time?
@user-b116a6 Aah, I think I understand your issue now. You want to aggreate the fixtions as Player does, correct?
Hi @wrp I don't know if you saw my reply last friday (I forgot to quote you) and I was wondering if you thought of something regarding my issue
@user-b116a6 Since Capture sends out fixation events as soon as they pass for the minimum fixation definition.
@user-b116a6 The online fixations have an id
field. Fixation events that belong to the same fixation have the same id
. You can accumulate fixation events until their id
changes and calculate metrics, e.g. total duration, etc based on the collected fixation events
@papr Yes that is exactly what I need to do, thank you, I will try that and let you know.
Hey guys. has anyone experienced a very good pupil track but a very bad, post calibration Gaze point? I'm detecting pupils in offline mode and the detection works beautifully. However, post Natural Features calibration, the Gaze point appears all messed up. Gaze from Recording (I performed the same calibration live while recording) is much better than offline calibration.
@user-c828f5 I can have a look if you share the recording with me. Which Pupil version do you use?
@papr Good morning. Working on what we discussed yesterday regarding merging calibration and test phase videos. I got ffmpeg and python installed and working. Managed to merge the videos and the numpy timestamps file, but I'm getting some errors when I try to playback in pupil player. As a test, I copied 4 files to a folder: eye0_timestamps.npy, eye0.mjpeg, world_timestamps.npy, world.mp4. These are the original calibration files. Pupil player won't open these. I must be missing some other file?
Ah, you will need the info.csv
file as well. But you can take it from any of the original recordings. No need to merge them.
@papr copied that file over. Now when I drop that into pupil player, player crashes...
@papr just re-verified that the original opens with no problems...
Can you share the player.log
file with us?
@papr sure. how do i do that?
Just drag and drop the file into discord
@papr 2018-03-20 09:20:51,961 - MainProcess - [INFO] os_utils: Disabled idle sleep. 2018-03-20 09:20:53,229 - player - [ERROR] player_methods: No valid dir supplied 2018-03-20 09:20:57,445 - player - [INFO] launchables.player: Starting new session with '/Users/nfb15zpu/Documents/J-Files/Dyadic/TestCalibration' 2018-03-20 09:20:57,446 - player - [INFO] player_methods: Updating meta info 2018-03-20 09:20:57,447 - player - [INFO] player_methods: Checking for world-less recording 2018-03-20 09:20:58,440 - player - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend 2018-03-20 09:20:59,022 - player - [INFO] launchables.player: Application Version: 1.5.12 2018-03-20 09:20:59,022 - player - [INFO] launchables.player: System Info: User: nfb15zpu, Platform: Darwin, Machine: C02PX3J2FVH8.local, Release: 17.4.0, Version: Darwin Kernel Version 17.4.0: Sun Dec 17 09:19:54 PST 2017; root:xnu-4570.41.2~1/RELEASE_X86_64 2018-03-20 09:20:59,045 - player - [INFO] camera_models: No user calibration found for camera world at resolution (1280, 720) 2018-03-20 09:20:59,046 - player - [INFO] camera_models: No pre-recorded calibration available 2018-03-20 09:20:59,046 - player - [WARNING] camera_models: Loading dummy calibration 2018-03-20 09:20:59,173 - player - [ERROR] launchables.player: Process Player crashed with trace: Traceback (most recent call last): File "launchables/player.py", line 245, in player File "shared_modules/file_methods.py", line 55, in load_object FileNotFoundError: [Errno 2] No such file or directory: '/Users/nfb15zpu/Documents/J-Files/Dyadic/TestCalibration/pupil_data'
2018-03-20 09:20:59,173 - player - [INFO] launchables.player: Process shutting down. 2018-03-20 09:21:01,176 - MainProcess - [INFO] os_utils: Re-enabled idle sleep.
Ah ok, you will need a pupil_data
file as well. Again, take it from any of the original recordings. Player generates empty ones when converting Pupil Mobile recordings.
@papr Bingo! Now original plays and my merged files play. I'll hand this off to my grad student from here so she can test if we can apply the calibration to the test phase now...will report back soon.
@papr one question: you mentioned the possibility of dropping video frames. Should I check that, or can I just assume that if the video plays in pupil player that my timestamps and video frames match up?
Mmh, I think it would be best to ensure that, especially if you merge more than two videos. See this link on how to count the video frames: https://stackoverflow.com/questions/2017843/fetch-frame-count-with-ffmpeg
I suggest to use the Decode and count the number of frames
method. Timestamps can be counted by loading the merged numpy file and looking at the shape of the loaded file.
@papr thanks
hello guys
is there somewhere data format for Offline Surface Tracker like this one for other csv? https://github.com/pupil-labs/pupil-docs/blob/master/user-docs/data-format.md
I mean these files
Hi, I am using the Pupil Labs with 2 eye-cameras (2-D) and I noticed that the pupil diameter in the export file differs between the eyes. Is this normal? Which value of the two should I use as the pupil diameter? Maybe the means of the two eyes? Thank you.
@user-b23813 The 2d Pupil diameter is measured in pixels. Therefore it is very expected that the values differ between the eyes since it depends on the eye to camera distance.
Thanks @papr.
@papr Following on yesterday's topic in regard to replicating the fixations.csv
file in real-time. I am trying to test my code with the pupil_data
file before moving to a real-time analysis test, however I seem to be off in the duration of the fixation. Also, when searching for dict['method'] == 'gaze'
it doesn't seem to return anything as it only returns pupil
method and in the csv file for the method field it states gaze. This data exists in the pupil_data
file right?
@user-b116a6 The method is selected based on the pupil detection method. pupil
uses pupil 3d model vectors to calculate fixations. If no 3d model data is available, gaze
unprojects 2d gaze using the cameras intrinsics and uses the resulting 3d gaze vectors for fixations.
In general, I recommend to use the pupil
method
@papr Alright, I will use pupil
then, however I don't seem to understand how the start_timestamp
and duration
is calculated. Initially I set it to receive the timestamp of the first frame as the start_timestamp
and to calculate the duration
I subtracted that from the timestamp of the last frame with the same id
but that seems to be wrong now that I am checking my output with the actual csv file.
Offline fixation detector does the aggreagation a bit differently. It defines a maximum duration and uses binary search in order to find the actual end of an fixation quickly.
@papr I run some tests with the code I have written offline and online and it seems that the results are not deviating very much and the fixations
file I create is almost identical with the one created from the Player
. Thanks for the help. 😀
Another question I have, is, what metrics should I use to calculate the saccades? Should I find an algorithm online and implement it since it is still under development from Pupil Labs
?
@user-b116a6 Yes, that would be great. It would be very appreciated if you could make a pull request when you are done and contribute your saccade detector to the Pupil project.
Also, thank you very much for verifying the functionality of the Player fixation detector. 🙂
@user-dc2842 recommended http://ieeexplore.ieee.org/document/6944931/ but I do not think that this algorithm is suitable for online application.
@papr Thanks, Should I email it to you or send it over using Discord?
As you prefer.
Hi Everyone, our lab has just received the Pupil labs eye tracker. It is great to have a forum with previous questions! I will go over them, but have several questions to get me started...What calibration method would you recommend using for eye tracking during fast movement within ranges of between 1.5-3 meters (the gaze distance will change within this range during the trials) - the manual, single, or nature calibration? Also, is anyone knows of information regarding synchronization of eye tracking with motion capture?
Hi @user-8944cb I would suggest trying the manual marker calibration. I would also suggest recording eye videos and to start recording prior to calibrating so that you can re-calibrate and even re-run pupil detection algorithms with Pupil Player post-hoc (if you want to have the option it is best to record calibration and eye videos).
@user-8944cb regarding mocap + Pupil - please check out community contributed scripts in pupil-community repo: https://github.com/pupil-labs/pupil-community#scripts
@here We just pushed a Pupil software release: https://github.com/pupil-labs/pupil/releases/tag/v1.6
Hi there,
I was trying to calibrate pupil on an htc vive and I changed the video settings as instructed on the website
Now what should I do such that pupil recognises my world camera
There is no world camera on the htc vive add on.
well can't I link it tot the htc camera ?
since I have it
You need to provide the reference markes yourself for the calibration. The idea is that you visualize markers in your virtual scene and calibrate gaze to that scene.
You will have to use the HMD Calibration procedure for that.
Ok
@wrp Thanks for your reply! When you say start recording before the calibration, do you mean do the calibration as part of the recording? What do I need to press then the 'C' or the 'R' to record when calibration? Or do you mean first record, calibrate and then record again?
@user-8944cb Example workflow:
1. Ensure pupil is well detected and eye cameras properly adjusted
2. In Recorder
click record eye
- this will record eye videos when you start recording
3. Start recording by pressing R
button in the GUI or r
on the keyboard when the world window is in focus
4. Start calibrating by pressing the C
button in the GUI or c
on your keyboard when the world window is in focus
5. Stop recording when done by pressing r
or clicking R
in the GUI
Hello all,
I am having problems getting audio in my recordings. I am using the v1.6 of Capture and Player not from the source , however I cannot see a audio.mp4 being exported anywhere. Do I need to run portaudio and its pyaudio wrapper as if I were running it from the source?
Thank you!
@user-88dd92 Your recording needs an audio.mp4 in order to play it back correctly. There is no audio only export.
Hi papr, when recording videos I can see the audio plugin module say 'recording audio.mp4', but this file is not generated in the recordings folder or in the exports folder, am I looking for it in the wrong place. Furthermore, when I open the .mp4 file (world.mp4 or world_viz.mp4) in an external player (VLC) the details show no audio in the file.
@user-88dd92 What microphone do you use and which operating system?
I am using headphone + mic that pupil capture recognizes on Ubuntu OS (also I have checked that sounds can be reproduced and there are no driver errors)
the headphones+mic is Microsoft LiveChat LX-3000
they seem to work fine on Ubuntu for everything else (videochat and audio recording)
Let me test a similar setup.
great! thank you very much!
@papr, did you face similar issues or did it all run smoothly for you? Once again, many many thanks for all your help.
Unfortunately, I am having trouble with listing audio input devices in the Audio Capture plugin if I run the bundle. We will try to resolve this issue as soon as possible.
OK, I will stay posted to see any developments. Thank you so very much for the terrific support and continuous development of this awesome tool! Cheers
is there some documentation regarding the 2d vs 3d mode and each of its upsides ?
Short version: 2d mode uses polynomial regression for mapping based on the pupil's normed 2d position. Can be very accurate but is prone to slippage and does not extrapolate well outside of the calibration area.
3d mode uses bundle adjustment to find the physical relation of the cameras to each other, and uses it to linearly map pupil vectors that are yielded by the 3d eye model. The 3d model is constructed from a sequence of 2d ellipses.
3d mode is on average a bit less accurate than 2d mapping but is less prone to slippage due to the refitting of the eye model.
So regarding the Calibration procedure with hmd eyes
For hmd eyes use 2d
should I import the Unity packages ? ( considering I do not intend to develop anything further on Unity and just calibrate my pupil ?
@user-f1eba3 So did you switch to Unity? I thought you were developing for a different platform?
so i developed my zmq plugin for unreal
and I am able to receive data from pupil capture
its just that i want now to calibrate my device
because at the moment i don't see any gaze data (since I never calibrated Pupil)
and from what i understand the best way to do it is using Unity
Ehm, there should be some dummy gaze. make sure to subscribe to gaze
and that at least one eye process is running.
ok but either way i'm going to need to calibrate the pupil on the way
These are the relevant notifications for hmd calibration: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/calibration_routines/hmd_calibration.py#L51-L60
so i decided to do it now
But if you are able to import unity packages, why bother to write your own unreal code? I thought both platforms were imcompatible to each other (I am no export in this though!)
well
The ideas that we are going to use pupil on a bigger reasearch platform
open ease and most stuff is developed in Unreal
I would recommend to reuse code as much as possible. So if you are able to import the hmd eyes Unity packages, then do so. No need to rewrite everything then.
i don't want to
My question is as follows : To perform just the calibration for my pupil do I need the packages ?
No. Calibration is done in Pupil Capture. You will need to provide reference positions via notifications (see my link above). Capture will collect pupil data on its own and calibrate when the procedure is done.
Thx for the info
Hey papr
Hi Everyone, I am exploring possibilities of using the Pupil mobile (we have recently purchased the headset). Can I use a mobile that is not directly provided by the company (Assuming it is one of those that is supported) purchese the cable, and download the app, or does it has to be directly ordered from the company? Thanks!
@user-8944cb - Our current tests show that the Moto Z2 Play, OnePlus 3/3T/5/5T, Nexus 5x, and Nexus 6p are the most reliable and robust devices for Pupil Mobile. While other Android devices with USBC ports can theoretically support Pupil headsets, we have found that not all USBC controllers are created equal and not all vendors provide full support of the USBC spec on a driver/firmware level.
I think I might have posted in the wrong channel earlier:
Hey guys, working on the diy pupil hardware.
Is there alternatives to the exposed black film (can't find any local); I purchased the $25 recommended alternative in the Google " list of materials" - unfortunately it was a glass nir filter that was too large.
Tried using polarizing filters (salvaged from 3d real-d glass) and I'm having trouble with getting the lens to focus and display a clear image.
I'm assuming if I try the exposed black film, it should fix problem with focusing a clear eye image; but I'm having trouble finding the material
The ir leds work for certain: double checked with my cell phone camera and you can see the violet light glowing
Also tried manually adjusting the physical distance of the eye camera mount, the software focus, and the lens finger adjustment
Oh, also tried floppy disk magnetic film as a filter as well... No luck
I tried using the undeveloped film inside of the canisters for point and shoots from Wal-Mart, but that had a silver foil? That i would need to develop with chemicals and expose to the sun or something.
Just responded in the vr-ar eyes channel. This is the correct channel for this post. Thanks for reposting here
as noted earlier in the thread (but somwhat hard to find) - for the DIY headset we recommend Agfa Precisa 100
Thank you ! - just to double check (is this it?): https://www.freestylephoto.biz/1175268-AgfaPhoto-CT-Precisa-E6-Slide-100-ISO-35mm-x-36-exp.?gclid=EAIaIQobChMIycjsrvKB2gIVA57ACh2uJwy0EAQYASABEgLxz_D_BwE
Also, is there any other alternatives. I'm building the head set for research and am running short on time (trying to find materials in driving distance).
Does anyone know if previous versions that support 200Hz srate of Pupil Capture can record sounds/noises? Cheers
Capture 1.5 should work for audio recordings on Linux
thanks! I will give it a try!
Ok, reading through old posts: basically buy the "agfa ct precisa 100", develop it unexposed to sunlight. And it should work as a ir pass filter
Correct
I'm really grateful that this discord chat exists - it's really helpful and if I found it earlier, would have saved me a lot of time and frustration, haha
This was on the Google docs material sheets for DIY pupil - https://docs.google.com/spreadsheets/d/1NRv2WixyXNINiq1WQQVs5upn20jakKyEl1R8NObrTgU/pub?single=true&gid=0&output=html
under depreciated tabs: https://imgur.com/a/fnp11 - ir-bandpass filter for eye cam - NIR Optical Filter, 850DF20, 11.5mm painted edge
For future references if anyone is building according to the list and cannot find "exposed (black) film negatives" (which means developed without exposure to sunlight), it does not fit the DIY pupil as a NIR pass filter
Left side is the DIY lens for eye camera, right side is the nir pass filter that is too large and is on the Google docs shopping list link
Also, depending on your area: most photo labs have been taken over by Wal-Mart, cvs, and Walgreens (USA). Trying to consult them for film negatives is not possible due to those centers off sourcing the development of their film and no longer doing it in house.
Hey guys, I don't know if this is quite relevant, however, I am developing a web application locally which will be used as a GUI for the commands I was using so far. The user will login, press a button and on the click of the button I want to execute firstly the pupil_time_sync_master.py which is provided to synchronize the Pupil Clock with the Unix_Epoch. The rest of the process is not relevant. I tried everything so far but the script does not execute. Can anyone help? I am using MacOS High Sierra 10.13.3, XAMPP 7.2.3-0 to host the web server, html, and python.
Try local photo labs if available in the area, different friends/family old photo box (becoming obsolete due to digital Photography), or the "Agfa Precisa 100" as recommended in previous chat.
And I see why that's the wrong size... That subheading says "deprecated" as in not recommended rather than "depreciated" as in lower price.
Can we change that material list sub heading to "not recommended" or simply remove them - the links are either sold out or dead and add a lot more confusion
Rather than placing the "Agfa Precisa 100" with a description to develop the film unexposed to sunlight to use as a NIR pass filter for the DIY pupil eye camera
And a link to the discord chat at the top of that Google document so users who are DIY can read through the chat and find up to date information on materials and previously asked questions
@wrp Thanks for your answer! I recorded and calibrated. Now I am trying to record again. However, when I move the recorded file to Pupil Player it writes that "No user calibration found for camera eye1 at resolution (400,400)...Loading Dummy calibration...No pre-recorded calibration available" and same for second eye. What am I doing wrong? Thans!
Hi guys, I am trying to export blinks with confidence of 0.2 and less. I changed both the onset and offset thresholds to 0.2 but the export file includes blinks with confidences higher than 0.2 too. I still get blinks with confidences higher than 0.2 even if I change the offset threshold to 0.25 and leave the onset to 0.2. Should I have done something differently? I also wanted to ask what the filter length (in seconds) means at the pupil player? Thanks!
Hi All,
Our lab just received the pupil lab hardaware (2x200Hz+word camera). Following the instructions, I am getting pupil detction but when it comes to the calibration process, I am getting this error message:"Not enough ref point point or pupil data". What could be the issue ? Thanks for the help
Hi has anyonr had some experience with msgpack ?
I want to use the msgpack C version to receive the data from Pupil Capture
@user-8944cb did you calibrate in Pupil Capture? If not, then you will need to conduct calibration offline. Please see the offline calibration section here: https://docs.pupil-labs.com/#data-source-plugins
@user-42b39f what calibration procedure are you using? If you are seeing this message it means that either reference points (e.g. markers shown on screen or manual markers) were not well detected or that the pupil was not well detected. Based on your note I would assume that the references were not detected or specified. Please follow up with more information about your calibration process.
@wrp I tried 3 methods (capture 1.16.14 on macbook pro os 10.13.3) : screen calibration (on 2 screens : macbook laptop screen and thunderbold display with full screen mode), manual marker and finger calibration.With the screen calibration process, I did notice how I could add more markers. I increased the sample duration as well but it did not change the error message. For the manual marker calibration, I guess the marker is recognized because it is filled with blue at each new position and the stop one is filled with red. Note that I went through the eye processes to be sure that the pupil was well detected. With the 200 Hz camera, it looks like the image is out of focus but I think that the positioning was rather correct (pupils were recognized nearly continuously). I changed the resolution from the default (192,192) to (400,400) but it did not change anything. The exposure is not exactly the same for the 2 cameras but I do not have access to automatic exposure mode (perhaps it is normal?).
Hi guys, I want to build Pupil on Windows and I'm on last stage trying to run run_capture.bat. I'm getting ModuleNotFoundError: No module named 'msgpack'
however msgpack is installed
C:\work\pupil\pupil_src>pip install msgpack Requirement already satisfied: msgpack in c:\python36\lib\site-packages
any ideas why it's not seeing module?
ok, I found errors when installing msgpack
pip install msgpack Collecting msgpack Using cached msgpack-0.5.6.tar.gz Installing collected packages: msgpack Running setup.py install for msgpack ... error Exception: Traceback (most recent call last): File "c:\python36\lib\site-packages\pip\compat__init__.py", line 73, in console_to_str return s.decode(sys.stdout.encoding) UnicodeDecodeError: 'utf-8' codec can't decode byte 0xbe in position 155: invalid start byte
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "c:\python36\lib\site-packages\pip\basecommand.py", line 215, in main status = self.run(options, args) File "c:\python36\lib\site-packages\pip\commands\install.py", line 342, in run prefix=options.prefix_path, File "c:\python36\lib\site-packages\pip\req\req_set.py", line 784, in install **kwargs File "c:\python36\lib\site-packages\pip\req\req_install.py", line 878, in install spinner=spinner, File "c:\python36\lib\site-packages\pip\utils__init__.py", line 676, in call_subprocess line = console_to_str(proc.stdout.readline()) File "c:\python36\lib\site-packages\pip\compat__init__.py", line 75, in console_to_str return s.decode('utf_8') UnicodeDecodeError: 'utf-8' codec can't decode byte 0xbe in position 155: invalid start byte
do you know how to solve it?
Hello, anybody a good documentation about data received from pupil capture through zemQ?...
I have submited to gaze and I become topics gaze.3d.0, then gaze.3d.1 and gaze.3d.01?..
@user-c9c7ba The different topic endings indicate on which pupil data the gaze datum is based on.
- 3d
indicates the gaze mapping type.
- A trailing .0.
indicates that the gaze datum was monocularly mapped based on eye0 data
- A trailing .1.
indicates that the gaze datum was monocularly mapped based on eye1 data
- A trailing .01.
indicates that the gaze datum was binocularly mapped based on eye0 and eye1 data
@user-e38712 Try running pip3 install msgpack
The 3
is important (as far as I know)
with pip3 I'm getting same Exception
this is my pip version if it can help solve problem: pip 9.0.3 from c:\python36\lib\site-packages (python 3.6)
@user-e38712 This looks correct. Mmh
@papr thanks for reply... than when I need to construct a graph where x is time and y is gaze position, should I use .01 topic with pupil_point?....
and when I need to construct another graph where x is time and y is eye position can I use .0. for it?
@user-e38712 Could you try running pip3 install -U msgpack_python
@papr I got the same error. I'm not the only one facing that problem, there are few issues on github but msgpack dev keeps saying "go to stackoverflow" or "it's pip problem not msgpack"
@user-c9c7ba Not exactly. Let me clarify the term we use:
pupil
datum: The result of the pupil detector. The norm_pos
field is the pupil position relative to the eye image.
gaze
datum: The result of mapping one or two pupil
datums into the world camera coordinate system. the norm_pos
field is the gaze position relative to the world camera image.
If you want to visualize eye positions, you will have to visualize pupil
data. If you want to visualize gaze data you will need to use the gaze
data.
@user-e38712 Mmh, that is annoying =/ Unfortunately I am not able to reproduce this issue.
@papr yes it is :/ No problem, thanks anyway for your time
I hope it will not take me too long to find solution
@user-e38712 What is the full folder path from which you try to install?
@papr actually from C:\Users\zezima\Downloads, does it even matter?
all other libs I've installed from same path and everything went smooth
@user-e38712 It does matter if it had included non-unicode characters: https://stackoverflow.com/questions/25036897/pip-install-unicodedecodeerror
@user-e38712 what versions of python are installed on your Windows OS and what version of Windows OS?
@papr you're right, but there are no non-unicode characters @wrp Python 3.6.1, Windows 10 Pro 64bit
@wrp Python is 64bit as well ofc
@user-e38712 in this case you should not need to specify pip3
since you only have python3 installed on your system
an alternative is to use a wheel from Gohlke https://www.lfd.uci.edu/~gohlke/pythonlibs/#msgpack
and then pip install <name of .whl file>
@wrp yes, I have no other 2.x python versions installed. pip or pip3 are doing the same. I have also newest version of pip. Cool I'll try this out in a moment
@wrp looks good
C:\Users\zezima\Downloads>pip install msgpack-0.5.6-cp36-cp36m-win_amd64.whl Processing c:\users\zezima\downloads\msgpack-0.5.6-cp36-cp36m-win_amd64.whl Installing collected packages: msgpack Successfully installed msgpack-0.5.6
If you want to install msgpack from pip install msgpack
then you need Visual Studio or MSVC++ installed
so, perhaps it did not build due to missing compilers/dependencies
@wrp much thanks man, looks like with msgpack lib everything is fine now
the wheel file on the other hand ships with libs compiled for you, so you don't need to have MSVC++ installed
so, if you want to use pip install msgpack
please follow the instructions in the README of the github repo for msgpack-python
@wrp I'm doing everything step by step like here: https://github.com/pupil-labs/pupil-docs/blob/master/developer-docs/windows.md
did you install MSVC?
were you using x64 Native Tools Command Prompt
?
both questions: Yes
well, then in your case I don't know what else to say. Setting up a dev system for Pupil on Windows is quite the process. If you did not start from a clean slate (freshly installed OS), then there could be some environment conflicts - but hard to say without looking. Pleased that there was a workaround for you though
@wrp yes, there are many steps to do and my OS is not freshly instaled. For now I'm taking a break, thanks for help and if I have more problems I'll let u know 😃 BTW today I'll test Pupil 1.6. Some cool features!
@user-e38712 are you running from source or you tested the v1.6
bundle?
@user-42b39f please could you try calibration again on the screen using screen marker calibration method. Please ensure that you are using full screen and that there are no other markers in the scene (e.g. a printed marker visible on the table)
@wrp I'll test it from bundle, got another problem while running run_capture.bat. I'll go back to it tomorrow
@user-e38712 is there an absolute necessity to run from src for your work/research?
I ask because another option could be to develop a plugin or subscribe to IPC over the network
this way you could just run the bundle and focus on developing the plugin/application
@wrp not sure, I want to develop plugin. But It's not necessery to run from source?
@user-e38712 you do not need to run from source if all you want to do is develop a plugin
You can add your plugin to the /plugins
directory within the pupil_capture_settings
dir or pupil_player_settings
dir
these dirs get created in your home dir after you run the bundle the first time
@wrp good to know 😉 I should have check it on my own in docs, my friend told me it's necessary
@user-e38712 you can do a lot with plugins 😄 but it really depends on what you are trying to achieve
@wrp actually I want to create plugin that will count fixations number, time, average time on surfaces and be able to generate several different heatmaps based on these data. I don't know if I'm able to write gui in plugin. I also want to develop features like generating charts e.g time/fixations on specified surface
This should all be possible via plugin - @papr do you see any blocking cases based on @user-e38712's description above
@wrp Thanks guys, you are very helpful. I have to go now, I'll come back later and read @papr papr response. Have a good day!
likewise 👋
In case that you want to do this in an offline fashion: I would recommend to work with the exported csv data instead of writing a plugin. The exported data should contain all the data that you need.
@wrp I performed again a screen calibration with the default settings and with no other markers nearby and got the same message. As I don't know how to add more ref point in this calibration process, I guess that it has something to do with the pupil detection. However when I go in eye0 and eye1 process and set the parameters with the algorithm on, it looks as if the pupils were well recognized. I have a red ellipse contouring and a central dot. Is there anyway of checking the quality of the pupils detection during the calibration procedure ?
Hi, i have a new issue with the lastest version of pupil capture (1.6.11, 200hz eye camera and I'm on Windows 10), the eye 1 video regurlaly freeze for a short moment and the gaze mapping on the world image freeze with it. The problem occurs when the resolution is set to 192*192 for any framerate. If i lower the exposure absolute time the video seems to works fine but then the value is automaticaly reset to 32 after a few seconde and the video start freezing again. I do not have the same issue for the eye 0 video. And the same lines appears continuously in the prompt.
@user-c14158 can you share the logfile?
sure, however all hose lines are not in the logfile
@user-c14158 can you contact us vie info[at]pupil-labs.com I think this might be a HW issue.
Hi @wrp , thanks for replying. Yes, I did calibrate in Pupil Capture, and I tried the screen calibration method (not sure how to additional calibration points), the manual marker method (tried adding more than 9 points for calibration), and the single marker calibration. I tried on two computers, and in the 1.16.14 and the previous capture software. In all of them the accuracy and precision values in the 'accuracy visualizer' after calibration look good, but I got this error every single video I recorded after the calibration and tried to open in pupil player. I can't find any information about this in the Pupil Docs. Thank you!
@mpk I tried again with pupil capture 1.5.12 and i can't replicate this issue, that why I thought it was SW related
@user-c14158 We think that the v1.6 issue is due to our stripe issue detector that we implemented. It resets the camera's framer ate automatically when stripes are detected. Your camera seems to trigger a lot of false positive detections.
Please share an example recording with us thtat was recorded using v1.5 and includes the eye videos. You need to enable the Record eyes
option in the recorder menu.
We can use it to improve the false postive rate of our detector.
@papr I shared the recording with info@pupil-labs.com
Thank you
Hi Pupil guys is there anywhere with the specification of the message that is to be send on every topic
I'm trying to receive data from Pupil service in a static way with the core zmq C Api which means I could use to now exactly how the message is encoded when being send 😄
A zmq message is composed of one or more zmq frames. Notifications are composed of at least two zmq frames. The first simply contains the topic, and the second frame includes msgpack encoded data.
@user-f1eba3 I recommend this: https://docs.pupil-labs.com/#the-ipc-backbone
and use this: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_messages.py
to print the message you are intersted in to learn about the specific payload.
@user-f1eba3 just to clarify, dont use python instead of C, just use the script to introspect to learn about the format on the wire.
Hi just testing out pupil mobile, and can't seem to calibrate or lock on to pupil? Is there a way to do this in standalone mode (i.e., just running from the app, no connection to pupil service/recorder on a pc)?
Hi guys, i want to using pupil on Windows in real time but dont know how to start. I want to create program to pattern recognition based on Yolo darknet. My ask is: its possible to create stream with data (Pupil files csv with the meta data & mp4 video) in own module without compilation pupil? And if it's possible how i should start?
@user-921ec1 Pupil Mobile app will not perform pupil detection. It is designed to be used for local recording of sensor data (eye videos, world videos, audio, imu data) and/or for broadcasting video data over a wifi network to another computer running Pupil Capture.
If you record data locally on your android device with Pupil Mobile, you will use Pupil Player to perform offline (post-hoc) pupil detection and calibraiton for gaze estimation. This means that you should include the calibration procedure(s) in your recording.
Alternatively you could stream over wifi to another computer running Pupil Capture and perform pupil detection and calibration on the subscriber/client desktop/laptop
@krzysztof.kozubek#6593 Maybe you want to take a look at https://pupil-labs.com/blog/2017-12/real-time-object-recognition-using-fixations-to-control-prosthetic-hand/ - this community member has a very well documented project that demonstrates how you can create a plugin and extend Pupil to do object classification
@wrp yes i saw it. If i good understand to create object classification i dont need change code pupil and using source code? [email removed] @zezima)
With regards to my calibration issues, is it normal that the camera are fitted in a way that on the left eye the upper lid appears upward and on the right eye it is downward ?
@user-42b39f yes, is just a display issue, you can set "rotate image" in the eye window general settings to remedy this.
@mpk thanks for your reply. Unfortunately it does not give me any indication how to solve the calibration issue. I still can't get any calibration functionning. How can I check the quality of the pupil dectection during the calibration process ? I don't know really how to qualify good pupil detection or to optimize it.
I'm very close to my c++ Comunication with pupil but for some reason I got stuck
I even made a google docs to better comprehend where I got stuck
If anyone is kind enought to take a look please do : https://docs.google.com/document/d/1AYT1HqmhT2tnS5kr6gi5J34aIaZnjc7GSSqrYYtiL8w/edit
@user-42b39f I think the best way to debug is to set up a quick video meeting. Can you reach out to us via email (info[at]pupil-labs.com ) ?
@mpk yes I will, thanks
Hey I've been running into a persistent issue where I cannot get pupil groups to successfully run on my MacBook Pro (2017). I usually can get both camera and instances working on the first go, then the second instance "loses" the eye tracker and will no longer recognize it. I am wondering if this is a processing power issue but I am unsure how to isolate it.
If anyone has tips or experience with Pupil Groups, I'd greatly appreciate any advice.
So I managed to receive data in Unreal with c++. Now the question is how can I get the device calibrated as easy as possible
I might attempt to reproduce something that is in hmd eyes but is there any other way ?
hey! Today I have made a pretty long record with my macbook pro 2017 and Pupil Capture (kind of 5.12 gb of recording seems) but I'm not able to lunch it on Pupil Player 😦 I suppose the file is too big to been upload from Player.... is there any possibilities to solve this issue? xx
@mpuccioni#0374 I can open similar size files on my MacBook Pro 2017. They take a while to load but it works ok. How much ram do you have on your machine?
@papr hey papr, the ct precisa 100 came in today - I dunno if you're in america, but Wal-Mart, cvs, Walgreens and a bunch of other places develop but do not return the film negatives
Do you have any advice. I'm thinking an online company, possibly
@user-e7102b i have 8 GB ram !
I'll try again thnks
Ah ok, I have 16GB on my machine. Maybe just give it a few minutes to load...
oh yes !
I'll let u know 😃
Does anyone now where the schema of the pupil message on different topics is stored in code ?
Does anyone know how I can export only the blinks with a confidence lower than 0.2? I can't understand what to put as onset and offset confidence thresholds to get this result...
So running into problems with the ''ct precisa 100'' nobody in the area develops this film even checked Houston since I'm in Texas - the only route is to pray someone does it online
If anyone at pupil labs has extra film negatives that they've used and is willing to sell it to me; I'll buy it and pay for postage considering that to develop my film takes about half a month
I would simply order a color slide film developing kit from Amazon, but that's going to run over $120
Hey Everyone, just looking for advice on which method of calibration I should use. We are using pupil mobile and saving directly to the phone (no wifi, will put into pupil player after data collection), and walking around outside. There is no set depth of view that our subject's will be looking that as the task is simply to explore the environment. With that in mind and reading the various calibration methods, I believe that the natural features method is probably the best way to go, but looking for any and all suggestions. Thanks!
@papr apologies for bringing up old topics and apologies to anyone that has tried contacting me regarding the matlab script. My question is a follow up on the converstaion you had with @user-cf2773 in early March. I calibrated using 2D how would I go back and determine the visual angle X/Y of where the user is looking.....its publishing time 🙏
@user-e7102b strange are you porting the information over the same computer? I remember having issues a while back with this and the issue was an internal matlab buffer that I was able to work around based off placement of fclose or fopen another work around I used was downsampling the data on the python side using a circular buffer that would average whatever pupil info it received and send it at the "new" sampling rate
Hello everyone! Does anybody knows, what image sensor model is used in Pupil eye camera?
Do you mean the mathematical formula?
I mean image sensor chip model and it's manufacturer... omnivision, sony? And what kind of shutter it using - global or rolling?
rolling
only global shutter cameras I can find are TTL (analog)
at least the tiny ones
should I be using a stack of 3 or 4 of these? Then the visible wavelength still passes but barely
Hey folks, I've found some concerning timestamps coming back from the fixation detector.
The fixation durations suggest that some fixations begin before the previous fixation has ended.