Hi guys! I wonder is there implemented feature to "save" calibration between Pupil Capture\Service starts? Actually I use HMD, so I believe that it's enought to calibrate once and then just use one "calibration model" everytime I use my vr helmet.
@user-f7028f , in our experience with the HMD add-on, you have to recalibrate every time you take the HMD off even if just for a second. Even if the HMD is sitting on your cheek just a few mm higher, the calibration will be very wrong.
@user-5d12b0 this is true for now. We will add 3d calibraiton that uses the 3d eye model. It will be robust against this kind of movement.
sounds great!
I really hope we can release this next week!
its been quite a while! Also this will get better with the release of the improved eye model. I hope for that to happen in about 4 weeks.
Can you point me to any references detailing how the eye model and 3d calibration work?
I don't plan on modifying it, I'm just curious as to how it works.
None of that is well documented. The new eye model is not at all documented. But we will publish a detailed report once we have it finalized!
ok, I look forward to it.
We will certainly make an annoucement!
Just reposting in case this got lost! I am just wondering how to improve the accuracy of our calibrations. For offline calibration with the manual marker, it looks like the gaze maps well onto the marker location, but then sometimes when we reposition the marker the gaze is way off. How can we improve this online and is there a way to manually correct it offline?
@@user-45d36e Could you post a screen shot that visualizes what you mean?
Our data is confidential so I don't think that I can! But if we visualize the fixation during offline calibration, I can see that the pink circle sits far away from the marker, and the person was looking at the marker. It seems the most accurate in the center but when we move the marker to the right or left, even though they are allowed to move their head to look at it, it seems to get very inaccurate. If you need to see it I'll see if I can shoot a demo video.
A demo video would be very helpful!
Ok. I'll aim to get one to you today or tomorrow.
My coworker Chris is going to get you a demo video tomorrow!
Hello! I have just started using offline calibration. If I go through the offline calibration process and it doesn't "work" but the eye videos look good, what other things could I check or adjust to salvage that recording?
@user-2798d6 There is a difference between offline pupil detection and offline calibration.
I know - The detection process seems to be fine, but then when I go through the calibration with natural features, I mark everything but the visualizer shows the person is looking in other places so it's not accurate
Is there any documentation available about how the device is calculating depth in the 3D gaze vector? I assume it's using vergence, but I can't find anything. Thanks
@user-ecbbea I do not think that we have documentation on that other than the source code. https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/calibration_routines/gaze_mappers.py#L439-L447
@papr thanks!
@user-2798d6 How many natural features do you use? The calibration tries to find a general solution for the mapping function. Therefore it does not always match the given reference points. @user-45d36e this might apply to your problem as well.
Probably 8-10
Should I be using more than 8-10 points for the offline calibration?
No, 8 are more than enough if they are well spread.
Great! So are there any other things I should be looking at or adjusting if the calibration still isn't accurate?
Pupil data confidence should be high and the markers should be set as precise as possible. The new version will improve marker placement and user interaction. This is currently a bit unintuitive. Especially concerning how long a marker is displayed vs when it is used for calibration.
ok - also, when I'm actually placing the markers, I'm noticing a faint marker that is flickering around the screen near where I'm marking, but not actually in the same spot. Is that possibly affecting my markers?
Can you post a screenshot of that?
It's the faint blue dot on the right computer keyboard. During my calibration that blue dot flickered all over the keyboard
Oh ok. These are false positives of the circle marker detection. They are ignored if the calibration section is set to natural features.
got it - thank you!
Here is a link to a quick demonstration video. The viewer was following the toy the whole time. https://we.tl/aTZ1hJ92JG Just a reminder, I am just wondering how to improve the accuracy of our calibrations. For offline calibration with the manual marker, it looks like the gaze maps well onto the marker location, but then sometimes when we reposition the marker the gaze is way off, especially when the object isn't in the centre. How can we improve this online and is there a way to manually correct it offline?
@papr Hey! So there is something funky going in with Pupil capture on Ubuntu. Every time I run the application from the icon (application icon created in the search path), it opens up, blinks for awhile (the way an application blinks when it's loading up) and then just closes on it's own. The last time it happened I reinstalled the Pupil SDK and it worked well for awhile. It's crashing again now.
Where could I run the application directly using terminal?
I think you can run it with /opt/pupil_capture
@user-c828f5
also if you install the bundle on linux
just type pupil_capture in terminal and it will start
Hello! When working with offline calibration, is there anything I can do about the pupil detection in this screen shot? When I watch during the pupil detection process, most everything looks fine, but then when I move to actually calibrate, the screenshot happens in a few places. Is there some way to adjust what it thinks is the pupil?
@LKH#3236 can you run detection in 2d mode? I think it will work better for your case.
@user-b79cc8 hey, thank you for the recording. I will have a look at it on Monday 🙂
Hey guys, a bit of a beginner question here. What is the difference between pupil and gaze position? I just recorded some data to test pupil labs out: while head i tried to keep my head still i made several eye saccades and plotted the results in matlab (see image). It seems that for my x-coordinates my pupil position is just a attenuated copy of my gaze position, whereas for the y cordinates it is the opposite. Is this correct? or did i make an error in retreiving the data out of the csv files?
Hello! Another calibration question - I ran detection in 2D mode (see mpk's comment above) and the calibration was still off. My question is how exact can I expect calibration to be? I know the glasses can get within 1 degree of accuracy, but in terms of calibration, can I expect for the marker to be EXACTLY where the person is looking so I only have to worry about that 1 degree, or is it typical for the gaze vs. the marker to be a little off? We are having people look at a musical score, so the notes are kind of tiny and any degree that the calibration is "off" means that we could be off by a whole measure of music.
Hello, i got a calibration question too: What assumptions are made for screen marker and manual marker calibration? (Like: assume screen markers are all in one plane) I tried digging through the source code but I didn't quite figure out how it works/ at which places they differ. Sorry if this has been asked before.
@gendelo3 They only differ in the visualization. One is shown on the screen, the other one needs to be printed.
@papr that i know, i was more interested in how the processing works.
@user-97e499 Ah, I understand. @user-894365 can tell you more about it. She has been working on improving the caibration marker detection.
Hi @user-97e499 In the screen marker calibration and the manual marker calibration, the algorithm does the edge extraction and then detects clusters of ellipses. The center of the first cluster found is reported as the position of the marker. In the current algorithm, it is quite likely that it reports the position where is no marker. So we are working on improving the calibration marker detection, which is to make sure that the algorithm reports the exact position of the marker
@user-83773f Pupil positions are the positions of the detected pupil within the eye video frame. A gaze position is a pupil position that has been mapped to the world frame. The mapping function is estimated after running the calibration. The mapper uses the identity function to map pupil positions to gaze positions if no calibration was performed previously. I suspect that you did not calibrate and therefore get the strong correlation between pupil and gaze positions
@user-b79cc8 I just finished watching your demo video. Which calibration method did you use for calibration in the demo? Circle Markers or Natural Features? The first thing I noticed is that it is not necessary to move the marker so much. It is important that field of view of the world camera is covered. The easiest way to do that is to keep your head fixed and move the calibration marker within the world camera's view. E.g. the marker left the camera's view during 00:38 and 00:45 seconds. Therefore I suggest to have a person monitoring the Capture window and checking if the marker is visible. Additionally, I would suggest to show the calibration marker only while you are looking at it, i.e. turn the paper with the marker upside-down during initial setup and only turn it towards the camera during calibration. This advise is negligible if you set the calibration range correctly in the Offline Calibration plugin.
@user-894365 - I'm not sure if I understand your comment about the calibration. I'm doing offline calibration with natural markers - could you tell me a little more about how your comment relates to that? Thank you!
I'm still getting my terminology down - still learning 😃
@user-2798d6 I think @user-894365 mentioned you unintentionally. 🙂
Oh! Ok thanks!
Btw @user-2798d6 The further away your target is the more impact does the 1-degree-error have. At which distance is your subject looking at the music notes?
About as far away as from a computer - so less than a meter.
@user-2798d6 I think the current offline calibration using natural features suffers a lot from a bad user interface. The new version will allow you to set more markers in consecutive frames and therefore yield a more precise result.
Ok, great! Thank you!
Hi guys. I'm using unity with the pupil labs software and the unity pupil labs plugin. I would like to know where the user looks in the unity world but I don't manage to get this data
@user-c0934a please could you raise this question in the 🥽 core-xr channel?
Hi everyone, I am quite new to pupil and would like to know if there is somebody who could answer me some questions regarding pupil remote ?
Hey @user-34aab4 Welcome to the channel. Please ask your questions here. 🙂
Thanks 😃 I'm trying to work with pupil remote the first time and I am not used not network protocols and stuff like that. I have pupil capture running on a ubuntu system and would like to recieve data on a windows laptop. I installed python and dowloaded an example python script from the pupil helpers repository (filter_gaze_on_surface). Finally, I installed the required dependencies using pip. Actually, I don't know if thats all that need to be done, but if running the python script (and having capture running on the other computer) the script is stopping here: sub_port = req.recv_string(). It seems like there is no response. My question is if I forgot to do some basic stuff or if special versions of python or zeromq need to be installed?
The filter_gaze_on_surface
subscribes to a specific data topic. In this case to gaze on a specific surface. To receive this data, the surface needs to exists. Surfaces are created using the surface tracker. Did you load this plugin and setup surfaces? This is kind of an advanced feature already. In your case, I would suggest to start with this example: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/pupil_remote_control.py
Thanks for your anwer! Yes, I already created a surface. However, I also tried the remote control script. For me it seems as there is a general problem about the connection. The remote control script also stops when it tries to receive something : socket.send_string('t') print(socket.recv_string())
Ah, nice! Did you change the ip address in the script to the ip address of the computer on which Pupil Capture is running?
Yes, I did. Is it important which python version I use?
We recommend to use Python >=3.5, but this is not critical for running the script.
Did you check if there is a network connection to the ubuntu computer, by e.g. pinging it?
I have a short question regarding your previous answer ("Ah, nice! Did you change the ip address in the script to the ip address of the computer on which Pupil Capture is running?"). I have to set the same ip adress in the script that is shown in the pupil capture remote plugin, right?
As far as I know, you do not need to change anything in Pupil Capture. You need to adapt the ip adress in the example python script, that runs on Windows. Specifically this line: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/pupil_remote_control.py#L33
Adapting the ip adress in the python script is what I meant. However, I am not sure what to insert there. The ip adress that is shown in the capture remote plugin?
And yes, I tired to ping and it works fine
Which adress did you use?
Can you run ip addr show
in a terminal on your ubuntu computer? Is the shown ip address the same as you used to ping from the windows machine?
If I get it right it's showing two ip addresses and I used one of them in the script
Ok, and it is probably not 127.0.0.1
, correct? Then I assume that the network connection is there and the issue is on Pupil Capture side. Which Pupil Capture version do you use?
I am sorry that I do not know enough about that topic to follow in detail. Right now I used the 127.0.0.1 but also tried other ones.
Maybe it's the best idea to contect somebody personally that has knowledge about stuff like that. I guess it's hard to do this just by texting. however, it's good to know that it should not be a prolbem about version.
I am using capture version 0.0.14.7
sorry 0.9.14.7
I would suggest to switch to personal messages since this is a setup issue and is not relevant for other users.
Hi! I received my pupil few days ago and started playing with it! I'm enjoying it! 😀
I have a question. I'm trying to produce heatmaps on a surface. I have the surface found on Pupil Capture and I recorded a short session. Then I open it on Pupil Player.
I am following what is written at https://docs.pupil-labs.com/#surface-tracking
"Surface Heatmaps It is possbile to dispay gaze heatmaps for each surface by selecting Show Heatmaps mode in the Surface Tracker menu. The surface is divided into a two dimensional grid to calculate the heatmap. The grid can be refined by setting the X and Y size field of each surface. "
I don't understand what X size and Y size are...
@user-943415 welcome to Pupil !
size x and y are the real world size of your surface. This can be in the units of your choice. (think pixels of a screen or cm of a physical surface.)
I also think there should be written: (1) how the heatmap is going to be shown (clicking play?) (2) where it is going to be shown (in the main window while playing? in the debug window while playing?)
also, clicking on "Re-calculate gaze distributions" should give some visual feedback, I think. Now nothing happens
wow, thanks mpk!
so if I leave X size and Y size set to 1 (default value), what am I saying to the software? and if I set it to 1000? how this change what the software does?
if you set it to 1 you will not get heatmaps (If I remember correctly.)
I recommend setting it to some sensible value like 200x300
if you click recalcuate and set the display to heatmap. you should see pretty overlays on the detected surfaces.
it worked! thanks!
I suggest you specify this important detail a bit better in the interface and in the documentation 😉
Would it be useful for you if I fork the project on github, do some small changes (just text both on documentation and interface) and then open a pull request? Or you prefer to manage this (important) part of your work in other ways?
@user-943415 a fork and PR would be perfect!
thank you!
perfect! going to do it!
since we are here, 2 more questions 😉
1) you say "size x and y are the real world size of your surface. This can be in the units of your choice. (think pixels of a screen or cm of a physical surface.)". but pixels don't have a dimension, right? I mean, pixels and centimeters are not exchangeable units, right?
a pixel is a unit.
if you measure in it.
in the end what we mean by scale is that we dont know the real world scale or size. So you have to supply it.
the units that you count in is abitray.
infact the second question is
2) why the software does need to know the real world dimension of the surface for computing the heatmap? i.e. what if I say the X size is 100000 cm = 1 km? or it is only the ration between Xsize and Ysize that counts?
I meant "ratio"
the relation is important.
correct ratio.
also we export the data in this scale.
I think the size also affects the heatmap binning.
yep, I tried with (xsize=1.0,ysize=2.0) and I get something almost uniform in orange
then I tried (xsize=3.0,ysize=2.0) and I get something that seems to make sense
the same results I get with (xsize=300.0,ysize=200.0)
so I still don't understand what this info is used for.
if the real world size is 350.0x200.0 and I specify 300.0x200.0 do I get unrealistic heatmaps? Or what do i get?
@user-943415 this used to be the case in the past. But we mey have changed that recently.
then scale is only important for the export and the ratio
I'm asking it because we plan conducting some simple usability studies in the beginning and so I need to know what we are going to show to the stakeholders 😉
ehm, I think I don't understand 100% your sentence "then scale is only important for the export and the ratio"
sorry to bother and thanks!
if you prefer to "speak in code" you can point me to the part of code in which the related computations are made, I can read python 😉
I have see that exporting with (xsize=3.0,ysize=2.0) produces a heatmap_surfacename.png very pixelated while with (xsize=300.0,ysize=200.0) it is much more reasonable so it is like you were saying before
also, replying to myself, this link has some answers https://groups.google.com/forum/#!topic/pupil-discuss/oWS7dCYac8U
@here we just pushed v1.1! https://github.com/pupil-labs/pupil/releases/tag/v1.1
Happy Version 1.1! I love the new interface!
hey i just wanted to stop by and say gz to the new release! It's probably time for me to undust the tracker once more and check it out! 😉
Hi, we recently received our Pupil Eye-tracker. I've successfully installed all the pupil software and have it up and running on my Macbook Pro 2017 running OS X 10.6.2 (very impressed with everything)! We use MATLAB 2017b and the psychtoolbox for stimulus presentation, so the next step is to figure out how to gain online access to the data stream via MATLAB. I found some third party code on github that should enable this (https://github.com/matiarj/pupil-helpers/tree/matlabAddV2/pupil_remote/Matlab_Python) but I'm having some issues trying to get this to work (I posted on the google group but didn't receive a response: https://groups.google.com/forum/#!topic/pupil-discuss/Ze6GLyN7LBw). Is this the recommended way to achieve MATLAB integration? I'm also having some trouble with the libuvc installation procedure listed on the mac developers section of the website (https://groups.google.com/forum/#!topic/pupil-discuss/0LTy5DJM7qo), but I don't know if this is related to the MATLAB issues. If you have any suggestions for how to resolve these issues I'd really love to hear them! Thanks, Tom Bullock (Postdoc, UCSB Attention Lab)
@user-e7102b thanks for the post. We will have a look on monday!
Great - thanks 😃
Congrats on your v1.1 Release – looks beautiful! Are there any news about the Oculus CV1 integration? We're eager to get our hands on those, so we would like to order ASAP 😃
Hello, I've recently got my pupil. I just have a small problem of running pupil_player on my Ubuntu workstation. When I try to run it (after installing it from the .deb on the site), I get this:
ImportError: ('Unable to load OpenGL library', "Failed to load dynlib/dll 'libGL.so.1'. Most probably this dynlib/dll was not found when the application was frozen.", 'libGL.so.1', 'libGL.so.1')
I installed both 64bit and 32bit versions of this library, tried LD_PRELOAD, but can't seem to figure it out.
Do you know if Pupil works with (1) Ogama or (2) PyGaze or (3) OpenSesame? I've searched on the web and the answer seems to be "no" but it thought I would ask here anyway. Thanks!
Hi @papr I'm working with Sara on this. Taking into account the online calibration (it was my first time doing it and I believe most of what you said is usually followed. In terms of offline calibration, I believe we use circle marker, and whatever we do, the calibration seems to be off consistently, usually in a specific direction. What else about the calibration range should I be doing? (it will not let me alter it in anyway). Sorry for the probably quite basic questions, this is all very new to me!
@user-e938ee The bundle should work out of the box. It comes with all required libs. Please try reinstalling the bundle and revert your custom lib-loading changes if the reinstall does not work .
@user-9b7f2d No, we do not have native integrations with any of these. But we have a very simple network api that can easily be integrated into existing Python projects.
@user-c47be0 Did you alter the manual gaze correction seetings for your gaze section? This feature offsets the gaze by a given amount in a specific direction. Do not worry too much about the calibration range for now. Make sure the marker is clearly visible within the recorded scene video and that the pupil detection works well.
@user-c47be0 Just to clarify: What do you mean by calibration range? The distance of the marker to the subject or the selected frame range in the recording which is used to calibrate?
@user-29e10a thanks for the positive feedback! Regarding CV1 - we have been pushing back the release on this because we are waiting on new cameras . We hope to be able to have this available within the next 2 weeks... very very close to release!
@user-9b7f2d adding on to what @papr notes on the network API - please see docs here: https://docs.pupil-labs.com/master/#interprocess-and-network-communication
@papr sorry, I think I was referring to the distance of the marker rather than the frame range. I hadn't altered the gaze correction settings, so should try this?
@user-c47be0 Ok, good. Do not worry about the marker distance. Usually it is not required to alter the manual gaze correction. Did you upgrade to v1.1
already? Do you see the thicker part of the green lines? This is the calibration rang and should wrap around the found markers (short white vertical lines).
@papr Yeah, I ran it again from source and now it works, although with some stability issues (one of the eye cameras crashes often)
Is there any way to extract raw information from the live capture? Or at the very least from the recording? The data I'm after is pupil diameter and their relative movement
@user-e938ee See the post above about the network api. You can subscribe to the pupil data which includes diameter and relative movement (if you use the 3d pupil detection)
Thanks @papr and @wrp !
You're welcome @user-943415
I found on this issue on github https://github.com/pupil-labs/pupil/issues/503 that cpicanco was able to import pupil data into Ogama. I sent him an email to see if I can get additional details or code about this.
@user-41f1bf 👆
@papr Thanks! I'll look into it.
After installing v 1.1.2, changing the source camera for eye0 doesn't work properly anymore (the Detect eye 0 circle stays activated, but the eye0 process crashes). Does anyone have a similar problem when switching through cameras? Pupil capture selects my notebooks integrated camera for eye0 per default, which forces me to change the configuration.
@user-97e499 Could you specify what you mean by the Detect eye 0 circle stays activated
? Could you post a screenshot?
im talking about the "checkbox" thing where i should be able to toggle detecting eye0 and eye1. After trying to change the source for eye0 from the integrated webcam to "id0" or "id1" the detection window for eye0 crashes.
And clicking the green circle doesn't do anything anymore after that too.
@user-97e499 Would you mind opening a github issue for that? Please attach a note about your operating system and if you are running from source or from bundle.
I can do that, np
@here We are very excited to announce new Pupil hardware! 200 FPS eye cameras - Availble starting today via https://pupil-labs.com/store. Check out our blog post on the new hardware for more details https://pupil-labs.com/blog/2017-11/200-frames-per-second/
I've been looking for a wearable device for our research project, but I need something that will record IR video of the iris, up close, plus capture pupillary measurements. Do ya'll offer both features in one device?
@user-26898b What distance do you have in mind when you say "up close"?
close enough to see discreet changes in individual iris musculature
@user-26898b Our eye cameras are adjustable but have an average distance of 2 cm.
The visibility of the iris musculature changes might be dependend on the used camera resolution. You might need to make a trade off between resolution and captured frame rate.
@user-26898b I'm sure our hardware can be set up to do what you need.
You will have full access to the IR eye video feeds and pupillometry data
I was looking at using the VisualEyes 525 system from Interacoustics, but it does so many other things that I don't need and the price is way up there. But it has the video I need.
How could I look at a sample of what the video feed might look like of the iris?
@user-26898b if you want we can make a recording with close up eye video to show you.
Do you have a photo that shows what kind of image you are looking for?
yes, one second I'll find it
this is from the Interacoustics wearable hardware
it's called video frenzel
@user-26898b This is a screenshot of our Capture software's eye windows. You can record the eye video feed if you want to. The red and green circles are visualizing detection data. They will not be included in the recorded video.
The focus of the cameras can be manually adjusted.
could you send me one where they are looking forward, so I can see if the res on the iris is adequate, please
@user-26898b we can send an image where the eye is looking into the camera. However the camera is not frontally located by design on the Pupil headset, therefore the eye images will likely not be central (like your example) unless looking directly into the camera.
can it be altered to where the camera in frontally located?
@user-26898b we are building a custom eye camera mount for another researcher
our participants will be looking forward the whole time, so that we can look at discreet iris changes and get pupil measurements
@user-26898b we can offer a custom eye camera arm extension so you can get frontal images of the eye
Just send us an email to info@pupil-labs.com and we can follow up from there
but would they still be close up views?
Yes
ok, I'll email
Thanks
Hello--message from Prerna who is part of our team: "So we have a new release of pupil capture but as soon as we click C for calibration the pupil capture application hangs." Does anyone know how to resolve this issue with the newest version of pupil capture? Thank you!
@user-45d36e We are not aware of such an issue. Please create an Github issue including the following information:
- Operating system and version
- Running from bundle or from source?
- Which calibration method has been used
- Please attach the capture.log
file which you can find in the capture_settings
folder
Hello! I've downloaded the new version of Player and Capture. When I try to do offline calibration with natural features, I click for a natural feature mark and the dot shows up elsewhere on the screen rather than where I just clicked. It's appearing diagonally up and to the left from where I click. Is anyone else having this issue?
@user-2798d6 How far away is it? What display and OS do you use? This sounds like the hdpi-factor of your screen is not calculated correctly.
On my 13inch screen, it's about 2 inches away diagonally
I'm in OS Sierra
What type of macbook are you using? You can look it up by clicking on the in the top left and opening the About this Mac
window.
It's a MacBook pro with retina Early 2015
Ok, I am using the same version. I will try to reproduce your issue.
thanks!
@user-2798d6 I was able to reproduce the issue by running from source as well from running from bundle. Please create a Github issue.
How to Github issues work? I created one - but what happens from here?
Thanks @user-2798d6 now we can keep track of the issue - https://github.com/pupil-labs/pupil/issues/934 - the issue will be closed when it has been resolved/fixed.
Hello again, another question from my colleague Prerna: Is there a mechanism to synchronize 2 eye tracking recordings after the fact in pupil player. She wants to reduce technical issues by recording each person on a separate laptop and then sync the videos afterward. Can we do this by matching up timestamps, etc?
When using 3D pupil detection, how accurate does the green circle around the eye need to be? A lot of times the circles around both eyes are two different sizes or they really don't match the eyeball size.
The green circle should be roughly (+- 3-5mm) as big as the eye ball. The important part is that it is stable (opaque green, does not jump around). Move your eyes around if the green circle does not match your eye ball at all. This will generate better samples for the model than not moving the eyes.
Thanks!
Is there any record of the different plugins people have made to go with the main software?
@user-988d86 there is currently not a directory/archive/list of user contributed plugins
This would be nice to have!
Hi! has anyone used the matlab code? I see that we get one reading per time stamp but is there any way we could get x and y coordinates and diameter for each eye rather than one?
Just out of curiosity, when did the 200 hz binocular system launch? We just purchased the 120hz binocular system, and are a little sad that we weren't notified the 200hz system was launching so soon.
I'm also curious about this. We just purchased the 120Hz system but would have been willing to hold off for a few weeks for the 200 Hz system.
@user-ecbbea @tombullock#3146 we released the 200hz system 2 days ago. We are getting our first production batch this week.
If you want to upgrade just write us a email!
Thanks @papr !
Does anyone know if there is a way in pupil capture to view the error of each fixation estimate?
Hello everyone, I downloaded the Pupil Labs App for Android (OnePlus 3) and the app works just fine. I tried data recording on it. The app doesn't give us an option to change resolutions so I found that the Eye Video was capped at 320X280 (?) and the Scene Video at 1280X720. The videos were written out as a MJPEG file, which I couldn't read into VLC or MATLAB or Python. I wanted to know the frame rates that I could get from mobile recording.
@user-c828f5 You should be able to change the resolutions of the world camera Pupil Cam1 ID2
.
You will also need to open the videos in Pupil Player to view the videos
@user-c828f5 settings are locked during recording. Once you stop that you should be able to change them.
Mjpeg video can be transcoded to h264 using ffpeg
Hi there! Glad to meet this community and to use the pupil hardware and open-source software!
I just received my pupil headset with the Intel RealSense R200 RGBD world camera but get issues installing is on MacOs (latest version) - I followed instructions on https://docs.pupil-labs.com/#librealsense but cannot get it to work with PuPilCapture - the world camera is not recognized... any update on this?
@Laurent#2505 does the realsesne backend show up in the backend selection menu of Pupil Capture?
hey, I new to this chat thread
Has anyone any experience with the gaze interaction of the pupil glasses
with computer program/game
and how accurate is it
Hi @user-33d9bc welcome to the Pupil chat 👋
Under ideal conditions gaze estimation accuracy is 0.6 deg. In the field (not ideal conditions) you may see less accuracy than this depending on the individual, environment, setup of hardware (e.g. focus of cameras), and quality of calibration
An example (contributed by a researcher and gamer) can be seen in this video: https://youtu.be/X_BalnBOcpk
@user-33d9bc could you give an example of what you are hoping to achieve (your applicaiton or research objective)?
Hello everyone, I have a question regarding calibration accuracy of variing distance of AOI. My colleagues do driving research. In this use case the distance to AOI varies from app. 50cm to arround 100m. Do you have some recommondations to get accurate tracking for all distances?
Furthermore these studies are carried out at night, we adjust contrast and birghtness to get better world camera videos, do you have further recommondations to get a good world video? (we would have infrared beams supplementing the headlights)
Thank you
Hi all, my plug-in works great and I want to use a third-party python library (pyserial) and wanted to know if its possible to load it at run-time. I'd like to avoid having to build Pupil Capture from source, so the user can just drag and drop the plug-in, but we need to interface with a device over serial and was wondering if there is anyway to tell Pupil Capture where to load libraries from?
@user-23d980 Unfortunately, I did not find a way to do so yet. This is due to how pyinstaller
bundles the applications. Runtime plugins can only use modules that are included in the bundle.
ok no worries
I found a small lib that might just work by copying & pasting
so fingers crossed won't be an issue
thanks for quick reply @papr
Good luck! Let us know if it works.
Copying and pasting a lib relative to your plugin (eg in a dir next to your plugin should work)
@wrp @user-23d980 AFAIK, you will have to do so with all its dependencies as well. This does not work for system modules though that are not included in the bundle.
Right
Hi @wrp @papr sorry to bother, I've been using my trusty Pupil Labs glasses for quite a long while now. With the latest v1.1, I'm facing the same problem as what Sara mentioned "Sara - 11/15/2017 Hello--message from Prerna who is part of our team: "So we have a new release of pupil capture but as soon as we click C for calibration the pupil capture application hangs." Does anyone know how to resolve this issue with the newest version of pupil capture? Thank you!"
The calibration screen freezes and stutters. There's no calibration targets. If I esc the the calibration, the main screen is glitched up.
As currently, I'm not using same MBP as the one I used to be using, I totally haven't got a clue if it's my current MBP setup's problem, or if it's an incompatible issue with the v1.1. Hope to get this sorted so that I can continue using my Pupil glasses! Looking forward to the 200hz cameras too!
@user-7ca285 what version if macOS and computer specs for the mbp?
@wrp Hi thank you so much for the reply!!!!!
And these are the screenshots of the problem
Blank stuttering screen on calibration; and a glitched screen upon exit from calibration
@user-7ca285 this looks like an opengl issue maybe related to retina macs. I will not be able to recreate this with my current machines unfortunately.
@papr can you take a look at this when you get online?
@user-7ca285 can you try a non full screen calibration
uncheck the full screen option in the calibration plugin
Is this gonna help: 2017-11-23 12:37:01,204 - MainProcess - [INFO] os_utils: Disabled idle sleep. 2017-11-23 12:37:04,703 - world - [WARNING] camera_models: Loading dummy calibration 2017-11-23 12:37:04,801 - world - [WARNING] launchables.world: Process started. 2017-11-23 12:37:05,898 - eye0 - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend 2017-11-23 12:37:05,912 - eye0 - [INFO] launchables.eye: Session setting are from a different version of this app. I will not use those. 2017-11-23 12:37:06,095 - eye0 - [INFO] camera_models: No user calibration found for camera Pupil Cam1 ID0 at resolution (640, 480) 2017-11-23 12:37:06,095 - eye0 - [INFO] camera_models: No pre-recorded calibration available 2017-11-23 12:37:06,095 - eye0 - [WARNING] camera_models: Loading dummy calibration 2017-11-23 12:37:06,364 - eye0 - [WARNING] launchables.eye: Process started. 2017-11-23 12:37:26,807 - world - [ERROR] calibration_routines.finish_calibration: Did not collect enough data during calibration. 2017-11-23 12:37:33,324 - eye0 - [INFO] launchables.eye: Process shutting down. 2017-11-23 12:37:34,326 - MainProcess - [INFO] os_utils: Re-enabled idle sleep.
Really really appreciate the invaluable help!
@user-7ca285 the log is not helpful, but thatnk you for posting
Ok great
I'll try as you suggested
Please try a non-full screen calibration as requested above
thanks
It's the same results
ok, thanks
we will try to reproduce this issue on our end later today
Thank you soooo much!!
Really appreciate the help
@wrp Hi, my bad, yes, disabling "Full Screen" from the Calibration Plugin does remove the problem
@user-7ca285 ok, that is helpful in narrowing down the issue.
@papr glfw fullscreen issue - please try to recreate on your dev machine if possible today
Thank you really awesome people!!!
welcome - we will be working on it
Thank you again!
@user-7ca285 @wrp I was not able to reproduce the issue on my MacBook Pro (Retina, 13-inch, Early 2015)
. Neither running from source nor from bundle. @user-7ca285 I Are you running from bundle or from source? Re-run your dependency installation in case of running from source. Nonetheless, I would highly recommend upgrading your OS to a newer version. macOS High Sierra runs very smooth for me and the overall software support (specifically on homebrew
) is much better.
Hi, do the Logitech C920 HD Pro works as a World Cam? The C930 works nice, but the 920 is on sale on Amazon just today...
@papr fortunately I was able to just copy in the library and it worked
Is there a way to work out the orientation of a square marker?
relative to the world view?
@papr Ok great, thanks lots, I'll give that a try! Will update at a later date
@papr I'm running from bundle, btw.
@user-23d980 Do you mean 3d orientation? See https://github.com/pupil-labs/pupil/pull/872 for that.
@papr yup I've come across that. My understanding is it requires minimum of 2 marks to orientate which makes sense. I was trying to work out if there was a way to do it with just 1 but can't think of how
Hi all, my colleague is using pupil capture, and she keeps receiving a 'dropping frame' message on her screen. Do you know what this means particularly? What could be causing it, and how we could alleviate it?
Manual Marker Calibration: I am struggling to get a successful calibration using this method. It always returns "Did not collect enough data during calibration. Using:monocular 3d model" No matter how many marker points I use. Using pupil service v 1.1.7. Any advice would be appreciated.
@user-2968b9 a bit more info about the setup would be required. My guess here is you are using Pupil Mobile together with Pupil Capture?
Yes we are
and the error is only happening when you start a recording?
What do you need to know? I can get the info from her
I'll check, she's operating in a different country, so she may take a little time to reply
@user-2968b9 no I think I only need to know the answer to the above question
Okay, perfect. I've asked her, and I'll let you know as soon as she gets back to me. Thanks for the help. 😃
because if this only happens at the beginning of the recording then, its fine. Pupil Capture is simply waiting for the first h264 keyframe to start a new h264 mp4 file.
Ah I see
we should silence the warning. its an implmentaion detail.
Ah, fair enough, so it's not really a massive deal, so to speak?
yes.
its intended and not harmfull.
Ah fair enough
I'll let her know that, if she tells me that it's coming up with greater frequency, I'll let you know. Thanks for the help!
Morning Folks. So, I have got an issue with the player updates and wondeirng if anyone has come across this and solved it here. Thanks in advance.
Updated my Pupil capture and player software and now my old recordings from version 9.12.2 will not open on the player, it throws out the following error - Interface still loads - I attempt to upload the recording folder anyway and it crashes.. Please help!
Error Message on attempting to load a file is as follows: MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version.
loadLibraries()
player - [INFO] launchables.player: Session setting are from a different version of this app. I will not use those.
createNativeWindow()
_glfwInitWGL()
Loading opengl32.dll
_glfwCreateContextWGL()
loadWGLExtensions()
choosePixelFormat()
it seems pretty self explanatory but I am not sure how to go about resolving this - Sounds like some backward compatibility issue. Has this being solved and if so, please can anyone point me to the thread?
Many Thanks
Worried researcher. 😦
I have had a rummage and found the following thread - which I have tried to follow. Please see the report attached for details - including system specs etc. Thanks guys. Keeping my fingers crossed
Hi everyone,
I'd like to use image data from pupil-labs for image-processing, but I couldn't find the detailed specifications of the camera. Does anyone know the focal length and the size of CCD or CMOS?
I'm using "high speed 200hz-binocular".
@user-6764f5 Is it possible that you mean the 120Hz
eye cameras?
@papr Apologies, I mean the world camera.
Hello all, has anyone used pupil eye-tracker in a multi-monitor setup?
@user-36aef3 Hey 🙂 I am using the Pupil software in a multi-monitor setup for work. I think @user-41f1bf has also a lot of experience in this field.
@user-36aef3 For completion: You can specify on which monitor calibrations should be shown. You can also track multiple monitors using the surface tracker in case that you need gaze that is relative to the content shown on the monitors.
Hi everyone,
My advisor is planning to order a pupil labs eye tracker and I'm not sure which one to choose between high resolution, high speed or 3D. Does anyone has any advice about this? Thanks a lot in advance.
@NahalNrz#1253 Usually we would recommend the high speed world camera. It comes with 2 lenses (100deg and 60deg) and can capture at variable spatial and temporal resolutions (and is the smallest of the options). The 3d world camera uses the Intel RealSense R200 sensor it provides depth data of your scene (RGBD video streams) - it is a newer sensor in our system and should be used only if you need RGBD data and are comfortable working with this kind of data.
Thanks alot
Welcome @user-006924
Is there a big accuracy gap between the monocular and binocular versions? and in case it might be helpful the eye trackers are intended to be used by older adults .
@user-006924 - You can achieve high accuracy with a monocular system. However, a binocular system provides more data (views of both eyes) and therefore pupil detection for at least one eye should be high confidence due to redundancy in eye movements (e.g. looking extreme left or right). With a monocular eyetracker, extreme look towards the nose may yield reduced confidence in pupil detection
@NahalNrz#1253 one more point. Only with binocular hardware is gaze accurate at depth other than the calibrated depth.
@papr thanks for response. Could you point me to some documentation that describes this type of setup?
Does anyone know of a plugin that captures data upon keystroke or clicking a button? Or anything somewhat similar to that?
@user-988d86 Check Out annotation capture it does something like that
How did you solve the problem caused by different height of camera and eyes?
When I experimented, there was an error depending on the distance between the eyes and the object.
I think this caused by height difference between eyes and world camera
Pupil solve this problem?
My eye camera is a monocular. Does this cause more errors?
@user-8058d7 Please be aware that gaze is only correctly estimated in the depth in which the calibration happened.
what means estimated in the depth?
pupil estimated depth when is in calibration step?
That was ambiguous, sorry. What I meant was that if you calibrate with a distance of 2 meters, gaze will only be estimated correctly in a depth of 2 meters (especially with 2d calibration, the 3d case is less sensitive to this issue). If the subject looks at objects that are further away the gaze estimation might include inaccuracies depending on how far the object is.
This is probably exactly what you experienced.
Ok i got it
The relative camera positions to the eyes is compensated for in the calibration.
How can you mitigate this phenomenon? There is considerable distance variation in actual experimentation or use.
Binocular is better for this problem?
Yes. The 3d calibration on a binocular device can handle depth much better than the monocular headset. We even add an depth estimation to the gaze point that is based on eye vergence .
@erinome I have been using pupil in a multi-monitor setup from day one. Right now it is really robust. However my experience is restricted to Ubuntu OS.
What are you planning to do?
@user-36aef3
@user-41f1bf We would like to do some experiments with very wide field of view, hence several monitor setup. We are working with Ubuntu as well
If you are planning calibration across monitors you will to adapt the screen based calibration plugin
will need*
Yes it does
@user-41f1bf does it tolerate some head movement?
From my experience, if use 9 or more calibration points
@user-41f1bf To be honest, my background is computer vision, I've only used Pupil with a single monitor before for a small stury. I don't even know where to start with multi-monitor setup and the psychologist in our lab doesn't have experience with this either.
Do not trust in 5 calibration points
My background is behavioral science and I have managed to adapt the plugin. Pupil code is clean and very easy
@user-41f1bf I actually found some of your posts on the mailing list regarding the screen-based calibration and the plugin you wrote. I think I'll study those first before asking any more questions. Thank you for pointing me in the right direction
The screen tracker plugin I wrote is to avoid showing fiducial markers to participants
I have found easier to detect the screen monitor and used this as a "pupil surface"
@user-36aef3 , to make my life easier, we have adapted (me and my advisor) a correction algorithm
This way we increase our chances of quality data.
The algorithm assumes equally distribute stimuli around the screen center and is useful when participants move their heads too much
@user-41f1bf is this part of the code?
No, it is not part of the plugin. The screen tracker only detects the screen and export data
But it is also written in python
Right now I am afk
@user-41f1bf thanks for help
Fell free to enter in contact, we can talk
@user-41f1bf thanks, I'll try using your plugins first to avoid asking silly questions
Hum.. I don't mind. Feel free ro ask
My code is NOT as clean as pupil's code, so I would prefer your silly questions first
Is there an easy or recommended way to get the DIY world-camera bracket on the frame?
I have been using screws
Do you mean fixing the word camera in the frame?
Both to get the pcb on the bracket and the bracket on the frame?....I've mounted the pcb on the bracket with the tiny screws from the camera disassembly, but now I simply can't get the clamp on the back of the bracket onto the small cylinder shape on the frame...It's super unflexible
Ahhhhh...
did you click it on before or after mounting the camera pcb?
It is strong enough. The bracket will fit in the frames easier without cables and the pcb
It also looks like it's a new design....at least it looks different from the one in the documentation
Hummmm...
I'll try unscrewing the pcb and clickking it on...
The one I am wearing has an X shape
The world bracket has X shape
With four places for screws
I have the same one...did you mount the pcb after clicking it on?
Yep
I'll try that then, thanks!
Try pushing with your thumb in the center
Without the pcb
Do you know if I can cut off the wire that goes from the usb cable and directly to the microphone (the microphone I have aleready cut off)...I think it's a ground/shiield connection...did you cut it off?
I can send you a picture when I get at the work
As far as I remember, I did not remove any cable.
I do remember that I needed to force cables to fit them inside the bracket
That did the trick...my world-camera is now on and the cables nice tucked in.
It is an older design
It's just the ekstra wire on the eye-camera that went from the usb cable directly to a small metal piece on the microphone....I took of the microphone and directed in the documentation, but wasn't sure what to about this extra wire (they didn't say in the documentation) and though it might be needed for grounding etc...
You are right, they did not say to remove the ground cable. 😀
They just forgot to mention where to put/connect it...
As far as I remember, I did not remove any cable. I cut off the auto focus and mic from the eye camera.
You cut the autofocus as well...? They didn't mention this in the documentation...or how to do it.
Also, as I did not know SMD soldering, I made my own IR connection
Using a resistor and single IR led
You can choose to leave the autofocus intact
You will be able to turn off the autofocus using the most up to date software
Cool ....THANKS! I was struggeling with the SMD soldering earlier today and hope I got them placed correctly as it was impossible for me to se the little notch that indicates the cathode end...
I would recommend another camera to you, it is a little bit bigger than HD-6000, but will do the work
I am using an ELP-xxxxxx, you can found the model number in the pyuvc repository
This camera came with a 5v output
I didn't know you could use another camera for the eye-camera...I had a hard time finding it, but got it and it's mounted for now. Did it fit on the same pupil bracket or did you make you own for this as well?
Same pupil bracket
95 fps
Nice. I'll keep it in mind if I have broken this one....I had a hard time replacing the IR filter and may have scratched the lens... will see.
And it is far more easier to work
wow 95 fps is impressive
what can the hd-6000 provide ?
30fps
ELP should allow up to 120 with proper ilumination
However I hadnt time to keep exploring
and testing
I'm looking at the pyuvc repository on Git, but can't find a camera list...
You mentioned this on on the github issues page....ELP-USBFHD01M-L21...it it the one?
Yes
It is
Hi, just a couple of questions: With the online Accuracy calibration, what does the Angular Accuracy and Precision. In a test run ours were 1.18048093324 and 0.00189682860559 respectively . What should the limits be?