@user-8779ef what you are seeing is not periodic changes in confidence but the 4-5 confidence samples for that world video frame (eye is recorded most likely at 90-200hz) repeating in the confidence graph when you hit pause in player. This is a bug. The confidence graph should not be be updated during pause. @papr can you implement a fix for this?
Also note that you are only seeing one video frame out of the 4-5 that match that world frame temporally.
Same goes for the recorded fps graph.
How feasible/simple would it be to detect head rotation/position based on the world cam and markers? I am interested in eye-in-head gaze analysis, as in, gaze shifts that consist of both head and eye movements.
@user-62cec9 Have a look at this work-in-progress pull request: https://github.com/pupil-labs/pupil/pull/872
ahh, thanks!
@user-8779ef @mpk I pushed the fix into master. We will replace these graphs with timelines in the long run. They make much more sense in Player. But the timelines are not precise enough without the zoom feature. Therefore we will discontinue the fps/confidence graphs in Player as soon as we introduce timeline zoom.
Hi everyone. I'm trying to extract fixations-on-surfaces from a large recording (7 GB). In the pupil player, the orange bar showing Marker Cache stops half-way through. When I export the data, "fixations_on_surface" CSV file also ends at a time stamp in the half of the video length. This happens on a Mac (player version 1.2.7, data format version 0.9.15). I have tried doing this on an Ubuntu machine but there the Player freezes and quits. Is the video too large to handle? If so, is there a way to downscale the video's quality after the recording?
@user-a4d924 What happens if you seek in Pupil Player to the second half of the video?
@papr the recording plays on (although it's laggy). I can see the gaze, but no fixations and no surfaces (both fixations and surfaces are visible in the first half of the video)
@user-a4d924 You should be able to reduce the video file size by transcoding it with ffmpeg. ffmpeg -i <original file> <new file>
@papr I was worried that changing the resolution will mess up the pixel coordinates used by Pupil's recording
The command above does not change resolution. Just the encoding of the frames.
@papr alright, thanks a lot! I will give it a go
Please do not overrite the original file though. Just to be on the safe side.
Hi, I've run into a peculiar issue and I'm not sure if I'm being stupid or something broke. I was changing the Local USB Video Source settings for the input from RealSense R200 camera and now it just defaults loading to a dummy calibration on a still of the last input, irrespective of whatever settings I choose. I've tried deleting my pupil settings folder and running from a fresh pupil recorder directory to no avail. Has somebody dealt with this before?
WRT the previous post, i have managed to resolve default input by just reinstalling drivers but changing resolutions on the video source settings will cause the error again
And now it works fine... You may want to just disregard everything I said, it seems something was playing up that I cannot now replicate to report back. lol
@user-e7102b Please see here the complete example: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/remote_annotations.py
@papr ffmpeg solved my problem - thanks once again.
@user-a4d924 Glad to hear it π
@papr Thank you for creating the remote annotations example - this works great! Do you have any examples for passing commands from MATLAB to python? I use MATLAB and the psychtoolbox to present stimuli to participants, so I really need to be able to control the eyetracker from MATLAB. I was planning to use the "system.m" command, but I don't think this is suitable.
@user-e7102b hi, I've been quietly watching this conversation because it's of relevance to me also. I am currently interfacing triggers from LUA to python using simple sockets, presumably you can do the same with matlab? Pardon me if I misunderstood your framework/requirements
Hi @user-dfeeb9 thanks for your message. My overall goal here is to try and use the Pupil as a substitute for the Eyelink 1000 trackers that we currently use in our lab. We use MATLAB to control the trackers in one of two ways. The first way is just to send all the usual commands (e.g. start/stop recording, calibrate, event codes) via MATLAB to the eye-tracker and everything is logged in a datafile on the eyelink machine. The second (and often more ideal) way is to grab the data from the eye-tracker online and record it in MATLAB.
I'm not very familiar with python, or LUA, so I'm running into problems when trying to execute either of these approaches
No problem, I'm very new to eye-trackers so I sympathise with the feeling of lacking direction
LUA is just another programming language that I happen to be using due to the stimuli we've selected
The principles are essentially the same irrespective of programming languages, which is that you can use a socket to send whatever bytes/triggers you want
As I understand it, zmq (zeroMQ) is the networking library that pupil uses. Because there's no easy way to get zeroMQ running with my current stimuli in LUA, I have a lua -> python socket where I send bytes (something simple like 1, 2, 3, 4) from lua to python. said python server script will read the byte and interpret these codes based on whatever label i've given them. so 1 might be experiment start. based on this interpretation, that server will then trigger an annotation in the pupil IPC during a recording. I'm very new to this so I may be doing things horribly wrong/inefficiently, so someone please do correct me if I am
but based on these principles you could do the same thing from matlab using a similar setup
as for reading data though, I am not sure at all. apologies for not being any help there
Sure. So you're essentially using a python server as a middle man between LUA and Pupil Capture Software?
yep
Ok that makes sense
this may be non-ideal depending on the latency and precision you desire
using a UDP socket I have anywhere from 0~30ms latency
you may wish to look into setting up a zeroMQ socket connection between MATLAB and pupil to cut out the middleman, but I have no idea about that
We actually do something similar with Eyetribe Eyetrackers in our lab
Right, that would make sense
So, there's actually a toolbox that I've been playing around with for grabbing the data from Pupil Capture into MATLAB via a python server: https://github.com/matiarj/pupil-helpers/tree/matlabAddV2/pupil_remote/Matlab_Python
The problem I'm having is that the data it grabs from the tracker are not meaningful
If I can crack this, then the problem shoudl be solved. Grabbling the data direct from the eye-tracker into MATLAB is ideal because a) it allows all the data to be stored in one place and b) it means you can run gaze-contingent eye-tracking experiments.
That sounds very cool. I actually know very little matlab so I am no help there, hopefully the pupil guys here will help you resolve that. Sorry for butting in without much help there lol
Of course, I'm sure they will! Thank you for butting in, I need all the help I can get π
Many vision scientists use MATLAB/psychtoolbox for stimulus control, so I think that resolving these issues and making Pupil as accessible as possible could benefit a lot of poeple in my field
@user-e7102b I certainly think MATLAB support is important. We will put this on our todo list (with the caveat that the majority of our team reads/writes Python, C++, C, JS, etc). @user-ed537d If you have time, would you be willing to update/work with others in the community to add more features to your example?
@wrp Thank you - that would be really great. I'll be more than happy to work with you guys with testing out solutions and making working code examples available in a public repository. I fell that @user-ed537d 's code isn't far off working for me...it's just tough to figure out what the problem is with a limited knowledge of python.
Hello, what should be sued as in IR only pass filter for tracking cameras, 850nm or 940nm?
850 is closer to what we use.
@user-e7102b You should have a look at the Hololens Relay plugin. It defines a separate network interface that uses udp. You should be able to use it with Matlab. It is not as versatile as Pupil Remote. Let us know if you need further commands that are not implemented there.
The example by @user-ed537d is a python script that opens an udp port and relays the data as well after subscribing to Pupil data.
closer? what is yours?
850nm is not visible to the eyes right?
correct, not within visible spectrum
@user-e7102b Looks like you could write a matlab wrapper that calls methods in a python module: https://www.mathworks.com/help/matlab/matlab_external/call-user-defined-custom-module.html?requestedDomain=true
@papr The confidence threshold is used during calibration, correct?
correct
@papr The nature of my issue posting is this: the same threshold should not be used for calibration and for other settings.
Imagine I want to export my calibrated data, even when data below threshold.
I would have to calibrate with a high threshold, then lower the threshold prior to export (then raise it again, since every other plugin seems to rely on . it?)
It just seems a bit messy.
They keyword here is consistency. There is no reason to export data that was not used in other calculation.
I don't know that i agree with that.
Athough using a very high threshold for calibration might produce a great track with limited data.
...but, I would want a much lower threshold for export so that I can plot and make decisions related to the threshold post-hoc.
So you want to have multiple confidence threshold sliders?
I think just one for calibration, and one "other."
It's hard for me to say much about the "other" threshold, because when/where the general setting is used is not transparent.
The exact reason why introducing multiple thresholds would be messy
Heheh.
I'm going to stick to my guns here. I think adding a slider to the calibration plane would be a good idea.
calibration pane/menu
You would need such a slider for each calirbation section as well... This menu is already so full with elements. EDIT: sequence
Lets put it this way: recently, I had the situation where I had to have a high threshold for a good calibration. When I exported the data, though, it was full of holes.
It's not clear to me why the export should be thresholded at all, really. Other plugins, I understand, but withholding raw data seems antithetical to a research device.
I'll likely edit my export module myself. I realize it's a simple commenting out of one or two lines of code.
...at the very least, I would make it very clear to the user that not all the data is actually exported.
Did the pupil or the gaze positions have holes? Or both?
Only looked at gaze
Took me a while to think to run a min(gaze['confidence']) and see what was happening.
Well, I guess this is a documentation issue then.
@user-8779ef pupil data stream is never filtered.
the gaze mapper threshholds based on the set level. This means the gaze datastream is fully exported.
but the gaze datastream has these holes because of the sample selection from pupil data based on the threshhold.
I getcha.
we export all data that is generated.
I don't like it π .
what do you not like? You would like to have a pupil data mapped to gaze?
Yes. I think one should be able to calibrate using one threshold (because that makes sense - it imporoves the quality of the calibration )
and then export all data using no threshold, or a different (lower) threshold for post-hoc analysis.
You are effectively throwing out data.
...by not gaze mapping it.
Gaze map it now. Let me decide what I want to keep later.
@user-8779ef ok. I get this point. We could just map all pupil data to gaze but we will need to then filter for it in other plugins later.
That makes sense.
I guess the difference is that I consider pupil player a waystation to analysis in Python.
Whereas you consider it good practice to filter the data prior to analysis in Pupil.
I think flexibility here is key.
...or, at the least, transparency, because one would not expect "export raw data" to omit data.
(i realize the issue with semantics there - to you it's not REALLY omitting data)
@papr @mpk Thanks both for listening. How should I phrase the issue request?
Is this really an issue with documentation, as you would sugges?
sorry.
SHould I request what I want - the two sliders?
wrong link
ehr, two thresholds
Ok, that works. Thanks very much @mpk !
@user-8779ef Thanks for the suggestion re writing a matlab wrapper that calls methods in a python module. I'll see if I can get this working and report back. This seems like a better way to go than using the matlab "system.m" function.
My pleasure, and good luck!
@papr I've actually set up @user-ed537d 's python/matlab UDP relay and have managed to pull live gaze/confidence data from the eye-tracker into MATLAB. Unfortunately the data that I'm pulling in do not seem to update in response to my eye-movements/blinks etc. I just see an endless stream of the same numbers e.g. Eye tracking data: [-48.26, -43.69], 0.80 Eye tracking data: [-48.26, -43.69], 0.80 Eye tracking data: [-48.26, -43.69], 0.80 Eye tracking data: [-48.26, -43.69], 0.80 Eye tracking data: [-48.26, -43.69], 0.80 Eye tracking data: [-48.26, -43.69], 0.80
I think it would be worth the effort to look into the hololens relay interface. It will be officially supported by us. You can probably adapt the matlab code part of Matiar's code example.
OK, sure, I'll take a look at this too.
@papr , can you direct me towards the hololens relay interface script(s) please?
@user-e7102b https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/hololens_relay.py
Thanks!
@papr So I changed the port in hololens_relay.py to match the pupil remote port (50020) and then ran hololens_relay.py in terminal using $ python3 hololens_relay.py . I don't see any error messages, but when I look in Activity Monitor there doesn't appear to be a python server running. I'm also unable to read any data into MATLAB. Am i missing something here?
By the way, I was able to tweak @user-ed537d 's python server code to successfully read the eye-tracker data into MATLAB. However, the rate at which samples are being read in is really very low and inconsistent, so I don't think this is going to work.
@user-e7102b this is a plugin that runs within capture. You will have to write a custom Matlab script that implements the relay protocol as it is described in the plugin doc string
You need to enable it in the plugin manager
Which option is it in the plugin manager?
It should be called Hololens Relay. Which version are you using?
version 1.1-2
I think it is a 1.2 feature. Please upgrade your software to use it.
Ok got it.
pupil capture v1.3 isn't loading on my machine (Macbook Pro, Sierra 10.12.6). Both pupil player and service seem to be working.
@user-e7102b did you try deleting pupil_capture_settings dir?
hello @papr , I am interested in Pupil Mobile, I have just started going through some instruction about this. Yet, I am just first wondering that anyone could suggest which android device we should use to connet the Pupil Mobile? Does Google Pixel 2 xl works with Pupil Mobile? Alos, I want to know is it difficult to do the calibration outdoor? Thanks!
@wrp I tried deleting the pupil_capture_settings directory but I'm still unable to get pupil capture to start.
I've downgraded to v1.2-7 - this version appears to be working OK
there was a bug in 1.3 @user-e7102b it is fixed now. Please re-download and re-install Pupil Capture and it should work.
@mpk Thanks - it's working now.
@papr So, with regards to the hololens_relay plugin, woud I still need to set up a python server to interface between MATLAB and pupil capture, or is the idea that I can just tap directly into the UDP socket from MATLAB?
@user-e7102b No, the plugin is the server. No need for further man-in-the-middle-relay scripts. You are able to talk the udp socket directly using MATLAB.
@papr I'm able to read a stream of 23 numbers into MATLAB using the following code, so it looks like I'm doing something right. Does the following pipeline look correct to you:
% construct UDP connection
hUDP = udp('127.0.0.1',50021);
fopen(hUDP);
% initialize the relay
fwrite(hUDP, '0I')
% start gaze broadcast
fwrite(hUDP, '0S')
% read from
fread(hUDP);
@papr Hi! where can I find pupil apps 1.3 for windows?
@user-826625 We will publish the windows bundles early next week.
@user-e7102b Codes starting with 0
are response codes. These are the codes that you should be receiving from the Hololens Relay. The protocol implements mostly a request-reply pattern. This means that you should call fread
after fwrite
to see if your request was successfull.
@papr OK that makes a lot more sense. I'm able to construct the UDP, initialize the relay and start gaze broadcast. One thing I'm not sure of with this approach is whether I'll be able to record data in MATLAB at the true sample rate (e.g. 120 Hz). For example, if I run the code below, the samples do not appear to be updating in real time.
% gaze capture loop for i=1:1000 fread(u) end
@user-e7102b I would always recommend to record in Pupil Capture. The interface is only for live interaction and does not guarantee delivery (due to the socket being udp).
@papr Sure. My approach with our other eye-trackers in the lab is to stream into MATLAB for live interaction and also record to tracker's data log, in case we miss samples in the live stream
If I could send commands direct from MATLAB to start/stop recording AND event triggers via the hololens_relay plugin then I could also use this approach with Pupil. However, I don't see these options in the hololens_relay.py documentation?
You are correct. These options are missing. I think both are reasonable functions to implement. Please create a Github issue for this.
OK, I've submitted an issue. Thank you for your help with this.
Anyone here experienced with pupil on the rpi3?
@papr before starting to abuse the hololens relay and build a udp based (read unreliable) matlab interface. Can we at least discuss writing a matlab demo that uses zmq and smgpack reading and writing a stronly typed subset of the IPC?
hey all, I have just pulled the latest pupil version and all dependencies and everything runs fine except the screen marker calibration does not result in calibrated gaze point. During the process there are multiple "to far." messages in the console. The screen markers respond normally (turn their color to green when looked at), but after the calibration completes the gaze is all over the place, no correlation with real gaze point whatsoever. The console output: Starting Calibration world - [INFO] calibration_routines.screen_marker_calibration: Starting Calibration
Stopping Calibration world - [INFO] calibration_routines.screen_marker_calibration: Stopping Calibration to far. ... 90 more to far. to far. world - [INFO] calibration_routines.finish_calibration: Collected 77 monocular calibration data. world - [INFO] calibration_routines.finish_calibration: Collected 12 binocular calibration data. Reflection detected Reflection detected Ceres Solver Report: Iterations: 89, Initial cost: 2.729794e+01, Final cost: 7.783502e-03, Termination: CONVERGENCE
@user-8cf4ca what is the confidence of Pupil detection?
mostly 1.0. I have just noticed that the reported fps is very low though (2 fps in world and 8 fps in eye windows), although it looks like usual 120fps in the eye windows
@user-8cf4ca On which platform do you run?
And am I correct that you run from source?
linux, yes, running from sources, just pulled 30 minutes ago, tried, got this, then switched to the tag v1.3 - same results
@user-8cf4ca this is a very low frame rate. Could you let us know the specs of the machine you are using?
@user-8cf4ca Do I understand you correctly, that the eye video is smooth as if it would run with 120 Hz but the reported fps is around 8 fps?
it worked wonderfully a month or two ago on the same machine. Intel(R) Core(TM) i7-3820QM CPU @ 2.70GHz, 24gb RAM
@papr, exactly. Same with the world too.
that's a powerful machine - should be more than enough to run Pupil
yeah, it worked flawlessly before. I might try to just restart and reinstall the dependcies again
ok, everything solved by redoing (sudo make install) in turbojpeg. Apparently it wasn't linked properly or something.
@user-8cf4ca thanks for the update - pleased that this was resolved. Makes sense that you would be seeing super low frame rates without turbojpeg
Hey, working with the newest build of pupil lab (200 fps eye cam) and I can't seem to find any way to adjust the focus of the eye cam. Does anyone know if this just wasn't built in?
@user-02ae76 You are correct, the focus of the new cameras is not adjustable. You should get good pupil detection anyway even if the image does ot look as sharp as with the old cameras.
@user-02ae76 I'd also like to add on to @papr comment. You can also try using the eye camera arm extender - https://docs.pupil-labs.com/master/#additional-parts - if you are not able to get a good view of your eye region with the standard eye camera arm.
Dear Pupils Team, Do you have a video showing the performance of the DIY eye-tracker? I'm a college freshman and I'm going to buy the DIY eye-tracking kit but I am wondering how accurate it is. (I want to try to use it and see what kind of projects I might come up with.)
@user-b458c2 There are some videos on youtube.
Also, there is a sample dataset to play around with pupil player. I am not sure if it is up to date for offline detections though. https://pupil-labs.com/pupil
Hi, I'm running into a problem when I attempt to view the raw pupil data. When I load a recording into pupil player and hit "export", the raw data does not appear in the "exports folder" (I just get an empty "annotations.csv" file and a "world_viz.mp4" file). Pupil player also crashes a few seconds after the export. I have the "Raw Data Exporter" plugin activated before I hit the export button.
I'm running pupil player v1.3-9 on Macbook Pro (Sierra)
I'm also unable to open the pupil_data file that is saved in the recording file. It doesn't appear to have a file extension. In the documentation it states that pupil_data can be read and inspected with a couple lines of python code...what are those lines of code? I've tried reading it in as a pickle file but no luck.
@user-e7102b please see https://docs.pupil-labs.com/master/#raw-data-with-python re viewing pupil_data
@user-e7102b please could you share the player.log
file so that we can gain insight into the crash? This is located in pupil_player_settings/player.log
@user-e7102b @wrp A user notified me that this crash is due to a mistake in the pupil helper script that sends the annotations. It does not set the source
key. The player annotation plugin requires this key for legacy reasons IIRC. We should remove this requirement. The short term version is to fix the helpers script.
@papr I added the source field:
@papr I also made an issue: https://github.com/pupil-labs/pupil/issues/1058
@papr I have a question regarding distance calculation between user and a point on a surface. I posted it in github with details: https://github.com/pupil-labs/pupil/issues/1059 Could you take a look at it?
for everyone who is installing the pupil source on macos high sierra: the developer docs should be updated, opencv does not install ffmpeg automatically... you have to make a brew install ffmpeg ...
@user-29e10a noted, we can update the docs today
@user-29e10a I just checked brew opencv3 and it appears that ffmpeg, numpy, and other dependencies are required by opencv3 and therefore should be installed when you do brew install opencv3
the output from brew info opencv3
shows:
Required: eigen β, ffmpeg β, jpeg β, libpng β, libtiff β, openexr β, python β, python3 β, numpy β, tbb β
Hi! How can I export csv raw data with Pupil Player? Thanks!
I was using Pupil Mobile app as well, Can we calibrate with the mobile app? Moreover, when I play the files captured by Pupil Mobile, the gaze circle did not show up as the files recorded by the pupil capture. Is this due to the app itself or am I not using it correctly?
Hi @user-826625
- Exporting csv files with Pupil Player - Load the Raw Data Exporter Plugin in Pupil Player. Press e
or β¬ to export data.
- Pupil Mobile - You should start recording and then show the calibration marker to the participant. You will not be calibrating in Pupil Mobile, only recording video data. You can calibrate post-hoc in Pupil Player as long as you have recorded the calibration session in Pupil Mobile with offline pupil detector and offline gaze mapper. (The reason you didn't see a gaze position/gaze circle in Pupil Player is due to the fact that you did not calibrate and therefore did not have gaze data)
Thanks@wrp ! I was trying to follow this video, https://pupil-labs.com/blog/2017-08/pupil-capture-player-and-service-release-v0-9-13/, but I had difficulty finding the "open plugin" "visualizer" "Analyzer" "data source" options. I have activated all plugins from the plugin manager
Hello! I was wondering whether I could use Pupil with an Arrington Research EyeTracker? and if so, where/how should I start? Thanks!
@papr @wrp Thanks for editing the plugin - now the raw data export is working just great.
@user-826625 Please note that with v1.0
we changed the entire GUI. Therefore Pupil software no longer looks like the linked blog post. If you check the docs in this section: https://docs.pupil-labs.com/#analysis-plugins you can see how to launch plugins with screenshots of v1.0 versions of Pupil software.
@user-48d784 Pupil is designed to use UVC compatible cameras and is designed for wearable/head-mounted eye tracking (opposed to remote eye tracking). I do not know much about AR's systems (It seems that many are remote - e.g. not head mounted - unless you are referring to a VR/AR integration).
Hey guys,
I have a DIY set with the full hd world cam and I need to make it work with pupil-mobile. Since I have um regular usb for each camera, I am inclined to think a usb-to-usbc hub would do the trick. Do any of you have any experience with that? I am concerned with bandwidth problems, but I imagine that the commercial version of the headset -which has two cameras and a single usb-c output- must deal with the same issues. So, to sum it up, do you know if there is anything inherently different from the way the commercial headset communicates with the mobile device from plugging two cameras to a hub?
@user-516564 please note that Pupil Mobile is designed and implemented to work with Pupil Labs hardware. Pupil DIY hardware could work in theory, but it has not been tested and we can not provide support for DIY.
@anlutfi#9800 there are a lot of small details that make the 'normal' Pupil headset work on a single bus and also with Pupil Mobile. We dont have this kind of control over the DIY hardware. Thus DIY use case is only on Desktop and with two usb cables.
hey pupil labs, for the realsense version of the eyetracker, is there any way to get the depth video recording without compression? Or even better, with lossless compression? I tried setting the bit_rate parameter in the code to high numbers, the resulting depth video is still overcompressed, even though the reported bitrate gets higher
@user-c77dda I would recommend taking the depth stream and saving it uncompressed as frame wise numpy arrays. You will need an obcene about of disk space.
ok, thats what I was afraid of. If only there was some video codec supporting 16 bit grayscales
you could also try saving them lossless using .npz files.
but I think that will be factor 2 not 10 or so.
still a lot of data.
@wrp and @mpk, thanks for the reply. I understand you can't provide support, but buying a hub is a much cheaper bet for me. So, is there a possibility in theory, or is there anything that makes you think beforehand that it will definitely not work?
@user-516564 I can not think of anything that will not make it work outright. But I would give it <50% chance it will work out of the box. Also import in Pupil Player will need modifications to the source code.
@mpk thanks, worst case I'll have an extra hub. What sort of modifications do you mean? The output files would be different if I used the diy headset?
camera names are different.
this needs to be adjusted.
Ok, thanks a lot!
Hi pupil. I'm trying to automate marker detection and 'gaze on surface' exporting for an experiment where I have one recording per file (so loading them all into pupil player will be very time consuming). My code seems to locate the markers fine when I visualise the maker detection, but the ids are way off the normal range (I'm getting ids of 2048 and 4608, or example), and the confidences are way off (very low). This means that no gaze positions are being detected on the surface. I'm using an anaconda distribution on windows on a comp without admin rights so I haven't set it up as a developer PC. The same code works fine on a linux laptop set up to run pupil-labs from source. Is this problem something you have come across before?
I was able to see the cameras unders the "cameras" in the device manager (windows 10) before, but I cannot find them in the device manager anymore since yesterday. The pupil software still recognises them though. any idea what could be wrong?
Hi - quick question - can you foresee any issues with replacing the standard USB A to C cable that is used to connect the pupil headset to the USB port with a longer cable (e.g. 15 ft)? Thanks
Hi everyone, can someone please explain or provide any resource to what exactly is the difference between 3D mode and 2D mode in the eye tracker? I have been trying to figure it out but not got any concrete answers
@user-e7102b Which headset do you use?
If it all I would use an active usb extender cable, that has active signal reinforcement
@papr We're using the mobile headset. From what I've read, passive cables should be ok up to around 16ft, so I was thinking something like this: https://www.amazon.com/CableCreation-Braided-OnePlus-Macbook-Resistance/dp/B01D3095RW/ref=sr_1_1?s=electronics&ie=UTF8&qid=1518112814&sr=1-1&keywords=usb+a+to+c+cable+10+ft
If that doesn't work, we could always go down the active cable route.
@wrp thanks! You're right, my Arrington Research camera is remote, not head mounted. Sorry I had not understood that! Cheers
@papr Hey, I'm sorry if this seems like a question that has already been answered, but I'm looking a brief description of these two parameters in Pupil player. Model sensitivity and Confidence threshold. How do I get an intuition regarding which parameter to control for a better track?
@user-0d187e cameras should be listed in the libusbK
category within the Device Manager under Windows 10. If drivers are not correctly installed, or if drivers are overwritten/removed by a Windows update, Pupil Cameras will be listed in the Imaging Devices
or in some cases Cameras
categories.
@user-c09b2c for an overview on 3d vs 2d see: https://pupil-labs.com/blog/2016-03/pupil-v0-7-release-notes/ - This is a blog post from almost 2 years ago, so there have been lots of improvements and changes since, but this will at least provide an intro
Hello - I downloaded the new Pupil apps for mac, and when I open old recordings in Player, the calibration, fixation detection, etc. fails because there is "no gaze data available to find fixations". Is there a fix for this?
Nevermind - I deleted the old settings, and I think it's working now!
Hello - We are purchasing the hardware add on for the htc vive. I though it would be best if we could do the development on the windows machine since it's where the vive is attached to. I am have numerous issues trying to follow the windows dependencies sections. Everything went as planned but when I try to run the main.py the pyav seems to be missing a dll. Does anyone have any comments on how to fix that? I also cannot compile the pyav. The other alternative is trying vive with linux, but that seems like a whole another can of worms.
btw this is the error in the traceback:
File "C:\Extras\Python\Python36\lib\site-packages\av__init__.py", line 9, in <module> from av._core import time_base, pyav_version as version ImportError: DLL load failed: The specified module could not be found.
Hi @user-d40c36 do you need to modify Pupil source code?
I ask because, you can also run the Pupil bundle for Windows and add plugins at runtime (perhaps you already know this).
@wrp We do intend to modify code, but mostly the c++ pupil code to see if we can improve upon it, this is mainly just research. The big issue is we can definitely do it on linux, but i'm not sure how well supported the htc vive is on linux. That is the main issue/problem that we are facing. I'm not sure the instructions are correct for windows, or maybe something has changed in the source code that is affecting the whl pacakges etc.
Hi - One of my experimental setups uses a Linux machine (Ubuntu 14.04) with a dual monitor setup that uses x-screens, so I'm unable to display the calibration to the participant using the built-in "screen markers" calibration option. To get around this, I wrote a script in Psychtoolbox that displays the manual markers at various locations on the screen. I'm displaying markers at 9 different locations, as suggested in the user guide. However, I've noticed that the gaze data are not being sampled for some of the markers (i.e. blue semicircle does not appear). This seems to happen randomly (i.e. no particular locations or orders). I've played around with different marker sizes, eccentricities, world camera resolutions etc. but haven't managed to stop this happening. Do you have any suggestions how to resolve this? I've attached a video here so that you can see what I'm dealing with: https://www.dropbox.com/s/ifcmdojeo3c04d0/IMG_4031.MOV?dl=0 ..... Thanks!
btw no participant is being tracked in this example
hi everyone, well..I just got the pupil mobile now, I have downloaded pupil capture, but it shows not supported? may I know how should I do?
@user-d40c36 I would recommend a two computer setup in your case. One Linux machine to which the add on is connected via USB and that runs your modified version of Pupil Capture. The second machine runs windows and your vive application. The hmd eyes integration uses a zmq network connection. It does not matter much if the Pupil Capture runs on the same computer or a computer in the local network.
@user-d40c36 - as @papr notes, Linux is certainly a more stable (easier env to set up dependencies) for dev than Windows
@user-d40c36 we do also need to update wheels for windows
@user-7bc627 The warning just says that Disabling idle sleep is not supported on Windows. This is expected behavior since it is actually only implemented on macos. But this feature is not required for the application to run. Does the application start and show up?
@user-e7102b I would recommend to draw the markers as similar as possible as the manual markers in the docs. The outer black ring looks a bit too thin to me. Usually one would subscribe to Pupil Capture and wait until the manual marker calibration broadcasts that it detected enough samples of a marker before showing the next marker. You seem to advance in a fixed time period.
(link to calibration marker pdf - https://docs.pupil-labs.com/images/pupil-capture/calibration-markers/v0.4_markers/v0.4_marker.v12.master.pdf you can also use the jpg that is in the docs page - note use the calibration marker and not the stop marker for claibration)
@user-e7102b The black and white ratios are important when drawing the markers, not so much their size. I think they could be slightly smaller in your case though.
@papr now is like this.. how should I do?
Looks like your camera is either not connected or the drivers have not been installed. Did you run Pupil Capture with administrator rights? @user-7bc627
hmm I think so ? I just downloaded the one for Windows from here https://github.com/pupil-labs/pupil/releases/tag/v1.3
Don't worry about the zero division error. You should be able to open the eye windows. I will add a fix to the next release that catches this exception
@user-7bc627 please right click on pupil_capture.exe and select run as administrator
ok I just did that. but it seems the same? @wrp
To debug driver installation could you please do the following:
Unplug Pupil Headset from your computer and keep unplugged until the last step
Open Device Manager
Click View > Show Hidden Devices
Expand the libUSBK devices category and expand the Imaging Devices category within Device Manager
Uninstall/delete drivers for all Pupil Cam 1 ID0, Pupil Cam 1 ID1, and Pupil Cam 1 ID2 devices within both libUSBK and Imaging Devices Category
Restart Computer
Start Pupil Capture
General Menu > Restart with default settings
Plug in Pupil Headset after Pupil Capture relaunches - Please wait, drivers should install automatically
@user-7bc627 you mentioned that you are using Pupil Mobile. Did you connect your Pupil Headset to your Phone or to the computer running Pupil Capture?
(good question @papr )
it's connected to the laptop now
Then please restart Pupil Capture with administrator rights again
@wrp i checked what u said, but I dont have libUSBK (I have clicked show hidden devices)
@user-7bc627 do you see drivers in other categories as noted above?
Look in imaging devices
it's "integrated webcam"
And the pupil headset is connected? You should be seeing the devices in the device manager. What Pupil headset configuration are you using?
You can also DM me with a order id if you have it
For those reading the above notes with @user-7bc627 I wanted to note that this behavior may actually be due to the USBC cable either not being fully connected or a defective USBC-USBA cable. I will update with concrete information after further debugging.
π Issue resolved. The USBC cable was not fully connected to the Pupil headset. Please note that the connector needs to be pushed in fully and requires a bit of firmness to push it in.
@papr @wrp Re the calibration markers, in the example video I'm actually directly presenting the v0.4 Marker that I downloaded from pupil-docs. It's unclear why there would be sporadic failures.
I understand @papr 's suggestion to subscribe to pupil_capture and wait for the confirmation that the required amount of samples have been broadcast, but during testing I did try presenting the markers for a really extended duration (e.g. 10 seconds each) and still the gaze sampling won't work for some locations, so I'm not sure if this will resolve the issue.
@user-e7102b Try to move the headset slightly during the calibration procedure. Especially when the markers are not recognized. Please let us know, if you find any consistency/regularities in the marker positions that are not recognized.
@papr Good point - due to the nature of the dual screen setup I can't wear the headset while I'm viewing the calibration procedure, so it's been resting stationary on a stand the whole time. I'll give this a shot. Thanks
Hey guys... What kind of encoding does pupil use? I subscribed using nzmqt, but apart from few tags like gaze, etc I get just rubbish
@user-e938ee please see https://docs.pupil-labs.com/#message-format
@user-e938ee we use msgpack
Thanks
@papr I am just wondering why my eyes in eye camera 0 is reversed? thanks!
@user-7bc627 This is due to the camera being physically rotated on the headset. But do not worry, this is no issue for the pupil detection algorithm. You can flip the image in the general settings (this only flips the visualization only though).
oh I got it thanks! @papr meanwhile, now the pupil player seems crashed..it shows something like "GLFW window failed to create." May I know how should I do?
Yes, this is a known issue. This is most likely to a invalid window size value stored in the user session settings. Please delete the pupil_player_settings
folder and start Pupil Player.
thanks!
Anyone else notice the latest bundle runs VERY slow?
@user-8779ef what aspect runs slow?
player
v1.3-9 ?
Running it on a brand new machine with tons of horsepower, and it's not nearly as responsive as it should be
version 1.2 was faster?
Uh... the lastest, as of a few days ago (sorry, not on campus, where the machine is)
Most def. Very noticeable.
Now, when running from source on my machine, I didn't seem to have the same unresponsiveness.
@user-8779ef ok. We will have a look into this on monday.
Ok, thanks. Let me know if Ican help.
FYI, running on a mac in both cases.
If you can let us know the exact version of the bundle we can pinpoint the issue.
Ok, I'll do that on monday.
Also, I had a crash related to the new offline calibration range,.
thats from master right?
I'll replicate so I can post the issue
great!
Yeah ... it the range was 1 frame > actual frames. I had to add a "-1" somewhere. I know, I know, very helpful \s
in gaze mappers. I'll get the details soon.
That's papr's domain, right?
great! And thank you for keeping the feedback coming!
My pleasure.
@papr wrote this so he is the best contact.
Ok. I'll send him a message.
hey, @mpk , another question for you - I was recently trying to install homebrew dependencies on a mac, and got a new message about deprecated formulas. Is this something new?
Ring any bells? Google is, surprisingly, not very helpful.
Seems related to this: https://github.com/Homebrew/homebrew-science/issues/6365
Something I thought I'd drop by and ask - has anyone ever had problems with pupil detection in participants using heavy makeup? lol
I noticed on those wearing thick layers of eye-liner that the system would be confused very easily
I'm sure this is common but thought it was interesting enough to mention
@user-dfeeb9 I think this is a common problem with eyetrackers in general (I've certainly experienced this with other trackers that we use in our lab, such as Eyelink and Eyetribe). We explicity instruct our participants to come in without any eye makeup., and if they do come in with makeup we ask them to remove it.
I have decided to do the same since realising that was the issue, very interesting though. Sorry for stating the obvious, I'm very new to eye-tracking. Thanks @user-e7102b
Good idea π You might also encounter problems with people with thick/heavy eyelashes.
@user-dfeeb9 @user-e7102b This issue can be slightly worked-around by setting the eye ROI to exclude the eye lashes. Does not always work very well though, but might improve the situation a bit.
@user-8779ef @mpk We noticed a long-time performance issue during paused playback in Player last week. I fixed it in https://github.com/pupil-labs/pupil/commit/2df34d28f06a45902aba05f56c472bea655edc75 This should have made it into the 1.3-9 bundle though -- nor 100% sure though.
I had the trim mark index issue as well for offline data that was generated by a previous version and was not able to reproduce the issue after deleting the cached data. Maybe we should increase the offline data version format to force the system to delete the invalid data...
Hi all, i'm finding a strange error when attempting to load recordings in pupil player. it reads "updating recording format" - "this may take awhile", the player app closes shortly after.
i'm using a 2015 mac with OS sierra 10.12.16,
pupil bundle version 1.2-7
any tips would be greatly appreciated,
Thanks
Hello
I'm Yoonmin Kim in Korea. I would like to order a Pulpi-lab's eye-tracker.
Is the Pulpil-lab's analysis software for free?
Hi @user-6e5aaf yes, software is free and open source. See https://github.com/pupil-labs/pupil/releases/latest
You may also want to check out the docs - https://docs.pupil-labs.com
@wrp @papr thanks again for all your explanation and help! Now I finaly understand what the "Markers" mean...so another questions, if the marker need to be presented physically, for example, it means we cannot use Pupil Mobile to do the eye tracking for outdoor advertising board? orr anything that is outdoor (e.g. how people view the building / architecture etc etc)
@user-7bc627 You can use Pupil Mobile and use the offline surface detector to detect the markers in your recording. You will have to add the surface markers to your AOI though. This means that it will not work if your advertising board does not have the markers.
Make sure to monitor you eye cameras for overexposure during your outside recordings. This might happen if it is sunny.
hmm ok I didn't really get the idea..is there any demo / tutorial for doing offline surface marker? if there is no marker in my recordings, how can it be detected offline?
Sorry if I did not express myself clearly. You will need surface markers in the world video. But you can use the offline surface tracker define surfaces after the recording based on these markers. Similar to the example dataset from our website.
what I was asking is that... so now, imagine I am collecting data outside and I want to see how people process the beverages in a convenient store (e.g. which beverage attracst customers' attention the most). In this case, obviously I cannot stick the markers on all those beverages...how should we do then? Thanks!
Pupil does not provide any object-detection based surfaces. You will need to either adapt your own object-recognition algorithm and define surfaces based on that, or manually annotate the video frame by frame. There are services that provide automated solutions, e.g https://imotions.com/
ok great! Then I think my doubts are all solved:) Last question...why it happens very often that the eye 0 doesn't work? Eye 1 is ok, but somestimes if I adjust the eye camera a bit, then Eye 0 will then show as grey background only (and even crashed)
Do I understand you correctly, that you start eye0, it works, you adjust the eye0 (right eye) camera and the window chrashes?
Yes!
This is with the headset connected to the computer running Pupil Capture, and not running Pupil Mobile and copying the recording to Player, correct?
yes, correct. the headset connected to the computer running Pupil Capture
(it was ok when I just turned on the Pupil Capture..and after I adjusted both eye camera, the left was still fine, but the right camera just didn't work)
This might be a loose connection to the eye0 camera. I will continue writing you in a personal messsage since this issue is specific to your setup.
ok sure:)
@papr @wrp thank for the info. I'll proceed with that route. Just curious, will there be any attempt to update the wheels and get the development environment working with the current github source? The website lists steps to setup windows, but those do seem to be incorrect or out of date.
@papr Performance issues go away when I hide the timelines by click-dragging to lower the height of the bottom workspace
@papr So, it seems like all those timelines are stressing out the system.
20 minute video
@papr With reference to the manual calibration issues I highligted on Friday, you suggested that I subscribe to Pupil Capture and wait until the manual marker calibration broadcasts that it detected enough samples of a marker before showing the next marker, rather than advancing at a fixed time period. Can you provide or point me towards an example of the python code I would need to use to do this? You mentioned in an earlier comment that ZMQ relies on send/receive command pairs, but I'm not sure what I would need to be sending here in order to receive the broadcasted signals from the calibration routine. Thanks
@user-e7102b Have you looked here? https://docs.pupil-labs.com/#networking
Yeah I've looked through this before and I'm still not sure what I'm supposed to be doing. Is the "Reading from the backbone" example under "PUPIL REMOTE" the example that is most relevant to what I'm trying to do ?
@user-8779ef please test the current master for timeline performance. It should be much smoother than the last release. Else we will have to take further action to improve their performance.
@tombullock these are the notifications you need to subscribe to: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/calibration_routines/manual_marker_calibration.py#L85
See the this script on how to subscribe to notifications https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_messages.py
Be aware that subscribing to notifications requires you to prepend notify.
to the notifications` subjects to correctly subscribe to them.
Best you subscribe to notify.calibration
@papr Thanks, this makes sense.
As an aside, I noticed that pupil diameter does not appear to be displaying in pupil capture. I've switched on the "Display pupil diameter for eye" in the "system graphs" tab, but both graphs are not displaying any output.
Pupil diameter is being logged in the recording files, so perhaps this is just a visulization issue?
@user-d40c36 I can update wheels today
@papr you mentioned that another alternatively we could do is to manually annotate the video frame by frame. Just to confirm that can we do this in Pupil Player?
@user-7bc627 you cannot do that in Player. You will have to do this somehow externally.
@wrp How are Pupil-Lab's products different from other Eye-trackers, such as Tobii and SMI?
Hi, I am using version 1.3.9. It worked great for a couple of days but suddenly eye camera 1 does not work. I know that this could be an issue with the drivers but it worked perfectly 1 day ago so I am not sure what to do. I uninstalled the camera drivers (both visible and hidden) and re-installed but nothing. Here is the message I see. Any ideas?
@user-b23813 The log does not indicate any failure. What does happen? Does the eye window open? Does it show a gray screen? Is the ID1
camera listed in the eye's uvc manager plugin?
@papr Hi, yes it shows a gray screen, whereas ID0 works fine. Not sure I know how to check if the ID1 camera is listed in the eye's uvc manager plugin.
@user-b23813 Click onto the Activate source
selector in the eye window. Is the ID1
camera listed?
@papr No, it's not. I only see ID0 and ID2.
@user-b23813 Please make sure that the camera connector is connected properly https://cdn.discordapp.com/attachments/412224263140016128/412598203725381632/JPEG_20180212_141751.jpg
@papr it looks fine
@user-b23813 This seems to be an hardware issue then. Please write an email to info@pupil-labs.com containing your order id and a short description of the problem.
@papr Will do, thank you very much
@user-6e5aaf Perhaps this is a question better answered by other researchers in the community who have experience with these other products to offer comparison.
Hi, So i've been trying to study and understand the pupil-helper scripts for the time-sync system but I'm not sure I really get how it works, in terms of what is actually going on with the jitter-compensation and what the method is to connect a script to time-sync
@user-dfeeb9 Time sync is absed on this idea: https://en.wikipedia.org/wiki/Network_Time_Protocol#Clock_synchronization_algorithm
@user-dfeeb9 You will need to include network_time_sync.py
in your application. And then you probably need to implement the time sync master based on the example: pupil_time_sync_master.py
.
I see
Also, in case you did not see it yet: The exact time sync protocol spec that we use: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/time_sync_spec.md
ah I've read this
my main confusion* was exactly how to set it up to synchronise directly with pupil-recorder but i think this makes sense
The implementation of the clock follower is a bit more difficult than the master. Pupil Capture implements both. Therefore I recommend you to implement the master in your application and let Capture do the time following.
i see
this is very helpful, thank you
I will return if I struggle further, but thanks kindly
@wrp Thanks... I'll check out the updated wheels to see if that solves the windows issue. In the meanwhile I'll continue to go forward with the linux dev/windows setup
Hello! I am trying to build a research experiment that utilizes the Pupil glasses as an eye tracker. I want to record gaze fixation location relative to a computer screen center, not just relative to the World Camera. Is there a simple method for this?
@user-6952ca Yes, have a look at the surface tracker plugin
@user-6e5aaf In response to your question about how Pupil compares with other types of tracker (e.g. TOBI, SMI)... I was looking to purchase a head-mounted eye-tracker last year and decided to go with Pupil over Tobi after meeting both companies and trying out their trackers at ECVP in Berlin. I can tell you why I went for Pupil over Tobi. The headset is lighter, it doesn't have a lens and the modular nature really appeals to me (you can swap out cameras, upgrade, 3D print custom frames etc.). The software is all open source and python based, so you have a lot of flexibility there. I haven't tested Pupil extensively yet, but the accuracy, features, sampling rate are all more than sufficient for our purposes. The only issue I've experienced is getting their capture software to integrate with MATLAB/psychtoolbox, but I've just about got this figured out now with the help of the support team (thanks guys!) and they have also acknowledged that adding more MATLAB/PTB support will be beneficial, so I can see that improving.
Also, I don't know how easy it is to get hold of SMI eye-trackers anymore, since they got acquired by Apple...
And of course, one huge factor in my decision was the fact that the Pupil head mounted system can be had for <$2k, whereas the Tobi is ~$20K π
I also have a bunch of experience with Eyelink, but their systems are not head mounted so I don't know if that's relevant. But feel free to contact me if you have any questions
Hi, I see this error message when I open the player. However, the player works. It seems like a directory issue. Do you think that leaving it as it is would be fine or could it somehow affect my analysis later?
@user-e7102b Thank you for your answer.π
@user-6e5aaf Also, Tobii seems to completely disregard the research market, and focuses on games. Their demos at scientific conferences are designed to obfuscate the quality of the track - for example, their latest demo shows the gaze position as a >5 degree cloud of smoke. How am I supposed to evaluate tracking accuracy, precision, or latency with that?!?! Similarly, it's only a recent event that they made eye images available to their customers. It's quite worrying that the company would prefer to hide data and indications of track quality from potential customers. SMI never did this - and their tracks were great, but @user-e7102b said, their trackers are no longer available. Pupil is the most transparent of all of them (open source!), but their products and software are still in alpha/beta, so don't expect something quite as polished as a product that costs >$10k.
Not yet, anyhow. STill, the software is in good enough shape that I'm conducting funded research using pupil mobile trackers.
...i'll even go as far to say that it's shaping up to be the best solution for mobile tracking, regardless of price.
@user-8779ef Agreed. I would say further that pupil open source software have everything to grow into a fine, pervasive and versatile ecosystem
For mobile and stationary
So, it is a damn good investiment for developers
Pupil is not a threat for imotion like business today. But I was asking myself if it could possibly be in the future. What do you think?
Hey there, got one piece of pupil with two eye cameras to test out in work. Got it working for the most part but I am kinda struggling with the eye focus set up. We are making games for mobile devices and no matter how I tried to adjust the cameras I was not able to set them up in a way that properly centers on both eyes when they are looking down (sitting position with cell phone in hands, looking down on the screen). Especially right eye that has the camera inverted (is that by design?) seems really hard capture in this position. Would really appreciate any suggestions on how to deal with this, thanks!
@papr My colleagues and I want to be able to have access to gaze position in reference to the screen center in real time. An example of this is utilizing the Pupil glasses to function as a mouse cursor. I have seen some videos of this being done and am wondering if anyone is currently working on this.
@user-6952ca this works with current Pupil Capture and this script out-of-the-box: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/mouse_control.py
This example even controls the mouse. You can remove that part.
@user-4d2126 right eye camera is inverted by design because the sensor is physically rotated. You can flip the image display in the eye window's general menu if that helps you. Also if you are having difficulty adjusting the eye cameras, you may want to consider using the eye camera arm extender.
You can see more about the eye camera arm extender here: https://docs.pupil-labs.com/#additional-parts
we have been shipping this with all new 200hz systems
but if one wants to buy these parts for an older system, you can do so via our shapeways store (linked in the docs in that section)
If you'd like some concrete feedback on eye camera orientation, you can DM me a link to an eye video or send an eye video to info@pupil-labs.com for feedback
@wrp Thanks for the reply! The flip does not really help with this. Tried using the extenders but while it did help with the angle, the picture ended up being too zoomed in (might be wrong use on my part, did not have as much time to fiddle around with it yet) for proper capture. I will definitely stay in touch with you, thanks for offer. Gotta figure out if we can use this or not ASAP.
@user-4d2126 I am quite confident that we can get you up and running after a bit of adjustment/notes/tips. Please send us an email and I or my team at Pupil Labs can provide you with some feedback
@yoonmin#2698 You're welcome
Quick question for the pupil-labs developers: do you have any plans to add a desktop mounted eye-tracking solution to your product line?
@user-e7102b we will continue to focus on head mounted/wearable systems. Currently no plans in the pipeline for remote systems
What does it take to get pupil tracking working on a raspberry pi zero w?
@here We are happy to annouce the release of Pupil v1.4! Get the latest release here: https://github.com/pupil-labs/pupil/releases/tag/v1.4
Hey guys, is it possible that the color of eyes plays role in the quality of detection? I had great readings from a person with dark eyes, but had great issues with people with light colored eyes. Any settings I could change to work around this?
@user-4d2126 usually that is not an issue. Can you send a sample eye recording?
Will do.
any prefered file sharing service? files are too big for discord.
@user-4d2126 drive is preferred
share with info@pupil-labs.com
@wrp Shared
Thanks @user-4d2126 - someone from our team will review and get back to you with feedback
@wrp Perfect, in the mean time, my boss is kinda pressuring me to do this ASAP while I do not really have much experience with this technology in general. Could you share any tips on best possible calibrating process for capture of mobile devices? I have read through the guide but did not really get a good idea of best possible way to calibrate when the field of view is under an angle lower than 90 degrees.
@user-4d2126 I checked both videos. Detection if fine. The only thing I had to change was reduce the min pupil diamter to around 11.
@user-4d2126 if you use the other lens make sure to re-calibrate the camera intrinsics. For your usecase you might also want to try 2d mode.
@mpk Will give it a try, thanks for your help. Will get back to you if I cannot make any break through.
ok sounds good!
Hey, just started developing in Windows so far... how is it possible to produce such "binaries" of the pupil source code as you're providing on the release site on github? we do have some windows clients "on the road" where I don't want to run the source directly... π
@user-29e10a any chance your changes can be put into plugins?
good idea, I'm not sure, but I think about it β at least for the functional parts. I made some ui customizations which are not interesting for others
@user-29e10a you can overwrite exsiting plugin with custom plugins. You can also make changes to the ui via a plugin.
if you can use the plugin path to field deployment is super smooth. Otherwise, get ready for a rough ride π
ok, then I definitely look into it β on the other hand, what does not kill us makes us stronger π¬
hi I've a beginner's question. I'd like to collect <x,y> positions of eye fixations and I can't figure out how to collect those datapoints. Preferably, I'd like to do that through Python though using a ready made software would also work. Can you guys help me sort this out...documentation is tremendously long I get lost every time I dare searching for something of that sort..thanks in advance.
Hi, I'm trying to run pupil_capture from src from within PyCharm. Does anyone (@performlabrit ?) know how to modify a run configuration to expand the PATH to include ..\pupil_external ?
Hi, I've created a very basic "toolbox" containing functions and example scripts that enable Pupil_Capture to be controlled using MATLAB/Psychtoolbox. I'm on a tight deadline to get Pupil up and running, so my solutions may not be the most elegant...but they do the job for now. I figured I'd share here in case anyone else finds this useful. Any feedback from other MATLAB users would be welcome. https://github.com/attlab/pupil_labs_matlab
hey @user-e7102b and others here, I've taken a look and a lot of your python stuff looks very similar to a middleman system I've been working on. It seems we both essentially worked off the pupil-helper scripts. I've made mine public. You can find it here: https://github.com/mtaung/pupil_middleman Functionally they're basically the same and you have the addition of matlab scripts. For the time being, I think our stuff will work until Pupil brings over native support to matlab. Also, I don't know if this is actually all that useful. Some input from the pupil guys would be nice too. I also don't like how I defined certain strings to communicate with pupil functionality. Is there documentation for all available commands?
Hey @user-dfeeb9 Thanks for sharing this. I'll take a look at your scripts over the weekend.
@user-e7102b @user-dfeeb9 this is great. Please consider adding a link and short description to your work in https://github.com/pupil-labs/pupil-community so that other members in the community can find and use your tools/scripts
Hello everyone...i m working on a project related to eye motion tracking but my pupil tracking has some noise issues .anyone to help me out please
any ideas if floppy discs make a good IR pass filter?
@user-138d0a, I would bet it would be waaaaaaay too dark
As a rule of thumb, since the ir filters filter just IR light (mostly), you can test your intended filter by placing it in front of the camera and checking the captured footage
you mean by placing an ir filter i should see black since everything is filtered? But I think any kind of filter is not perfect and will pass at least 1% right?
If you can't see anything, as I would bet with floppies, than it's no good as a filter. The other way around isn't true, though
No, I mean if you put a filter in front of the original camera, before removing the ir filter, you should be able to see some image
Sounds the same thing but ir filter and ir pass filtered in different order
unless both are 100% efficient something will be visible
Yeah, my point is that more than something should be visible
Not just something, but a lot of things. If it's too dark with both filters, chances are it will be too dark with just the second one(the floppy in your case)
But, of course, you could check it in the middle of the disassembly just by holding the floppy in front of the lens, without yet committing to the "surgery"
the visibility could be a result of imperfection of either of the filters, that wont tell me if the foppy disc actually acts as a real IR pass filter or just a very strong dimmer.
This test won't be able to tell you for sure if it's a good choice, but could tell you it's a bad one
Of course, the only way of knowing it works is by testing or knowing the properties of the material beforehand
unless someone else has done that already (likely)
I sincerely doubt a floppy would let enough light pass through. But i would like to know this too. Floppies are way easier to get than film negatives
@papr Is there an built in way to prevent the world camera video from being recorded along with all the other pupil metrics? I plan to do many hours of recordings with my participants and these files are huge so space will become an issue. I don't think i'll ever look at the raw world video footage, just pupil position relative to my defined surface.
I understand if this is not possible, and I can alway figure out a workaround if not.
A question regarding the setup for the pupil-labs/libuvc on Windows 10, VS17, for x64. I've downloaded and built both the pupil-labs/libuvc and the libuvc. Attempting to use it with a device that definitely works with UVC and all, and has had the libusbK (v3.0.7.0) driver installed on it via Zadik does enumerate the device properly (albeit causes some minor heap corruption), however when I try to start the stream, "uvc_stream_open_ctrl" gives me back the error Not Supported. What could be going wrong there?
@user-e7102b how long each of your recording sessions will last? Do you have any precise number in mind? There was a memory issue that prevented people from opening their long recordings in pupil player.
Anyway, a workaround is to split recordings... I would recommend 20-30 min
You can save a lot of space using the compression option (either less space-more speed or more space-less speed)
Can anyone tell me if the binocular addons for Vive/Rift provide enough precision for performing eye tracking for foveated rendering? The addon isn't cheap so would like to know if it is worth investing in. If anyone has done it before any links and/or videos would be appreciated.
@user-e7102b You will need to modify the source code to not record the world video. But you will be able to use Pupil Player to analyse the recording, starting with v1.4
@user-41f1bf Hi, the project we plan to use the tracker for first will mainly involve multiple 5-30 minute recordings, so hopefully that means we won't run into issues. I was hesitant to try the "smaller file, more cpu" compression option as I was planning on running pupil capture on the same machine that I'm using to display stimuli (and I was worried this increased CPU usage might have a negative impact on timing), but I'll give this a try. Thank you for the suggestion and information.
Btw has this memory issue that prevented people from opening long recordings in pupil player been fixed now?
@user-e7102b This has been mostly a Python 3.5 multi-processing issue. All bundles are frozen using Python 3.6. 30 minutes of pupil_data
should be no problem at all. Nonetheless are we planning on introducing changes that deal with very long recordings. But these take time to implement and test thoroughly.
The real time annotation feedback can be implemented easily using log message. I will make a quick PR to include this change
@user-adb91e I would recommend you to ask HMD questions in the HMD room. π
For reference, that is vr-ar π
@user-adb91e I can say for sure that Pupil have enough precision for a lot of motifs. And I can also say that monocular accuracy is good enough for stationary stuff.
stationary stuff as in?
As in my doctoral thesis
:)
I do not know much about this topic but the question for me is if the delay for running the pupil detection is worth the time saved on GPU rendering time.
If you are planning to use binocular setups, you should have better chances than me for higher quality detection
During stationary inquires
I am talking about accurate detection. Pupil precision is excelent in any case as far as I know.
detection of what?
Pupil detection.
In the monocular case it is difficult to detect the pupil if the subject looks into the opposite direction of the eye camera.
Are you aware of how Pupil measures detection quality?
There are two basic measures. Precision and accurary.
These two terms refer to the quality of gaze mapping though. Not to the pupil detection itself. (As far as I understood them)
Yes! I should have said data quality.
Gaze mapping means the end result. The data points that you effectively end with.
@user-adb91e I think your question is too broad, as I am understanding it. As @papr have suggested, are you concerned with machine performance (rendering itself)? Or as I suggested, with accurate data during assumed good rendering?
the latter
Noisy mapping can be counteracted by increasing the high-resolution rendered area.
Up to the point where one would not speak of foveated rendering anymore.
I can discuss the feasibility of foveated rendering in great detail but first I'd like to know if pupil software and hardware can provide accurate enough data for foveation.
I think accuracy is not the issue here. The question is if the processing is fast enough for foveated rendering. I do not know the total delay by heart that it takes the pipeline to produce a gaze sample starting by capturing the eye video frame. @mpk knows it.
The more I think about it the more I am sure that this is doable. You will need to detect saccades though and disable foveated rendering until you detect a fixation which you can use to re-enable foveated rendering at the position of the fixation.
there are different types of saccades, which one are you referring to?
I do not know the exact terms. I do not mean micro saccades but those that change the position of the gaze location dramatically
why disable during those?
Because you will not know where the subject will fixate next and there is a delay between the actual start of the fixation and the Pupil timeline giving you the gaze point for the start of fixation. If you do not disable foveated rendering during the saccade it will result in a poorly rendered scene at the location of the fixation for short period of time (the delay between the acutal start of the fixation and the pipeline giving you the gaze point).
This delay is only a few milliseconds but be enough for the subject to notice the poorly rendered scenes.
On the other side, you render with way less frequency than the system is able to produce gaze points (2x 200Hz). Therefore the delay might be low enough for continuous foveated rendering.
this is from 2012 so not sure about performance issues. 200Hz cameras might be unnecessary
The mapping accuracy always depends on the quality of your calibration but lies by 0.8-1.3 degress depending on the chosen detection/mapping mode
The purpose of foveated rendering is to render more complex scenes than the hardware can handle vs with normal rendering. If it is going to be turned off at any point, at that point there will be fps drop which is a no-no in VR, so not an option.
it is not there to merely save system resources
Ok, I understand.
Some proprietary systems guess new gaze position and provide to the user before the eye actually reaches there, does Pupil not have such a feature?
No, gaze points are evidence-based. This means that everything starts with a single video frame of the eye camera on which the pupil detection algorithm is run. It results in a 2d ellipse that is relative to the eye image. In the 2d mapping mode this 2d ellipse is used to generate a single gaze point. In the 3d mapping mode a series of 2d ellipses is used to fit our 3d eye model and to generate 3d data that is projected into the space of the world camera/hmd display
Any guess work on future eye positions would be done based on previous gaze locations and can therefore be calculated outside of the pipeline.
@user-e7102b https://github.com/pupil-labs/pupil/pull/1080
@papr Great - thank you!
pyglui v1.18
wheel (for Windows devs) has been uploaded - https://github.com/pupil-labs/pyglui/releases/latest
@user-adb91e do you mind telling me what proprietary systems will guess gaze positions? Just to make sure I will not buy them.
It was a joke π¬
@mpk Apparently the plugin system does not allow the manipulation or investigation of the eye images, which is crucial for me, so for this part I have to take the rough ride ... are you working on including the eye images into the plugin pipeline?
@user-29e10a do you need real time access to the eye images? How much of a delay would be OK for you? You can use the frame publisher plugin to broadcast eye images over the IPC. If the IPC adds too much delay, you will have to modify the eye source code for direct access.
@papr the delay is not very important (1 sec would be ok for example) β but the processing of each frame could be longer β if I manage to do the processing off the main thread, this would be no problem?
i will look into this β pupillabs never ceases to amaze me π
by the way, any progress of putting the 200 Hz cams into the vr headsets? π
@user-29e10a the eye and World processes are separated from each other for performance reasons. I would suggest to do the eye frame processing in a separate process as well.
Hey guys, finished our first test over the weekend with what I would call a moderate success. One thing I have realized is that to be able to accurately track eyes on a mobile device one has to fix the device to one place since if the player holds it in his hands only it's impossible for him to not move it if even a slightly. Is there anything I could do to work around this via software/calibration? If not, do you have any tips for or even a solution for some kind of a holder?
So far the best solution I could think of was using stativ with a grip for mobile phone.
@user-4d2126 "fix the device to one place since" do you refer to the phone or to the headset?
And what happens if this device is moved?
@papr The tracking shifts accordingly, meaning that what was based on original calibration considered the center is now being on the right side of the screen (if the device was moved to the left). This might be an issue with the way I have calibrated it but so far I haven't found any other way to do it (to be fair I have been using the tracker for a week only).
This is a known error source called slippage. Were you using 2d detection/mapping or 3d?
@papr I have used 2D mode since I have been having issues with getting an accurate calibration in the 3D mode. 2D was a recommendation from mpk.
Yes, 2d is more accurate on average but more prone to slippage.
since we are using this on mobile devices ranging from 4 to 6 inch screens, we need it as accurate as possible.
In your case I would recommend to recalibrate periodically
I have been fiddling with calibration since the beginning and the most accurate results were produced by recording automatic screen calibration on a computer and playing it as a video on a mobile with manual calibration on but this of course leads to the necessity of keeping the device in same spot.
Not sure if head movement affects the accuracy as much but since the tracking is done in a sitting position (with device being fixed in one place) there should (hopefully?) not be much room for moving the head and getting different view angles.
I have a very limited understanding of this as I am new to this technology so maybe I am just going completely wrong around this.
Ok, I think the issue is a different one: Your area of calibration is too small. Yes, smaller calirbated areas result in morea accurate results within that area but to very bad results outside of it.
The calibrated area is fixed within the world cameras view of field. If the phone moves out of the area you will get incorrect results if the subject looks at the phone.
Therefore my suggestions is to increase the calibration area.
@papr So for example if I used a tablet to play the calibration video and then used a phone in the area where the tablet was, I would get accurate results in a wider area? I have been trying similar thing with computer display but the results were not accurate at all.
What do you mean by not accurate at all? What does the accuracy visualizer plugin tell you? It calculates the accuracy in degrees after each calibration
I was getting an offset (at least an inch) no matter what I did until I tried the aforementioned thing with the video played on phone.
Is an headrest an option for you?
@papr Depends. Do you think that a headrest would bring better data than having a phone fixed to one position?
The headrest would fix the last thing in your setup that is not fixed yet: the head. The phone is fixed. The headset does not move much if the head does not move. And the phone does not leave the calibration area if the head is fixed. You should increase your calibration area a bit though. Playing the video back on the phone will result on a calibration area that is smaller than the display.
Fair enough, will definitely look into options for headrest. Was hoping to capture this in a completely natural environment (ie. device in lap and head free) but I guess the technology is not at the point where this would be feasible yet.
@KeenMFG#0059 I understand and Pupil is actually designed for this natural environment. But accuracy comes at a cost.
Or to be more specific: Even if the angular error is small, the absolute error increases with distance.
Yeah, I was coming here with the idea that for the results we are looking for, will need to make some sacrifices as far as the player's comfort goes. I believe that will be enough to persuade my boss to go for this. :)
Cool, let us know if you need any more information or tips. I think @user-41f1bf also used a headrest in his work. Maybe he can elaborate on his experience with it.
Great, thanks for all the help. I can definitely use any I can get but fro the reaction to data I got so far the leads seem satisfied. Also really appreciate how you guys are active here. I Believe that this could lead to a long term partnership and hopefully to an actual integration with mobile gaming business in the future for your technology.
Either way, what I wanted to say. Awesome work, keep it up. :)
Happy to hear that π
hi @papr and @wrp, you wanted me to add our repo to the pupil-community MD right? is that just through a PR?
myself and tom have merged our projects over here: https://github.com/mtaung/pupil_middleman
@user-dfeeb9 Yes, correct, just add a link to your repository to https://github.com/pupil-labs/pupil-community/blob/master/README.md and make a PR with the changes. π
It would probably fit best in the Scripts
category
thanks
on a related note @papr , if you ever get a chance I would greatly appreciate a gloss over what I wrote there - its mostly stuff you'll be familiar with, i.e. pupil helper code
but my main worry is that i may have fudged up the implementation somewhere
and i still need to write in proper time sync with network time and pupil-recorder
@user-dfeeb9 of course π
thank you kindly!
@user-4d2126 yep, it will improve stability but I would recommend a chin rest if you really need to ensure that something must be always visible for the world camera. For me it was necessary to detect the computer screen so I need the screen always at sight.
You should be aware that calibration is highly influenced by the operator (you). So you should develop a routine to efficiently move people around, wearing glasses, find when pupil is not being detected, when calibration markers are not being detected, the best calibration method for your usecase and so on
β this is very much true.
Hello! I love that Player now shows the time bar, but is there somewhere to concurrently see the frame number so I know what frames to export? Thank you!
I am working on that. Expect that option in the next release. Currently this is only possible with the old version.
Ok - so when I do video export or raw data export can I only do the full video right now?
no, you can still use the trim marks. They are just time based instead of frame index based
I am working on an option that displays the trim range once based on time and once based on frame index
oh I see! Thanks! One last quesiton - is there now a way to get saccade length/duration or is that still in progress?
This is still work in progress
I'm working in Player with offline calibration and want to adjust the mapping range, but don't see any way to figure out frames since the bar now shows time. Is there a way to do this or a way to make the mapping ranges done by timestamp instead of frame?
@user-2798d6 we will be releasing a fix for this soon.
@user-2798d6 The v1.4 Player release allows to set calibration and mapping ranges from the current state of the trim marks. No need to manually type frame numbers anymore.
Important [email removed] :
We have found a severe bug in Pupil Mobile when used with our new 200hz cameras. We have observed that eye videos recorded with the new 200hz cameras are not in sync with each other nor the world video. This results in faulty calibration and gaze mapping. We are working on a fix as we speak! Please do not use Pupil Mobile with the new 200hz eye cameras/Pupil headset headset until this is resolved. Headsets with 120hz eye cameras are NOT affected by this bug. We will update you all again here when the bug has been resolved.
hello all, tried to run pupil player v1.4-2 on mac os 10.13.3. After clicking "open" on mac os Gatekeeper the application immediately quits. I can't find any solution to the problem. Ideas?
@user-c83a9a does the same happen with capture?
I was not able to reproduce this issue on my Mac. 1) I downloaded the zip file 2) Opened the Pupil Player dmg file 3) Copied the Pupil Player app into my Applications folder 4) Ejected the disk image 5) Opened the Player application from the applications folder 6) Gatekeeper asked for permission 7) I granted permission 8) The "drop a recording" window appears as expected.
Hi
For pupil plugins in windows
What is the home folder for plugin in windows.
AFAIK \Users\<user>\pupil_<app>_settings\plugins\
Thanks @papr !
And for mac?
I will make a pull request with this information, it is missings in docs
~/pupil_<app>_settings/plugins
Great!
see my edit above
Yes, I got it
Hey everyone! I am running into an issue with my new 200hz headset. On the new version of pupil capture (didn't happen a few weeks back) when I connect my headset, the eye camera simply won't read in. I get a log message of "[info] Found device. Pupil cam2 ID0" "eye0 warning (video_capture.uvc_backend:capture failed to provide frames. Attempting to reinit. "
Then goes through warning about backlight, pre-recorded calibration etc. I've tried re installing, re plugging and different cords. This camera worked a week ago and nobody has touched it since so it shouldn't be a hardware issue. I appreciate any inisght or tips!
@arispawgld#8014 what os are you using?
Sierra MacOS
I've resolved the issue!
I do have a question about the new natural features edit mode: when I have tried to use it to increase my tracking accuracy, I notice no difference in the tracking once I mark points. Do I need to click recalibrate for them to take effect?
That, and you need to set the calibration section to natural features and make sure that the natural features lie within in the calibration range
Hello. I think you revise pupil page information . In the page of "https://docs.pupil-labs.com/#python-libs", "pip install msgpack_python" command is now deprecated according to "https://pypi.python.org/pypi/msgpack-python" on the official python page. This package installation might use "pip install msgpack" instead.
@user-537e9a thanks for pointing this out
you can make a PR to https://github.com/pupil-labs/pupil-docs with a change to the install instructions.
Okay, I will make a PR. Thanks for your quick response.
You're welcome π
Is there an easy way to generate a video out of the surface area?
@user-8be7cd Unfortunately not
Hello, I realize there is a major bug in Pupil Mobile right now which could be the cause of my problems. But, is it normal for mobile to not produce a world.mp4 file? We are trying to use a phone owned by the school (Samsung Galaxy 8) and I need to check that the phone isn't the problem here. It's a new 200Hz headset.
@user-072005 Pupil Mobile creates videos according to the camera names: Pupil Cam 2 ID0.mjpeg
, Pupil Cam 2 ID1.mjpeg
, and Pupil Cam 1 ID2.mp4
(if h264 encoding has beeen enabled for the world cam, default). Therefore there is no world.mp4
after copying the recording from the phone. Nonetheless, Pupil Player will try to detect the world video file and rename it to world.mp4
when it opens the recording for the first time
@user-072005 I suggest you verify that the world camera works correctly by either opening the preview on the phone (click on Pupil Cam1 ID2
) or by streaming the video via wifi to Pupil Capture.
@papr Yes, both cameras are working and I did stream it to capture. I suppose my overall goal is to figure out why when I play the folder in Pupil Player, there's nothing showing that it detected where the eye was looking. I could see this when I streamed it, just not when transferring the video files. Is this caused by the bug mentioned previously?
@user-072005 No, the bug mentioned earlier is a different problem. Could you please send me a list of all filenames in an example recording after you have opened it in Player? And which Player version do you use?
@papr audio_0001.mp4, audio_0001.time, imu_002.imu, imu_0002.time, info.csv, key_0004.data, key_0004.time, Pupil Cam1 ID2.mjpeg, Pupil Cam1 [email removed] Pupil Cam2 ID0.mjpeg, Pupil Cam2 ID0.time, I downloaded the Windows x64 v1.3.9 player.
Pupil Cam1 ID2.mjpeg
is your world video.
h264 transcoding was disabled and therefore Pupil Mobile saved the raw mjpeg data stream.
Is Pupil Cam1 [email removed] a typo or is it
Pupil Cam1 ID2.time` in reality?
It was a typo, the latter is correct
Opening this recoridng in Player it should upgrade the recording and you should end up with eye0.mjpeg
, eye1.mjpeg
, and world.mjpeg
and their respective *_timestamp.npy
files. Something is wrong if this does not happen.
Are you able to share your recording privately with me such that I can investigate the issue?
Sure, but I've never used discord. So how do I do that?
I would recommend to upload the recording to Google Drive and to share the link to the recording with me in a private message.
ok, I will do that
Alternatively, you can share it directly with [email removed]
hi everyone! does anyone of you have experience with running pupil on a nvidia jetson platform? thanks in advance π
can anybody tell me that how to check that red dot is coming into pupil properly irrespective of the head movement without using the pupil service exe
Just as an follow up to @user-072005's issue: The recording has been upgraded correctly by Pupil Player. The actual issue was that the gaze was not shown. This is due to the fact that Pupil Mobile does not do any gaze mapping. In order get gaze in Pupil Mobile one has to record the calibration sequence as well and use Offline Pupil Detection/Calibration as described in https://docs.pupil-labs.com/#data-source-plugins
@user-e7f18e What do you mean by that the red dot is coming into pupil
? Could you describe your setup?
I can give you remote if you want
That is probably not necessary. Could you explain your question in a little bit more detail?
when I run any unity scene, in background i can check the eyes before calibration through pupil capture
i want to perform this without using the pupil capture service, but through code how can i identify that my eyes would be properly calibrated
You either need Pupil Capture or Pupil Service. These application capture the eye cameras, do the pupil detection, calibration and gaze mapping. There is no way around these applications if you want to use the Pupil add-on.
Hello I am new to pupil labs
I just got one for hololens
Hi @user-11dbde welcome to the Pupil community π
thank you!
i am having troubles in running the unity example for hololens
it compiles
but i cannot deploy it
i receive a code 1 error when trying to deploy
Please see the vr-ar channel for questions related to unity π
thank you
no one answering in hmd-eyes π
@user-e04f56, the hololens example author will surely answer when he is back in the office tomorrow.
Thank you!
@papr Thanks for the help with the natural features editor! I'm loving this feature
@papr A quick question. Can you point me towards resources regarding the various parameters in Pupil player?
parameters for what exactly?
You can find an overview over the plugins here: https://docs.pupil-labs.com/#pupil-player
Pupil diameter threshold is self explanatory, but what about intensity levels? and Confidence threshold?
and most importantly, model sensitivity.
Are, are you talking about the 3d detector parameters?
In the eye window?
See this paper for the 2d detection parameter explanation https://arxiv.org/pdf/1405.0006.pdf
The model sensitivity describes how sensible the 3d models are too change. But we are continuously working on improving the detection/mapping pipeline. The next iteration will have more in-depth documention
Thanks @papr yes, I was. I essentially wanted to gain an intuition about which parameters to change on what type of eye image.
Hello everyone. Any particular suggestion about oTree, Willow or Psychopy to run experiments using Pupil?
What about open sesame?
I just wanted to tell you all that the trim markers for calibration and mapping are AWESOME! Super easy - thank you!
Same here, I just receive the headset and try the program, It was great
I am planning to communicate with external hardware device using pySerial, and also synchronize the hardware with pupil's timestamp. Is there any existing plugin that do this?
@user-e02f58 thanks for the feedback π you might want to check out this project which is a plugin using pyserial - https://github.com/akramhussein/pupil_plugin_marker_tracking_and_fixation_detection
you may also want to get in touch with the author of this fork: https://github.com/jesseweisberg/pupil
the second link - jesseweisberg's fork of pupil does not explicitly use pyserial but uses ROS to control a prosthetic hand. The first project demonstrates how to make a plugin for Pupil that can control external hardware using PySerial.
thanks!!!
Welcome π - You may also want to check out other projects in the pupil-community repo: https://github.com/pupil-labs/pupil-community
Hello everyone. we try to get pupil capture working on a mac os 10.9 (Mavericks), but it crashes at starting. We have tried the last version of pupil software and the previous one. It works fine on another computer running El capitan. I attach the crash report. Any hint about what it might be happening would be great.
what is the standard settings for pupil capture exe and how can i detect the tracking is proper or not in capture service
When using the offline pupil detector on Windows 10 with a monocular eye tracker, it crashes before completion almost every time. What could be causes of this?
@user-072005 are you using Pupil Mobile and 200hz camera?
yes
it says invalid data found when processing input
@user-072005 please note that there is a current bug in Pupil Mobile regarding this. We are very sorry. I wrote about this 2 days ago.
For now I recommend using Pupil Capture. This should be resolved in the next couple days with a new software release.
ah ok, I'll keep an eye out for the new release. Thanks!
Hi, Is there a way to make Capture remember my settings? Every time I launch, I have to set the 'absolute exposure time'. Thank you
@user-a49a87 this setting is overwritten by us. We can change this. Why do you need a custom value here?
Hi, we're looking to maybe purchase some of the ey-tracking glasses. Anyone got any feedback?
Hi, Today I am experiencing some artifacts in my eye image. I'm attaching A screenshot. Is this a compromised camera? These white lines keep showing up with not apparent cause.
for the camera eye-1 is for left and eye-0 is for right. please tell me if i am wrong
@mpk with default value, the image is too dark and the algorithm doesn't work well
@user-e7f18e this sometimes happens but does not affect tracking. If it
Happens more frequently please write us an email so we can get you a replacement.
The artefacts can be removed by resetting frame rate or resolution.
@user-a49a87 what camera are you using?
Pupil Labs in HMD (HTC Vive)
If you want a more specific answer, please tell me where to fin it π
Ah ok. let's fix this in the source code. What value is good?
around 120
Ok. Would you mind making an issue on GitHub?
We can track and discuss implementation there.
here is an example of what it looks on 127
(if I put 128, it became a lot darker, it kind of cyclic)
No problem. How can I do that?
Get a GitHub account. Go to GitHub.com/pupil-labs/pupil/issues and raise an issue
done
sorry, bad copy-paste. Here is the example with 127 https://we.tl/UwId1P6qtI
Awesome. We LL have a look at this on Monday!
thx
@user-a49a87 In the same window where you modify exposure time, you should have other options further down to modify gain/gamma/contrast.
(and others, like Hue, Saturation, etc)
ok, thx, I didn't see the magic arrow
here just above, there is an example of the capture of my eye, is it too bright?
The link doesn't work for me
And brightness doesn't matter, all that matters is the contrast between the pupil and the surroundings so it can detect the edges of your pupil.
Also try modifying the ROI to make sure it's only looking for your pupil in the region of the image where your eye is likely to be.
here is a new link https://we.tl/dZf3wGGifX
I find it quite complicated to configure the parameters to have a proper following of the pupil
That link is also not working for me. It might be a problem with the corporate firewall.
When I "play" with absolute exposure time, gamma, constrast, ... Frequently, the shadow cast by the eyelashes are darker than some part of the pupil.
Do you have a lot of light leaking in to the HMD? The tracker uses an array of IR emitters to the inside of the camera. These shouldn't be casting significant shadows in the area of your pupil. If you are talking about shadows far above your pupil, then try reducing the ROI
when I use the algorithm display mode, there is green rectangles appearing, what are they?
Could you send me a picture of what the ROI should look like?
I set the ROI as such the pupil is always inside if I look as far up as possible, as far down, and so on
It depends on each person's head shape. For me, it's a box about 1/2 the height and 1/2 the width of the full window that sits at the bottom middle of the image. I determine it the same as you.
ok, so the intersection between the eyelids and the iris is in the ROI. That's the shadows I'm talking about
Try rotating the knobs on the Vive strap to get the cameras further away from your eyes, and adjust the focus accordingly. Unfortunately this reduces FOV.
I'll try
Thank you for all the advices. I have to go. See you maybe tomorrow
good luck
Hello- does anyone know how to display the pupil video instead of the gaze video in pupil player
Also- trying to figure out best setting for blink detection, suggestions about confidence intervals?
How do o make sure it is recording the eye video which is more important to me then the world video
Is there a way to save settings so all recordings are done on same settings?
@user-13ea21 you need two things. First must have enabled the record eye option in pupil capture. Second, if the eye video was recorded, then you can see it using the vis_eye_video_overlay plugin (something like that, I am not completely sure about its name)
Hi, Has anyone had issues with Calibration? I am running a MacBook Pro with Sierra 10.13.2 (17C205) and a brand new Headset. The only thing I see when starting screen calibration is a white, blank screen. ... OK βΒ issue seems to be with the 'full screen' setting. It works when it's turned off βΒ but the floating window is quite small?!
What are the minimum Windows computer requirements to run pupil apps
Hi guys, how do I install drivers for win10. When I run "pupil_capture_windows_x64_v1.4.1" it hangs on updating drivers and doesn't load.
'
Damn, what a pain in the rear.. got it though.
Hi @user-13ea21 Minimum requirements - The key specs are CPU and RAM. We suggest at least an Intel i5 CPU (i7 preferred) with a minimum of 8GB of ram (16GB is better if possible). OS - We support macOS (minimum v10.9.5), Linux (Ubuntu 16.04 LTS), and Windows 10 (windows 10 only).
@user-47bcf3 were you running with admin permissions? Would you be willing to share some feedback and/or what you did so that we can improve driver installation?
I am running Pupil monocular with an i3 with 4GB DDR3.
It just works.
@wrp On the Pupil website, I could not find anything explaining installing drivers. After searching this Discord channel, the third instructions I followed was this link : https://github.com/pupil-labs/pyuvc/blob/master/WINDOWS_USER.md
But these instructions clearly tell you to plug in your camera after installing Zadig. You cannot install linusbK or Zadig without having your camera plugged in. (if you can, I couldn't figure it out). Which led to more frustration. If you look at these instructions, it isn't until step 5, that it says you need to choose the "composite parent" and not the interface. Well, you can't do this, unless you choose to install the composite back in instruction 1 when you installed libusbk.
It would be a lot easier, if Pupil put a modified version of these instructions on their website(I spent 2 hours today pouring over the website and the git repository looking for driver instructions - they may be there, but I couldn't find them). Then another hour just following these instructions(installed it wrong first, then uninstalled and started over again). Tell the user to plug your camera in at the beginning and watch, for when you have to choose your camera - choose the parent(in the software, the text box is small - you have to scroll to the right to see the part of the description that shows you the difference between the 2.
@user-41f1bf great to know that lower specs also work well. I have an i5 mbAir (macOS) with 4gb ram and Pupil Capture and Player also work well, but some users with very large recordings may require more RAM - hence the suggestion for higher RAM and CPU.
@user-91325a thanks for the feedback, and I apologize for this loss of time. Did you go through troubleshooting in the docs: https://docs.pupil-labs.com/#troubleshooting
One should not need to use Zadig at all anymore - but in some cases it seems that drivers are not installing properly on some Windows users systems. So more information about your setup would be appreciated so that we can smooth out the installer process.
ahhh
you know - if you've never been on that page before - finding those install directions is like a needle in a haystack. That part of the menu doesn't expand until you are in the Developer Docs... I went through all the stuff at the top of that document - you know "Introduction" Getting Started" User Docs....... It told you how to setup the software, but doesn't say much for how to install drivers.
Ok - thanks for the feedback @user-47bcf3 we will make the driver installer more present in the getting started section for Windows users. You can also feel free to make a PR to the docs at https://github.com/pupil-labs/pupil-docs
At one point, I did find the instructions on how to 'uninstall' but I didn't have any linUSBK devices to uninstall. So I couldn't figure out what to do. After I installed Zadig wrong, that was really helpful though.. because I was able to uninstall everything and start over.
sorry, this is my first experience with this software/hardware. I don't know what a PR is
No worries - your feedback is very helpful. Thanks @user-47bcf3 - A PR is a pull request (it is a way to ask maintainers of a software to accept your changes to code, documentation, or other files that are in a versioned system like git).
This was how I found the instructions :
@user-47bcf3 Did you find the link to docs.pupil-labs.com
on the front of the box that Pupil was shipped in?
Yeah, I don't know how to do that. Maybe someone will read this and do an PR for me.
@user-47bcf3 We will take your feedback and update docs accordingly.
BTW @user-41f1bf I see your PR and will review it today.
I don't have the box. I am a 3D printing guy, I was asked to make a bracket for some custom glasses. I was just given the glasses and camera and asked to test the bracket with the software to make sure you can see the pupil.
lol, that might be part of the problem
@user-47bcf3 got it. BTW - we have geometry documented for the mounts here - https://github.com/pupil-labs/pupil-geometry
If I had the original documentation, that probalby would have prevented this too.
@wrp Yes, I should have mentioned that my setup does not support long recordings (40min+). Anyway, I would not recommend such long recordings due to current stability constraints.
hahaha, that was the other thing - the guy that gave it to me didn't tell me anything about the camera until last Friday - I've been working on this for 2 weeks. I go to your website to get the software and what do I find... ? The stl's to print the ball joint... lol that would have helped a lot.
Do you know how to reassemble the world camera of pupil headset? I have the pupil eye tracking headset consisting of 3D world cam and 120hz eye cam and the high resolution cam in Hololenz binocular add-on. I want to change the world camera from 3D world camera to high resolution camera. Please let me know if you have a solution about that.
@user-537e9a unfortunately the cabling for the 3d world camera is not compatible with the high speed world camera. So hot-swapping these out would not be an option and could damage the system.
So is it only possible to do hot-swapping the eye camera in this case?
Yes @user-537e9a this is correct
Okay... Thanks, @wrp.
You're welcome @user-537e9a
Hello, anybody got experience with msgpack with QT or c++?
@user-e938ee I think @user-e04f56 does. I would suggest to hop over to vr-ar and to ask him π
Thanks!
@user-8779ef @user-41f1bf This is for you: https://github.com/pupil-labs/pupil/pull/1100
I am getting an error with pupil player- disabling idle sleep is not supported on this OS version. No valid dir supplied, works on my other computer- not sure what I need to do.
@rakupe#7607 what are you dragging onto the player window? It should be a recording dir (usually called 000,001,002...)
@papr Thanks! I'll have a look as soon as I can.
@papr Some questions (I'll start typing, but I feel I may be interrupted by a meeting I'm supposed to have now)
"Previously, all binocular mappers dismissed pupil data with confidence lower than 0.6 to prevent bad binocular mapping. Now, low confidence pupil data is mapped monocularly." Is there a flag to indicate if cyclopean gaze results from binocular, L, or R eye data?
Overall, this is a very good move on your part. Thanks for listening!
and coding and contributing.
@user-41f1bf I had the same issue with the player when I recorded for more than 40 minutes since my files were usually more than 30 GB. I chose to split the files and playing 20-25 minutes of recording goes well so far.
Thatβs what happens when I try to start it, it worked a couple times then starting giving me this error message
Can someone help me with getting started? When I try to move the camera to center on my eye, the video output only works for about 15 seconds then freezes for about 15-30 seconds, then works for 10 seconds, freezes for 10.. on and off. Is there a setting I can turn on so that it stays on until I get the camera centered on my eye?
@user-47bcf3 Do you have a monocular or binocular system?
mono
I have the cable with 2 jacks on it, but only 1 camera plugged in
@user-47bcf3 your cable has two female jacks on it?
I believe that you're working with quite an old system.
umm, I think the cable has male and the camera is female
yup
Is the camera "freezing" when you do not touch/move it?
I am asking these questions, because I want to try to understand if this is an electrical/wiring issue
let me check
hmm, I clicked on the big T on the screen by accident and now the software just freezes
T
is for testing after calibration
Any information regarding the crash in your pupil_capture_settings/capture.log
file?
how do I invert the image ? I found it earlier
To flip the eye image go to the Eye
window General > Flip image display
how do I get the eye window to open again? it was auto opening before, now just the capture window opens
World
window General > Detect eye 0
thanks for the help, this is making me feel dumb
No problem. I am happy to help out
ok, got it working. eye 0 is plugged in and inverted now
and it freezes even if I don't move it
@user-47bcf3 what are the specs of your computer?
AMD-FX 6300(3.5 ghz) six core 16GB w Radeon R7 200 2GB GPU
OK, this machine is certainly powerful enough. This sounds like it could be a hardware issue. Please send us an email at info@pupil-labs.com so we can follow up. If you can also get the original order_id associated with this headset (You would need to ask the individual who gave you this hardware to work with for this information), that would be appreciated.
ok, thanks
Welcome π
@user-8779ef yes, a gaze datum has a field called base_data
. It is a list of the
1-2 pupil positions it is based on. These include all the information you need.
@user-13ea21 This might also happen if the path to the recording includes unicode characters. Could you make sure that this is not the case?
Thanks for the help!! Restarting with Default settings reset everything and it's working good now.
hello pupil
Hey @user-933e7a π
@papr π have some question about my eye-tracker. i can't configure it exactly to work.
Happy to help, what are your questions?
i use an old tracker version. since 2015-2016. and can't get a correct result. i don't know what is wrong. calibration goes good. but after this tracker don't track my gaze/
Do you mean that the gaze is not accurate or that there is no gaze point visible?
mostly the gaze is not accurate.
confidence >0.9
Does it have a consistent offset or is it jumping a lot? Could you check if your pupil confidence is high? You can do so by checking the confidence graphs in the world window.
Ok, thanks.
Do you use 2d or 3d detection/mapping?
it jumping or mirroring. i use 3d
Could you test 2d detection/mapping and see if the issue remains?
yes, i tested 2d too. but got same result
Which operating system, Capture version and what type of eye tracker (monocular/binocular) are you using?
also i used a different software version
main: linux. and also try on macos and windows
capture version 1.3 , 1.1, 0.9 use monocular tracker
Alright, thank you for this information. So, just to see if I understood you correctly: You have been able to use the system as expected for a while but now you are having trouble getting accurate gaze tracking?
yes, correct.
And you cannot think of anything that changed in your setup?
i'm a new in this and i make a new installation and configuration. i tried to reproduce the setup from oldest systems but no results.
Ah ok, so a collegue of yours was able to get accurate results but not yourself?
Could you make an example recording that includes the calibration procedure and a recording of your eyes, and upload and share it with me? I might be able to give tips on the setup/procedure π
collegue don't get a correct result too. yes, i can.
@user-8779ef You can also see it in the gaze topic field. These are the gaze topics in <= v1.4:
gaze.2d.0
, gaze.2d.1
, gaze.2d.2
, gaze.3d.0
, gaze.3d.1
, gaze.3d.2
And will be changed to gaze.2d.0.
, gaze.2d.1.
, gaze.2d.01.
, gaze.3d.0.
, gaze.3d.1.
, gaze.3d.01.
in >=v1.5.
.2d
/.3d
denote the mapping method
.2
and .01.
indicate binocularly mapped gaze. .0
and .0.
indicates monocular gaze based on pupil.0
and .1
and .1.
based on pupil.1
respectively.
I took the glasses outside and the world video is totally washed out. How do I correct this?
@user-072005 what do you mean by washed out?
Everything is white. I can sort of make out the horizon but I can't really see anything
This means that your world video is overexposed. Did you disable auto exposure priority in the UVC Source settings menu? It is enabled by default and prevents over exposure.
I was using the mobile app. Would that setting be in the app or the player?
Ah, you will have to set this in the mobile app.
Ok so I will do that in the pupil cam1 ID2 settings?
Pupil Cam1 ID2
cameraframe rate may be dynamically
ah ok it's working, thank you
Hi there, I am sure this has been already discussed, but I have followed all the steps on your website to install Pupil capture on an Ubuntu 16.04 machine, and I get this error: ImportError: /opt/pupil_capture/cv2.so: undefined symbol: ZN2cv2ml6RTrees4loadERKNS_6StringES4 I did uninstall the python-opencv package with pip, but with no success. When trying to install pyuvc, I also get an error /usr/lib/gcc/x86_64-linux-gnu/5/../../../x86_64-linux-gnu/libturbojpeg.a: error adding symbols: Bad value
@user-cf2773 Please see todays pyuvc issue on how to solve the latter problem: https://github.com/pupil-labs/pyuvc/issues/44
In short: sudo rm /usr/lib/x86_64-linux-gnu/libturbojpeg.a
And reinstall pyuvc
Thanks, pyuvc now works!
opt/pupil_capture/cv2.so
is the bundle cv2 lib. This should not be loaded during runtime if you are running from source...
This indicates that your opencv installation is not correct
π€ I followed every step, maybe it got installed in another way?
Please uninstall the bundle to make sure that the bundle cv2 version does not interfere with your manual installation
How can I check if my opencv installation is correct? I think I've followed all the steps
run python3 in the terminal and execute import cv2
. This should work without errors
It does, and the version is 3.3.0-dev
perfect. That looks like a successful setup
But I still get that issue when launching pupil_capture
Could not find platform independent libraries <prefix> Could not find platform dependent libraries <exec_prefix> Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>] MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version. world - [ERROR] launchables.world: Process Capture crashed with trace: Traceback (most recent call last): File "launchables/world.py", line 105, in world File "/home/pupil-labs/.pyenv/versions/3.6.0/envs/general/lib/python3.6/site-packages/PyInstaller-3.3.dev0+g8998a9d-py3.6.egg/PyInstaller/loader/pyimod03_importers.py", line 631, in exec_module File "shared_modules/methods.py", line 19, in <module> File "/home/pupil-labs/.pyenv/versions/3.6.0/envs/general/lib/python3.6/site-packages/PyInstaller-3.3.dev0+g8998a9d-py3.6.egg/PyInstaller/loader/pyimod03_importers.py", line 714, in load_module ImportError: /opt/pupil_capture/cv2.so: undefined symbol: ZN2cv2ml6RTrees4loadERKNS_6StringES4
world - [INFO] launchables.world: Process shutting down.
Wait, I think you are mixing things up. There are two ways to run Pupil. One is by running from bundle. You can download the releases from github and install the containing .deb
files. You do not need to install the dependencies for that. They come with the bundle.
Oh OK. I did install the .deb
To run from source you need to install the dependencies, download the source and run python main.py
in your terminal from the <repository path>/pupil_src
folder
But then how should that generate the issue?
You execute the bundle from terminal by running pupil_capture
That's what I am doing
Running the bundle should not fail. If it fails it is not installed or executed correctly
I have installed the bundle with sudo dpkg -i pupil_capture_linux_os_x64_v1.4-1.deb and then run pupil_capture and I get that error
As a follow up, @user-cf2773 is using a Intel Xeon CPU which is a different architecture than for whcih the bundle is compiled for. The bundles do not run on Intel Xeons. The solution is to run from source as described in https://docs.pupil-labs.com/#developer-setup
@papr could you elaborate a bit? does this only apply to the linux bundles? or only a particular generation of xeons? thanks π
This applies to macos bundles as well
@papr, i was already asking that in vr-ar, but maybe here it's better: i'm writing some calibration code for the trackers atm, and was wondering whether there is some more documentation available what the reference points should be, etc?
(talking about HMD calibration)
@user-bd0840 in the hmd case, you provide the ref positions. The calibration will be within the coordinate system of these reference points
@papr thanks! as far as i understand, 2D calibration would give me x/y positions on the hmd screen, and 3D an actual world position (in case i calibrated with them), is this correct?
correct
do you have any recommendations on the distribution of the reference points?
They should cover the field of view of the subject
great, that was the basic assumption. thank you π
Hi there. When using Capture remote recorder with the glasses connected to the mobile app. Can you do online calibration. Or is it only offline calibration?
Hi @user-ca2a2b If you select Pupil Mobile
as a capture source in UVC manager in Pupil Capture, you can calibrate "online" and save calirbation data and video data on the computer running Pupil Capture. The Remote Recorder
plugin is designed only to start/stop recordings on each connected Pupil Mobile source.
Great. Thanksπ
Hi - I'm setting up my pupil mobile headset to track a screen, using four surface markers to define the screen as a surface (one in each corner). The markers seem to be momentarily lost and found, causing the dimensions of the surface to be disrupted (for an example, see here: https://www.dropbox.com/s/pip8qzdxf9nsxim/IMG_4062.MOV?dl=0). Does anyone have tips for how I can improve consistency? The markers in this example are printed and physically stuck on the monitor, but I've also done this with markers drawn to the screen via psychtoolbox, and I see similar effects. The background needs to be white for this study, so no option to lower the screen brightness. Any tips would be appreciated. Thanks!
@tombullock#3146 can you try smaller markers, those are quite big. otherwise I think it should be fine.
@mpk The screen is about 1.2 m from the participant...I could probably make the markers a bit smaller, but if I go too much smaller they are not detected. If it's not a big deal then I won't worry about it.
Thanks
@user-e7102b check the min marker size slider. I think you can make the markers a bit smaller.
@mpk Yes, I'll give that a try and see if it helps.
hi there!
@papr i'm ready
does anyone can help me to configure tracker to accurate work
@mpk hi. you can't?
Hi, weβd like to use the 3D gaze feature of our pupil tracker (binocular + fisheye world cam). However, the Z values we are currently receiving from gazepoint_3d seem to be not in millimeters (sometimes they are even negative). Do we need to calibrate on a 2D plane when using the 3D Gaze Point, and if yes, does the distance to the plane make any difference? Is it even possible to do a separate 3D calibration? And how can we have the raw Z values visualized or printed within the pupil UI?
We already tried to dig through the source code to understand the various coordinate systems and extrinsics that are involved - at the moment we assume that metric values for gazepoint_3d should be available, with their origin being the optical center of the world camera (+z to front, +x to the right, +y up). However, we still cannot really tell where involved transformations e.g. the eye camera intrinsics, or extrinsics eye-to-world, or extrinsics eye-to-eye are coming from.
Weβd be grateful for any help or hints - thanks in advance!
@user-c6b893 The 3d gaze mappers have their own side menu. It should include a button to open the debug window. It includes a 3d visualization of the calibrated 3d gaze.
@papr can we continue?
Thanks @papr - we already stumbled upon this window, however it only seems to give us a visualization of the 3d gaze mapping, but no numerical Z values or (also numerical) eye positions to check if we are actually getting "sane" values in millimeters somehow corresponding to our current gaze point (right now the visualizer looks quite good, but we are receiving values of -900 (mm) for Z - relative to the world cam this would mean we are looking behind our own head, right?).
@user-c6b893 I am not exactly sure about the coordinate system's orientation. Be aware that gaze data contains a confidence value that gives a quality estimation. How are you getting the data? Are you subscribing to it or do you use the exporter in Player?
Thanks @papr , we'll have a look at the confidence values. We are getting the values through subscribing via zmq in Unity as well as via the exporter, they are identical. According to the visualizer (given the axis colors are green = y, red = x, blue = z) the z values should be positive in front of the camera.
This may be more of a general question, but is there an easy way to see dispersion in real time? We are having participants look at sheet music and trying to figure out how much music "fits" into 1 degree.
Do you mean dispersion over time?
@papr - I don't think so. I want to figure out for example if I look straight ahead then I look 1 degree to the right, how much of the sheet music will I have covered with that sweep.
This depends on the distance of the sheet, if I am not mistaken.
Right! But do you happen to know a way for me to figure that out in the moment?
You could add a surface marker of a specific size to it and calculate its 3d positions. There is some initial work on that: https://github.com/pupil-labs/pupil/pull/872
Hi guys noob question in 1.2.3 : After I just got the addon for htc vive and installing the hardware is there any software already available to see that it works ?
Yes, start Pupil Capture and see if the eye window is able to detect an image
Pupil Capture needs to run on the computer to which the add on is connected to.
is it normal to identify something even when no one is using the headset ?
Yes, the pupil detection algorithm works on best-effort bases.
also has anyone noticed that glasses have any influnce on the eye tracking
Depends on how you position the glasses. If the eye cameras have a free view onto the eyes, the glasses do not have any influence. If the eye cameras record through the glasses, you might encounter issues with the glasses' refraction. Especially after they have been moved slightly.
@mpk Reducing the marker size helped improve stability. I think I found an optimal size for my screen distance https://www.dropbox.com/s/mmka0msf62g4340/IMG_4063.MOV?dl=0
Thanks for the advice
Follow-up question to our questions above - is it possible to get the gaze point coordinates (x/y) in the world camera coordinate frame for each eye separately?
Yes, we have a dual-monocular gaze mapper. But there is no user-option yet to enable it explicitly. Please create a github issue and I will see to it next week. π
Is there a way to pull or calculate where the user is looking in terms of visual angle.
@user-ed537d do you mean in relation to the world camera or in relation to an object that is visible in the world camera?