hello - has anyone put together a diy version? not sure which channel to ask a question on here
followed all the videos and successfully replaced the leds on an hd-6000 but when following the diy wiki instructions, im confused as to what to do to remove the ir lens filter. i was able to unscrew and desolder the lens housing but are there videos detailing the entire setup?
hello, does anyone know what the average shipping time to Austria is?
mine took just a few days
fantastic, thank you
@user-2ad874 Sorry for the late response. You will have to open Pupil Capture multiple times for being able to record with multiple headsets. You can either open it multiple times on the same computer (please test if your computer has enough performance!), or you can also connect the headsets to multiple computers and start Capture once on every computer (preferred when recording at different locations or not having enough performance). In the later case you will have to make sure that all computers are in the same LAN or WIFI network. This is what I meant by "on the same network".
@user-c5fb8b Thank you so much for all your help. I appreciate it. I tried to open multiple instances of pupil capture on my machine but it kept crashing so i guess i need to go with the second option of using different computers and ensuring that they are on the same network. Thanks again!
Does anyone know the phone number to contact Pupil labs? I need to include it to authorize a purchase order at my institution. I can't find this information anywhere on the site! Thanks.
@user-9f67a6 please contact - sales@pupil-labs.com (if you haven't already) and we will be happy to coordinate with you regarding POs.
hi everyone. I used pupil core to record participants manipulating objects and now I'm using pupil player's fixation detection plugin. I'm using 1.51 deg for max. dispersion, 50ms and 500ms for minimum and maximum fixation duration. Do these values seem reasonable? I did histogram to see the distribution of fixation duration and most seem to be between 50-100ms, which I think is odd, although I'm veeeery far from being an expert on gaze behavior. (also, is this the right channel to ask these questions or should I switch to π¬ research-publications ?)
@user-5ef6c0 a fixation is active as long the gaze does not disperse more than a given threshold (1.51 deg in your case). Keep in mind that gaze estimation has a calibration dependent accuracy. If the accuracy is 1deg on average, then it is comparably easy to reach the 1.5 deg threshold. Therefore, it is not surprising that your fixations are mostly short. If you increase the maximum dispersion threshold you should in turn see longer fixations.
@papr Thank you, papr. I understand. So is there a rule of thumb in this regard I should be following? Or an average fixation duration I should aim for while adjusting these settings? In the literature fixations are often said to be between ~50-600ms or so, which is why I was surprised to see most of the fixations recorded were on the lower end of the spectrum. But maybe that's totally fine.
I'm also trying to learn more about different fixation detection algorithms. What is the algorithm implemented in pupil player?
@user-5ef6c0 Hi, you can read up on Pupil's fixation detection in the terminology section of our docs: https://docs.pupil-labs.com/core/terminology/#fixations
thank you, sorry I missed that, I thought I had checked the documentation. Too early in the morning I guess π
No worries, glad I could help π
Hi @user-c5fb8b , I have two core eye-trackers connected to two different computers(one mac and the other windows). I have pupil capture opened with on each pc and they are both on the same WIFI however, i can't seem to connect them. Pupil groups says there are no other group members. Is there an additional step I'm missing? As always, i appreciate your help
@user-04d904 Please share the capture.log files (Home directory -> pupil_capture_settings -> capture.log) from both computers. In most cases, a network misconfiguration leads to connection issues.
capture log for windows pc
capture log for mac pc
@user-04d904 Are you in an university or otherwise large network?
@papr yes i am
It looks like both machine choose the correct network interface. Which leaves me to believe that the connection issue is caused by the network. Larger networks often prohibit peer-to-peer connections for security reasons. This might be the case with your network, too. We recommend to use a dedicated wifi network which you have full control over.
@papr How do I do this?
There is one more thing that you can try. I noticed that the IP addresses differ in their IP range (10.21.106.127
vs 10.21.175.192
). Maybe, the issue can be solved by assigning ip addresses of the same range, i.e. 10.21.X.y
where X
is the same number for both computers. But I do not know if this is possible in your network nor am I very sure that this is the cause of the problem.
Regarding dedicated wifi network: Setup a wifi router, a typical house hold router is fine, which creates its own network.
@papr I'll try that. Thank you so much for your help
@papr When using pupil labs core in conjunction with another computer for experiment, there is a serious lagging issue where the experiment significantly slows down. What might be the cause of it?
@papr In other words, I am using middle man to connect two computers, having one running pupil labs and the other running the matlab experiment. However, only after a few minutes the lagging issue become pretty serious, whereas the whole experiment significantly slows down.
@user-c629df are you able to calibrate the Pupil for the screen attached to the MATLAB computer? I've been trying to do this
@papr Here is the error message from MatLab I got:
PTB-INFO: There are still 44926 textures, offscreen windows or proxy windows open. Screen('CloseAll') will auto-close them. PTB-INFO: This may be fine for studies where you only use a few textures or windows, but a large number of open PTB-INFO: textures or offscreen windows can be an indication that you forgot to dispose no longer needed items PTB-INFO: via a proper call to Screen('Close', [windowOrTextureIndex]); , e.g., at the end of each trial. These PTB-INFO: stale objects linger around and can consume significant memory ressources, causing degraded performance, PTB-INFO: timing trouble (if the system has to resort to disk paging) and ultimately out of memory conditions or PTB-INFO: crashes. Please check your code. (Screen('Close') is a quick way to release all textures and offscreen windows)
WARNING: This session of your experiment was run by you with the setting Screen('Preference', 'SkipSyncTests', 1). WARNING: This means that some internal self-tests and calibrations were skipped. Your stimulus presentation timing WARNING: may have been wrong. This is fine for development and debugging of your experiment, but for running the real WARNING: study, please make sure to set Screen('Preference', 'SkipSyncTests', 0) for maximum accuracy and reliability. WinTabMex shutdown complete.
So it seems that it is the pupil labs that is causing the lagging in Matlab. What might be a good solution to this?
@user-c629df Hi! I've been using middle man matlab code but as a work around have been using an older version of pupil capture (1.8) otherwise i can't get it to work -- are you using current versions of pupil capture? Re your PTB issue -- this seems like its related to the number of textures you have open (44926). You should close the textures that you aren't using anymore.
@user-c72e0c Thanks for your reply! I tried different versions, including 1.8, and the same lagging problem repeats. Thanks! I will try to solve the texture problem and try again! Thanks!
@user-c629df what version are you currently using?
Hi --- I have a diy question for de-casing the Logitech Web Cam--- How do you open the case housing the lens? In the sample videos one ends with the lens housing exposed but not opened, and for the other one it starts with the lens out of the housing.
Hi there, I am new using Pupil Labs. Unfortunately, the confidence level of the eye cameras is not very good. I've watched the videos on how to adjust all the cameras and parts, and even though the eye is centered, detection is not very good. Has anyone some tips? Or experiences issues with pupil detection due to mascara/eyeliner? Thank you.
Hi everyone, i'm a new user and i'd like to use the pupil core eyetracker with a data projector for an outreach event (covid-19 allowing). Can i output the screen-based markers to the projector screen? has
Has anyone done this before, who wouldn't mind sharing their experiences and dos/donts?
Hello, wasn't there the possiblity to detect saccades in an earlier version of Pupil Core or Pupil Player?
Hello, I am trying to run the Pupil Core headset in Pupil Capture and I can not get my computer to detect the headset. When it is local USB on start up it says COULD NOT CONNECT TO DEVICE. When I go into the video settings and activate device it says COULD NOT FIND DEFAULT DEVICE. When I use manual detection it shows 2 cameras as unknown, and when chosen they say SELECTED CAMERA IS ALREADY USED OR BLOCKED. I checked and the computer detects the Core headset but Capture does not. Can anyone help me?
@user-c429de this is just a guess, but it sounds like the drivers might not have been installed a)correctly or b) at all?
@user-26fef5 Maybe, I am working on a linux machine. This is not my personal computer so I did not personally download the software
Would you be able to let me know how to check?
Did you build from source or are you using the bundled version?
We just downloaded the standard Pupil Core Software, Capture, Player, and Service
So I believe the bundled version
@user-c429de sounds like it. Unfortunately i can not personaly tell you how to fix this, since i am not a member of the pupil devs - nevertheless i would simply reinstall the software (if you are in ubuntu deinstall the package through the software center and reinstall it by opening the .deb package with the before mentioned softwarw center)
Thanks I will try that. Unfortunately I can not install stuff without our administration staff so it will take some time lol
@user-c429de that might be a problem though since the driver installation usually needs superuser permissions
Which I do not have, thanks for your help and input. I will talk with my admins about reinstalling
@user-c429de no worries - i guess one of the pupil devs will adress your problem in their office hours and can help out more. Cheera
Cheers
@&288503824266690561 I am trying to run the Pupil Core headset in Pupil Capture and I can not get my computer to detect the headset. I am running linux on this machine. When it is local USB on start up it says COULD NOT CONNECT TO DEVICE. When I go into the video settings and activate device it says COULD NOT FIND DEFAULT DEVICE. When I use manual detection it shows 2 cameras as unknown, and when chosen they say SELECTED CAMERA IS ALREADY USED OR BLOCKED. I checked and the computer detects the Core headset but Capture does not. I have tried the command: sudo usermod -a -G plugdev $USER but it did not work either. I have reinstalled the software and that did not help either, and I do not know what else to try. Any input would be appreciatted
@papr Hi, I want to use the Pupil dilation for one study. I have some recorded trials and I have "diameter_3d" data. May I use these data for pupil diameter?
@user-c429de what Pupil Core hardware are you using? I assume that you are using the most recent version of Pupil Capture, correct? Also after doing sudo usermod -a -G plugdev $USER
you will need to log out/log in for the changes to be applied IIRC.
@user-8fd8f6 yes, diameter_3d
is the diameter of the pupil in millimeters based on the 3d model of the eyeball.
@wrp Eye camera recording frequency is 200 Hz. Right? I want to apply low pass filter.
@user-8fd8f6 you can change the temporal frequency of Pupil Core eye cameras from within the eye windows, take a look it might be set to capture at 120Hz - you can just select 200Hz in the user interface.
@wrp I have some recorded files. How can I know the recorded frequency?
You should be able to calculate the frequency based on the pupil_timestamp
s in the pupil_positions.csv file. Note that frequency/frame rate is not constant (so you will not get a constant 120Hz or 200Hz).
@wrp Found it! Thank you
@wrp Ok thanks, I will check it out on Monday. Yes I downloaded the programs from Pupil Labs website today.
@user-c429de thanks for the update. Hope that we can help you get up and running on Monday. (Please note that if you don't get a reply, it is likely because members of our team are not online. We have members working both UTC+1 and UTC+7 timezones, which have minimal overlap with North/South America working hours)
@wrp Ok, ya I am in EST so that makes sense.
Dear Pupil Core friends, I received the pupil core+binocolar (right eye, eye 0) last week. I tried the Pupil Capture/ Pupil Detection. However, it seems that the Eye camera (right eye) is configured for the left eye (eye 1, which I do not have). The captured eye image is inversed and the detection of gaze is always on the left part of the screen (no matter which place I look at). Any hints for correction? I tried the "flip image display" in settings, but it just works for the display, nothing essential. Thank you in advance.
@&288503824266690561 please render some help if possible π»
hello. so, I was hired by the by my university to learn and set up the pupil core. i have the software installed and im getting a feed form the cameras, but its not letting me do the calibration. i was wondering if i could get some help with this problem
i not sure why one of the cameras is upside down
@user-c2d2ea in order to calibrate your device you require a world video feed (messge on the screen) that tracks calibration markers (circles with dots inside) in the world frame and maps the pupil positions to those world points. Without a video feed you do not need a calibration since you do not map anything anywhere.
I'm just trying to track pupil dialiation. Would I still need to do the calibration?
@user-d31aae the camera for the right eye is upside down by design. As you saw you can flip the visualization of desired. Based on your description it seems like you need to calibrate.
@user-c2d2ea see my note about the right eye can above. If you do not need gaze data, then no calibration required. Make sure to adjust the eye cameras (physical adjustment) so that you have a good view of the eye for high confidence pupil detection.
okay. thanks. i don't really care about where the eyes are moving, i just eye dilation. I'm working on a experiment that relates stress level / anxiety and eye dilation in my universities psychology department
HTC Vive Binocular Add-on: One of the wires (right eye) that connects to the USB-c component got cut apart and we soldered it together. It worked for a couple weeks but it no longer does. Any good solution to fix this without having it sent back to Pupil Labs (we don't have surplus money at the moment to ship it there, pay for the fix & have it returned) If anyone has a diy solution, pls advise, thanks!
@wrp Thanks a lot! I get to try it by Monday! Thanks again for heading up!
@user-c2d2ea shake hands, good luck dude
@user-c2d2ea if all you require is pupillometry data then no calibration is required. However you may want to have the participants look around to extreme ranges of their view in order to build a robust 3d eye model
Hi guys, I'm a little confuse about how Offline Surface Tracking works. I understand that this plugin is for when a didnt used an apriltag on the recording, right? Now I'm not being able to add a surface. Someone has a clue?
Hi @user-ce3a3d π - Offline Surface Tracking is for post-hoc surface definition and tracking in Pupil Player. Your recording must have markers included in the environment that was recorded. You can only add a surface if markers are present & detected. Was I able to answer your question?
@user-c629df As a follow up: From the error messages, I am fairly confident that the PTB issue is causing the lagging. not the connection to Pupil. Also, I do not see any Pupil-related error messages. If you want, you can share the full log and I will have a look and check if I can find anything related to Pupil. Also, please consider opening issues in the corresponding repositories if you are using third-party software.
@user-95b9b6 Hey, mascara and eyeliner can indeed lead to detection issues. In the eye window's general settings, you can switch to ROI (region of interest) tracking. Four handles to adjust the ROI will appear in the corners of the window. Adjust the ROI such that the mascara/eyeliner is excluded. This can be difficult though if you have a lot of different viewing angles. Preferably, ask your subjects to not wear mascara/eyeliner during the experiment.
@user-3717e9 Depending on projector and the projection area, the marker detection can become difficult. I would recommend printing the calibration marker (roughly 5cm x 5cm) and using the Single Marker Calibration in manual mode.
@user-632362 Please checkout this video: https://vimeo.com/59844059 It looks like the current documentation does not link them properly. I will fix that now.
Hi, I'm working on a university project which is concerned with pupil dilation. Everything is set up but i am unsure how to actually measure the dilation of the pupils. Not sure if i specific plugin is needed or I've just merely overlooked something. Any help with this would be much appreciated.
@user-f29d83 The simplest way to access the dilation data is through the Pupil Player CSV export. See these sections of our documentation:
- https://docs.pupil-labs.com/core/software/pupil-player/#export
- https://docs.pupil-labs.com/core/software/pupil-player/#pupil-positions-csv
Specifically, check the diameter
and diameter_3d
fields.
@papr Thanks I'll look into that!
Thanks @papr but we are actually having trouble removing the lens from the Microsoft pupil facing camera. Do we just unscrew that or do I have to desolder the entire housing to get at the IR filter?
Posted before and apologies if someone responded but was having issues with the discord app
Hello, I am trying to run the latest version of pupil player (just downloaded from the site) but it doesn't start up. I am running it on a windows 10 64-bit machine. It works on another laptop with similar specs.
Hi Pupil experts, I am trying to load my recording to Pupil_player but everytime I try to do that the program shuts down. My recording is almost 5 GB and I think it might be the reason because with smaller sizes it works well. Any idea how to solve this problem?
@user-3717e9 Depending on projector and the projection area, the marker detection can become difficult. I would recommend printing the calibration marker (roughly 5cm x 5cm) and using the Single Marker Calibration in manual mode. @papr Thanks! I'll try that out
Having issues with pixalated heat maps. In pupil player visualization everything is ok, but after exporting I got these pixalated heat maps. Anyone with the same issue? Cheers.
Hello. I'm new in Pupil Labs with mobile version and I don't know much about technology. I'm trying to connect an A70 cell phone using Pupil Mobile app, however I can't do that. I'm using windows 10. I enabled the hotspot network and connected the phone to that network. The IP address of the network was recognized by Pupil Capture but images from the cameras do not appear on the software. What I did wrong?
Hi, I experience a similar problem as Katsumi (see yesterdays' post), but then with pupile_capture. Everytime I try to open the program it stops working (i.e. does not respond). Also using a windows 10 64-bit laptop. Does anyone know why this happens/how I can solve this problem? Any help is appreciated.Thanks!
@user-76c3e6 @user-66a3ee Could you share your respective log files?
- Capture: Home directory -> pupil_capture_settings
-> capture.log
- Player: Home directory -> pupil_player_settings
-> player.log
@papr Thanks for your quick response. In the meantime I ran capture again as administrator and that seems to have solved the problem. In case the problems returns, I'll send you the log file. Thanks!
@user-f3048f In Pupil Player v1.17 and later, heatmaps are exported with a minimum size of 2000px. If the heatmap is not of the target size, the exported image is upscaled with the cv2.INTER_NEAREST
interpolation mode.
@user-511dcb You should have received a response to your question via email.
@user-772bb2 Insufficient amounts of RAM might cause Pupil Player to shutdown. Could you share the Home directory -> pupil_player_settings
-> player.log
with us such that I can confirm that it is indeed a RAM issue?
@user-363d4d We have just updated the DIY documentation with all missing video guides. The documentation webpage should update shortly. (see this PR for reference: https://github.com/pupil-labs/pupil-docs/pull/360)
Hi there, I'm trying to set up the software and got the pupil (blue area) and ROI mapped out just fine, at least in my Opinion. However I'm still only yielding max 0.2 confidence and it seems like the framedrop to 15 frames (from 30) always happen, when I turn up the brightness. (like for example on the photo). Does that mean my Mac Mini is too weak to handle the requests?
@user-5749a9 Are you using a DIY headset? AFAIK the DIY webcams drop frames if you increase their exposure time.
Yep, DIY! @papr
@user-5749a9 I would recommend to move the eye camera closer to the eye if possible. This way the IR leds are closer and the image will be brighter automatically.
Also, the eye is quite small in comparison to the recorded face area in your screenshot.
Understood @papr Gonna try to bring it closer to the eye and test again
Huge thanks so far π Seems like the difference between the actual difference from pupil to Iris is too little for the program to pick it up with confidence...Back to the desk I guess π
Got the confidence up to 0.9 for a minute...then I went to calibration, the program crashed and I cannot seem to be able to reproduce it...dang it π¦
Could I come down to my IR filter? I used 2pcs of black exposed film as it says in the BOM
@user-5749a9 Feel free to share an example recording with data@pupil-labs.com We can check for indicators causing the bad confidence.
Hello there. So today when I got my eye tracker out of the box where I keep it, I noticed that one of the wires that goes to one of the eye cameras got disconnected, probably from moving it around or detaching it so it fits in the box. Is this something that can be easily fixed if taken to a repair shop?
Hi! I'm running an experiment using Pupil Core in MATLAB (using Psychophysics Toolbox). We want to use a biofeedback model in which we show participants real-time changes in their pupil size. We want to show a circle/ring during the experiment that enlarges/constricts in response to changes in pupil size e.g. a change from 3mm to 5mm in pupil size would enlarge the circle/ring visually on the screen during the experiment.
To do so, we'd need data of pupil size from Pupil Labs feeding into MATLAB in real-time to draw the circle/ring at each moment in time, preferably with as little lag as possible.
Is this possible and if so what would be the approach to take? Thanks π
@user-7d0b66 please contact info@pupil-labs.com in this regard.
Thanks @papr but we are actually having trouble removing the lens from the Microsoft pupil facing camera. Do we just unscrew that or do I have to desolder the entire housing to get at the IR filter?
Hi! I am using Pupil Core. I am having difficulty in finding fixations from gaze data. Is there any way that I can get the eye velocity data?
Hey, I am new to Pupil eye tracker and this might come as a stupid question, but why is the price difference between the DIY kit and the ready kit from Pupil have so much price diffrence? Am I missing on something?
I am having difficulty mapping the normalized cordinates to my areas of interest. For example , the bottom left corner is 0,0 as defined by pupil core sofware. I have made an area of interest there and fixating for about 120 seconds in that region. The coordinate should be somewhat closer to zero but it is not. Is there some other way? Please suggest! Thanks
@user-e52dee The short answer is that a lot of design, engineering, and testing goes into Pupil Core headsets.
I hope that I have been able to respond to your questions.
@user-76416d are you trying to get fixation data in real-time or post-hoc? Did you define a surface as described in: https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking ? Please provide more information.
Thanks for the response. Actually I got the fixation data from fixation filter plugin. I did not able to understand surface tracking. I am fighting with the coordinate system as I have do Area of Interest based analysis. I have done screen calibration. According to the pupil docs, the normalized pos_x and y have the origin at bottom left corner of the screen. Hence, when I fixate on bottom left corner of screen the coordinate should be near zero but it is not. What is the problem?
@user-76416d If you just subscribe to the fixation data, you will receive the fixation data in scene-normalised coordinates. What you are interested in are surface-normalised coordinates. Checkout this script as an example on how to receive gaze in surface-normalised coordinates. https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_gaze_on_surface.py#L35 You can change gaze_on_surfaces
to fixations_on_surfaces
to access the surface-normalised coordinates.
That is not clear to me. Will you please help me with a step wise procedure to do this? I will be very thankful to you.
@user-76416d How are you currently accessing the fixation (or gaze) data?
I am doing analysis offline. I got fixation data by enabling fixation filter plugin in pupil player.
@user-76416d Ah ok, then ignore my previous comment regarding the script. It is used to access the data in real time. This is not necessary for you. Did I understand correctly that you have enabled the surface tracker plugin and setup surfaces?
If yes, please check your export
folder. It should contain a subfolder named surfaces
. It includes the surface-normalised data, specifically fixations_on_surface<surface name>.csv
I have enabled surface tracker plugin but when I am clicking on "add surfaces" it is showing "no surfaces to add".
Yeah I have checked the surface folder it have csv files but no data
@user-76416d ok. Does your recording include apriltag markers as described here? https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking
No.
Ok, you will need to add these markers around your AOI else the software will not be able to track the AOI. π
How to add this?
The simplest way is to print them on a sheet of paper and glue them. Checkout our example surface tracker recording to get an idea: https://drive.google.com/file/d/1nLbsrD0p5pEqQqa3V5J_lCmrGC1z4dsx/view?usp=sharing
Thanks a lot. I will disturb you again if I find difficulty.π€ͺ
hello again. so for some reason the pupil software has seemed to stop working entirely
all i get is is this when i open it
@user-c2d2ea is it possible that the shortcut is broken? Could you try starting the application from the original pupil_capture.exe
file?
ill try
i launched directly form the the original file and still nothing
i have this
dose it matter where you down load the file to?
@user-c2d2ea It actually does. Could you please delete the empty space in the
psych department
folder?
or create a new folder, e.g. C:\pupillabs\
, where you can extract the applications to
this is box man. im just on the schools machine. like this rightΓ
@user-46acc9 for example yes
If this did not change anything, please consider removing the Home directory -> pupil_capture_settings -> user_settings_*
files after closing Capture, and restarting the application after deleting the files.
@papr Where I will find offline surface tracking plugin in pupil player? There is only surface tracking option in pupil player.
@user-76416d That's it!
Online Surface Tracker -> Surface Tracker in Pupil Capture Offline Surface Tracker -> Surface Tracker in Pupil Player
so what is pupil service?
@user-c2d2ea Were you able to run Pupil Capture? Pupil Service is similar to Pupil Capture but with lower latency gaze mapping and a limited feature set.
no. still working in it. im trying to see if i can get it to run on my laptop. the issue is that i had it running before, then windows did a big update, now nothing works.
it runs on my laptop but the cameras wont connect
@user-c2d2ea This is a separate issue. Windows resets all drivers after each Windows update.
Have you tried running Capture with administrator rights?
its doing the same thing on the school machine. no i haven't yet. ill do that
still nothing
not sure if this helps
@user-c2d2ea please try following steps 1-7 from these instructions: https://github.com/pupil-labs/pyuvc/blob/master/WINDOWS_USER.md
thanks
Hi, my research lab has recently purchased Pupil Core. I have a few questions: I'm looking to use the nslr (Eye Movement Detector), may I have brief through on how to get it working and how to get the results from it? I'm also looking to have it real-time (online). https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/eye_movement/eye_movement_detector_real_time.py From the code, I guess it's a plugin. Could I clarify if its for Pupil Core or Pupil Player? If I were to use the offline version, https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/eye_movement/eye_movement_detector_offline.py, how do i get it working? What kind of input data does it use? Could I use the pupil and gaze data collected from Pupil Labs LSL (Gaze meta data format) or do I need to use Pupil Capture (to record) and then Pupil Player (to export)? Thanks
Hi, I also have questions on the other eye movement detectors and nslr: Under shared modules https://github.com/pupil-labs/pupil/tree/master/pupil_src/shared_modules there are two files, fixation_detector.py (dispersion method) and saccade_detector.py (looks like dispersion method too). There is also the nslr eye movement detector which can classify fixation, saccade, pursuit and pso. What are the differences in terms of output? Would I still get detailed information about each fixation, saccade, pursuit and pso (in terms of duration, number of occurances, etc.) if I use the nslr? What kind of output will saccade_detector.py give? Thanks
@user-0ff08d Currently, only the fixation detector is officially enabled. The saccade detector is only a placeholder at the moment. The NSLR-based detector is currently disabled due multiple reports of it not being accurate.
Hello, another question (sorry I have a lot): I did a test recording using Pupil Capture, then used Pupil Player to export csv files. However, it seems that blink.pldata is not exported as csv. Is there something I'm missing here? Would I need to write a script to export blink.pldata to csv? Thanks again
@user-0ff08d There is the offline blink detector which runs the blink detection and exports the blink csv files. The data from blink.pldata is not used.
@papr Thanks for the answers. Is the offline blink detector run using Pupil Player?
@user-0ff08d Yes. As a rule-of-thumb: All plugins with the "Offline" prefix are Player plugins. The "Online" prefix indicates Capture plugins.
@papr Thanks!
Hey I posted this last Friday but I think it might have gotten buried. It would be great if I could get some help on it!
Hi! I'm running an experiment using Pupil Core in MATLAB (using Psychophysics Toolbox). We want to use a biofeedback model in which we show participants real-time changes in their pupil size. We want to show a circle/ring during the experiment that enlarges/constricts in response to changes in pupil size e.g. a change from 3mm to 5mm in pupil size would enlarge the circle/ring visually on the screen during the experiment.
To do so, we'd need data of pupil size from Pupil Labs feeding into MATLAB in real-time to draw the circle/ring at each moment in time, preferably with as little lag as possible.
Is this possible and if so what would be the approach to take? Thanks π
hello , i have a question in regard gaze data, i saw it is normalized but i have data of y which is bigger of 1 , i want to make sure that i understand right - that data is bad right? it should be maximum 1?
@user-d9a2e5 That means that the gaze estimation algorithm estimated the location to be outside of the field of view of the scene camera. Please be aware, that this happens often for low-confidence data. In any way, I would discard this data as you are not able to interpret it in terms of a scene image location.
thanks!, i have another question , how i suppose to use real time analysis? , is capture the one who analys the gaze/pupil size? and player extracting it to exel file? or player is the one who does the analysis?
@user-d9a2e5 Nearly all analysis can either be done in realtime in Capture (access to results via network api) or offline in Player (access to results via csv exports).
is there a video or somehow i can see how to use it on capture?
@user-d9a2e5 Checkout our developer documentation in this regard: https://docs.pupil-labs.com/developer/core/overview/
thanks for everything! π good day
Hi, I am trying to get Pupil Core up and running and to be remote controlled by MATLAB on Ubuntu 18.04. I already downloaded the pupil helpers for matlab (https://github.com/pupil-labs/pupil-helpers/tree/master/matlab) but they require a specific compiler version to be installed (gcc 6.3). Untill now, I could not find a description on how to install specifically this gcc version that works for me. I was wondering whether anyone else ever had this problem and solved it? Thanks
hi @user-5080b5 ~~which operating system are you using?~~ Just saw that its ubuntu, my bad.
@user-5080b5 The default apt repository for ubuntu only offers the latest gcc, but here is a guide on how to install and setup a different version: https://linuxize.com/post/how-to-install-gcc-compiler-on-ubuntu-18-04/ I tried this once and got it working with this guide. Otherwise you can also compile gcc from source, but this might be a bit of a hassle.
Thank you so much, I will try this immediately π
okay I tried that, but all it did was to install the gcc 6 version package, which defaults to the most recent version (6.5.) which is why Matlab still complains about the compiler version :/ Did you specify any of the instructions from the guide any further?
No sorry, I cannot help you with that further, except maybe you want to try building gcc from source. Maybe someone else with Matlab experience will respond though!
Okay thank you very much though!
Hi, can I ask a question about PupilMobile here?
I'm using Pupil Mobile (v1.2.3) and Headset on Android device (SHARP SH-RM12 Android9).
The app terminated frequently. Is there a difference in compatibility by device? Is there solution for stable ?
Thanks.
hello
I have some questions regarding the pupil core headset. I am planning to buy the right arm with the camera attached. Do we have just a wire connection to PC from it?
@user-499cde Are you planning on printing the remaining parts of the headset yourself? Or do you have something equivalent to attach the arm to? The cameras have a specific connector. @user-755e9e Do you remember the name of the connector?
I am planning to 3D print the other parts
Could you give me some insight on how the camera is connected to PC, is it just a 2.0/3.0 USB connection or via something else?
@user-499cde It is USB 2.0 to my knowledge ( @user-755e9e please correct me if I am wrong ), but the physical connector is quite small and is not the typical usb connector.
If I order the right arm, do I get this connector with it? I mean its 650 euros, its not mentioned anywhere on the website checkout page or the hardware section so I am curious to know about it.
@user-499cde If you order 1 Pupil Eye Camera through our website, you will receive also a USB to JST cable.
@user-499cde JST is the connector I was talking about.
@user-755e9e Thank you for the clarification π
And as Pablo said, it's a USB 2.0 connection.
Ohk, thank you for that, could you also tell me what is the length of this cable?
The cable we include is 2 meters long.
Thank you for the quick reply!
a quick question, just to confirm: I do some raw data computation with data I captured from the ZMQ from the pupil core. I am currently mainly interested in the pupil messages. In the pupil messages, the diameter_3d
has the diameter in real mm and the diameter
is the diameter in an abstract projected sdandart coordinate system or something like that, right?
A quick question, is it just a standard UVC camera? And does it works with opencv out-of-the-box.?
@user-755e9e @papr ?
@user-499cde I can check the second question when I am back in the office
@papr Thank you. I really appreciate that.
Hello @papr , could you please confirm whether the pupil core right eye cam is a standard UVC compliant camera or not? Also, does it works with Open-CV out-of-the-box?
@user-755e9e
@user-499cde Yes, the cameras should be UVC compliant (not sure if they follow the complete standard, though). Is there a specific functionality that I should test regarding OpenCV? Or is opening the VideoCapture
sufficient?
@user-499cde I am able to open a cv2.VideoCapture
and access the cameras frames on macOS without having to install any specific drivers or similar.
I will be doing the tests on a windows. I hope there wont be any issues with it?
@user-499cde No, I do not think so.
@user-c5fb8b might be able to test it on Windows to be sure
alright, thank you!!
@user-499cde Ok, as it turns out you can use the eye cam with OpenCV on Windows only if you use the default drivers. If you run Pupil Capture, it will install the libusbk drivers which allows it to set UVC controls on the camera. I am not sure if you can do that with OpenCV.
@user-499cde Sorry, @user-c5fb8b just corrected me. The access to the scene camera works, but the access to the eye camera with the default OpenCV backend does not work.
@user-499cde These are the UVC controls that Capture sets for the 200Hz cameras: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/video_capture/uvc_backend.py#L284-L318
@user-499cde So i played around with this a bit. I am only able to get a good image from the world cam with OpenCV, with this snippet:
import cv2
cap = cv2.VideoCapture(1)
while True:
success, frame = cap.read()
if not success:
break
cv2.imshow("Frame", frame)
ESC = 27
if cv2.waitKey(1) == ESC:
break
Note that I'm using index 1
as 0
is my computer's built-in camera.
For the eye cameras however this will throw an error. But I am able to tap into the eye camera stream when choosing a different video backend. By default, OpenCV uses MSMF on my computer, which apparent does not work at all. When using DSHOW, I can get a stream, but it's mostly black, which is probably the result of bad initial UVC settings of the cameras. You can select the backend with an index offset:
cap = cv2.VideoCapture(1 + cv2.CAP_DSHOW)
See available backends here: https://docs.opencv.org/3.4/d4/d15/group__videoio__flags__base.html#ga023786be1ee68a9105bf2e48c700294d
As @papr also said, this won't work, when the custom drivers from Pupil Capture are installed. You can however remove them: follow steps 1-5 of this guide: https://docs.pupil-labs.com/core/software/pupil-capture/#windows When you start Pupil Capture next time, however,, the custom drivers will be re-installed.
I want to grab frames from the eye camera to use it in Libretracker.ext
@user-499cde well you can always just use pyuvc as we do in Pupil Capture. It's a bit more complicated than OpenCV though and not very well documented. You will have to look at Pupil Capture source code to figure out how we do it I guess. Or as a different solution, you could have Pupil Capture running in parallel and subscribe to it's messaging API to access the frames.
@user-499cde II do not know what Libretracker.ext
is. I assume that is a third-party software. Please understand that the eye camera is a special-purpose camera and that we can only support its usage with Pupil Capture. We cannot give you guarantees that it will work with other software.
Yes, I understand. I wanted to learn if the eye cam can grab frames using OpenCV. Thank You for the information though.
I'm still trying to figure out how to yield better tracking confidence (current sitting at around .2-.3 max. Can somebody help, what my picture setup or setting is missing?
here's the algorithm view
@papr I saw that Pupil Labs shared some user's work where they integrate Tensor Flow and OpenCV into the pupil labs core. So that Pupil Labs could realize real time object recognition. I'm wondering if there is any step-by-step tutorials on how this could be done?
@user-5749a9 I think the issue is not the 2d confidence but the 3d confidence. After starting Capture and roll your eyes until the green outline fits your eyeball.
@user-5749a9 alternatively, change the detection and mapping mode in the general settings to 2d, to verify my guess that the 2d confidence is already sufficiently high.
@user-c629df I think I know the project. Let me look up the link to their repository. Do you still have the link to the article? This would help verifying that we were talking about the same project.
@papr huge thanks again! will try π
Can confirm I'm getting 0.9 confidence when in 2D mode. Also I can get as far as getting the green circle around the eyeball, but when looking at th debugging model the red dot is jittering quite a lot, thus no confidence at all. Maybe the angle between camera and eye ain't optimal and I need to modify on the hardware side
Hello,
I am getting following eye camera feed where it cannot detect the pupil. Greatly appreciate if someone could help.
Mine is a 200Hz Binocular PupilLabs core and it does not allow me to change the focus either.
@user-346f9b hey, your left eye camera seems to be obstructed by a smudge. Maybe due to touching it by accident. Please try to clean it carefully with lens cloth.
Also, please try to set the ROI (see eye general settings) to not include the dark areas in the outer top corners. The algorithm looks for dark area and currently the corners are too dominant.
Thank you very much @papr ! But the lens is too small to be cleaned or touched. I will try to clean it.
Thanks for the insight on ROI as well
@user-c629df I think I know the project. Let me look up the link to their repository. Do you still have the link to the article? This would help verifying that we were talking about the same project. @papr Hi Thanks for your reply! Here is the link to what I was referring to: https://pupil-labs.com/news/object_fixation/
Hi, is it possible to use the LifeCam HD-3000 instead of HD-6000 for the eye-cam? Does someone as experiences with that one? Thank you in advance
Hi all, I'm trying to understand a strange behaviour when analyzing my recordings. Shortly, I get reasonable results for horizontal scan, but vertical scan are definitely wrong (see pictures attached). I'm wondering if this could be due to the distance from the screen (about a meter), and (if I'm right) how could I correct the possible effect of a smaller visual angle.
Thank you.
@user-aaa87b It looks to me as if your calibration just performs better horizontally than vertically. May I suggest using the single marker calibration (assuming that you are using the screen marker calibration)? With this calibration, you have full control over the calibration area and therefore the possibility to improve the accuracy in weaker areas.
@papr Hi Papr. Yes, I'm using screen marker calibration with one marker moving around the screen stopping at 9 points (4 corners of the screen, 4 in the middle, 1 at the centre). What exactly do you suggest?
@user-aaa87b The procedure is described here: https://docs.pupil-labs.com/core/software/pupil-capture/#calibration-methods
Thanks a lot! I'll check it out and I'll let you know the results.
@papr Ok, I've checked.Unfortunately we're not actually using Screen Marker Calibration but rather the Manual Marker Calibration procedure. The computer we're using for stimuli administration is not the same machine we're using for running Pupil. Suggestions?
The computer we're using for stimuli administration is not the same machine we're using for running Pupil. Could you elaborate on that?
Pupil runs on one machine and stimuli are administered via a second computer running Psychopy
@user-aaa87b ok, no problem. Just set the single marker calibration mode to "manual". It work similarly to the Manual Marker calibration with the difference that it is sufficient show a single marker and perform the head movement
Ok, I'll give it a try and let you know. Actually the calibration procedure we're using right now is based on the Manual Marker Calibration from the User Guide. All 9 markers are shown on screen in order starting from screen centre, and the last one is the stop marker in order to automatically stop calibration.
Hi Pupil-labs; we're setting up a public-private project and are interested to use Pupil Core + VR as a platform for our project. Is there someone that I could ask a couple of questions with regard to IP to get the investor on board too? Thank you
@user-2fface Please contact info@pupil-labs.com in this regard.
Ok - will do that, thank you.
@papr I'm afraid we're gone from bad to worse. With Single Marker Calibration
Horizontal
Hi team, is it possible to use Pupil Core to track eyes while wearing a VR HMD? Thank you
@user-4d2f62 Pupil Core is not easily compatible with VR HMDs. We offer dedicated add-ons for some VR HMDs though. More info on that can be found here: https://pupil-labs.com/products/vr-ar/
@user-aaa87b If you want you can share recordings of both calibration choreography procedures with data@pupil-labs.com s.t. we can give you more specific feedback in regards to that recorded data.
@user-4d2f62 Pupil Core is not easily compatible with VR HMDs. We offer dedicated add-ons for some VR HMDs though. More info on that can be found here: https://pupil-labs.com/products/vr-ar/ @marc Thank you for the reply! But what's the main problem? Do the device need an outside light or what? Let's say we have an Oculus device. It is super hard to use the add-ons you've suggested (as I've been told). Why not to wear a pair of Pupil Core glasses? Any information would be a huge help. Thank you.
@user-4d2f62 The add-ons we offer are are a variation of the Pupil Core hardware that was optimized for use within the respective headset. The main problem with using Pupil Core together with a VR headset is that it is not possible to wear them simultaneously while still getting reasonable eye images. The way the eye cameras are placed in the VR add-ons is optimizing the camera positioning within the limited space inside of a VR headset, while not occluding the subjects vision.
Using our add-ons should in principle not be very difficult. They can be plugged into the headset easily and after that they can be used similarly to Pupil Core.
@papr Ok, Thank you. Tomorrow I'll prepare a complete experiment for you to evaluate.
@papr Hi Papr. You can download the experiment from here: https://www.dropbox.com/s/tqachvlb0u9y2u7/2020_03_26.tar.xz?dl=0
@user-aaa87b Thank you! We will have a look and come back to you next week. Would you mind sending an email to info@pupil-labs.com such that we can follow up via email?
@papr Ok, of course, I've already sent an email with the link at [email removed] but I'll forward a copy to info@pupil-labs.com too.
@user-aaa87b That is fine, in this case the second email is not necessary
@papr Ok. Thank you.
@here π£ Announcement π£ - We have just posted the v1.23 release of Pupil Software. It is available here: https://github.com/pupil-labs/pupil/releases/tag/v1.23
Most notably, it improves the driver installation process on Windows and allows Pupil Player to load multi-part recordings quicker, especially if they contain a lot of parts.
Read more about the changes in our release notes: https://github.com/pupil-labs/pupil/releases/tag/v1.23
Anyone have an issue with the trim beggining and end markers moving unpredictably? I looked on github and saw no issue raised. I am trying to set begin/end trim for several videos and when I interact with either I can click and move but frequently the markers don't respond to clicks or can't be unclicked (i.e. I set in a position, release mouse button and it still is selected and moves with my mouse)
I am using v1.23 and had same problem in v1.22
Windows 10, core-i5, 16GB ram
to clarify using Pupil player
Is there an alternate way to just type in beginning and end times? I have the list of timecodes and doing via mouse with the problem makes it very difficult as it messes up almost every other action
@user-1b50f9 you can set the trim marks via text input in Player's general settings
@papr ok thanks I missed that seeing that option! Have you heard of this mouse issue? Should I file an issue on github?
hello, I need help configuring my pupil eye-tracker. It's giving me a bunch of errors.
@user-7d4a32 what type of errors do you see?
@papr Thank you for the quick reply. A moment, I'll paste the error.
For the record, I am using Windows 10 64-bit x86.
This is the error in the eye-tracker
This is the error in the prompt.
@user-7d4a32 which version of Pupil Capture are you running?
how do I check?
It should say at the start of the logs
1.23.0
@user-7d4a32 ok, thank you. This is the most recent version of Pupil Capture π Would you mind share the Home directory -> pupil_capture_settings -> capture.log
file with us?
@papr Here you go.
@user-7d4a32 Thank you for sharing the logs. Would you mind testing Pupil Capture v1.22 and checking if the driver installation is successful with this version. You can download it here: https://github.com/pupil-labs/pupil/releases/tag/v1.22#user-content-downloads
You do not need to delete the v1.23 folder in order to run v1.22. π
@papr Thank you, a moment while I download please.
@papr You're a miracle! It's working better, however I'm a bit worried, there's a bunch of NaNs being printed. Is this OK?
@user-7d4a32 yes, this is ok. That's the output of the 3d eye model. You can ignore it. On Monday, I will look into your issue with v1.23. Would you be available via email in case there are further questions regarding your setup? If so, could you please send an email to [email removed] with the above screenshots? This would help us to keep track of the issue.
It's ok, I don't need v1.23, as long as this works. I'm a software developer however, and I'd like to view the eye tracker's output in Python, is there a link explaining how? Thanks.
@user-7d4a32 Ok. We would like to come back to you when we have a possible fix in order to check if it works as we were not able to reproduce the issue during our own development.
view the eye tracker's output in Python Check out our network api here: https://docs.pupil-labs.com/developer/core/network-api/
@papr That's no problem, I'll be available for you, I will send an email.
Thank you for all your help!
@user-7d4a32 Thank you very much!
Hi there, quick question. Is there a way to delete an annotation inside of pupil player βi.e. without opening the pldata file elsewhere.
hello, i need help with gaze,do you have any suggestion while using calibration?and i want to use gaze data to put on my picture , do you suggest to make mine own calibration too?
Hi there, I am using Pupil core installed on Hololens, unfortunately, as I try to adjust the eye cameras, it is impossible to get better view of my eyes and that lead to low confidence in pupil detection. I attached my eye view and asking for any suggestion.
@user-d9a2e5 I'm not sure I understand your question. Please could you give a short example of what you are trying to do so that we can try to help out.
@user-64e880 Based on the screenshots you have provided, I see that you have already set the ROI. Have you made any changes to the pupil detection algorithm (e.g. min/max pupil diameter)? The screenshots you show - qualitatively - look like decent pupil detection. However, if possible it would be great if you could capture the from a bit higher up (I suppose that the hololens lenses are blocking camera movement/position - correct? What confidence do you see for images like above?
hello. if i don't see "libusbk usb device" in device manager, would there be an error ' Could not connect to device! No images will be supplied'?
i see this error, and no images are seen. what should i do?
now i see drivers in 'libusbk usb device', but no drivers in 'imaging device' and 'camera'. is this a reason that i get gray image or white image(showing background compensation error)??
Hi, I have just started using Pupil Core. I have a question about "gaze_point_3d_x" and "gaze_point_3d_y" in gaze positions.csv. When using a 100-degree and 60-degree world camera, what is the number of horizontal and vertical pixels on the entire screen? Example οΌUpper left of screen (0,0)γLower right of screen (640, 480) Please let me know if you know.
@user-20b83c We have received multiple reports of this issue when running Pupil Capture v1.23 and are currently investigating it. π
@user-20b83c
now i see drivers in 'libusbk usb device', but no drivers in 'imaging device' and 'camera'. is this a reason that i get gray image or white This sounds as if the drivers were installed correctly π€
@user-f79f09 The gaze_point_3d
is not in pixel-coordinates but in camera coordinates. You need to know the camera intrinsics (camera matrix + lens distortion) to transform one into the other. If you are looking for pixel coordinates use the norm_pos
and transform it to pixels. See this section of our documentation for more information: https://docs.pupil-labs.com/core/terminology/#coordinate-system
Thank you very much. I will check the URL indicated.
this is the image when i run pupil_capture.exe with administrator :(.. anything i can do?
@user-20b83c Do the eye cameras work?
no.. it shows gray image
you want the picture?
No, thank you. Could you share the capture.log
file?
im sorry, but where can i get that capture.log file?
Home directory -> pupil_capture_settings -> capture.log
here.. thanks
Yes, this looks like the reported issues that I mentioned earlier. Please use Pupil Capture v1.22 until we have fixed the issue.
okay im trying v1.22. thanks alot!
i want this to be a last.. i got the world image, but no image in eye tracker. what can i do?
may be this is the reason
First step would be to "Restart with default settings" from the general menu.
thanks.. really.. π
Did it work? To clarify: We need the Pupil Cams to be in the libusbK USB Devices
section. Else it will not work.
yes its working
@user-20b83c @user-7d4a32 We have just released Pupil v1.23-4
which includes the fix for the driver installation issue that you have experienced previously.
Please give it a try and let us know if the issue was resolved for you.
is it on official pupil labs page? or github?
thanks i will check tmw and tell you π
@wrp Thanks for your response. the confidence is very sensitive to my eye movements. sometimes it is around 95% and sometimes it declines to about 20%. it seems to be fluctuating very fast. And sometimes the red dot and circle on eyes blinking so fast.
it works thansk!
@user-64e880 have you tried decreasing the model sensitivity?
@wrp No I did not do anything about sensitivity. Could you please share me documentation about that?
@user-64e880 typically the default parameters of the pupil detector are recommended. However, in some cases (maybe like yours) you might want to try reducing model sensitivity for the 3d model (see screenshot). You will need to experiment with the values.
now i have a problem in pupil player.. it worked an hour ago, but it's not now. there is no response. thanks
i think this is the log file for player. thanks..
and it works with v1.22..
@user-20b83c please delete the user_settings files in the pupil_player_settings folder and try again.
hi everyone. I'm just revisiting an old question about pupil player's offline fixation detection parameters. I'm curious about what values people use. I've read a couple different papers which suggest/use different settings. I'm using 2.71ΒΊ, 50ms (min) and 1500ms (max). What do other people use, if they don't mind sharing?
@wrp i meant:1) when i using gaze calibration can you tell me if there are any ways to make calibration effective , or the only way to make calibration better is by moving the eye cameras ?, 2)i want to put gaze on my own code (for example on a photo) so i tried to make a "clibration" while recording , and on the code subtracted the mean , can you suggest me if i doing good ? or there are other way to use gaze data on seperated code?