@user-20faa1 One Plus devices have proven to be robust for Pupil Mobile use. We also used to recommend Moto Z3, but that device is becoming harder to find. (Update: z4 is coming out soon and may be worth checking out)
Hi! I am developing an eye tracking demo with pupil eye trackers. I need to transform the sphere_center data in eye cam frame into my world frame. Can you specify the three axis directions of the eye pinhole camera? Thanks!
@user-16ad22 Check out the eye_center0/1_3d
fields in the gaze data. They are the sphere_centers in world coordinates
@papr Yes but can I use circle_3d_center or sphere_3d_center? I am not using the world camera so I need to know the definition of eye camera coordinate.
@user-16ad22 Check out the linked image in : https://github.com/pupil-labs/pupil/issues/1506
The same applies to the ey cameras.
Very basic question, but how do i run this program?
I downloaded it, but all i get is a bunch of maps
@here
@user-bc3dc3 On which operating system are you trying to run Pupil?
@user-bc3dc3 And did you download the software from here or from a different place? https://github.com/pupil-labs/pupil/releases
Cheers but i found another pc and got it working
Hello - I am getting "nan" for some dispersion values - what does that mean?
Hello everyone! ; ) How should I calculate how many fixations were NOT on any defined surfaces?
Is it good idea to check in fixations.csv
which fixations are on defined surfaces (i.e. are in files with prefix fixations_on_surface_<surface_name>
)
and these which are not in fixations_on_surface...
files are "not_on_any_surface"?
Hello! Iβm new with using pupil labs and I have a few questions.
Thank you!
@user-a6cc45 sounds correct
@user-9c99b6 1. You can select a calibration range from which calibration markers will be selected. 2. Edited calibration markers should be saved automatically. Please let me know, if you did not refer to calibration markers in this case.
@user-2798d6 this is a bug which I have not looked into since we are about to replace the current fixation detector with a third party implementation.
Thank you @papr
Also, if I want to do calibration on subsets of the video, so I want to trim it and do calibration on different sets, how do I do that?
@papr - will I be able to go back and re-detect fixations from previous video with the new detector? Do you have an eta on that?
Hello, I'm trying to run the pupil player on an OS X El Capitan 10.11.6 and it just does not open. What can I do to resolve this?
Hi, short question:
What is the best way use the existing pupil software for linux in a headless environment? We plan to connect the pupil to a raspberrypi and forward all generated data via zmq. Is there an existing solution for this?
@user-9c99b6 we have a tutorial series on YouTube on that. I am currently on mobile so it is difficult to link. I can do that later when I am back at the computer
@user-2798d6 yes, of course. Eta ~2 weeks
@user-741ae5 not even the gray window opens? Can you share the player.log file in the pupil_player_settings folder?
@user-8e47a4 there is no official completely headless version. I have heard that there are a few people that made it work but I do not know any by name from the top of my head
@papr I can not install it
@papr I couldn't find any when looking for it. I'll try to get to work on it tomorrow. If it turns out nicely I'll submit a pull request with an added headless and mqtt flag π
@user-8e47a4 sounds good!
@user-741ae5 on which operating system are you?
Ah macos, I see. You can install it by opening the dmg file and copying the application into the application folder
@papr still would happily accept links to previous works tho ^^
@user-8e47a4 will keep you posted if I come across them
@papr awesome, thanks!
@papr yes, I've already copied to the application folder
@user-741ae5 then it is installed π next step is opening the application. Does a gray window appear?
@papr this message appear: Pupil Player terminated unexpectedly. Click Reopen to open the application again. Click to see more information and click Apple.
@user-741ae5 I see. What CPU do you have?
If you do not know the model number by hard you can look it up in the About my Mac menu, which you can reach by clicking on the apple symbol in the top left.
@papr
@user-741ae5 yeah, that is what I suspected. The bundle only works on the Core i series. You will have to run from source. See the documentation for the required dependencies and their installation.
Thanks @papr! By chance, do you know if the new fixation detector will correct current issues of not detecting some fixations or not detecting full fixations? I've been having some issues (I sent an email) with not all fixations being detected and with some fixations taking several frames to be detected so the duration is not totally accurate.
@user-2798d6 the new approach is based on the nslr algorithm which segments the complete gaze series and then classifies each segment into fixation, smooth persuit, saccade and post saccadig oscillations. Therefore you will get a classification for each gaze datum.
@papr I have this CPU too,, in it the application does not open but also does not display any error. It should work, right?
@papr the os is High Sierra 10.13.6
@user-741ae5 that looks better indeed π€ could you please share the player.log file in the pupil_player_settings folder if it is there now?
@papr Sorry for my ignorance, but where do I find this folder?
It should be in your home folder
@papr I figured I'd find it in this folder, but I could not find it.
@user-741ae5 OK. Without any log messages it is difficult to judge what is going wrong. Please try installing the source dependencies and run from source.
@user-741ae5 https://docs.pupil-labs.com/#macos-dependencies
@user-9c99b6 https://www.youtube.com/watch?v=_Jnxi1OMMTc&list=PLi20Yl1k_57rlznaEfrXyqiF0sUtZMMLh
Hello! I am doing small research using pupil software and eytracker. My supervisor told me to learn about marking up paper page for eyetracker. She guided me to the Pupil Lab website, however, I haven't find anything on this matter there. Would be very happe for any links and advices. In particular, I have to mark up paper page with a survey questions for my experiment.
The survey is in the Word document
@papr ok, I will try this, thanks for the attention
@papr I followed the steps of the link that you mentioned and everything went well, but the .bat files still accuse not having an application defined to execute them
@user-741ae5 the bat files are for running on windows. You have to start the application from the terminal.
@user-741ae5 I guess I should have given you the link that points to the actual start of the section: https://docs.pupil-labs.com/#installing-dependencies
@papr I have this Traceback (most recent call last): File "/Users/marcellaknipl/pupil/pupil/pupil_src/launchables/world.py", line 126, in world from file_methods import Persistent_Dict File "/Users/marcellaknipl/pupil/pupil/pupil_src/shared_modules/file_methods.py", line 27, in <module> ), "msgpack out of date, please upgrade to version (0, 5, 6 ) or later." AssertionError: msgpack out of date, please upgrade to version (0, 5, 6 ) or later.
@user-741ae5 Please run pip3 install msgpack==0.5.6 -U
@papr I did and tried again. Now I have this MainProcess - [INFO] os_utils: Disabled idle sleep. world - [INFO] launchables.world: Application Version: 1.12.274 world - [INFO] launchables.world: System Info: User: marcellaknipl, Platform: Darwin, Machine: iMacdeMarcella, Release: 17.7.0, Version: Darwin Kernel Version 17.7.0: Wed Apr 24 21:17:24 PDT 2019; root:xnu-4570.71.45~1/RELEASE_X86_64 world - [ERROR] launchables.world: Process Capture crashed with trace: Traceback (most recent call last): File "/Users/marcellaknipl/pupil/pupil/pupil_src/launchables/world.py", line 147, in world from plugin_manager import Plugin_Manager File "/Users/marcellaknipl/pupil/pupil/pupil_src/shared_modules/plugin_manager.py", line 14, in <module> from calibration_routines import Calibration_Plugin, Gaze_Mapping_Plugin File "/Users/marcellaknipl/pupil/pupil/pupil_src/shared_modules/calibration_routines/init.py", line 15, in <module> from .fingertip_calibration import Fingertip_Calibration File "/Users/marcellaknipl/pupil/pupil/pupil_src/shared_modules/calibration_routines/fingertip_calibration/init.py", line 12, in <module> from calibration_routines.fingertip_calibration.fingertip_calibration import ( File "/Users/marcellaknipl/pupil/pupil/pupil_src/shared_modules/calibration_routines/fingertip_calibration/fingertip_calibration.py", line 13, in <module> import torch File "/usr/local/lib/python3.7/site-packages/torch/init.py", line 79, in <module> from torch._C import * ImportError: dlopen(/usr/local/lib/python3.7/site-packages/torch/_C.cpython-37m-darwin.so, 9): Library not loaded: /usr/local/opt/libomp/lib/libomp.dylib Referenced from: /usr/local/lib/python3.7/site-packages/torch/lib/libshm.dylib Reason: image not found
world - [INFO] launchables.world: Process shutting down. MainProcess - [INFO] os_utils: Re-enabled idle sleep.
Ok, it looks like there is an issue with your pytorch installation π
@user-741ae5 Please be aware that "PyTorch is supported on macOS 10.10 (Yosemite) or above." https://pytorch.org/get-started/locally/ -- Since you are running on old mac hardware, I am assuming that you might run an older macos version as well. Not sure if this is the actual reason.
@papr so I would have to downgrade my os?
@user-741ae5 Which version are you running? If it all, you would need to upgrade if your version is older than 10.10
High Sierra 10.13.6
Ok, this version is new enough. Could you please retry installing pytroch? pip3 install torch torchvision -U
I am going offline soon. If reinstalling does not help, we will have to continue tomorrow. π
@papr I have this Requirement already up-to-date: torch in /usr/local/lib/python3.7/site-packages (1.1.0) Requirement already up-to-date: torchvision in /usr/local/lib/python3.7/site-packages (0.3.0) Requirement already satisfied, skipping upgrade: numpy in /usr/local/lib/python3.7/site-packages (from torch) (1.16.4) Requirement already satisfied, skipping upgrade: pillow>=4.1.1 in /usr/local/lib/python3.7/site-packages (from torchvision) (6.0.0) Requirement already satisfied, skipping upgrade: six in /usr/local/lib/python3.7/site-packages (from torchvision) (1.12.0) I will try to run the pupil again
Ok @papr , thanks for the attention.
@user-741ae5 Have a nice evening! I hope you can get it running! Let us know if you do π
Hi there! I just recorded a session in the pupil mobile app (latest beta version), transferred it to my Windows PC and wanted to play it with the player app (version 1.11 and 1.12). This resulted in the following error (see screenshot) and an immediate shutdown of the player. Any recommendations?
@papr Could you send me the link when you have a chance? I can't find the video you're talking about. Thank you!
@user-9c99b6 I linked it above
@papr Oh, thanks!
Hello everybody i'm so excited that i'm going to start my experiments. Everything is possible thanks to this amazing community, always ready to answer all my questions and solve all my problems. So I have one last things to ask: i'm going to use surfaces marker in my exp. I'm going to show subject a picture as prime, after that two images that will appear together on screen, repeated 60 times. What's the fastest way to work with 120 surfaces? The exp will last 20 min, more or less. Should I split my recordings? Rec - 10 stimuli - stop. Rec - 10 stimuli - stop. Something like this. Or just a big recording? The problem is that i don't know if my pc has enough power to support more than 10 surfaces at time.
@papr hi, What are the definition of theta and phi ?
in the pupil position file
Hi everyone. @papr @wrp I have the same issue like @user-dd52c0 . "Index Error: tuple index out of rang" So, I cannot open and analyze my recording. I am thus somehow in trouble now, since I do research with the recordings and my participant will show up in about 2 hours. It seems like I cannot work with the video recording at all π¦
@user-78538a @user-dd52c0 I think this bug has been fixed in https://github.com/pupil-labs/pupil/pull/1510 and will be released in the upcoming v1.13 release.
@user-78538a If I remember correctly, we made an internal prerelease bundle for Windows which I can send you via PM
@papr Thank you!
@papr Thank you very much indeed. Saved my day!
Hello @papr, I could not execute on that CPU, I had to use another CPU for this (attached configurations). But now I have another problem, I have a recording made on the pupil mobile that when dragged to the player window, does not convert the files.
@user-741ae5 can you check if it returns the same error as @user-dd52c0 his screenshot above?
@papr Hi, What are the definition of theta and phi ? in the pupil position file
Hello @papr, apparently @user-dd52c0 and I get the same error. Is there any release forecast for version 1.13?
@user-741ae5 Currently, we plan to release it within the next two weeks
ok @papr, I'll wait for it for to try again. Thanks.
Hello, We have some problems with heatmap and gaze map. We got a heatmap .png file 1KB with a 3mins video . Is this video too short to get a normal heatmap? And for the gaze map, we have got the offline data and gaze mappings in the folder "offline data". How can we use those file to get the gaze map? Thank you
@papr @user-5d12b0 Hi, I've seen you have been involved in the development of the Pupil LSL Relay plugin. Thank you for the detailed instructions! We have run into an issue with adding the plugin to the Capture. We've added it to the folder as specified in the Pupil docs, however it had no effect. Is it necessary to run the Capture from source for it to register the LSL Relay plugin? Thank you for your response π
@user-0cf021 check the logs, it will tell you if there was an error while loading the plugin
You should find it next to the plugins folder
@papr Thanks for the fast response! I will try it first thing tomorrow and come back to you π
Hi, I would like to ask....let say in one screen there are multiple areas of interest (eg., bell, house, ). How can I find values of fixations for each object ?
Let say, I decide to find information of fixation on number 1, 2, and 3...How can I do it ? thanks a lot for your help in advanced
Would you like to help me how to create gaze plot out of 30 participants, please ?
I only find the code for creating gaze plot for one participant...thanks
Hi, @papr . I want to find the head_pose_tacker_model.csv file. Do you have any Idea?
@user-8fd8f6 it should be in the export folder after activating the plugin and hitting the export button
Have you built a model successfully? You can use the debug window to visualize the built model.
@papr Thank you for your response. Which icon should I active in the plugin?
@user-8fd8f6 You need to enable the head pose estimation plugin from the plugin manager menu as first step. Have you done this?
BTW, on which operating system are you using Pupil Player?
Windows
@user-8fd8f6 the head pose tracker is not supported on Windows yet π
I have a Mac system too, Does it work there?
@user-8fd8f6 yes, it does
Thank you.
@user-8fd8f6 have you seen our video tutorial on YouTube on this topic?
No, Could you please send me the link?
@user-8fd8f6 https://youtu.be/9x9h98tywFI
Hi, I'm looking for a connector of world camera. Does anyone know name of this connector?
I have got pupil mobile app in mobile and pupil capture in laptop and have connected the pupil eye tracker to the mobile and am able to view my eye videos and world videos but can someone help me how to use the features of Pupil Capture while my headset is connected to mobile? Features like calibration and plugins as i don't find them in pupil mobile app
@user-755e9e do you know the answer to @user-bea039 s question?
@user-df4bad these features are not available there. Either you stream the videos to Capture and proceed as if the cameras were connected directly to Capture. Or you make a recording with Pupil Mobile and run calibration etc after the effect in player. For the second case I recommend to watch our YouTube tutorials.
I guess streaming would be better as
@papr Can I know how to stream them?
@user-df4bad When starting with defaults in Capture, select the Pupil Mobile Manager
in the UVC Manager
menu. Your Pupil Mobile device should show up. Select it and hit the "auto activate" button. You should see the video streams pop up in all three videos shortly after
Please be aware that streaming is mostly meant for monitoring purposes and you might experience frame drops if the wifi connection is not good enough. If you do not need the data in real time, I recommend to go the offline-route
Hi @user-bea039 , the connector is called JST-SH1 4 poles. You can find it here https://de.rs-online.com/web/p/leiterplattensteckverbinder-gehause/3531096/?sra=pstk .
@user-755e9e Thank you!
@papr in the capture log, it is importing it, but then it doesn't seem to load (and it cannot be found in the list of plugins in the gui), but there's no error either. Are you seeing anything suspicious in it?
hi, im trying to execute the offline calibration trought pupilplayer but several times an error occours when i try to perform the offline pupils recognition, avoiding me to continue with all the procedure. dropping the file in pupilplayer 2 red strings appear on the screen:
EYE0/1: Process eye0/1 crashed with trace:[] Traceback(most recent call last):[] File "launchables\eye.py",line 339, in eye [] File "shared modul
is there any way to solve it?
@papr
Thanks a lot
It worked
@user-c1220d could you please share the full trace back. It looks like it got cut off in your previous message
actually the error message is that one
@user-c1220d can you check the log file please. It should contain the full error message
i acquired the file using pupil remote, may i ask you where i can find the log file? doesnt seem i have something like this in my file folder
player.log
should be in the pupil_player_settings
folder in your computer's home folder
thank you
@user-0cf021 maybe it is not in the correct folder. Do you run from source or from bundle?
@user-c1220d could you please shutdown player, delete the user_settings_eye0/1
files next to the log file, and restart capture?
And try again. The full error is:
2019-06-13 11:52:16,427 - eye1 - [ERROR] launchables.eye: Process Eye1 crashed with trace:
Traceback (most recent call last):
File "launchables\eye.py", line 339, in eye
File "shared_modules\glfw.py", line 556, in glfwCreateWindow
Exception: GLFW window failed to create.
thank you u very much, apparently before opening a new file in player i have to delete them from the previous one
it works
The right eye i.e eye 0 .. is inverted. Is that a problem with the camera or there need to be any change in the settings?
@user-df4bad The image is upside down purposely because the sensor is flipped physically. The orientation has no effect on pupil detection/gaze estimation. You can leave it that way or flip the image in the Eye 0 window GUI.
Cool then.. But while calibration there are few Red - orange lines which depicts the error of gaze I assume. Can we calculate the percentage?
@user-df4bad the error is calculated in degrees and can be found in the accuracy visualizer menu
Got it.. Thanks for the prompt reply
Am doing Calibration at a distance of around 3m. Reading the docs i have figured out that manual marker calibration is the best. But the gaze location ain't apt so is there any other calbration method i need to choose or am some other change that need to be done to reduce the error?
@user-50a1c1 checkout the accuracy visualizer menu. What does the gaze accuracy field say?
Hi again! Wanted to ask, if there is any possibility in the player to change the eye camera video brightness and contrast of a mobile app recording?
@user-dd52c0 This is not supported within Player. You will have to do this with a video editor before using the video in Player.
@papr thanks for the response. are there any recommendations on how to use Pupil outside in high brightness environments (especially with the mobile app)? I have a recording from an a/c cockpit that seems to be unusable - unfortunatelly...
@papr From bundle
That's what made me think that perhaps it is necessary to run the Capture from source, as everything else seemed to be fine
@user-dd52c0 I do not think there is much you can do tomake this video usuable π The video is just overexposed. π
@user-0cf021 in which folder is the plugin? I would be interested in the full path
Hi. I am using pupil mobile for the first time using the motorola phone provided by pupil labs. In the app, the left eye is appearing right side up; however, the right eye is appearing upside down. We do not see a way to invert the eye image. How do we get the right eye image to be right side up?
@user-c37dfd See Will's answer from above: "The image is upside down purposely because the sensor is flipped physically. The orientation has no effect on pupil detection/gaze estimation."
@papr thanks so much for the quick response!
Hi all, I'm looking for the least resource-intensive way to run pupil-capture on a laptop. We need to run some other programs simultaneously and found that we're low on resources. I intend to play around with settings, but I figured I may as well see if anyone has already gone through that process. Thanks in advance!
@user-6d6c44 turn off the Pupil detection and minimize the windows. That should give you a lot of resources back
@papr Thank you!
Hi there, did someone experienced difficulties with the mobile update from Android 9.0 Pie and running pupil mobile?
@user-14d189 please see the pinned comment re Pupil Mobile on Android v9
Which USB-C cam you can recommend? For better quality than standard cam
@papr Hi, I used my Mac System for the head position system. It works. But I didn't use the markers when I captured the files. Is there any way for me to track the head position offline? (Just with the captured files)
@user-8fd8f6 no, the head pose tracking is strictly based on April tags. You can do the tracking in Player but your world video needs to include the markers.
Ok, Thank you. Do you have any document about Theta and Phi?? Are the in Radian?
Yes they are in radians
But I don't know where their coordinate origin is.
@papr what do you mean about "You can do the tracking in Player" ?? by checking the video frames?
You can run head pose tracking in real time in Capture, or using a recording offline in Player, as it is demonstrated in the YouTube tutorial
@papr C:\Users\experiment\pupil_capture_settings\plugins\pylsl\init.py
@papr Hi, can I decide how many screen markers when I used to calibrate?
Hi.. i want to know the unit of duration from fixations. I set the duration of fixations between 300 - 1000 milliseconds, but they give so many number in durations. Thanks..
Hi @papr can you help me?
Hello, after doing ofiline surface tracking, how do I export the data related to the surface.
@user-b13152 There is definitively something wrong with these numbers. These should be floats with a single decimal. We are releasing a new eye movement detector in our next release v1.13 which is planned for this week. It will replace the old fixation detector and should have less problems than the old model π
just found the folder.... with the surface info
but can we get coordinates?
imean coordinates of the surface
@user-888056 there should be a surface positions file
perhaps I need to select something to export them?
Got it! π
I hadn't added the surface properly.
You will get one positions file per surface.
So you need at least as surface
Thank you papr. I have it now
thank you very much @papr .. I'll be waiting the new one.
would there be a reason why there are no fixations_on_surface although I do know that there should be?
@user-888056 did you enable the fixation detector?
offline fixation detector?
correct
thanks
hi everyone
i don't know if it's proper to ask here but i wonder if there is pupil add-on for oculus rift s
i would really, really need that π
@papr Hi again, we're still having this issue with loading the LSL plugin. We're running Capture from bundle and the plugin is in C:\Users\experiment\pupil_capture_settings\plugins. Do you think running it from source (and putting the plugin in the respective folder) would solve this issue, or do you think there is a different issue? I've attached the log in case you'd like to have a look at it. Thank you for your help!
Hi,
We have been using pupil player to process data just fine. But for one recording that we did recently, pupil player does not load word video and only occasionally load the audio recording. It gave the error that reads something like "player initiated failed. source file could not be found at ...eye1.mp4. We did not record eye1 at all. The individual mp4 files for the audio, word and eye videos look just fine. Do you have any idea about how to fix it?
Hi @user-11498a Could you please share the recording with data@pupil-labs.com ?
The error message might be unrelated to the actual problem
Thanks for the reply. Yes. The recording lasts for about 30 mins. So the files are pretty large. Is there an efficient way for me to share it other than through email?
@user-11498a ok, in this case, please share the player.log
file in the pupil_player_settings
after Player crashed.
Depending on what the logs say, you can do some local tests which might give us more insight into the problem.
Ok. Where can I locate this file? Player didn't actually crash. It just a grey screen with the timeline and everything.
aah, ok
Have you used Pupil Mobile to record the data?
no, on a surface laptop through a usb wire
Can you check if the recording includes a world_timestamps.npy
file?
it does
Can you playback the world video file in VLC?
yes, i can
Ok, then I need the player.log file
It is in the pupil_player_settings
folder which you should find in your user folder
Just try searching for it in the file explorer (if you are on windows)
I just checked the length of the three mp4 files. They are different. During recording, we noticed that the word recording had some lags. Could this be the reason? If I trim them to be the same number of frames, would that fix it?
No, this is ok. Player should be able to handle this.
As I said, I need the log file in order to better diagnose the problem.
Sorry about the delay. I had to switch computers. Here it is:
Have you opened one of the "unsuccessfull" recordings before sending the log file? If not, please do so. The log file is overwritten each time Player is started and the log file does not show any problems.
here is after trying to load the recordings
ok, this log file does not show a problem either. π€
I think I will need access to the recording in order to diagnose the exact problem.
ok, i will compress it and email it to you
tex
thx
You can use ffmpeg to compress the videos
After installing ffmpeg, you should be able to run ffmpeg.exe -i <original video path> <compressed video path>
Hi, I have sent the files to [email removed] Please take a look. Thanks.
Quick question: if I use offline calibration and pupil detection is there away to export videos from pupil player that will also generate the pupil_position and gaze_position spreadsheets?
Yes, but these are two different plugins. The world video exporter and the raw data exporter.
Hi, so we are doing research that involves having subjects look at an easel board with stimuli (cards, magnets, etc) on the board and observing what the subjects look at based on audio cues. We are having some trouble calibrating the gaze data because the subjects often move away from or towards the surface during the trial and in between calibrations. I know you can use the surface plugin to add a surface and observe a heat map, but is there a way to use the plugin to improve gaze detection in 3d space while still observing temporal data? For example, would adding the easel board or the cards as surfaces be useful for preventing error in calibration due to the subject moving around? Thanks!
The calibration is independent of the surface tracking. It is a three step process. 1) pupil detection 2) gaze mapping 3) surface mapping
@user-a7dea8 to elaborate on that: the exporter does not care if the data was prerecorded or generated offline. At the end, it depends on the active plugins which data is being exported.
Hi, we have been using the audio recording. It introduces some delays. How to get around that issue and make sure that the video and audio are perfectly synced?
@user-11498a on which platform did you record?
@papr Hi, Is the "circle_3d_radius" the r in the below picture?
in the pupil position file
@user-8fd8f6 no. Phi and theta are related to the position of the location of the 3d circle on the eye ball. The eye ball has a fixed size that does not change. The circle 3d radius is related to the size of the pupil
The r in this picture is the eye ball size
So the Theta and Phi are not as shown in the picture too?
@user-8fd8f6 spherical coordinates define a three dimensional point. In the case of the pupil datum, we describe the orientation of the eye model either as a vector in the cartesian coordinate system or as (phi, theta, r) in the spherical coordinate system. Since the eye (modelled as a spehre) has a fixed radius, the Pupil datum omits the r value. The Pupil is modelled as a flat disk (circle 3d) that is tangential to the eye ball at the location (phi, theta, r). The radius of this disk is dependent of the current size of the pupil.
So the circle_3d has nothing to do with a spherical coordinate system or the picture that you posted. I hope this clears up any potential misunderstandings. π
perfect answer! Thank you
Hi all. I am currently trying to generate heatmaps using Pupil Player with no avail. Whenever I click "(Re)-calculate gaze distributions" in the Offline Surface Tracker plugin, I get the following error:
player - [WARNING] offline_surface_tracker: No gaze on any surface for this section!
I do not believe the issue is with my surface marker size, as if I change the mode to "Show markers and Surfaces" the markers are detected just fine. It appears that none of my gaze data is being generated or carried over from Capture. Adjusting the X and Y sizes under the surface tab does not appear to help. Any advice?
@user-a21e3c do you see the gaze data being visualized as green dots during playback?
This error message indicates that there is no gaze data. If you recorded gaze data in Capture, make sure that the Gaze From Recording mode is active.
@papr Yes, I can see the gaze data being visualized as green dots. Where can I find Gaze From Recording mode in Capture?
Did you mean to say Player? In your initial post you mentioned that you were using Offline Surface Tracking.
Sorry I misread your message. Thought you meant Gaze From Recording was something you selected when recording in Capture
In Player, you have an icon with concentric circles on the right. From its menu you can activate Gaze From Recording and Offline Calibration
Ah, I see thanks! I still can see the green dot representing eye tracking throughout the video, and the surface markers are still highlighted in Blue when I select Show Markers and Surfaces. But I still get the "No gaze data for any surface on this section!" Warning when I try to generate a heatmap
Do you see the red line around your surface?
You might need to setup a surface first, Marker detection is the first step in Surface Tracking. The second is setting up surface based on a set of detected markers
You can add surfaces on the left by hitting the A
button
Yes that did it! Unfortunately the heatmap is now entirely red. I do recall seeing this problem earlier though in the Discord chat, if it's already been answered I can look there and stop bothering you π
Figured it out. Thanks for the help!
Trying to open pupil player and keep getting the following error: player - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend
Any ideas?
@user-f3a0e4 You can ignore that error if you are not using a Realsense 3D camera
The issue is that the grey pupil player screen that usually pops up doesn't appear
I'm therefore unable to drag and drop a saved file
In fact, today I was having issues connecting remotely to the device and decided to "restart with default settings", which fixed the issue. However, since then I have had non stop errors, especially with calibration using pupil mobile. Does anyone know if any of the default settings might be causing this?
The error for calibration is: WORLD: collected 0 monocular calibration data WORLD: Collected 0 binocular calibration data WORLD: not enough ref point or pupil data available for calibration
This error does not happen when calibrating through local USB
@user-f3a0e4 1. In player, you need to drop the whole folder, not a single file. 2. Make sure that you are streaming all cameras to Capture
Yeah I have tried, but it literally won't allow me to drop anything in there as the grey window will not o/pen. It was working earlier this morning but there now seems to be some sort of error
@user-f3a0e4 ~~What is the error?~~ Can you be specific which folder you are dropping onto Player?
Hi, question of clarification. Under the Calibration menu in Player, I have the option to set a range for "Calibration" and for "Mapping". I am assuming that the difference between these two is that the trimmed section one selects for "Calibration" is used to identify the region of the video with the calibration marker, and the trimmed section one selects for "Mapping" is the region of space that calibration corresponds to?
Additionally does this mean that if the wearer significantly changes their visual perspective, a recalibration must be in order each time?
Hello papr, I have stupid question. When you see the video in player. How do you plot gaze position. I mean how do you overlay the frame with the position since we have multiple eye recordings for the same frame?
I'm trying to open a recording. To do so, I usually just open pupil player and a grey blank page appears saying "drop a recording directory onto this page". However, now when I try to open pupil player, this grey screen does not appear?
@user-f3a0e4 ah, ok, now I understand. Can you try deleting the user_settings*
files in the pupil_player_settings
directory and try starting Player again?
@user-888056 We create a many-to-one relation ship for each world frame. Each gaze datum is associated to its closest world frame (by time)
@user-a21e3c Correct, the mapping range is the range on which the calibration is applied to. Yes, recalibrating regularly is recommended. Especially in 2d mode.
What happens if we select a range for Calibration but not one for Mapping?
@user-a21e3c Then you won't get any gaze data
But the default gaze range is the complete recording
@papr I hope i'm not missing it, but I can't find anything for user_settings or pupil_player_settings when searching the entire pupil labs directories
@user-f3a0e4 you can find pupil_player_settings
in your user folder
@papr amazing that worked! thanks!
@papr are you able to help with my other issue? Today, for seemingly no reason, calibrating when using pupil mobile is always unsuccessful. This cannot be because of pupil detection or positioning because when performing the same calibration using the local USB the calibration is perfect?
The error for calibration is: WORLD: collected 0 monocular calibration data WORLD: Collected 0 binocular calibration data WORLD: not enough ref point or pupil data available for calibration
What does "Surface Cache is not Build Yet" mean?
I was working on generating heatmaps for a series of surfaces in a single video in Player. Everything was fine at first until I hit the 1 minute mark, at which point I had no gaze data for any part of the video afterwards. I tried playing around with a few of the settings, and now I lost the gaze data from the first minute as well. No green circles or fixation points at any point during the video anymore. Reopening the video in a separate Player tab does not restore the data. What can I do?
@user-f3a0e4 Please make sure the Time Sync plugin is active before calibrating.
@user-a21e3c I am not sure. We will be releasing a new version of the surface tracker this week. It might solve your first problem. Not sure about the gaze data. Are you using offline calibration?
@papr we record on a surface pro 6 laptop
Yes I am. Gaze data is restored when I select Gaze Data from Recording, but every fixation point is about an inch off of where it should be. Offline Calibration was far more accurate while it was working, so I chose that instead.
I'm also seeing under the Offline Calibration tab that the Detection Progress is stuck at 100% and never says "Complete". Does that tell you anything?
*an inch of the screen, not an inch of the real world
@user-11498a Could you share an example recording with [email removed] that makes clear that the audio is offset?
@user-a21e3c Can you try recalculating?
Yes, does not appear to do anything. Here is the terminal log:
player - [INFO] gaze_producers: Calibrating section 1 (Unnamed section) in 3d mode... Calibration Section 1 - [INFO] calibration_routines.finish_calibration: Dismissing 2.99% pupil data due to confidence < 0.80 Calibration Section 1 - [INFO] calibration_routines.finish_calibration: Collected 657 monocular calibration data. Calibration Section 1 - [INFO] calibration_routines.finish_calibration: Collected 0 binocular calibration data. Ceres Solver Report: Iterations: 6, Initial cost: 1.950097e+01, Final cost: 1.618617e+01, Termination: CONVERGENCE Calibration Section 1 - [INFO] gaze_producers: Offline calibration successful. Starting mapping using monocular 3d model. player - [INFO] gaze_producers: Cached offline calibration data to C:\Users\Coltan\recordings\TestingWithCode\000\offline_data\offline_calibration_gaze player - [INFO] offline_surface_tracker: Gaze postions changed. Recalculating. player - [WARNING] offline_surface_tracker: No gaze on any surface for this section! player - [INFO] fixation_detector: Gaze postions changed. Recalculating. Fixation detection - [INFO] fixation_detector: Starting fixation detection using 3d data...
^Last line seems to imply some process is incomplete but running, but it's been sitting like that for almost 15 minutes now with no updates.
Still no gaze data visible? What is your mapping range set to?
No, no gaze data. I have three sections in the video, each of which consists of a calibration and a surface. Currently I'm only working with the first one for simplicity, so the calibration range is the first 15 seconds or so, and the mapping range is everything after that until the next section's calibration
Gaze data randomly reappeared a few minutes ago for the first section only, but disappeared when I tried to calibrate the second section
Okay now I have the gaze data for the first section's surface but nowhere else. Heatmap generated nicely, not sure now that could have happened though if gaze data is not available for the calibration.
@user-a21e3c Maybe it could help if you watch our youtube tutorials on offline calibration. Did you watch them already?
Are you referring to these?
Yes π
Okay haha, yes they did help! But they helped me get to where I am now, and can help no more
Can you extend the timelines at the bottom. They should show the current setup of the sections. Could you make a screenshot of that?
There is nothing below eye0 FPS
I does not look like the gaze mapping ranges are setup correctly
Actually, it looks like you have not gaze mapper setup at all
@user-a21e3c In the gaze mapper menu, click: New Gaze Mapper
Where is the gaze mapper menu?
open the offline calibration menu. it has three sub menus: references, calibrations, gaze mappers
open the gaze mapper section
Hm I don't see that. Let me just send you a few screenshots of what I do see
I am now realizing this is version 1.10. Perhaps I should update to 1.12 before attempting to troubleshoot further?
Yeah, I just wanted to note that. The videos refer to v1.11 and higher
Okay I will do that and try again
Another side question, is there any advantage to inverting the eye camera image when using Capture?
Or I guess it says "Flip image display" in version 1.12 now
@user-a21e3c no
This is just for visualization purposes
@papr Thank you. Reactivating the time sync plugin fixed the issue π
Hi all, we use the pupil-labs eye tracker (200Hz) in complete darkness and we're struggling to get it to work properly. With the lights turned on it works perfectly, no problems with the detection of the pupil. However, when the pupil dilates due to the complete darkness, the pupil is not detected anymore or the confidence drops below 0.3. We made sure to set the pupil size of the 3D detector large enough, so that should not be the issue. For some participants we even observe a partly "bright" pupil. With our old 120Hz pupil-labs headset we did not have this issue. Is there a way to solve this? I have to note that we use version 1.8 at the moment, because we wanted to keep settings similar between different experiments.
Screenshot of the "bright" pupil effect
@user-5c16b8 please try to run it with the 320x240 resolution
@papr It's the new 200 Hz cam, so if I remember it correctly it only has 400 x 400 or 192 x 192, right?
@user-5c16b8 correct! Please use 192x192 in this case
Ok I'll try it. The only issue with that approach is that we're also working on estimating torsion based on the eye images and that won't work with the lower spatial resolution. But for now it might be a solution. I'll let you know whether it works.
@user-5c16b8 I see! That will be likely a problem indeed.
@papr It appears to track more stable with the lower spatial resolution, but we need to test it a bit more to be sure. And it's only a temporary solution, because we'd really like to include the torsion measurements
Hey, all! I'm brand spanking new to Pupil and I have no previous eye tracker experience. I'm trying to set up a Pupil Mobile headset for my researching professor but I can't seem to get the cameras anywhere near my eyes. It's like they're too low. I have tried the camera arm extenders and a bunch of different angles, but the only time the cameras even get my eye in frame is when I have the headband held up about an inch or more above my ears. Any suggestions?
Has anyone used the moto Z3 play with the Pupil Mobile with success? I know that previous versions of the moto Z* play have worked.
@user-20faa1 yes, if I am not mistaken we are shipping them as part of the bundle
@user-20faa1 you might need to enable the OTG settings
@papr Thanks! I thought I saw that the OnePlus 6 was shipping as part of the bundle, but I wasn't sure about the moto Z3 play.
@user-20faa1 the Z3 works good.
A question to the surface tracker. what's the min resolution it can be used on? If I set up the marker in the environment how many pixel does each marker need to include.
reason is we work with a very low resolution cam and over time I need to control head movement.
the cam resolution is only 224x171. a 10x10 matrix relate to 2.5 x 2.5 degree.
Can Pupil Labs team assist us with custom make cameras if we order enough of them and if yes what will be the cost per camera?
Hi @user-ede53a send us an email so we can get a better idea of what you need
@user-20faa1 Actually, I was mistaken. We started shipping the OnePlus 6 as part of the bundle. We used to ship the Moto Z3 Play though. Both should work fine with Pupil Mobile.
Hey folks - a few weeks ago you updated HMD eyes to use new 200 Hz cameras. Will the mobile tracker be redesigned to use them as well?
...and, no more educational discount on the webstore?!?!
Hey Guys, I am currently using the pupil labs devices for a research project and I am pretty amazed and astonished so far. However, I would very much appreciate to opportunity to get to know some other users in exchange for some experiences concerning the setup und usability in some situations. Would anybody be willing to chat and exchance experiences? PM me please in German or English (German based research assistant). β
@papr Hi, I have a question about player 1.13. In the offline eye movement detector we have fixation gazes and saccade and.... and also we have offline fixation detector (that we can define the fixation parameters). Are they different with each other in the fixations?? What is the definition of fixations in the offline eye movement detector?
Hi guys. I am trying to install the Pupil apps on my MacBook Air, but when attempting to open the downloaded files I get an error stating "βPupil Playerβ is damaged and canβt be opened. You should eject the disk image."
Anyone familiar with this?
@papr In player 1.13 I think the Frame numbers in the offline eye movement detector are wrong
@user-8fd8f6 checkout the NSLR paper for details on how the eye movement detector classifies eye movements.
@user-8fd8f6 also, could you please let @user-764f72 know what you mean by wrong frame numbers?
@papr Thanks! Has anyone run any battery tests with Pupil Mobile on the Z3 Play? How long did it last? If the battery does die, how does Pupil Mobile save the data? Is the full recording corrupted/lost if the battery dies?
@user-20faa1 I do not remember the numbers but it was well over an hour. Generally, we recommend to make multiple shorter (~20 minute) recordings.
Interesting. Our tasks are fairly long, with 1.5-2 hours of recordings per participant per visit. We might have to look at using multiple phones or the moto power pack mod.
@user-20faa1 I do not have the exact numbers in mind. I underestimated to be sure to not make further false statements π Maybe @user-755e9e knows more exact numbers.
@user-20faa1 if you are using a Moto Z3 Play connected to a Moto Mods Power Pack the battery should last 2/2.5 hours.
@papr can anyone help me with pupil apps installation on a Mac? I have entered all code listed on the masters doc into my terminal but still get a message that the apps are damaged and can't be opened? Surely someone else has come across this?
the main issue appears to be with downloading pyuvc
@user-764f72 Hi, I checked the V1.13. The frame range has a problem. As you see in the picture the video is in frame 143 but the result show frame range between 814 and 816.
@papr Did you change offline fixation detector in new version too?? Because the results are not same as previous version.
@user-8fd8f6 The only thing we did was to adjust the default parameters
It did not change in functionality
The method in previous was based on Pupil, In new version it is based on Gaze. Are the same in functionality?
@papr
@user-755e9e Thanks! Have you tested what happens to recordings if recording locally to a phone via Pupil Mobile and the battery dies? Is the full recording corrupted/lost?
@user-8fd8f6 Previously, it would choose depending on the available data. In order to be consistent with eye movement detector, we have decided to base both calculations on gaze since it is applicable to both, 2d and 3d pupil data.
To the difference between pupil and gaze based dispersion calculation: - pupil method: Requires 3d pupil data with a well fit eye model. One eye is chosen and any dispersion is calculated based on the eye model posture.
@user-8fd8f6 short update: @user-764f72 was able to reproduce the issue. We will be working on a solution to it.
@user-20faa1 Currently, we do not have any explicit fail-save on low-battery-shutdowns. This is something we will work on. For now, it is recommended to check the battery life regularly. Apologies for the inconvenience.
@papr Thanks for letting me know and being super helpful per usual!
Hi
I am a student at a university in Canada, and we are looking into getting your eye tracker to perform visual search tasks, do you have any papers that I can look at that have used your product to perform visual search tasks?
@user-6691cc Welcome to the channel. Please see our citation list: https://docs.google.com/spreadsheets/u/2/d/1ZD6HDbjzrtRNB4VB0b7GFMaXVGKZYeI0zBOBEEPwvBI/edit?usp=drive_web&ouid=115812273187120893285
Thank you
@wrp @mpk Guys, I'm not sure if this is intentional, but the academic discount for the Pupil Core / Invisible is not well advertised. It's only visible in the cart! If the goal is to incentivize more purchases form academia, well, you won't get that effect unless the discount is made more apparent.
Ok, thanks for the feedback @user-8779ef - we will make it more explicit
congratz to the new movement detector!
just a small question: is there some way to do batch processing? couldn't find much in the documentation..
Is there anyway to fix this damage?
I have soldering stuff but it looks like the wire frayed from the connection point
@user-adf88b please contact info@pupil-labs.com and attach the above image. Our support will come back to you. π
@user-02665a There is no batch export built-in to Player. But there are scripts that can export recorded data from multiple recordings without starting Player. Please check the Pupil-community repository for links.
As another question, I have an existing hololens project and was initially thinking of using hmd-eyes, will I need to do the getting started for pupil too?
@user-adf88b You will need to run Pupil Capture/Service, if you are referring to that.
That's the thing I don't really understand, how is the hololens supposed to interact with the headset.
@user-adf88b The eye cameras are connected to the computer running Pupil Capture. Pupil Capture estimates gaze and publishes it in real-time over the network api. This data can be accessed for interactions running on the hololens.
Ah so for pupil labs to function with the hololens an additional computer is needed, that makes sense thanks.
What is the field of view of the 120hz eye camera ?
@user-28fba0 I do not know that number by hard.
Hey all! Iβm opening my eye 0 window in pupil capture and itβs suuuuuper slow and laggy. It keeps freezing and trying to close on me. Any suggestions?
@user-4bf830 Hi, which version of Capture and operating system do you use? Also, can you check if the eye camera is properly connected to the rest of the headset?
Iβm on a Windows computer with the last pupil download that was posted, is that 1.13?
@user-4bf830 ok, thank you
Camera is attached, yes.
@user-4bf830 Could you share the capture.log
file in the pupil_capture_settings
folder?
Not at the moment, as my entire computer has launched into an update. π Iβll let this update download and then try and run it, youβll hear from me here again if itβs still being fussy.
@user-4bf830 Good luck with that update π
Thanks!
Hello, we noticed that in the new version (V1.13) of the pupil-labs software the remove surface button has disappeared. Does someone know how to remove surfaces now ? Thank you
@papr Is it possible to use the current github version of hmd-eyes with unity 2018.3?
Or do I have to backdate my unity version?
i m using it with 2019.1 and it works fine
Odd
.Is it the hololens version?
no vive
Damn
The Vive version is up to date
Anyone from Brazil here that mounted the DIY guide?
congrats pupil labs to your new web out fit! look good!