@user-7d0b66 all you need to do is make sure that the world window from the app is not visible in the video.
HI! Recently you've told me that the function to count saccades and other eye movements was deleted from the latest version. When are you going to add it again?
Hi, i was wonder if someone could help me with - where can i find the gaze frames that are missing from the gaze report? for eg. the npy and the video have the same number of frames but the gaze report doesn't.
@mpk Yes, I tried to do that. I tried recording and then switching the world window to something else, but even then the tracking showed to be very inconsistent, therefore not being able to genarate a proper heatmap. I don't know what I'm doing wrong.
@user-c5fb8b changing the codec to MJPEG and reducing the size to under 400x400 did the trick, looks like it is working now with the exception that the video playback bar has a different length and frame index than the source eye video (not a huge issue - the offline analysis still works, maybe has something to do with the fake world file that is generated?) Thanks again for all the help!
Hi @user-7daa32
I can find analysis plugin in pupil player. Which analysis plugin are you looking for? Note that you might have to enable the plugin first from the Plugin Manager on the right.
Please why is the config. graph covers with black color? the markers blinked with black color The black text on the system graph is a bug caused by the blink detector. We are working on a fix for it. If you disable the blink detector, you should not experience this issue. I don't understand "the markers blinked with black color". Are you experiencing more black artifacts in another context?
Hi @user-7daa32 Which analysis plugin are you looking for? Note that you might have to enable the plugin first from the Plugin Manager on the right.
The black text on the system graph is a bug caused by the blink detector. We are working on a fix for it. If you disable the blink detector, you should not experience this issue. I don't understand "the markers blinked with black color". Are you experiencing more black artifacts in another context? @user-c5fb8b Thank you. Once that has been fixed, I will download a new pupil software right ? I am trying to say that I can't see the analysis plugin Someone told me I have to do data analysis with python or Matlab
Hi @user-370594 currently we don't have any plans for adding another eye movement detector/classifier to pupil. The old one was performing very poorly from our understanding and we don't have a better alternative available. If you want to give the old one a try, you can download Pupil v1.16, which was the last version with the Plugin enabled.
@user-7daa32 I can only highly recommend that you read through the documentation for Pupil at https://docs.pupil-labs.com/core/ Pupil Player offers a set of common analysis plugins that are usefull for standard eye tracking studies. You will find a list of all available plugins in the docs as well: https://docs.pupil-labs.com/core/software/pupil-player/#plugins If you need more elaborate data analysis, you can always export the raw data from Pupil Player and import that in the data analysis solution of your choice (Matlab, Python, Excel, ...). Please be aware that Pupil does not offer a general purpose data analysis platform.
Hi @user-b0c902, which npy and video are you talking about? The eye video? Pupil Player always maps all eye data to the world video, so data before the first world frame or after the last world frame will be excluded from exports. Would that explain the "missing gaze frame" you are experiencing?
@user-a10852 Please try deleting the world lookup. It might have been generated based on the wrong timing information of the old H.264 eye video. Other than that the frame index won't match since everything in Player will be relative to the world video, not to the eye video. Since a missing world video is not the common case, we just generate a fake world video with fixed 30FPS currently. Since your eye video has higher fps, multiple frames from the eye video will be mapped to a single frame in the world video. If you run Pupil from source, you could overwrite the default framerate here: https://github.com/pupil-labs/pupil/blob/b3dda8cba54b2eff9503faa66e126bfce3c7b952/pupil_src/shared_modules/video_capture/file_backend.py#L287
@user-c5fb8b So i have looked at the number of frames from the world.mp4 video and its corresponding .npy file. those two seem to have the same number of frames - which makes sense. However, once I process the recording in pupil player and export the gaze_position.csv report there are some world_index frames missing. If you look at the attachment , you can see that there is world index from 4-7 but not 8-9. And i'm curious about those missing world index.
@user-b0c902 well it seems there is no gaze available on world frames 8-9.
@user-b0c902 can you share the pupil_positions.csv as well?
I have attached both - pupil position as well as blinks, just in case
@user-b0c902 there are a lot of world frames without any pupil data. Normally this shouldn't happen, is there any special setup that you are running for your recording? Can you share the entire recording with us? The best way would be to share it with data@pupil-labs.com via some file-sharing service like OneDrive/GoogleDrive/...
@user-c5fb8b we do run our studies at the participant's home environment while they play with their children. So I can understand if the eye-tracker loses the pupil at some points. And therefore I was wondering if there was an exported file on when it loses the pupil. I am not sure we can share the video recording because of the nature of the study but I will check and send it on the above mentioned email address if i can. Thanks @user-c5fb8b
@user-b0c902 it's not that the eye tracker loses the pupil, but that there is no data available. pupil_positions.csv contains all available data, also from when no pupil was found in the eye image (the confidence will just be close to 0). This seems to indicate that there were a lot of dropped frames from the eye cameras. If this only happens a couple of times, it might indicate a hardware disconnect, but in your recording there seems to be a systematic drop of eye video frames throughout the entire recording. My best guess would be that you are running Pupil on a computer with insufficient hardware. A couple of questions: 1. Do you know what hardware specs you have available there? 2. Are you experiencing this issue in all of your recordings consistently? 3. Can you observe the CPU graph in Pupil while recording?
Is the confidence being closer to 0 the only indicator of dropping frames or is there any other way to determine it? or do you count frames in the world video and look at the number of timestamps? if so, is there anyway to know which video images match up with the timestamps?
the answer to point 2 and 3 is yes. i will find out the hardware specs
@user-c5fb8b I have attached an image for the specs
@user-b0c902 Are you recording to the internal storage or to an external disk?
@user-b0c902 the confidence has nothing to do with frame drops. It's an indicator of how good the pupil detection was for an existing frame.
Normally you would expect a consistent number of pupil data for every world frame, since both eye and world run at fixed frame rates. E.g. if world runs at 30FPS and eye at 120FPS you should get 4 pupil data entries on average (per eye) per world frame. Since most of your world frames have ~4 matching frames and there are only a few parts without matching frames, I assume there were frames dropped.
Regarding matching timestamps to frame numbers: In the export file world_timestamps.csv you find a list of all timestamps. The index in this file corresponds to the frame number in the world video. If you enable the eye video exporter you will get an according file for the eye video. Please note that the exported CSV files are always relative to the exported videos, which are truncated to the trim marks that you set in player.
@user-b0c902 also can you check the FPS of the eye video while recording?
@papr internal storage for both laptop and mobile phone
Mobile phone?
Is this a pupil mobile recording?
the gaze and pupil position reports are from the laptop
but we do record on the mobile phones as well and stream it on the laptop
Ok, but the recording in question is not such a recording that contains streamed data? Or is it?
Because that would explain the eye frame drops.
@user-c5fb8b i think the fps for eye was 120 and world 30 but i will check again. If i am not mistaken, world_timestamps.csv is generated with recent version of pupil player, correct?
@papr yes, it does contain streamed data.
@user-b0c902 In this case, it is very expected that you see frame drops as the video streaming does not guarantee you that all frames are being transmitted, especially if the wifi bandwidth is temporally limited by other traffic.
fair enough. so we are basically trying to extract frame by frame data for each participant. it is a 10 minute recording. We are extracting frames of world video and gaze report to match match up where they are looking. And the gaze report is not matching up to the frames in the 10 minute world video. Do you know what would be the best way to solve this for data analyses? Also, what does pts mean in the world_timestamp.csv?
@user-b0c902 the more stable workflow would be to not record data that has been streamed, but record directly on the phone instead. This means your recording won't contain pupil or gaze data yet and you will have to do this with offline pupil detection and gaze mapping. For this you would need to include the calibration procedure in your recording (which we recommend in general).
You can still connect Pupil Capture over network in order to monitor e.g. the calibration procedure. You can also start the recording on the phone via Pupil Capture through the Remote Recorder plugin, which might be helpful.
If by any change you also recorded on the phone so far, you can just pop this recording into Pupil Player and run the offline pipeline for pupil detection and gaze mapping. As I said this will only work if you included the calibration procedure in your recording.
@user-b0c902 the "best way for data analysis" certainly depends a lot on what you are trying to accomplish. I'm afraid we cannot really help you with that. If you need gaze data for every world frame, then I'm afraid your recording does not offer this. Maybe you could think about some form of interpolation in order to compensate for missing data?
PTS are presentation timestamps and correspond to the time in the actual exported video file. In 99% this should not be relevant for you.
Can someone please point me to where I can buy replacement lenses for the eye cameras? EU or Germany would be preferred. π
@user-06ed20 I would recommend writing an email to info@pupil-labs.com including your question and your eye camera type in this regard.
Hello, I'm still having trouble generating a heatmap as the tracking seems to be very inconsistent, it keeps blinking. I've tried having a different background on my screen but the problem persists. Does anyone know what's going on?
@user-7d0b66 Sorry if we have asked for it already: Were you able to share an example recording with [email removed] such that we can have a closer look?
Ideally, you could name a specific world frame in that recording in which you would expect the tracking to work but it does not.
Can we email a link to the pupil player video for you to check?
@user-7daa32 I had a look at the video. What should I look for specifically? What I noticed is the low recorded frame rate.
I'm only testing it. I have not started research proper. I just want to understand it before doing preliminary study. Can I send the whole recording ? I don't really understand the results. The pupil diameter I got will just be for one subject which can be compared to other subject right ? What if I want to compare pupil dilations among different AOIs?
@user-7daa32 I had a look at the video. What should I look for specifically? What I noticed is the low recorded frame rate. @papr the heatmap for each surface is bluish blank and no color gradient on stuff like that. I just want to know if that recording was okay
@user-7daa32 At 0:03 you can see that the heatmap is not setup correctl. Click "edit surface" in the top right and adjust the handles to contain the complete display. Also, in the menu on the right, set a size (e.g. the resolution of your display) for the heatmap.
@user-7daa32 At 0:03 you can see that the heatmap is not setup correctl. Click "edit surface" in the top right and adjust the handles to contain the complete display. Also, in the menu on the right, set a size (e.g. the resolution of your display) for the heatmap. @papr I will try this now. Also I am sending some data and please look at them and let me what are your thoughts.
I won't have time to look at it this evening. We will follow up in the coming days. π
Okay. Thanks
I recently bought a pupil lab. I am trying to install the software on MacBook Pro, but I get an error: Pupil Service cannot be opened because Apple cannot search for malware in it. The same error occurs for pupil Capture and Service. How could I handle it ?
@user-3ede08 Please right click the application and click Open. Afterward, you should see a similar dialogue but with the option to open it anyway
@papr, thanks for your prompt response. I am new here, so sorry if my questions are trivial.
@papr I still get the same error
@papr I got it, thanks
@papr I sent an e-mail to data@pupil-labs.com yesterday but I didn't get a response yet.
@user-7d0b66 Could you let me know in a private message which email address you sent this email from?
@user-7d0b66 It looks like we have not received your email. Could you please check if the email was sent correctly and maybe attempt to resend it?
Generally, we try to respond to all emails within 3 business days or less.
@user-7d0b66 I can confirm that we have received your email. We follow up we email.
@papr Great, thanks!
Hello , > @user-7daa32 I had a look at the video. What should I look for specifically? What I noticed is the low recorded frame rate. @papr Hello
Do I need a high frame rate to get a good heatmap? The surface is the same at AOI right ? Why is it difficult to set up more than one surface? The edit line always scatter when trying to set up an additional surface. Please can you send me a screenshot of stimulus with more than one surface set up ? Thanks
Hi @user-7daa32 the demo recording for Pupil Player (found in the docs about Pupil Player: https://docs.pupil-labs.com/core/software/pupil-player/) shows an example of a multi-surface setup for surface tracking. Here's the direct link to the recording: https://drive.google.com/file/d/1nLbsrD0p5pEqQqa3V5J_lCmrGC1z4dsx/view?usp=sharing
here's a screenshot:
@user-c5fb8b thanks! this is helpful.
Hi. We're trying to run mouse_control.py under pycharm and it says it needs module "windows". When we try to install the module PyCharm says file not found. Do we need the module? Is this a problem you can help with?
@user-ae4005 Please try installing https://pypi.org/project/PyUserInput/ Also make sure that PyCharm is actually using the python environment in which you have installed the module.
@papr Thank you for the quick response! I'll try right now
@papr I'm getting this error now: Could not find a version that satisfies the requirement pyHook (from PyUserInput) (from versions: ) No matching distribution found for pyHook (from PyUserInput) Any idea why that is?
Do you get this during install or when running the script? Also, how do you install it? Using pip
or pip3
? What is your output of pip -V
if you are using pip
@papr Do you guys have much experience with pupil under PyCharm? Should we be using a different package manager / python install?
During install. PyCharm has a module / environment manager that manages installs for each project. It seems to use pip.
I will be away from keyboard for a while. I might be able to help when I am back.
ππ
@user-ae4005 It looks like you are using a virtual environment based on the python.exe path in your screenshot. Please make sure that you are using Python 3.6.1 or higher
My guess is that you are using Python 2.7 and that one of the requirements is not available for Python 2.7 anymore.
I just checked.. I'm using Python 3.7
I'm using PyCharm, would Condo be better or should that not make a difference?
@user-ae4005 I have no personal experience with this setup. But it looks like pyHook has not provided any concrete uploads to PyPI which is why its installation fails. Instead they link here: https://sourceforge.net/projects/pyhook/files/pyhook/1.5.1/
You should be able to download the zip and install it via pip install <path to zip file>
I'll try. Thanks again!
So I finally managed to install pyHook and the code runs. However, nothing happens. When I go through it in debug mode, it stops at line 74 (topic, msg = sub.recv_multipart) of mouse_control.py. Do you know know what the problem could be? I'm completely new to python and I'm not sure how to handle this...
@user-ae4005 Is pupil capture running? And have you set up a surface in the surface tracker plugin?
The surface should have the name screen
Pupil capture is running. I did add the plugin but I did not set up a surface...
That explains a lot. Will do it now!
Thanks again π
Hmm Pupil capture tells me that I cannot add a surface because it can't find markers in the image. What do I need to do to set up markers?
Checkout this very old video demoing the script https://www.youtube.com/watch?v=qHmfMxGST7A
Please also read the documentation on surface tracking https://docs.pupil-labs.com/core/software/pupil-capture/#surface-tracking
Okay, I'll check it out! Thanks π
Hi @user-7daa32 the demo recording for Pupil Player (found in the docs about Pupil Player: https://docs.pupil-labs.com/core/software/pupil-player/) shows an example of a multi-surface setup for surface tracking. Here's the direct link to the recording: https://drive.google.com/file/d/1nLbsrD0p5pEqQqa3V5J_lCmrGC1z4dsx/view?usp=sharing @user-c5fb8b thank you I got one done like this.
@user-7daa32 At 0:03 you can see that the heatmap is not setup correctl. Click "edit surface" in the top right and adjust the handles to contain the complete display. Also, in the menu on the right, set a size (e.g. the resolution of your display) for the heatmap. @papr thank you for this too
Hello, I'm trying to assemble a Pupil Core device following the documentation posted on the website and was wondering if anyone could provide clarity on the soldering for the eye-camera. I understand that the IR LEDs need to be soldered onto the PCB but the exact placement of LEDs is not very clear from the posted video. I was also wondering if there is any specific reasoning for using the Microsoft HD 6000 web camera for the eye camera. Would the same process of implementing IR image recording be possible using other web cameras?
@user-8eaeb3 I see that you are assembling a Pupil DIY headset. It is possible to use other eye cameras, however the eye cam mounts for the DIY headset are designed for the HD 6000. The eye camera mount design and frame design has not been updated for other cameras. LEDs should be positioned in order to uniformly illuminate the eye region (as much as possible). Glints from IR illuminators are not leveraged by Pupil Core software, so just try to get good illumination of the eye region. You could use other cameras that are UVC compliant with Pupil Core software, but will need to manually select cameras in Pupil Capture software.
@user-8eaeb3 out of curiosity, what type of research/applications are you looking to do with Pupil DIY?
@wrp Thank you for your reply! Would choosing a different camera model and manually selecting the camera have any effect on the accuracy of the data? I am working on a project studying memory deficits after laser oblation in epileptic patients. And was hoping to compare data from eye movements within a HMD VR system to eye movements when viewing stimulus images on a screen using the eye tracking data from the Pupil DIY.
@user-8eaeb3 Welcome! Regarding different eye cameras vs accuracy. The first and most important part for accurate pupil detection is getting a good image of the eye(s). From a hardware standpoint there are many too many variables to give an easy "yes" or "no" answer. Getting a good image of the eye depends on: position of the eye camera, lens distortion/quality, frame rate of the eye camera, resolution, sensor quality/noise, illumination, etc...
From a software Pupil Core software should be able to support different UVC compatible sensors out of the box (or without too much additional work - @papr or @user-c5fb8b please jump in here when online if there is more that needs to be noted).
Pupil DIY is really intended for those that are prototyping and have the time to experiment both with hardware and software. We surely want to support those who are prototyping/learning/tinkering, but if you are time limited/want to ensure robust pupil detection, then you might want to consider getting Pupil Core.
@wrp Thank you! We will definitely consider getting a Pupil Core. We were hoping to work with some cameras we had on hand to gather some initial data and begin working with the Pupil Core software, but I do believe we will eventually purchase a Pupil Core for the project.
@user-8eaeb3 Great! Looking forward to seeing what you are able to achieve with the DIY setup.
@papr I've tried to send the (raw)recordings to data@pupil-labs.com but they're too heavy to be sent via e-mail. How can I share it then?
@user-7d0b66 we usually recommend using file sharing services like Google drive, Dropbox, WeTransfer etc
@papr Do I just send you guys a shareable link via-email?
so you can have access to it
@user-7d0b66 correct
Hello @papr , Hope you are doing well. Is there any source where the weight of the pupil core eye camera is mentioned?
@user-499cde The core headset weights 22.75g https://pupil-labs.com/products/core/tech-specs/
@user-499cde Do you need to know what a single eye camera weighs? I don't know if our website mentions that. Instead I have measured the eye camera of my personal headset which weighs 3.8g
Oh yes, very well. Thank you very much
I was looking for the weight of single eye cam
Hello, I am using calibration markers for offline calibration. I am wondering if it will be possible to know the exact x,y coordinate (normalized or not, doesn't matter), using pupil player?
@user-7d4a32 Do you mean if Player is able to detect the coordinates? Or do you mean if it is possible to read out the detection result?
@papr Is it possible to know where the markers are with respect to x,y coordinates? So yes, if the Player is able to detect the coordinates but of the markers specifically
@user-7d4a32 Yes, in the offline calibration menu there is a button to detect the Reference Locations
@papr Thank you! You just saved us from hours of googling.
@user-7d4a32 Have you seen our video tutorial on post-hoc pupil detection and calibration already? It explains the complete work flow π https://www.youtube.com/watch?v=_Jnxi1OMMTc&list=PLi20Yl1k_57rlznaEfrXyqiF0sUtZMMLh
@papr Thanks for the reference! We just checked it out. Curiously, is it possible to perform Gaze Prediction Accuracy after offline calibration? Meaning to simulate clicking "T" in the pupil player?
@user-7d4a32 Capture calculates the gaze accuracy after calibration (C) and validation (T). The difference is, that the validation uses (or rather should use) other locations in the subject's field of view than the calibration to test how accurately the calibration generalizes. In Player, you have to tell the validation in which time range it should be validating.
@papr Thanks for everything!
Hi @papr , Is there anyway to export the video of the offline pupil detection 3D model (Algorithm mode) directly from pupil player software? Or an ability to play frame by frame ? Currently we can play at 0.25 but I wanted it to be more slower. Any suggestion would be great. Thanks
@user-331121 Did you know that you can pause the offline pupil detection from the menu? Unfortunately, the offline pupil detection does not have a frame-by-frame mode nor does the eye video exporter export the algorithm view. It can export the detection result visualization (red pupil ellipse + green eye model outline) though
I normally can not pause in the frame where I'm interested in as the speed is fast. I wanted additional information beyond 2D ellipse and 3D model. Anyways, thank you for your reply.
Hi I shot a country driving video with Pupil Invisible, however I find that I need to calibrate Gaze for longer distance view. I am a driving instructor so accuracy for where the driver is looking is the key point for the video. Centre and side mirror looking gaze are fine but for country/highway driving looking further ahead is important to show (especially on corners) but the recorded video shows that the gaze is way up as if I was looking up the sky. I tried to follow the Youtube video on Post-hoc gaze mapping validation but I got stuck in reference location step. I did not shoot circular reference check video before driving so I tried manual edit for referencing location but the player runs forever when I clicked "Calculate all calibration and mapping" and I have a message in status bar like, "Not enough referencing or pupil data available" message. The total video is about 40 mins. What is best way to calibrate for pupil invisible when reasonable degree of accuracy is needed? Thanks for help.
@user-8f829f Hi, the offline/post-hoc calibration in Pupil Player cannot be used for Pupil Invisible recordings as it requires pupillometry data which is not available for Pupil Invisible recordings.
hmm okay I see, does sun glare affect the gaze accuracy? I had sun straight up facing me for the video. Before recording, what practice could improve overall video quality for my driving scene with Invisible if more gaze accuracy is required?
Just now I increased the Vis cicrcle so it looks better on playback now, maybe it was bit smaller before, other question I have is that I put anti slip nose support sillicone patch on glass' nose bridge, does this cause much issue with Invisible's performance? Is my over the horizon gaze issue coming from this?
@user-8f829f Please be aware that this channel is dedicated for Pupil Core specific questions. I think my colleagues over at πΆ invisible are better suited to answer your questions. π
ah sorry about that π
Don't worry π
hello :). I am using Core, and i am getting a problem in frame rate. We have set the eye frame rate 120Hz, but we get the eye video with about 30Hz frame rate. Could you please what the problem is? If you need anything, i will upload it. thank you.
@user-20b83c hi, this is likely due to insufficient CPU resources. Which CPU does your computer use?
@papr we are using Inter(R) Core(TM) i7 CPU [email removed] and 8GM ram
@user-20b83c 1.6 GHz is comparably low. I fear that this is the cause of your low pupil detection frame rate.
@papr Then what is the minimum requirement for 120Hz pupil frame rate?
@user-20b83c It is difficult to tell what the bare minimum is as we do not have the resources to test a bunch of different CPUs. But I can tell you that I am able to run Pupil at 200Hz on my MacBook (early 2015) with a 2,7 GHz Dual-Core Intel Core i5.
Please be aware that the actual performance may vary depending on your system's current load, e.g. if you run a separate application.
@user-20b83c: also quick question: you are running with the headset connected to the computer via USB, correct? Not via some network streaming setup.
@user-20b83c It is difficult to tell what the bare minimum is as we do not have the resources to test a bunch of different CPUs. But I can tell you that I am able to run Pupil at 200Hz on my MacBook (early 2015) with a 2,7 GHz Dual-Core Intel Core i5. @papr Hi, Sorry. I am completely lost about the meaning of frame rate as related to the unit ''Hertz''. I usually see CPU graph running nonstop, is that what give the frame rate in Hz? is frame rate also related to the video range? I am using Dell Intel core i5, 8th Gen
@user-7daa32 There is a second graph next to the CPU graph saying FPS. FPS means Frames per seconds. 30 FPS is equivalent to 30 Hz (Hertz). Hertz basically also means per second but is meant for frequencies in general.
is frame rate also related to the video range? Btw, I am unsure what you mean by video range.
@user-7daa32 There is a second graph next to the CPU graph saying FPS. FPS means Frames per seconds. 30 FPS is equivalent to 30 Hz (Hertz). Hertz basically also means per second but is meant for frequencies in general. @papr ooh thanks! I am familiar with Hz being per second in my Physical chemistry classπ. I just understood what is FPS and factors that can affect it. e.g the CPU. when I said video range, I was talking about video adjustment on pupil player---trimming
Hello, I know there are resources in the pupil lab website for guide. I want to ask if you can please outline few points we should consider to make us have confidence in our data. I have not started the actual research but just looking at objects on a screen. My primary goal is to be able to record people looking at different items and developing a heat map from their gaze pattern. My secondary goal is to be able to record and graph the pupil diameter information during the above process. I think surface tracking will be very important for Pupillometry
Why are the png files of my heatmaps being generated like this?
Hi there! has anybody dealt with annotations delay between Psychopy and pupil labs? thanks!
@user-7d0b66 Please enter a Width
and Height
for each surface. This will determine the aspect ratio for each exported surface/png. See example screenshot of the surface tracker submenu.
@user-2be752 There is some natural delay due to the network connection. We compensate this delay by syncing clock between the annotation creator (PsychoPy) and the receiver (Capture).
Therefore, the sending delay is only relevant if you are processing the annotations in realtime, not if you record them with Capture.
Hi Everyone, Hoping somebody here might be able to help with a few questions: I'm looking to convert the 3D gaze outputs from pupil core into 2D pixel coordinates (for further processing), and understand i can do this with cv2.projectPoints(), but i'm getting what appear to be pretty unlikely outputs - hoping that by clarifying some things i can figure out where i'm going wrong.
Please excuse any stupid questions among the above - i can neither call myself an expert in coding or camera vision. Any help would be appreciated.
Thanks, Sam.
@user-5b46cf Actually, the Pupil software projects the 3d gaze result back into normalised image coordinates (norm_pos
). π No need to do this yourself. The software uses the active camera intrinsics for that.
It seems like the heatmap is not responding to my gaze attention as it should, the colored spots are static and not moving according to where my visual attention is being directed. Why is that?
Also, what does that big red triangle on the screen represent?
@user-7d0b66 Heatmaps in Player are static and aggregate all gaze that is available within the trim marks. The triangle shows the direction of the surface.
@user-7d0b66 normally you want a static heatmap for your analysis, because a heatmap is essentially gaze data independent of the timestamp. It can also better be used for images in reports and similar. As @papr said you can use the trim marks of the timeline to adjust the input data used for the calculation of the heatmaps.
@papr Ah! That's great - so we get that at the pupil level data. Am i correct in thinking that's in a coordinate system with the origin at the bottom left of the world view, with 0.5,0.5 at center?
@user-5b46cf please refer to the documentation in regards to the used coordinate systems :) https://docs.pupil-labs.com/core/terminology/#coordinate-system
But in short, yes this is correct in your case!
@user-c5fb8b Thanks! Thanks for the help @papr
@user-5b46cf Please be aware of the difference between pupil and gaze data. They live in different camera coordinate systems
But both have a norm_pos
field.
@papr thanks for the reminder! So, for clarification and future use (it's not essential now), if we wanted to calculate a gaze position per eye, we would need to look at the pupil data and project this to the world camera ourselves? Sorry to be a hassle - just want to make sure i understand what i'm doing before i mess up someone else's research!
@user-5b46cf You basically need to map the per-eye 3d pupil position into 3d gaze positions and then back-project them into the scene image plane
With our upcoming 2.0 release this will be easy to realise.
@papr Great stuff! We were only intending to work with average gaze points on this occasion anyway, so i'll keep an eye out for that.
Again, thanks for all the help!
@papr thanks, I do not need annotations in real time, I have annotations sent through Psychopy to know image onset and so and so. You said there shouldnt be a lag because of the time sync, is this time sync something I should implement or does it come automatically? Thanks!
No, you need to do this explicitly. The easiest way is by using the T command of Pupil Remote. Please check out our documentation on time sync for details.
okay, I will do this from now on, any ideas on how do fix this on already collected data? I would like to not loose this data π€ͺ π©
Hi there, I've been trying to add a surface in order to run the mouse_control.py script. Pupil Capture does not seem to recognize most of the markers and is very unstable so that I can't edit the surface or add markers either. I added a screen shot of my set-up for reference. I'm not sure what I'm doing wrong, any ideas on what I'm doing wrong here?
@here π£ Announcement π£ - We just released Pupil Core software v2.0. Download the apps here: https://github.com/pupil-labs/pupil/releases/tag/v2.0#user-content-downloads
With v2.0 and beyond, we will be focusing on making Pupil Core more core, making it easier to use and to extend for you and easier to maintain for us. There are a ton of exciting improvements and changes we made, please visit the release notes for more details and our product vision: https://github.com/pupil-labs/pupil/releases/tag/v2.0
We look forward to your feedback!
It's me again. I downloaded the new version and I was able to create a surface, edit it and add markers π₯³ But I keep getting the same kind of error (also in other pupil_helpers scripts) and I'm not sure how to solve it. This is the line that causes the error: gaze_position = loads(msg, encoding="utf-8") and this is the error: TypeError: unpackb() got an unexpected keyword argument 'encoding'
Does anyone know what the problem is?
@user-ae4005 Which version of msgpack do you have installed? Please use 0.5.6. You can install it via pip install msgpack==0.5.6
. (Please be aware that you might need to use pip3
depending on your setup.)
@user-ae4005 also since you said you downloaded the new version: are you running Pupil from bundle or from source?
@papr Thank you! I was using version 1.0 and changed now. It seems to be running now but the cursor is not accurately following my gaze yet, might be a calibration issue though.
@user-ae4005 @user-c5fb8b will follow up with tips regarding surface tracking. Could you post an other picture of the current state? The first image seems to be overexposed.
@user-2be752 You will have to potentially realign the recorded timestamps yourself. You can do that but creating a custom annotation (e.g. SYNC
) in Player (it will always use the recorded Pupil time) at a point in time in which you know you should be seeing one of the recorded annotations.
So for example, you have an experiment where you show a traffic light and your program sends annotations when the traffic light changes its state, e.g. red->green
. Then you add the SYNC
annotation at the moment in time where the traffic light changes from red to green in the video. This way you will have two annotations for he same event, but with different timestamps. Calculate the difference between these timestamps and apply it to all other recorded annotations. This way you can correct all annotations.
@user-c5fb8b I'm not sure what you mean, sorry still very new to all of this. I'm just using pupil capture and running the mouse_tracking code from python.
@papr Great, thanks again!
@user-ae4005 Don't worry about his question. He was only concerned that your issue was due to the new release which it is not. π
@papr Okay π
This is what it looks like now
@user-ae4005 it looks like your markers could be a bit small. Does the detection get more stable when you are closer to the markers?
The optimal marker size depends on the distance to the headset, you should try out different sizes and find a size that works stable in the range you are using in your experiment.
Additionally: when you have Capture open and record the screen, you will see the same marker multiple times, which can interfere with marker detection. Optimally you would minimiza capture or move it to a second monitor while recording your target monitor.
And as papr said the image seems a bit overexposed, please try adjusting the exposure settings in the Video Source
menu, either switching to auto exposure or adjusting the exposure time.
@user-c5fb8b Yes, the detection seems more stable when I'm close to the markers so I'll try printing them out bigger. Thank you for all the other pointers, I'll adjust all of it and hope it'll work better then π
@papr @user-c5fb8b So it's all up and running now, thanks for the support! π The mouse control seems to be a bot off still though. It seems to have a bit of an offset to the right, although it does fixate the start button perfectly when I look at it (which is on the bottom left). Does that have to do with my calibration?
@user-ae4005 Yes, there will always be some degree of error. Check out our best practices for a few tips in this regard https://docs.pupil-labs.com/core/best-practices/
@papr Will do, thanks again!
I feel like keeping the old version of pupil core software. can I have it together with the new one in my system?
@user-7daa32 Yes, you can π
After downloading for window, here is what i got
To open the RAR-archive on Windows, you will need a decompression software, e.g. WinRAR. https://www.win-rar.com/predownload.html?spV=true&subD=true&f=winrar-x64-590.exe
Thanks. Because I want to keep both old and new files, what are the best options to select here?
@user-7daa32 Are these WinRAR options?
After installing WinRAR, you just need to right click the downloaded Pupil release (ending on .rar
), and unarchive/extract the file it includes (.msi
). Double click the .msi
file and follow the installer instructions.
Afterward you can delete both, the msi
as well as the rar
file.
@user-7daa32 also from the release notes: https://github.com/pupil-labs/pupil/releases/tag/v2.0
Improved Installation Workflow on Windows - #1853
We have wrapped Pupil in a Windows Installer package (MSI) in order to simplify the Windows workflow.
By default, all 3 apps (Capture, Player, and Service) will be installed in C:\Program Files (x86)\Pupil-labs\Pupil v<version>. All apps will also get a start-menu entry, making it much easier to open Pupil. Installed versions of Pupil can be removed in the Windows Uninstall Settings.
New versions of Pupil will be installed alongside older versions and you can choose which to start. Switching versions will still overwrite your user settings as previously.
Thank you. I have all apps installed automatically into desktop. This means that I dont need to open file to run pupil apps any more but just click the the already installed
correct! π
However, I dont know the importance of pupil service, why is it not opening?
Do you need to use Pupil Service for anything?
I dont think. thanks
Hi all, is there any documentation on notifications and what can be done with them? Also, is it possible to set the "Auto Exposure" mode of the worldcam to "Manual" with a notification? If so, what subject / name / args would be required? Thanks in advance!
Hi @user-430fc1 I'm afraid we don't have a documentation for the notifications. ~~There is currently no notification for changing UVC properties (which the exposure mode belongs to).~~ EDIT: see below
The general idea is that every plugin can send notifications and respond to notifications. You can also send notifications via the Network API, see this example: https://github.com/pupil-labs/pupil-helpers/blob/master/python/pupil_remote_control.py Every notification has a topic and can contain potential payload data. The payload data has to be serializable, so not every Python object will work. Very often we don't send any data and treat notifications as single "event" that happened. Sending notifications can be done from anywhere within a plugin with:
self.notify_all({
"subject": "your-notification-topic",
# add more key-value pairs here for custom payload
})
This will broadcast the notification to all other plugins.
You can react to notifications by implementing
def on_notify(self, notification):
# ...
in your custom plugin.
If you want to find out more, I'd recommend you open the codebase and search for .notify_all(
and def on_notify(
to find out which plugins send or receive notifications.
@user-430fc1 I have to correct myself, there is actually a way of setting the exposure mode via notification: You can start any plugin remotely with custom arguments via the notification:
{
"subject": "start_plugin",
"name": "<plugin-name>",
"args": <dict-with-arguments>,
}
The UVC_Source
plugin is the plugin handling the UVC connection and it accepts uvc_control
as argument in __init__
for starting with custom UVC settings.
With that you should be able to basically restart the UVC_Source
with modified UVC settings.
You will need to do some digging in the code to figure out the necessary startup args however.
Hello, does pupil work with a computer integrated webcam?
I am looking for eye tracker software that can work with a webcam
Hi @user-151c9e I assume you are looking for a remote eye-tracker, where the eye tracker is positioned at a computer? Unfortunately, Pupil only offers head-mounted eye-tracking, i.e. the eye-tracker has to be worn on the subject's head.
@user-c5fb8b thanks, thatβs very helpful. Iβll get digging π I have one more question... about latency. I have developed a routine for measuring the pupilβs light reflex with Pupil Core. It uses the worldcam to detect the onset of a light and sends an annotation with the associated pupil time stamp to assist with calculating time-critical measures like latency to constrict, time-to-peak constriction, etc. It seems to work well, as the annotation is stamped on the first frame in Pupil Player where the light becomes visible during playback. Iβm just wondering what sort of latency this might have from the actual event? Given that the Pupil Core tech specs quote 8 ms camera latency and >3ms (depending on hardware) processing latency, and that I am using world cam with 120fps, am I right in thinking it would be somewhere in the region of 15 plus-or-minus 4 ms?
Hello, I'm a researcher at NTU and we are interested in using this product in a wearable robotic system. I noticed that there is a configuration option to add a USB C mount. Is there a list of compatible cameras? Also, has anyone added a depth sensor to this setup? Thanks.
hello, could you please tell me that the power of IR light source for eye camera? I want to add same power IR light source for our experiments. thank you π
Hello, my fixations.csv file is coming blank. Anyone can help me regarding this why it is like that?
Thank you in advance
Hi @user-6bd380 did you calibrate before/while recording? The fixation detector works on calibrated gaze only.
Hi @user-ab6a19 Pupil Capture supports third-party USB cameras that fulfill the following criteria: 1. UVC compatible: http://www.cajunbot.com/wiki/images/8/85/USB_Video_Class_1.1.pdf (Chapters below refer to this document) 2. Support Video Interface Class Code 0x0E CC_VIDEO (see A.1) 3. Support Video Subclass Code 0x02 SC_VIDEOSTREAMING (see A.2) 4. Support for the UVC_VS_FRAME_MJPEG (0x07) video streaming interface descriptor subtype (A.6) 5. Support UVC_FRAME_FORMAT_COMPRESSED frame format
Regarding depth sensors: Previously, we supported multiple versions of depth cameras from the Intel RealSense family. Unfortunately, maintaining support for the required third-party libraries was not reasonable for us. Currently we do not officially support these cameras anymore. You might be able to integrate any depth sensor into Pupil by writing a custom plugin, but this will require some serious effort.
Hi @user-20b83c the IR light source we are using is the SFH 4050, you should be able to find the specs that you need in the official datasheet:
@user-c5fb8b Thank you so much!
@user-c5fb8b yes I did the calibration before start recording
Hi, I am new to core. I am looking for a way to record external mic audio via stereo mini plug of android phone with Pupil mobile. I cannot find a way to alter mic from built-in to external mic. Is there any way to do it? The only description of audio source on Pupil mobile I found is this. https://discordapp.com/channels/285728493612957698/285728493612957698/632110119588593664
@user-c5fb8b thank you for your concern on my problem, I could manage to resolve, now its fine .
@user-4fb664 Hi feisal -just wondering how you went getting pupil labs software running on fedora? If you got it installed and working, what did you do? Any [email removed]
Hi @user-292135 Pupil Mobile will try to use your default microphone. Did you try configuring the default microphone in android? Which kind of external microphone are you using? Bluetooth or via cable?
Hi @user-c5fb8b , thanks for your reply. I cannot find default microphone setting in my phone ( Android 8 on Oneplus6, Pupil Mobile bundle ) Recorder app detects automatically external mic but Pupil Mobile not. I use a white iPhone earphone with mic for testing.
App version is 1.2.3
@user-292135 I had a look at the source code and if I see this correctly, the app should detect and use wired headsets automatically (as they become the default microphone when connected to the phone). Please make sure to connect the headset before starting the app. I do not if the app is able to handle mic changes while it is running / recording. You can stop the app in the settings.
set a headset, stop the app from the setting, and retried but failed. Rebooting also failed. Any other suggestion? Recorder app correctly captures a headset. Thanks
I am monitoring audio input level by tapping audio input -> monitoring the change of width of the circle. Is this ok?
@user-292135 and tapping the built-in mic produces bigger circles than tapping on the external mic?
Yes.
hello,
I cant find the option for 'edit surface'. I am attaching the image here . Actually I cant see any marker or line to adjust for defining my AOI. Is it because I didn't use the Apriltag Markers on my screen? One more thing I am doing it offline on pupil player.
Any help regarding this would be very helpful.
@user-6bd380 Yes, you will have to place markers in your environment. Every surface needs to have multiple markers assigned. Please have a look at the example recording we provide in the docs for Pupil Player, it shows a setup with multiple surfaces defined. Although the example does not feature a screen-based setup, the idea is the same: you would place multiple markers around the screen. Example recording: https://drive.google.com/file/d/1nLbsrD0p5pEqQqa3V5J_lCmrGC1z4dsx/view?usp=sharing
@user-c429de what Pupil Core hardware are you using? I assume that you are using the most recent version of Pupil Capture, correct? Also after doing
sudo usermod -a -G plugdev $USER
you will need to log out/log in for the changes to be applied IIRC. @wrp Thanks a lot! I had the same problem and this worked like a charm (Ubuntu 18, new V2.0 release, bundled version)
@user-c5fb8b Thank you for your reply. Since I already done this mistake and collected data on many participants. My last hope/question is there any way still can I define AOI ?else my all data will go in garbage and I have to begin it again from 0
If there is any possible way kindly suggest me. These data are very precious.
@user-6bd380 I'm afraid Pupil Player does currently not offer any surface analysis tools without the use of markers. Can you explain which metrics you are interested in? Maybe we can suggest alternative approaches for you.
@user-c5fb8b I didn't get in what context you ask for metrics what I understood (in terms of analysis) I am interested in quantitative metrics. Any possible approach would be helpful. I will give my best effort to get it done .
@user-6bd380 I meant to ask what analyses you had plannend with the surface tracker.
@user-c5fb8b My primary aim/interest is calculating the fixation duration of AOI. Secondary, aim is to do the analysis for saccades to find the sequence of eye movement on the surface.
Hi all, I have a question to the Accuracy visualization. after a calibration the accuracy and precision and the data sample are presented on the monitor and I would guess it will be recorded in the log file of the calibration recording. Is that data somehow accessible later? after several other recordings with the same calibration. And if so how could I easily access it?
@user-6bd380 can you give an example for the AOI you are interested in?
@user-c5fb8b I am presenting images to participants having grieving person either individual or many. I want to estimate the fixation duration only on face. In face I want to divide it in two parts 1)Eyes 2)Lips area.
@user-6bd380 ok, one more question: are you using a Pupil Core or Pupil Invisible headset?
In the second part of my analysis I want to estimate the fixation proportion for human body and back ground. Whether the number and duration of fixation is more on human body or more on the background information (considering it as contextual information)
@user-c5fb8b Pupil core
@user-6bd380 I am afraid what you are trying to do is beyond Pupil's capabilities, even when using apriltag markers. If you want to do this analysis automatically, you would need some sort of face detection algorithm (or human pose detection for the second part). We do not offer any tools for this. With Pupil's surface tracker you can only track rectangular, planar surfaces. I will speak to our research team and ask if they have any recommendations for you.
hi @user-6bd380 just chiming in that I've done this kind of analysis of the data I have collected from Pupil Labs, and what I've done is I use one of the many tensorflow object-detection algorithms (which can detect faces, eyes, mouth, etc.) on the world videos and then you can map the fixations output by pupil player and see if they fall within the coordinates output by the object-detection algorithm. I have found it works pretty nicely. I hope it helps π
@user-2be752 thank you so much for your suggestion. I am not good at writing codes but definitely I will try.
Hey π Does anyone have experience running the Pupil software in Ubuntu 20?
I know Ubuntu 18 is recommended, but I am trying to figure out if it is worth rolling back
@user-e70d87 The bundle should work fine on Ubuntu 20
Hey! Why do the timestamps values start with high numbers (1337471.275137 1337471.283206 ... ). What does the integer part of the number (1337471) stand for ?
@user-3ede08 Read more about timing here https://docs.pupil-labs.com/core/terminology/#timing
It's in epoch time, which counts up the number of seconds it has been since 1/1/1970 in London.
It might seem silly, but it's my favorite time format because there is no need to consider timezone and you don't need to format the string to calculate duration (as you would with a HH:MM:SS.00 format)
@user-e70d87 Actually, it is not by default synced to the unix epoch. π You are right about the number of seconds being the unit though.
Oh, right! Y'all use Boot time by default, right? i.e. time.time()?
(boot time is actually my LEAST favorite time format, because it is arbirtary and impossible to relate to other timecodes if you didn't log boot time in another format when you made the recording π )
time.time() returns seconds since unix epoch
Pupil time does not have a specified start.
The Current Unix Timestamp is about 1592595772 seconds since Jan 01 1970. (UTC). Which is a litte bit different to 1337471.275137 π€
Please see our messages above π
ok, thanks.
@papr Respectfully, that's impossible. A timestamp will always have a Zero time. That Zero might be based on the time the recording started (record time), the time since the computer rebooted (boot time), time since 1/1/1970 in the UTC timezone (epoch time), or time since 1/1/1970 in the local time zone (this one shouldn't exist, but I've seen it).
I think you mean that Pupil time is based on record time, which is the most intuitive, but also problematic because if I am recording two things on the same computer, it is impossible to synchronize the time streams. If both systems are recording in Unix Epoch time (based on the same system clock), synchronization is trival. If one is using Epoch time and the other is using Record time, synchonization is impossible if you did not log the moment that the recording started in the Epoch time format.
@user-e70d87 Sorry for the misunderstanding. Of course there is a zero time. I just wanted to say that the epoch (zero time) is not defined by a specific event, i.e. it is possible that it is seconds since boot but it is not guaranteed.
And no, it is definitivly not recording time unless you change the time base to 0 before starting a recording. Pupil Capture will not change its time base unless you use time sync or set it via a plugin or the network api.
I think I did verify that it was Boot time a while back.
I also recall a conversation we had years back that ended in you adding a record of the time that the recording started using the system clock (and floating point precision for the seconds). I checked a while back and couldn't find that number, but I'm hoping y'all moved it to a different log file (it's been a while since I've dug deep into raw data. I'm still working to publish some recordings I gathered in Jan 2018 π )
It might be the case for a specific OS, but the monotonic clocks might differ by OS. The info file saved in every recordign contains the start time in unix epoch and pupil epoch. You can use it to calculate the difference between both clocks and therefore sync the recording to unix epoch after the effect.
That's perfect, thanks!
Hey, do you have the audio recording function in the newest version of the soft(for Wind10)? In the General Settings I use Audio mode "sound only". But there is no file of the audio. As I remember somewhen I saw the plugin "Audio capture" (maybe in the previous version) but don't see it now.
@user-370594 The feature has been discontinued. You can find more information about it in our release notes https://github.com/pupil-labs/pupil/releases/tag/v2.0
Does anyone have advice on how to get Pupil Capture to actually see the tracker when you plug it in? It always seem to say "Camera not found" to start, and then I can usually get it to work after unplugging/replugging, restarting Pupil (with or without the tracker alraedy plugged in), etc etc.
I can't find any reliable way to make it connect to the tracker every time. Do you have advice here? Is it preferred to start Pupil with the tracker plugged in, or start Pupil and then plug in the tracker? Do you have any advice on how to troubleshoot this issue?
This has been an issue with - Multiple trackers Multiple computers Multiple USB cables Multiple USB ports Every OS (Mac/Windows/Linux)
[Edit - I have just discovered that the 'lsusb' command in linux is helpful for figuring out when my computer has seen the tracker, but that doesn't explain why it doesn't seem to connect every time I plug it in ...)
Hi, I'm just starting out with getting a Pupil Core to run in our lab but am falling at the first hurdle: the Pupil Capture App simply won't open on my Mac. (Latest version of all three apps. MacBook Pro, macOS Mojave 10.14.6, 16 GB RAM). Pupil Player and Pupil Service both launch OK. I can see that Pupil Capture launches and that it creates three processes, but then Activity Monitor shows that one of them has become non-responsive, and no interface ever appears. Here is the console log for Pupil Capture:
2020-06-22 14:51:12,826 - MainProcess - [INFO] os_utils: Disabled idle sleep.
2020-06-22 14:51:13,366 - world - [INFO] numexpr.utils: NumExpr defaulting to 4 threads.
2020-06-22 14:51:13,391 - world - [INFO] launchables.world: Application Version: 2.0.175
2020-06-22 14:51:13,391 - world - [INFO] launchables.world: System Info: User: michael, Platform: Darwin, Machine: Persepolis-3.local, Release: 18.7.0, Version: Darwin Kernel Version 18.7.0: Tue Aug 20 16:57:14 PDT 2019; root:xnu-4903.271.2~2/RELEASE_X86_64
2020-06-22 14:51:13,391 - world - [DEBUG] launchables.world: Debug flag: False
2020-06-22 14:51:14,041 - world - [DEBUG] video_capture.ndsi_backend: Suppressing pyre debug logs (except zbeacon)
2020-06-22 14:51:14,089 - world - [DEBUG] remote_recorder: Suppressing pyre debug logs (except zbeacon)
2020-06-22 14:51:14,102 - world - [DEBUG] pupil_apriltags: Testing possible hit: /Applications/Pupil Capture.app/Contents/MacOS/pupil_apriltags/lib/libapriltag.3.dylib...
2020-06-22 14:51:14,103 - world - [DEBUG] pupil_apriltags: Found working clib at /Applications/Pupil Capture.app/Contents/MacOS/pupil_apriltags/lib/libapriltag.3.dylib
2020-06-22 14:51:14,143 - world - [DEBUG] file_methods: Session settings file '/Users/michael/pupil_capture_settings/user_settings_world' not found. Will make new one on exit.
2020-06-22 14:51:14,143 - world - [INFO] launchables.world: Session setting are from a different version of this app. I will not use those.
So I just tried running the app from the command line, and some more specific information was shown in the Terminal:
Exception in thread Thread-2:
Traceback (most recent call last):
File "PyInstaller/loader/pyiboot01_bootstrap.py", line 151, in __init__
File "ctypes/__init__.py", line 348, in __init__
OSError: dlopen(libSystem.dylib, 6): image not found
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "threading.py", line 916, in _bootstrap_inner
File "threading.py", line 864, in run
File "pyre/zactor.py", line 59, in run
File "pyre/zbeacon.py", line 66, in __init__
File "pyre/zbeacon.py", line 303, in run
File "pyre/zbeacon.py", line 218, in handle_pipe
File "pyre/zbeacon.py", line 201, in configure
File "pyre/zbeacon.py", line 73, in prepare_udp
File "pyre/zbeacon.py", line 137, in _prepare_socket
File "pyre/zhelper.py", line 232, in get_ifaddrs
File "PyInstaller/loader/pyiboot01_bootstrap.py", line 153, in __init__
PyInstallerImportError: Failed to load dynlib/dll 'libSystem.dylib'. Most probably this dynlib/dll was not found when the application was frozen.
My take on that is that the problem lies with the packaging of the application for distribution but I'd certainly appreciate any guidance or suggestions.
@user-07222c I will have a look at it today
Hey @papr I sent an email to info@pupil-labs.com on the week-end. Please have a look at it. It is about .pldata
@user-3ede08 We have received your email and will follow up via email with an example on how to read the files.
ohh, thanks a lot.
@user-07222c Let us use your Github issue for any communication in regards to this issue: https://github.com/pupil-labs/pupil/issues/1919
@user-c5fb8b thanks, thatβs very helpful. Iβll get digging π I have one more question... about latency. I have developed a routine for measuring the pupilβs light reflex with Pupil Core. It uses the worldcam to detect the onset of a light and sends an annotation with the associated pupil time stamp to assist with calculating time-critical measures like latency to constrict, time-to-peak constriction, etc. It seems to work well, as the annotation is stamped on the first frame in Pupil Player where the light becomes visible during playback. Iβm just wondering what sort of latency this might have from the actual event? Given that the Pupil Core tech specs quote 8 ms camera latency and >3ms (depending on hardware) processing latency, and that I am using world cam with 120fps, am I right in thinking it would be somewhere in the region of 15 plus-or-minus 4 ms? @user-430fc1 Hello again, I was just wondering if you have had any thoughts on this issue?
Hi Pupil team - thank you so much for maintaining these chats and being available to answer our questions. I am processing data captured using Pupil Core with audio recording and I have a question about the audio timestamps. For a 23 second recording, we have an audio file with 1,030,144 samples, an audio_timestamps.npy file with 1006 values, and a gaze_timestamps.npy file with 1123 values. I am trying to synchronize the audio with the tracking data, and am unsure how to correlate these three files... Could you point me in the right direction for any related documentation? My biggest question is: How do we use the audio_timestamps values to synchronize audio samples with gaze timestamps? Thank you so much for your help!
@user-2143a5 The audio file should have as many audio frames as timestamps. In your case, each frame holds 1024 samples. 1024 * 1006 = 1,030,144. The timestamp corresponds to the start of the audio frame, i.e. the first sample of the frame. Audio and gaze timestamps are generated from the same clock. Therefore, you can match audio and gaze by pairing gaze points to their closest audio frames.
Thank you!!! I also noticed a difference between the first value in the gaze and audio timestamp files (approx. 1.505) - is there any special consideration for this, or can I simply add that fixed offset to the timestamps?
@user-2143a5 You should take the timestamp as they are. If there is a difference of 1.5 seconds it means that one sensor started recording 1.5 seconds earlier than the other
Excellent - thank you so much for your help!
hey guys I just started using pupil labs recently, and I had a question about the capture software
when the pupil camera displays come up, by default the left eye is right side up while the right eye is upside down. is that supposed to be the case, or is my hardware set up incorrectly? I am having some issues with accuracy and I wanted to figure out if that was the reason
@user-1af2b3 https://stackoverflow.com/questions/62441144/pupil-labs-eye-tracking-camera-set-up-is-my-video-feed-inverted
^ thats actually me who posted that question lol I was just about to say here that I think that solution posted fixes it. I didnt realize they updated it again so recently
thank you Michael
@user-1af2b3 Nice, glad if was useful (if it was, do tick the asnwer as being correct on StackOverflow, which will help future readers). But I'm a brand-new user, so take anything I say with a grain of salt. PS I misread your StackOverflow question - I thought it was posted in 2018, not the 18th of this month, which is why I referred to it as of "historical interest". Have edited it now to remove that bit.
haha no worries yeah i just started working with pupil labs recently
trying to set up a unity app where i track people's vision while they watch a 360 video
where can i find the new 2.0 unity package? it says it should be in the latest release but im having trouble finding it
oh nvm i found it
Hey, I have noticed, that each element of the array in the eye0_lookup.npy file is a tuple with 4 values. The first one is always 0, the second one starts from 0 and increases by 1. The third value is equal to the eye0_timestamps.npy. I don't know how the fourth value behaves. So, my question is: what do the values in the eye0_lookup.npy file correspond to ? Thanks
@user-3ede08 https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/video_capture/utils.py#L449-L458
These are the semantics of the file. It is a cache file that we use for efficient seeking within the video files. Keep in mind that Pupil Mobile and Invisible recordings can have multiple video parts for each camera.
ok, thanks @papr
Hi, is it possible to calculate the distance (e.g., in mm or pixel) between a target object (e.g., the center of a surface/area of interest) and the gaze position in a given moment? π€
Hi @user-e33d45, please note that computing this in mm vs pixel will mean quite different values. With mm I would assume you are are referring to the distance based on the real-world dimensions of your surface. With pixels I would assume you are referring to the distance on the camera image. Do I understand you correctly? If the surface changes it's distance to the camera, the ratio between mm and pixel will change, so you need to keep this in mind.
That being said, you can compute the distance in mm easily when you define the real-world size of your surface in Capture/Player in mm as well. E.g. if your surface is a computer monitor with a size of 550x310mm, you would put Width: 550.0
and Height: 310.0
in your surface settings. Then when you export, in gaze_positions_on_surface_SURFACENAME.csv
you will get scaled gaze positions for your surface. These will be relative to the surface corners, so you need to make sure that the corners of the surface in Pupil line up well with your surface in the real world. The x_scaled
and y_scaled
values in the export are then in mm relative to the surface origin and you can easily compute the different to the surface center.
@user-e33d45 the procedure I described above will only work for gaze that is already on the surface. We have also received your email to info@pupil-labs.com and will follow up with a more general method there.
Is there any difference if I use pip or conda to install msgpack ?
@user-3ede08 pip by itself only installs python modules from https://pypi.org/ into your system's python site-packages folder. If you use virtual environments, pip will install the modules into the virtual environment's site-packages folder. Conda is more than that. Conda creates virtual environments, too, but with the difference that it also isolates system library dependencies. Pip cannot do that.
Specifically, in case of msgpack, if you are in a conda environment, you can use both.
Thanks @papr
@user-430fc1 Hey, apologies for the delayed response. I think I need to clarify how timestamps are generated in Pupil in order to respond to your question. We differentiate between hardware timestamps and software timestamps. Hardware timestamps are generated by the camera at the start of the frame exposure. The software timestamps are generated by pyuvc using the system's monotonic clock at the time when the frame has finished transferring from the camera to the computer. The difference between the software and hardware timestamps is what we call camera latency. Camera latency is dependent on frame resolution, as a higher frame resolution requires more data to be transferred from the camera to the computer.
Ideally, we would use hardware timestamps at all times. Unfortunately, we have noticed that the camera and system clocks are not necessarily synchronized at all times and on all OS. Especially on Windows, we have seen major discrepancies. This is very problematic, as every of the three cameras is using its own clock, and if they are not synchronized, pupil data cannot be matched and mapped to gaze properly.
This is why we use corrected software timestamps instead. These are software timestamps from which we subtract a fixed amount of time to compensate the camera latency approximately.
In summary, the recorded timestamps should correspond to the actual time at which the frame was recorded. Therefore, the relevant questions are (1) how accurate the camera latency is approximated and (2) how much it varies. (Unfortunately, I cannot give representative values for this at the time as we were not able to measure the actual camera delay on Windows due to the desynchronized clocks.) Processing latency and camera frame rate do not play a role at all in this context as they do not affect hardware nor software timestamps.
@papr - Thanks so much for a highly informative response! So, it sounds like the camera latency is going to be shared by eye and world cameras. Can you clarify what you mean by "a fixed amount amount of time"? I.e. what is the value that gets subtracted? Also, do you have any 'ballpark' figures regarding the accuracy of approximation / variability for Unix? My testing has resulted in plausible values for PLRs (e.g. latency to 1% constriction = ~250 ms), but I'm keen to know if there are any ways of maximising my chances of getting good timing. For my application (detecting a flash of light), using the world camera at (320 x 240), 120/fps works just fine. Would you advise using the same settings for eye cameras? To what extent is the reliability of pupil size estimation improved by increasing the resolution?
camera latency is going to be shared by eye and world cameras This is only the case if you use the same resolution for both. I would highly recommend doing that in your case if possible.
Also, do you have any 'ballpark' figures regarding the accuracy of approximation / variability for Unix? Not at the moment. Let us gather some measurements to give you a more concrete answer. Once we have that, we can share the measurement script for you to replicate the measurement.
To what extent is the reliability of pupil size estimation improved by increasing the resolution? Unfortunately, I do not have a concrete number for that either. Are you analysing the 2d or 3d pupil diameter?
@papr Fantastic, I really appreciate your help with this, and very much look forward to the measurements / example script. Currently I'm focusing on 3d to avoid having to correct for perspective (and because mm units are more clinically relevant than pixels).
Hey, in the documentation you mention that "The payload, e.g. a pupil datum, encoded as msgpack ... is encoded twice". Both the first and the second encoded data are different. Which one should I use ?
I get none and nan in the head_pose data. { 'camera_extrinsics': None,
'camera_pose_matrix': None,
'camera_poses': (nan, nan, nan, nan, nan, nan),
What could be the problem ?
@user-3ede08 There are two types of files
The payload, e.g. a pupil datum, encoded as msgpack ... is encoded twice This only applies to .pldata files
The camera poses are likely nan because the camera pose could not be estimated for that world frame
Sorry, I was speaking about the .pldata. So, which one should I work with.
The camera poses are likely nan because the camera pose could not be estimated for that world frame @papr What should I take into account ?
@user-3ede08 You are looking at the head pose tracking data, correct? Have you set up a head pose tracking model yet? If not I would recommend looking at our youtube tutorial on how to set it up. You can also open the recording in Player, open the Head Pose tracker, hit export, and get the data as CSV. This saves you the hassle of loading the data manually
ok @papr I will look at it, thanks
Hi, I have a question about the pupil-labs installation from the source.
In the last time (about 2 weeks ago), there were no issues to run the application from the source after installation of 'libuvc' from the github. But, when I try to install from the beginning today, there is an error "no module named 'uvc'" even if I already installed the 'libuvc' from the github.
So, is there anyone who faced this issue so I can figure this issue out?
Hi @user-3f2e42,
the error message means that uvc
cannot be found, which is the python wrapper around libuvc. Which operating system are you on? Have you installed pyuvc in your Python environment?
Hi @user-c5fb8b, Currently, I'm working on Ubuntu 16.04. And, I installed the pyuvc by cloning repository of Pupil-Labs github. (I followed the installation guideline in the github.)
But the thing is, about 2 weeks ago, it was installed correctly and there were no errors what I mentioned.
@user-3f2e42 what's your Python setup? Are you working in a virtual environment? Theoretically you shouldn't even need to clone pyuvc, running this in your Python environment should be sufficient:
pip install git+https://github.com/pupil-labs/pyuvc
@user-c5fb8b Ah.. okay I will try it and let you know soon, thank you! π
@user-c5fb8b I found that there is an error while installing 'pyuvc'
It returns "Building wheel for uvc (PEP 517) ... error"
In there any more output?
there is 'error: command 'gcc' failed with exit status 1'
Did you add the udev rules as in the docs:
echo 'SUBSYSTEM=="usb", ENV{DEVTYPE}=="usb_device", GROUP="plugdev", MODE="0664"' | sudo tee /etc/udev/rules.d/10-libuvc.rules > /dev/null
sudo udevadm trigger
?
Ah, I don't think so,, I will try it..
Yes, there are still the errors.
Please copy all the errors
Should I close the terminal?
Okay, please wait a second..
Installation Error: pyuvc
Please check the note.
@user-3f2e42 If I understand this correctly, libuvc cannot link against your version of turbojpeg. This might be because turbojpeg was compiled with a different compiler. Did you upgrade your system between the installs? What's the reason for reinstalling by the way? Did you run all the steps in the docs again? I would suggest you cleanup every leftovers from the old installation and run everything clean again.
@user-c5fb8b Okay, I got it. I will try the installation after cleanup all the previous files. If I faced the same issue again, I will upload the notes on this chat. Thank you for your supports π
As a side note I highly recommend upgrading to a newer LTS version of Ubuntu, as the setup is much easier there.
However, I know this might not always be possible π
@user-c5fb8b Okay :D Since it is not the first time I installed the pupil source codes (it was succeeded several times before :D), I believe that it would be fine even in ubuntu 16.04 π
Hey Pupil Lab community, how could I calibrate the Pupil Core to get gaze, as well as the head pose while driving ?
@user-3ede08 The head pose is independent of the gaze estimation. I would suggest using single marker calibration with a physical/printed marker in the car + apriltag markers in the car for the head pose estimation
Ok @papr thanks.
Hey guys, is it there any restriction for a plugin to use CUDA when running from source ?
@user-94f759 no. When running from source, you can use any Python module you like. This is specifically useful if you want to use PyTorch with CUDA support for example.
@user-94f759 however, keep in mind not to block the main thread for too long, or Pupil might become unresponsive. If you have longer-running computations, you might consider using a background process.
thanks !
@papr HiοΌ What other camera can I use for use USB-C mount after Realsense camera is not supported
@user-a98526 if you want to use the built-in UVC backend, your camera needs to fullfill the following criteria:
1) UVC compatible [1] (Chapters below refer to this document)
2) Support Video Interface Class Code 0x0E CC_VIDEO
(see A.1)
3) Support Video Subclass Code 0x02 SC_VIDEOSTREAMING
(see A.2)
4) Support for the UVC_VS_FRAME_MJPEG (0x07)
video streaming interface descriptor subtype (A.6)
5) Support UVC_FRAME_FORMAT_COMPRESSED
frame format
[1] http://www.cajunbot.com/wiki/images/8/85/USB_Video_Class_1.1.pdf
For other cameras, you have write your own video backend.
Can you recommend some cameras, thanksοΌ
@user-a98526 Unfortunately, I do not have any personal experience with other cameras than the cameras that are already supported by Pupil Capture.
hey guys! i'm developer in korea using pupli on some medical device. i cant find any c or c++ native code. can i get it?
@user-bbee68 Pupil is mainly written in Python. You can access our network API (https://docs.pupil-labs.com/developer/core/network-api/) though using zmq for c or c++.
@user-bbee68 Pupil is mainly written in Python. You can access our network API (https://docs.pupil-labs.com/developer/core/network-api/) though using zmq for c or c++. @papr thanks! i read them!
I want to ask the camera that Pupil already supports, because the Realsense I use doesn't work very well.
@user-a98526 By "the cameras that are already supported by Pupil Capture", I was referring to the "High speed camera" which is the default scene camera for the Pupil Core headset. https://pupil-labs.com/cart/?pupil_w120_e200b=1
So please let me clarify: I do not have any experience with other USB-C connected scene cameras.
I think I need this default camera information, then buy and use it
I mean the information about "High speed camera" and where can I get it, thanks for your help.
@user-a98526 I do not know if this camera can be purchased separately. I will forward your question to our sales team and come back to you.
@papr thank you !
hi, i have a question regarding why my recording is a grey screen for the entirety of the length of the experiment when i open it in pupil player. any ideas on how to fix it?
@user-f3cfc3 it looks like your scene camera was not recorded correctly. Do you have a world.mp4 file in your recording?
hmm, it looks like the world.mp4 files are missing. i'll look into this further. thank you!
Hey pupil labs members, in the .pldata, we have {'base_data':({'circle_3d':{'center':( ...
. Does the center
parameter refers to the center of each pupil ?
@user-3ede08 this should be the center of the pupil as 3d circle in teye pinhold camera space units (x,y,z). @papr or @user-c5fb8b feel free to elaborate.
@user-3ede08: @wrp is correct here. The base data contains the pupil datums which were used to calculate the gaze (?) of your .pldata file. The content in base_data essentially has the same format as the pupil_positions.csv export (only hierarchical instead of flat unrolled), you can find more information in the docs: https://docs.pupil-labs.com/core/software/pupil-player/#pupil-positions-csv
I have used the both center
values of the { 'base_data': ( { 'circle_3d': { '**center**': (..., ..., ...) ...
in the gaze.pldata
, and I thought I was calculated my popupillary distance. It got about 35 mm. But where ever I look, the value is about 60 mm
@user-3ede08 Pupil data is located in the eye camera coordinate system. Each eye camera has their own, unrelated coordinate system. During calibration we assume a fixed interpupillary distance (IPD) (or more specific the location of the 3d eye ball centers in relation to the scene camera) Therefore, you cannot use the 3d eye ball position to calculate the IPD.
ok, thanks to all of you.
Hi, I still have a problem with my pupil core, I had two cameras with the same name, I then changed the firendly name of one in the registry but I still can't see it in capture
@user-bd800a ~~Where did you change the names? In Pupil Mobile?~~ I misread your question. Pupil Capture expects a specific set of camera names, i.e. Pupil Cam1 ID2
, Pupil Cam2 ID0
, and Pupil Cam2 ID1
.
in the windows registry using the driver key
I have 2 sets of glasses, one worked normally, the other had two Cam2 ID0
I changed one
then switch back to the other glasses, now I also have again two Cam2 ID0
actually on the one that worked I have two Cam1 ID0
@user-bd800a We have never tested such a name change in the registry. So I do not know if that actually changes something for Capture.
It does not I think, since I only see the world and one one eye camera
Do you see the duplicated names in the device manager or in Capture?
device manager
ok, please ignore that as long as Capture correctly recognizes the camera names.
Capture does not
Could you please test that on a separate computer, ideally macOS or Linux, and verify the duplicated name appears there, too? Given that you made changes to the registry, I cannot tell if this is an issue with Capture, the drivers, or the device itself.
@user-bd800a The best way to verify it is by (1) downloading and running the most recent version of Pupil Capture, (2) go to the Video Source
menu, (3) "Enable manual camera selection", and (4) check which cameras are listed in the "Activate Device" selector.
Ok, with the latest version it worked and fixed the issue also with the previous versions
thanks a lot
@papr my friend! can i use pupil for android device directly?
maybe pupil core on android? or ios?
@user-bbee68 You can record Pupil Core video on Android using Pupil Mobile. But you will have to transfer the recordings to a computer running Pupil Player for analysis https://docs.pupil-labs.com/core/software/pupil-mobile/#pupil-mobile
@user-bbee68 You can record Pupil Core video on Android using Pupil Mobile. But you will have to transfer the recordings to a computer running Pupil Player for analysis https://docs.pupil-labs.com/core/software/pupil-mobile/#pupil-mobile @papr oh thanks! one more thing. is it use some gaze tracking on android like google cardboard enviroment? it just can using vive or something?
@user-bbee68 Pupil Mobile does not do any live analysis. It only records video from the connected eye tracker. Therefore, it is not usable in any VR/AR-like environment.
@papr aha! thanks! i use it for medical center for dizziness patients. they are mostly old man and they are poor at computers but they can use android phone. and we suggest for some kind of rehabilitation of dizziness disease. anyway thanks!
@user-bbee68 What type of data are you looking for in regard to the dizziness disease? This sounds more like a use case for Pupil Invisible: https://pupil-labs.com/products/invisible/ Please be aware that Pupil Invisible does not provide any pupillometry data yet. If you have more questions in regard to that, please checkout πΆ invisible or mail your questions to info@pupil-labs.com
It comes with an Android phone and a Companion app that is very easy to use.
Does anyone know what might be causing this error when I load a recording into the new version of Pupil Player? I didn't encounter this with the older versions of the software
@user-430fc1 This is a very short recording, correct?
@papr yes, only a few seconds testing
@user-430fc1 This is an issue with very short recordings and the drawing of the timeline. It does not influence your data or anything. You should not see this error in longer recordings. We will look into the cause of the error.
@papr FYI, I still encounter this error with ~2min recordings, but it does not seem to affect the data.
@user-430fc1 I have to correct myself, there is actually a way of setting the exposure mode via notification: You can start any plugin remotely with custom arguments via the notification:
python { "subject": "start_plugin", "name": "<plugin-name>", "args": <dict-with-arguments>, }
TheUVC_Source
plugin is the plugin handling the UVC connection and it acceptsuvc_control
as argument in__init__
for starting with custom UVC settings. With that you should be able to basically restart theUVC_Source
with modified UVC settings. You will need to do some digging in the code to figure out the necessary startup args however. @user-c5fb8b Thanks again for this. I've tried various ways of doing this with a notification but I csn't seem to find the right arguments, and pupil capture always becomes unresponsive. Here's what I've be doing:notify(pupil_remote, {"subject":"start_plugin", "name":"UVC_Source", "args":{"uvc_controls":{"exposure_mode":"manual"}}})
. If you can offer any insights they would be grately appreciated!
exposure_mode
is not part of the "uvc_controls". https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/video_capture/uvc_backend.py#L59-L68
Try
notify(
pupil_remote,
{
"subject":"start_plugin",
"name":"UVC_Source",
"args": {
"frame_size": (1920, 1080),
"frame_rate": 30,
"exposure_mode":"manual"
}
}
)
frame size and frame rate are non-optional
I would also recommend setting name
or preferred_names
@papr Thanks - I sent the above notification and included "name":"World"
- it didn't crash capture but looks like it killed the world process. I'll play about with it a bit more. Thanks for your help. The reason I want to do this by the way is because my application requires specific camera settings, and it would be easiest to be able to set them programatically rather than remember to do it manually in Pupil Capture at start up.
Please try "Pupil Cam1 ID2" instead
See these default arguments as reference https://github.com/pupil-labs/pupil/blob/master/pupil_src/launchables/world.py#L291-L305
@papr Bingo. Thanks!
Hi, I have a very basic question: when running Pupil Capture, should a window pop up with the eye camera view? I'm running on a Windows 10 computer and ran Pupil Capture as an administrator. A command window popped up and a window with the World camera view, but not an eye camera view. When I tried to calibrate, no data was recorded, so I think something might be wrong with the eye camera connection. I tried deleting the camera device drivers and unplugging/replugging the USB to reinstall the drivers, but that did not fix the problem.
I believe it should show the world view and then one window for each eye camera view. Does it show any error in the terminal/command window that pops up when you open pupil capture?
I'm having an issue where Pupil Player (latest version) doesn't seem to notice when I change the "Minimum Pupil Confidence" value. When I try to calculate the calibration, it still shows the message "##%% of Pupil data was discarded because confidence >.8"
Hello, this is a basic question but I am trying to analyze my video recording for fixations at certain AOI's and I wanted to confirm if the yellow circle (which identifies a fixation exists at that frame) is where the person's eye gaze is or is the eye gaze still located at the purple-colored dot that normally represents the gaze position; as these are not matching up in the frame.
@papr Hi, When I use USB to connect the pupil to the computer, my pupil capture software cannot open normally
If I donβt connect the pupil core to my laptop, the core capture can be opened.
Hi @user-a98526 unfortunately your screenshot of the terminal windows seems to be cut off on the right side, which prevents me from reading the full log. Can you try this again and send a screenshot of the entire terminal window?
Hi @user-dcca8c, yes there should be windows for the eye camera views. Is this a fresh installation of Pupil? Maybe the eye processes are disabled. Can you check the general settings menu on the top right? There should be two buttons to "Detect eye 0/1", are these enabled? If they are enabled and you still don't see any eye-window, please hit the button "restart with default settings" in the general settings window. One of these should hopefully solve your problem.
Hi @user-e70d87, this is indeed a bug in the latest version of Pupil Player. We are working on a fix.
Hi @user-0f5d40, can you share a screenshot of a frame where this appears to be an issue? I'm not sure what you refer to with "purple-colored dot", the gaze indicator is by default a green circle with a red dot in the center. Please be aware that a fixation is not several gaze points at the exact same position, but several gaze points with a small distance. This is what you control with the dispersion sliders in the fixation detector menu. Because of this, some of the gaze points that belong to a fixation will not be perfectly centered at the fixation.
This is the whole picture.
@user-a98526 Pupil is having trouble installing the necessary drivers on your system. Were you able to run Pupil in the past, or is this your first run on this machine? I think the easiest solution is to try and uninstall/reinstall Pupil. Please uninstall Pupil via the official Windows "Add or remove programs" system settings page. Please notify us if the problem persists after reinstalling Pupil.
Hey, how long does it take to detect the markers, when tracking the head. I have clicked on start detection
about 15 min ago to detect markers (running time = 25 s) and it is still running.
@user-c5fb8b i can run pupil capture only when the pupil core is not connected to the laptop, I'll try reinstall Pupil. Thank you
@user-a98526 this is to be expected, Pupil will only try installing the USB drivers if you have a headset connected via USB.
The marker detection bar is completed, but the pupil player software is still running
Hey, how long does it take to detect the markers, when tracking the head. I have clicked on
start detection
about 15 min ago to detect markers (running time = 25 s) and it is still running. @user-3ede08
HiοΌ@user-c5fb8b I reinstall Pupil, but the same problem occurred.
Hey, I have a problem concerning the head pose tracker. Each time I clicked on start detection
in online head pose tracker
the pupil player runs nonstop, and the only way to stop it is to rush it. No function is accessible, no buttons are working. How could I handle the problem ?
@user-3ede08 Did you mean offline
instead of online
head pose tracker?
@user-3ede08 Please try again, and as soon as the application freezes (i.e. the buttons stop working) please copy the player.log
file in the Home directory -> pupil_player_settings
folder and share it with us.
Hi @user-a98526, are you running Capture as administrator? If not: do you see a popup in Windows requesting administrator access? Can you try right-clicking Capture and running it as administrator?
If this does not help: Pupil uses Microsoft Powershell to install the drivers in the background. Normally every Windows installation should have access to Powershell, but maybe for some reason your machine does not have it or Capture might be unable to access it. To test this, please open a command prompt window. You can do this by opening the Windows start menu and typing "Command Prompt". Please open this App. A black terminal window should open. In this window, please type:
powershell
and hit enter. Please copy and paste the text that appears afterwards, it should be something like:
Windows Powershell
Copyright (C) Microsoft Corporation. All rights reserved.
After this you can just close the Command Prompt window again.
In the Plugin Manager -> Head Pose Tracker -> (on the top) there is written : Offline Head Pose Tracker. I don't see online Head Pose.
@user-3ede08 Did you mean
offline
instead ofonline
head pose tracker? @papr
@user-3ede08 ok, thank you. I was just asking because you wrote "online" before. But this is clarified now. If you could share that log file I would be able to check what the issue is.
@user-3ede08 I have just remembered that we have made an update to Pupil Player last Thursday that fixed an issue with the Head Pose Tracker in Player. Are you using version v2.0-182
already? If not, please download it here and try again: https://github.com/pupil-labs/pupil/releases/latest#user-content-downloads
Hi@user-c5fb8b οΌTranslation: 'powershell' is not an internal or external command, nor is it a running program or batch file.
@user-3ede08 I have just remembered that we have made an update to Pupil Player last Thursday that fixed an issue with the Head Pose Tracker in Player. Are you using version
v2.0-182
already? If not, please download it here and try again: https://github.com/pupil-labs/pupil/releases/latest#user-content-downloads @papr I am using the version you released about 2 weeks ago. Have you released a new one ?
@user-3ede08 Yes, we have updated it last Thursday.
But I can start Powershell like this οΌstart powershell
@user-a98526 Ah okay! So powershell is installed, but it cannot be found. Have you made any custom modifications to your PATH environment variable on Windows?
@user-c5fb8b Because many applications have added environment variables, Iβm not sure if I modified some of them.
@user-a98526 can you open the window where you edit the environment variables
This
I mean using the head pose. So, we can detect if someone is moving his head on the left/ right, up/down ...
Will it be possible to get something like this with the pupil core ? @user-3ede08
This is a system variable
@user-a98526 the PATH variable can be set either in the upper section (user settings) or bottom section (system settings). One of them should contain a path to the powershell directory, for me, the system PATH variable contains:
%SYSTEMROOT%\System32\WindowsPowerShell\v1.0\
You can double click on the Path
variable to get a list of its content
@user-a98526 this is e.g. the content of my system's Path
variable:
I assume the WindowsPowerShell might be missing for you?
@user-3ede08 The head pose tracker exports the head pose as a 6-components vector for each scene frame. The first three components are the rotation vector in Rodrigues format [1] and the last three components are the translation vector. I am not sure if these rotation vectors correspond to yaw/pitch/roll but if they do not, you should be able to transform them into your preferred format. My colleague who developed the plugin is currently on vacation. I will forward your question to her once she is back.
[1] https://en.wikipedia.org/wiki/Rodrigues%27_rotation_formula
Please, when will she come back ?
@user-3ede08 You can expect a response around the end of the week.
ok, thanks.
@user-c5fb8bI think I am missing this environment variable
@user-a98526 is this your user Path
or system Path
?
this is the system pathοΌand the user path is
@user-a98526 Is this a private computer or has this computer been provided to you by your employer/university?
This is a private computer.
@user-a98526 I'm afraid you might have messed up your Path
variable in the past.
You can try just adding %SYSTEMROOT%\System32\WindowsPowerShell\v1.0\
to your System Path
, maybe that fixes it.
Please be aware that other functionality might also not work correctly with this setup.
I would actually recommend a clean reinstall of Windows in your case. Please never delete anything from the environment variables if you are not 100% sure that this is what you have to do. Also try not adjusting the system variables.
@user-3ede08 I have just remembered that we have made an update to Pupil Player last Thursday that fixed an issue with the Head Pose Tracker in Player. Are you using version
v2.0-182
already? If not, please download it here and try again: https://github.com/pupil-labs/pupil/releases/latest#user-content-downloads @papr It works with the new one, thanks @papr
Hello, what is the meaning of the "Default Gaze Mapper" line in pupil Player ?
@user-39c8c4 Have you checked out our post-hoc pupil detection and gaze mapping tutorial already? https://www.youtube.com/watch?v=_Jnxi1OMMTc&list=PLi20Yl1k_57rlznaEfrXyqiF0sUtZMMLh It explains the complete workflow and how the gaze mapper relates to it.
@papr Thank you for your answer. I've already watched this video but I don't understand everything. I understand the white rectangles from the "Reference" line, but I don't understand the "Default gaze mapper" just bellow (the colored lines)
There three colored lines. One refers to the calibration range, one to the mapping range, and one to the validation range. Each can be set in the corresponding sub menu.
Oh yes thank you very much for your help !
Hi everyone, I have trouble opening an old file in pupil player 0.7.1 and 0.4 from version 0.7.3 of pupil capture using Ubuntu 14.04. I am getting the error: "info.csv is not valid" . The files that contain the folder are in the image attached, Best , Pablo
By the way please what is the major function of the headpose tracker plugin?
@user-7daa32 The idea is to set up an external coordinate system with markers in which the scene camera can be tracked. This allows you e.g. to map gaze from multiple participants into a common 3d space. Check out our tutorial on using the plugin https://www.youtube.com/watch?v=9x9h98tywFI
Hi everyone! I have a problem with the timestamps in the recording. I get a negative timestamp. What could possibly be the reason for this?How can this be rectified?
@user-2ff80a we have seen this before on Windows, but as long as the timestamps are monotonicly increasing, there is nothing to worry about
@papr thanks for the reply. But I also have some other issue regarding timestamps. This timestamp shouldn't it map with the real clock time.Or is it something else?
Unless you have synced it explicitly to Unix epoch or an other clock, no
I can link the corresponding documentation when I am back at my desk
@papr okay that would be great! thank you
@user-c5fb8b Hi, thank you for getting back to me. Attached are two images that represent what i'm referring to. Also, is there a default frame rate for the world & eye cameras?
@user-2ff80a https://docs.pupil-labs.com/core/terminology/#timing
@papr Thanks alot! will check this.
Hi ! I'm having a problem with Pupil Player. It used to work fine, but today nothing happens, i can't seem to open it. I tried to uninstall and re-install but it still doesn't work. Also, Pupil Capture and Pupil Service work just fine. (i'm working with Windows 10). Did anyone ever encounter the same issue ?
@papr but also It would be great to know if there's anyway that I can get rid of the negative timestamp issue. As I have to map this with the real clock time precisely in my project.
@user-2ff80a In this case, it does not matter if they are negative or not. The important question is how to synchronize the two clocks after-the-effect. For this, checkout info.player.json
. It contains the start time in pupil and system time. Calculate the difference and apply it to the exported timestamps in order to shift the exported epoch to unix epoch
@user-c87ed5 Hi, these are very old versions that you are using. Not sure if we will be able to reproduce this issue. But if you want you can share the info.csv with us and we can have a look.
@papr okay thank you!
@user-0f5d40 are these screenshots from the exported world video?
Hi @user-0fcca1, can you share the log file with us? You can find it in your home folder > pupil_player_settings > player.log
@user-0fcca1 maybe it already helps if you just clear your player user settings: in your home folder > pupil_player_settings, delete all files starting with user_settings
[email removed] it is !
@user-c5fb8b i deleted the files and now it works ! thanks a lot ! have a good day !
@papr yes, they are just for example.