Can anyone recommend a calibration method when using both computer screen and mobile during the same test?
People stay seated the whole time. We don't want to do offline calibration since we have people observing during testing. Since there's so much switching between devices there is no room for recalibrating during testing.
@user-f09f8c You can use any of the provided calibration methods. The same calibration can be used for computer screen and phone since gaze is calibrated to the world camera's coordinate system and not to a real world object.
Thanks papr. I've used screen calibration but that's not accurate when looking down at phone, possibly because the distance from person to device is different?
@user-f09f8c Please be aware that gaze estimation is usually less accurate outside of the calibration area (green polygon). The phone might be outside of this area while the subject looks down?
Hello, i'm running Pupil Capture v 1.9.7 on Ubuntu 18.04.2. We are currently experiencing a bug: everytime we activate eye_0 cam, the application shut down with no warning or error message. What could it be? I skipped an update that i'm downloading right now.
@user-96755f The whole application shuts down?
yes
@user-96755f Could you please share the capture.log
file in the ~/pupil_capture_settings
folder?
@papr this is the log file
@user-96755f Not sure what the issue is. The log file indicates a normal shutdown. Could you try delete the user setting files next to the log file and starting Capture again?
ok, it looks stable right now. no shutting down! Thank you
I'm trying to send remote annotations to pupil capture. This seems to work and when I open the recording in pupil player I receive a message that multiple annotations are loaded from annotation.pldata. However, when I then export the data, the exported annotations excel file is empty. Does anyone know what the problem could be? And is it possible to read .pldata files in R or with other programs?
@user-64b0d2 My suspicion is that the annotations' timestamps are not aligned with the recording. Player only exports data within the tram marks. If the annotations are outside of that time range they will not be exported. You can read the pldata files in every environment that supports msgpack. E.g. use this: https://cran.r-project.org/web/packages/msgpack/index.html
First time i see a github linking a discord
@user-1e8f1b it's relatively common
@user-1e8f1b Yeah, I guess that's true. But Discord is just too handy to not use it.
@wrp Is it? Other than us, I have not seen that either
Too many to cite: Vuejs has a huge discord channel for example
hey! I have a proble: suddenly the eye window of pupil capture didn't open any more. only the window for the world view was visible. the eye window showed this message. Could you help me please?!
@user-07d4db I can't see any eye related messages. Is it possible that you simply turned off the eye process? Can you try starting it again from the general preferences menu?
Actually the eye process was turmned on. Now I turned everything off and tryed again and it worked. Thank you very much for your quick reply!
Are there any settings for changing the pupil detection intensity range of the infrared LED reflections? In Algorithm view, I have bright yellow dots in my cornea, but I also have some duller yellow dot-like-smears in my lower eyelid, probably because I'm pale and sweaty. These duller dots seem to require extra blue boxes which may or may not impact the pupil calculation.
Since which version is the 3d mode the default one?
@user-c1923d Since https://github.com/pupil-labs/pupil/commit/2fda7894d5322f5ca3434a81dcee365a62bb56e0 The next release after that commit is v1.1. So basically <v1.0 has 2d as default
@papr thanks!
Hi, I just wanted to ask how blink confidence was computed and how to interpret them
I couldn't find any details on this for the blink data format like there is for pupil and gaze
@user-dfeeb9 we convolve the Pupil confidence signal with a filter. The resulting signal is called blink confidence. It can be interpreted as how quickly did the confidence signal drop (blink onset) or spike (blink offset).
Hi guys - running on Ubuntu 16.04 LTS here, with Realsense D435i as world camera. Had to resolve a few compatibility issues to get pupil running from source - will post an issue later to list them for others to use. Pupil runs now with both eye cameras working fine, but the world camera failing... I get this: "world - [ERROR] video_capture.uvc_backend: Init failed. Capture is started in ghost mode. No images will be supplied."
I tried what's mentioned in https://github.com/pupil-labs/pupil/issues/1120 , did not resolve it. The realsense camera works well in realsense-viewer too - so I don't see a hardware/driver problem here...
Any thoughts? Note1: As I'm runing a D435, I'm using Realsense SDK2 and therefore pyrealsense2 Note2: System already tested on Windows and working fine - so not a hardware issue.
Update: now working with a D435 instead of D435i ... think I better open a formal issue later with all the details. Ignore my message
@user-3f53d1 You have the wrong backend selected. Please choose the Realsense D400
manager instead of the Local USB
manager
@papr thanks for the feedback... was aware of that. Issue is after I choose Realsense, the "Activate source" list will not open. Thought that was due to the world error in the terminal. It will allow me to open the list, if I unplug-replug the camera while pupil is running. But even after that it's problematic... I get repeating input output errors after a few seconds and the frames freeze.
(Same with the D435 btw - it just took slightly longer to happen)
@user-3f53d1 I am not sure that I can follow. Could you make a screen recording of the behavior?
@papr I see two behaviours - one is what leads to repeated input/output errors - see attached.
The other is where I don't get an io_error, but then running calibration leads to pupil crashing entirely. See attached.
In both cases, the D435 is not selectable as a source when I run pupil - I need to unplug/replug (you see the warning on the screen when I unplug "EYE1: Capture failed to provide...") for it to show up in the list.
Hello all, if I have pupil capture open and I do calibration, pupil capture crashes
@user-3f53d1 Did you try disconnecting device first and then starting Capture? Are you able to immediately select the RS device without an other disconnect?
@user-888056 Hey, could you let us know which hardware, which operating system and which Capture version you are using?
pupil version 1.11.4, mac book, siera 10.12.6
@user-888056 Thank you! Which Pupil hardware and which calibration method are you using?
pupil w120 e200b
screen marker calibration
@user-888056 ok, thank you. Could you please try again and share the capture.log
file in the pupil_capture_settings
folder after the application has crashed.
will do
caputre.log of application crashing after calibration
@user-888056 The log file does not show any indication of a crash 🤔 Could you make a screen recording (e.g. via the built-in Quicktime Player) that shows the behavior?
sure
its 180mb cannot send here, can you give an email to send it?
@user-888056 data@pupil-labs.com
resized and send
@user-888056 Ok, thank you. This is a very low level crash, i.e. not a Python related crash. I do not know why it is correlated to the calibration end. Could you click the Report
button in the crash dialogue? It should open a new window with a lot of text. Please copy this text to http://gist.github.com/ , click Create secret gist
and share the resulting link with us.
@papr I am able to immediately select it in that case. Though the other problems persist. I tried moving the HDD to a newer machine, and the issue with the camera was resolved! I suspect the USB3 connection on the older machine, as it is a pci-e usb 3.0 card... The issue with the calibration crashing the programme persisted though - I looked into the error and it was about the cv2.findContours method looking for 3 return values when there were only 2... fixed it by changing the code accordingly (changed ¬,contours,hierarchy = cv2... to contours,hierarchy = cv2... ). Managed to calibrate after that!
@user-3f53d1 Ah, then you have probably opencv4 installed. Great to see that you were able to solve the issue!
you want all of it?
@user-888056 Yes, please!
done....
@papr Thanks! followed the developer docs to install opencv - this was a fresh install of ubuntu. Guess that gets me opencv4. Had a number of other compatibility issues which led me to change the code too - will put a list together and perhaps post as an issue for others to find.
it is worth mentioning that if i dont have detect eye 0 and eye 1 selected it does not crash but calibration fails saying: not enough ref points
Hello :) Could anyone please help me and my college? We can't seem to make pupil capture recognise out Inter Realsense D400 cameras on ubuntu nor window.
Hi, does anyone have a script to calculate gaze velocity using the pupil lab data? Thank you
@user-5f8042 In which unit would you like to have the velocity?
hello! we wonder if using the app pupil mobile, streaming so the data from the phone to the PC , some data can be lost due to the connection or the amount of samples are anyway guaranteed. Thank you
@user-c1220d Yes, on heavy load of the phone or the network connection, it is possible that video frames are being dropped. Streaming is mostly meant for monitoring the headset/camera positions. We recommend to use the Remote Recorder
plugin to start local recordings on the phone.
ok, because i saw that starting the record from the phone, the data saved on it, dont have the informations pupil_player needs to provide the .csv with pupil positions and gaze. As oppositie i have to start the record from the PC while the glasses are plugged to the phone in order to have the complete data on my PC
does this plugin solve this?
@user-c1220d Just open the phone recording in Player and use the offline pupil detection and offline calibration plugins
@user-c1220d https://www.youtube.com/playlist?list=PLi20Yl1k_57rlznaEfrXyqiF0sUtZMMLh&disable_polymer=true
ok thank you so much
i'll try better this
@user-c1220d Nonetheless, you can use the remote recorder to start the recording from your computer running Capture. It is equal to pressing the recording button on the phone.
@user-522b6b Is this the exact wording?
~~What file are you dragging ontop of the gray Player window?~~ Ah, you need to drag the folder including the mp4 file ontop of the gray Player window.
@papr degrees/seconds. thank you
@user-5f8042 I do not have a working example yet. But I will create an example in the coming days.
Hi, Intel Realsense D400 is not recongised by Pupil Capture, may someone help me?
@user-1ddb42 Have you selected the Realsense D400
manager? The Relasense devices are not being listed in the Local USB
manager.
Yes i have selected Realsense d400 in the first dropdown, but the second one keeps being empty
@user-1ddb42 Is it being recognised by the Realsense Viewer?
No it is not.
Then there is a general issue with your hardware. Please make sure to use fully usb-3 compliant cables and sockets on your computer.
We have 2 desktop and 3 laptops in the laboratory. Using linux mint live usb we are able to get one laptop to work without crashes and one laptopt to work but crash over time. One desktop seems to be working with windows. All laptops and desktops have comaptible front and back usb3 ports.
We have tried instaling the relsense driver on all of them using the same methods and on some it helps and on others it doesnt.
We wore unable to isntall pyrealsense on all the mentioned computers, including the ones where the Realsense R400 camera works just fine.
@user-1ddb42 Pupil Capture requires pyrealsense2
to be installed. Please refer to the official librealsense repository for support in this case.
I have doen that, and ended up with a issue similar o this one: https://github.com/IntelRealSense/librealsense/issues/1043
Strangely, on some computers it is not requiered to install pyrealsense2, and Pupil Capture works just fine.
Sorry, but this is definitively not the case. I am very sure that pyrealsense2 was installed successfully on these computers.
@papr thank you. looking forward to it
thank u @papr for ur help in offline calibration, which is the difference in selecting "natural features" or "circle marker" in the offline method? since apparently, even using "natural features" the algorithm chases the printed marker as well
as "circle marker" method does
@user-c1220d which version are you using?
yes i had the previous one
the new 1.11 is different
thank you
Hi, @user-1ddb42 and I seem to be having driver-related issues on Win10 with the D400 Realsense cam. Both eye cams are recognized by Windows and Pupil Capture and are functioning correctly, but device manager throws a "device descriptor failed" for the Realsense cam. When we plug it into a USB2 port, device manager recognizes it properly, and it works in Realsense viewer, but it's not recognized properly in Pupil Capture. As for USB3 it isn't recognized within device manager at all.
Here's a screenshot with the device plugged into USB2
@user-222750 all realsense cameras require USB 3. USB 2 won't work.
Yes, but Win10 can't seem to recognize it properly on USB 3. We've tried installing Intel's drivers and the Realsense SDK.
@user-222750 what type of USB 3 cable are you using?
Are you using the cable that was shipped with the headset?
a Syncwire USB 3 cable, though I'm not sure if it came with the headset or the Realsense camera which was in a separate box
@user-222750 it is difficult to tell what the problem is. Please follow the following steps to debug the issue . - try to get the realsense camera working directly (not connecting it through the headset) first. - Use the realsense viewer to test if the drivers have been been. - make sure that your USB 3 socket is USB 3 conform
Weird, now we're consistently getting the camera to show up in Device manager but Realsense viewer always detects it as USB 2, even when it's connected into USB 3. Seems to be an issue with the general USB 3.0 drivers on the machine 😕
Hello, I would like to do a camera recording using capture. How can I select my plugin camera?
@user-888056 Do you mean a recording with a different camera than the Pupil world camera?
yes
@user-888056 Which OS do you use?
Also, your camera needs to support the UVC standard
Siera, 10.12.6
Then the only requirement is the support of the UVC standard. Then you should be able to select the camera in the UVC Manager menu.
no options appear at uvc ...
but I am uncertain about my camera
will check. Thanks
so... this might be a silly question.
I have noticed that if I use skype or something similar and then open capture
one of my eye cameras does not work.
So there is something about which camera is active... how do I control that?
@user-888056 Only one process can access each camera at a time. It might be possible that skype is accessing your eye camera.
... my usb camera does support uvc, but nothing appears on the uvc source... any suggestions?
@user-888056 No, not of the top of my head, sorry.
thanks ok.. thanks...
mac build in cameras support uvc
shouldnt that appear?
We use https://github.com/pupil-labs/libuvc to talk to the camera. As long as this lib does not recognize your camera, Capture will not recognize it either.
@user-888056 No, it usually does not. @mpk Do you know a specific requirement that makes cameras be recognized by libuvc?
I think the camera is suspended, I have never seen the build-in mac camera show up in libuvc.
There might be an of-chance that we filter the device out but I doubt that. Check the source code of pyuvc for that.
@user-888056 ⬆
My capture today crashes. When I open it the eye cameras are not checked. I check eye0 and everything is fine, but when I check eye1 it crashes
@user-888056 could you please share the capture.log
file in the pupil_capture_settings
folder after the application has crashed?
eye1 crash
crash after eye1
@user-888056 I think this is an issue with the blink detector. Please stop the blink detector, start the eye cameras, and reenable the the blink detector
the other way around it does not crash... so if i select eye1 and then 0 it is fine?
stoping the blink detector does resolve it also
another question... i did some recording using the pupil mobile. When I open it on player, I can see the eye cameras but i cannot see the world camera
@user-888056 Please share the recording with data@pupil-labs.com It might be possible that the world video got corrupted.
ok...
I send it .. should there be a world.mp4?
@user-888056 yes!
Did you maybe not correctly copy it from the phone?
made another recording
i just drag the entire folder over
@user-888056 indeed, the world video is missing. Can you please check if the preview for id2 works on the phone?
@user-888056 it might be possible that you did not give permissions to the camera. In this case, it would show up as offline. Activating the preview should ask you for access permissions.
the preview worked
I could see what the camera was recording
@user-888056 when you start a recording, does the camera change to the capturing mode?
yes.
I will record tomorrow what it shows and send you over tomorrow
Hi. I'm very new in using the Pupil labs. I have a question. When I recording a video, I can't find the Time steps and X and Y positions in recording folder (should I turn On something before recording?)
I found my answer!
Hello, I think I have a mechanical problem with one of my carmeras in one of our eyetrackers. Some times the camera has vertical lines. I am not sure it is mechanical as some times if I open capture a few times then it stops doing it....
HI , i have a file gaze.pldata , i think it is the gaze coordinate file , am i correct??
and how do i open it?
@user-15e6c0 This is an intermediate recording format. Open the recording in Player and export the data using the Raw Data Exporter to get the data as csv. Alternatively, checkout the documentation of the format: https://docs.pupil-labs.com/#data-files or this example that extracts pupil data from the pupil.pldata
file, which has the same format: https://gist.github.com/papr/743784a4510a95d6f462970bd1c23972
@user-888056 Please send an email to info@pupil-labs.com regarding the stripe issue
@papr thanks mate
Hi, Whats is different between eye image frame and world image frame in the export files?
@user-8fd8f6 Do you mean eye and world frame index?
@papr Actually I saw in the export data we have norm_pos_x - x position in the eye image frame in normalized coordinates in the pupil_positions.csv data and also have norm_pos_x - x position in the world image frame in normalized coordinates in the gaze_positions.csv. I want to know what is the difference between them.
Has anyone used Pupil-Mobil (the android system) with a DIY headset? Obviously you would need a USB hub, but I wonder if this is possible, either in theory or in practice? We've bought one system, but we can afford multiple systems without going the DIY route
@user-5054b6 in theory this should work.
@user-8fd8f6 a Pupil position in eye camera coordinate system does not tell you where the person looked at in the world image. For that you need gaze data which is in world coordinate system
@papr Thank you so much for your help. Do you know what is the dimensions of gaze 3d data? ( Are they mm or inch or pixel?)
@user-8fd8f6 mm
@papr Do you know the reference of them?
Yes, it is the world camera's undistorted camera coordinate system
@papr thank you, But my problem is why the norm diagram and 3d diagram are not in a same behavior. What the difference between them.
Trying to run pupil from source on MacOSX
nevermind, fixed it
@user-8fd8f6 norm_pos are in distorted image space, 3d gaze are in undistorted camera space.
@papr Hi, I'm going to connect my Android phone to the eye tracker. I use a female-female USB 3 connection between the cable of eye tracker and mobile cable. But I can't see any signal on my phone. Is it because of my connections?
@user-8fd8f6 not sure if I understand. Could you please post a picture of the setup?
@user-8fd8f6 Mmh, yes, the setup could be at fault. I would recommend to get a proper c-to-c cable
@papr Do I need connect my phone application to pupil-capture software on PC when I want to use the phone application by some remote host or IP?(I connected my phone to the eye tracker directly but the application didn't work)
@user-8fd8f6 no, the eye tracker needs to be connected to the phone. You connect with Capture to the app via wifi
Or you connect the eye tracker directly to the computer
@papr I can connect to the computer but I can't connect to the phone!
@user-8fd8f6 as I said, that setup above might be the issue
What socket does your phone have, micro USB or USB c?
micro USB
@user-8fd8f6 ah, this is the issue. A usb-c socket is required. But even this might not be enough
See the Pupil Mobile repository for a list of phones that are known to work
How can I find the list? Do you have the link?
@user-8fd8f6 https://github.com/pupil-labs/pupil-mobile-app/blob/master/README.md#supported-hardware
Hello! I am using the Pupil Labs Hololens add-on with Unity. I have run into a problem where the UDPCommunication.cs script is throwing an error with the information it receives from the Hololens relay. In the InterpreteUDPData function the "data" array throws on error when it is being processed into a float array from the byte array in the FloatArrayFromPacket function. I have seen one other user run across this problem ( @user-d3a1b6 ) but I don't see any resolution. I created a similar solution to his where when the data array is length 17 instead of 16 I am cutting of the last or first piece of data, but both of these solutions have produced data that is not accurate. Any help on the problem would be wonderful and any pointers to a solution as it was first brought to someone's attention would be great also. Thanks!
@user-e8a795 please see the vr-ar channel for questions regarding the unity project
Okie dokie, I've posted the question there also.
Hi, I've just purchased the headset. We want to use it for research purposes at our university. I'm just writing the test protocol and was trying to find if someone has already created a list of exclusion criteria for people to be involved as participants - where the gaze-tracking system doesn't work. Any assistance would be greatly appreciated! Thanks
@user-078b83 What kind of research are you doing? I exclude occulomotoric dysfunctions like strabism and phorias, nystagmus, pupil under pthosis is hard to measure. Amblyopia might lead to incorrect binocular gaze. Those can be diagnosed by fan optometris.
A app was developed a while ago to undertake a card sorting activity that's normally completed by OTs. The app didn't deliver the same results as a physical card sort. We are investigating the differences. The information you have given me is great!
May I ask what are OTs?
Sorry ... Occupational Therapists.
Hello everyone! I would like to talk with a member of the Pupil company if possible
Hi @user-bc4266 What is your question?
Hello papr, I’ve wrote an email to your company on Friday
I’m contacting you from Veeso, a company specialized in face tracking for VR
We would like to test your eye tracking system and we have some questions
Probably better for a private conversation
Then writing an email was the correct step.
We will be in touch.
Thank you very much, appreciate you quick response
Hello, I just got a new camera supporting UVC but still I only get capture initialisation failed. Macbook Siera, 10.12.6
Hey @user-888056 have you checked if the cameras are recognized when running Capture on Linux?
i am on mac, but I can see the camera is recognised
Logitech Webcam C930e:
Product ID: 0x0843
Vendor ID: 0x046d (Logitech Inc.)
Version: 0.13
Serial Number: F3571F8E
Speed: Up to 480 Mb/sec
Location ID: 0x14200000 / 25
Current Available (mA): 500
Current Required (mA): 500
Extra Operating Current (mA): 0
and i can select it in other applications e.g. skype...
@papr sorry, I realised that you asked something different. I don't have access to a linux machine right now. But I will test tomorrow on windows. I want to do the time synch between two machines...
Mmh, that is weird. I know this camera and it should work.
I don't get an error, it just that nothing happens ...
In the UVC Manager, having selected "Local USB" as "manager", is it listed in the "Select to activate" selection?
yes.
And what happens if you click it?
! so when I selected it crashed, but after re-openning it is working!
Please share the capture.log
file in the pupil_capture_settings
folder in case it crashes again. (share it before restarting the application, else the log will be overwritten)
yes, I will. Thank you!
hi everyone! i am having trouble setting up the 3d pupil detection correctly with the htc vive-based trackers. the default settings do not seem to work for me (tried with various people as well). so far i have tried: adjusting focus, adjusting the headset, adjusting ROIs, adjusting the pupil min/max/intensity parameters, and the image postprocessing parameters. i'd be happy for any kind of hints 😀 i can also provide a video of what i got out from pupil in the best case. i'm also using the latest software version. thanks in advance 😃
@papr Hi, I connect the eye tracker to a phone by USB C and it works but I Can't see the Gaze data after capturing the video. Should I turn on something in the phone or in the computer?
@user-8fd8f6 checkout our offline pupil detection and offline calibration feature
@papr could you please send the link?
@user-8fd8f6 https://www.youtube.com/playlist?list=PLi20Yl1k_57rlznaEfrXyqiF0sUtZMMLh
@user-bd0840 please share this video with data@pupil-labs.com
@papr will do, thanks!
Why can't I launch the pupil player ?
I use windows 10
Thansk
@user-0a2ebc Please close Player, delete the user_settings_*
files in the pupil_player_settings
folder and try again
@papr such user_settings dont exist
@user-0a2ebc please check if there updates to your graphics driver
I'm very interested in the timing of eye movements with presented stimuli. Have you thought about the best way to timestamp presented stimuli with eye position? For example, a photodiode signal (5V TTL analog or ditigal)? Or is the best available a zmq message for when you turn on a stimuli on a monitor or LED light? Thanks!
@papr: did you already have a change to look at the video i mailed over yesterday?
@user-bd0840 not yet. Currently it is a bit busy because we are preparing the new release. This might take until next week.
@papr understood. do you have some suggestions what i could check in the meantime? on your youtube channel i found only videos showing captures using the glasses-mounted pupil tracker version. do you have some recordings for any of the hmd version so i could maybe compare parameters or sth?
@user-bd0840 you already have tried a lot. I will have to have closer look to check what is wrong.
Maybe @fxlange has some suggestions in the vr-ar channel
@papr i'll try that. if you find time to check what i sent over before next week, i'd appreciate that a lot, as i'd need the eye trackers working pretty soon. thanks for looking into it!
I have a pupil hardware question. While I was using Pupil Capture, suddenly the world camera just went gray. No amount of restarting Pupil Capture, or the computer (or trying the hardware on a different computer) will produce anything other than a solid gray frame. Do you have any suggestions for how to fix this?
@user-bd0840 we're having similar issues getting decent 3D tracking performance with the vive addon. I think there is a need for a calibration troubleshooting document to go through the various items to try to improve calibration performance, but as far as I know one doesn't yet exist. If you find any settings that improve your track, do let me know, and maybe we can start a document to develop as a community.
@user-6997ad it'd be really great to do that! i'll keep you and the community posted 😃
Any tip on how to synchronize screen recording(Not world camera) and the gaze recording?
@papr it seems that running the cams at 640x480 at 120fps improves things tremendously for me. there's a hint about that in the docs, but i think that hint right in the software might be even more helpful. would you accept a pr for that?
@user-bd0840 Mmh, this does not generally hold true in my experience. It would be more of a last-resort thing to try.
Higher resolution also results in higher cpu requirements for the pupil detection
@papr sure, but your docs also state "Use 640x480 or 320x240 resolution to get 120fps and a good view of the eye. Other resolutions will crop the eye images."
and sure, higher image size means higher cpu utilisation. it's just that with 720p, i could not get useful tracking at all, while with [email removed] i got okay-ish tracking in 30s of setup
At no point I would have recommended to use 720p. Did you just try it out or was this recommended somewhere explicitly?
ah, and do you have more docs available on the steps you are doing for pupil segmentation and eye fitting? the 3d detector uses https://www.cl.cam.ac.uk/research/rainbow/projects/eyemodelfit/ for the model, right?
i have to be honest, i tried it because it seemed sensible to do it as they are 720p and i hope for better precision with higher res. i only found the remark in the docs yesterday. so your recommendation would rather be going to 320?
Check this report on the 2d detection: https://arxiv.org/abs/1405.0006 You are right on the 3d model
Yes, I would rather go 320x240
okay, thank you
now back to my original question -- would you like a PR that mentions these things in the GUI?
@papr thanks for the arxiv link. are there major differences between the preprint and the ubicomp paper? and what about the 3d detection?
okay, just tried with 320. even better than 640 actually
Where would you put the documentation exactly?
3d detection is just the swirski model with the 2d detections as input
how about in the sensor settings? after the warning to not change during calibration and recording
okay, great, thanks!
Changing sensor settings (other than resolution) should not have any effect on the calibration
but i am talking exactly about resolution
I see. Yeah, a warning there might be helpful indeed
want a PR for that?
Sure!
Thank you
👍
no problem, sorry for confusing you 😉
Don't worry 🙂
just the one question again: any major differences between the preprint and the ubicomp paper?
Not that I know of
and i guess the reason 720 is not working well is the cropping, right?
plus increased load
Also, my guess is that finding connected pupil edges might be more difficult
But this is definitely just a guess
The biggest reason for going 320x240p is higher frame rates and lower load.
Our 200Hz eye cams operate on 200x200 pixels btw
are you using them for the hmd versions now as well?
@user-bd0840 yes, the new vive add ons use them
nice! do you have plans to create addons for the windows mixed reality series hmds as well?
I can't make a statement about that.
understood ^^
i do like your readme on bundling, Figure out a setup that bundles by your own. (Expect a world of pain.)
😄
@user-bd0840 it is 200% true
Hi there! I am using the Pupil Headsets on my 2017 MacBook Pro (ios 10.14.1) and I'm having difficulty getting the audio capture plugin to work with my mics. I first tried the built-in mic, which gave me and error saying it was incompatible (looking through past messages, this seems common, so no big). After I tried using my H5 Zoom four track recorder (set to stereo so only one track) and it also is getting an incompatible message. Are there any specifications as to which mics ARE compatible? I don't want to buy another just to find it is also incompatible. If possible, I'd love to use the Zoom, so any advice is welcome!
Any way to synchronize screen recording and the gaze recording?
@papr i will check
Hi, how can i make the gaze coordinate file with information of Fixation and Saccade from the data.
i have the .pldata files
@papr can you help me regarding this, i am new to pupil labs
@user-15e6c0 We do not have a saccade detector yet. If you simply want to load the data and apply your own classification algorithm check this script out https://gist.github.com/papr/743784a4510a95d6f462970bd1c23972
It show cases how to load pldata files
@papr i was using Tobii files for my analysis as of now, we have event name in Tobii gaze coordiate file like Fixations,Saccades,Blinks etc
My pupils just arrived...i am able to use it in phone. But we are failed to also monitor what has been displayed on the phone in our Laptop. I don't find any guideline to do live streaming onto the laptop..can anyone help please ?
Thanks
@papr loading the data is fine, but i won't be able to use it for any analysis because i need events data as fixation or saccade etc
@user-15e6c0 you can get fixations using the offline fixation detector in pupil player. I haven't done it myself so I can't help you any more than that.
hello, just open a new box of glasses. Confidence is really high >90% but I cannot pass calibration. It starts but I cannot get any markers green. I get stuck on the first...
@user-888056 what are your settings? camera resolution, pupil min/max and intensity?
camera: 1280x720, min = 0, max = 8, where is intensity?
same mistake that i made, run the cam at 320x240 or 640x480 at 120fps
the edge detector for the pupil doesn't work ... that well at 720p
Intersting, because I was using a different pair of glasses with the same setting and no issues....
It did not resolve my problem....
Anyone have any information about what audio recording devices are compatible with the audio plugin? @papr
capture, keeps crashing
ok, so I unselected everything and now capture does not crash and calibration is successful
Should I indeed change the resolution? What is the benefit of lowering it?
things seem to be fine with 1280x720, I would like to use this recording so high resolution is important ....
@user-888056 admittedly, my discussion with @papr about the resolution was for the hmd-based trackers. probably it's different for the regular ones. but great it's working now for you!
hi all, does anyone know if the pupil labs eye tracker can be used within a matlab script?
I guess you can provide that yourself, for example through a ROS node (or a similar thing)
@user-7ed11b have you looked athttps://github.com/pupil-labs/pupil-helpers/tree/master/matlab ?
My pupils just arrived...i am able to use it in phone. But we are failed to also monitor what has been displayed on the phone in our Laptop. I don't find any guideline to do live streaming onto the laptop..can anyone help please ?
Thanks
The phone is motorolla
@user-888056 is calibration marker (circle marker) robustly detected by the world camera? What calibration method are you using?
@user-0a2ebc you need do the following:
1. Android device and computer running Pupil Capture are connected to the same wifi network (dedicated wifi router for streaming is ideal)
2. Start Pupil Mobile with cameras connected and check that you can preview each sensor on Android
3. Start Pupil Capture on your desktop/laptop and select the Pupil Mobile backend from capture selection and Pupil Cam1 ID2
for world camera, Pupil Cam2 ID0
for eye0, and Pupil Cam2 ID1
for eye1
4. You should now see video preview in Pupil Capture
Now it is ... I was trying both screen marker and single marker. My problem is resolved now, just by unchecking everything. It is worth mentioning that it is hard to change the setting cause sometimes capture crashes very early on and it is difficult to uncheck options.... I just send the log in case of debugging interest.
@user-0a2ebc see also: https://docs.pupil-labs.com/#capture-selection
@wrp thanks a lot i will try
hi everyone, i am using a hololens with the pupil labs eye tracker addon. I want to track the eye positions during a scene in unity. So at first, i imported Pupil.Import.Package.HoloLens.unitypackage and start the Unity scene Calibration.unity. But in the Unity console i get this multiple times: Unknown response: R This seems not to be normal. Besides, unity says that the calibration was successful but the pupil capture says, that the calibration was stopped. Then my scene is loaded. Now, I dont know what to do next. Can anyone help me?
@wrp thanks for advice! It does not seem to make a difference for me!
Hello, my pupil services is crashing again and again and it does no work until I unplug and re-plug the usb cable back to the CPU.
version using : pupil_service_windows_x64_v1.11-4
I have also tried using pupil capture but it's also crashing the same way. Please suggest some solution.
The world cam on our headset seems to have stopped working -- suddenly it just went completely gray. No amount of restarting Pupil Capture, or the computer (or trying the hardware on a different computer) will produce anything other than a solid gray frame. Is there anything else to try, or a way to fix this? Or do we need to return it under warranty? Thanks!
We've tracked down that our calibration issue seems to be caused by tracking failure of one eye camera only (eye1), despite equivalent configs and no noticeable difference in the eye images or focus. Support suggested we try an ROI to crop out eyelashes (which we tried to do as seen in this video: https://drive.google.com/open?id=1vpg2d5SIXJDW34PAiqowHeip-to9D4NC) but the rectangular bounding box doesn't make it possible to fully crop to just the eye area, and eyelashes seem to not impede tracking for eye0. Has anyone else overcome issues like this? We're using Vive addon and pupil capture.
@user-6997ad you can set the right border of the upper roi much more to the left. You dont have to include the whole sclera, just enough to fit all pupil ellipses. Additionally, you can raise the pupil min value, so the algorithm searches for bigger ellipses.
@user-29e10a this helped quite a bit! Increasing the min radius to 25 and cropping even more of the frame gets us a decent binocular track. Thanks for that!
@user-52b112 please email [email removed] with your order id and we can help diagnose the issue with the world camera.
@user-a5f66b what operating system are you using?
i don't have pupil hardware, although i have the data files where i have eye videos, world video and other related data like .gldata etc. I want to do the eye tracking analysis that i usually do on Tobii data file,exported from Tobii pro i guess. How can i perform the analysis cause i need the fixation , saccade and other information in the Gaze coordinate file. Can anyone help me with this, i have read all the documentation and stuff and can't figure out anyting?
@user-15e6c0 Pupil Player does not support any Tobii formats. gldata is not equal to pldata. You will need to change your recording to a Pupil Player compatible format to open it with Player.
Hello~ We tried the pupil doc's https://docs.pupil-labs.com/#boost section but we checked below messages at
this image... How can I resolve this issue? I cannot fine 'stage' folder in work\boost
@user-09f6c7 setting up from source is quite tricky/laborious on Windows. Have you looked at @user-54a6a8 's PR https://github.com/pupil-labs/pupil/pull/1455 ?
@wrp thanks a lot. I'll checkout it.
@wrp I am using Windows 10.
@wrp I could see this error when I used Visual Studio debugger for my project..
@wrp Thanks; have emailed about the problem.
Hi everyone, I want to make a simple program using Python, which will change LED intensity in real time. The idea is simple - eye gaze is further from 0.0 point = more intensity etc. The problem is I don't know what data and from which file I need to import it. I've found something (norm_pos and screen_pos), but I don't know if it's useable in my project. Thanks in advance.
Trying to install third party pluggins on pupil capture, but there is no pupil_capture_settings folder, how are they installed?
@user-1e8f1b do you run Capture from source?
@user-d28f08 check out our network api and subscription to Realtime gaze data
@papr I downloaded the windows pack "pupil_v1.11-4blabla_windows_x64"
@user-1e8f1b have you run Capture already? It creates the folder during the first run
Yep, i did several times
Check your user folder or search for it
@papr Sorry to keep asking, just not getting any info. Does anyone know what audio capture devices work with Pupil capture on Macbook pros? I have now tried 3 different devices and all have given a hardware not supported error. Cannot find any documentation on hardware specifications.
@papr You are right, it is on the user folder, sorry for the question, didnt read it was there (AND THANKS!)
@user-358ea2 there are no specific specifications. It is a known issue that audio recording on macos is currently not fully functional for all microphones. We are looking into potential causes
@papr thank you! Are there any devices known to work with macos?
Considering most seem to not
@user-358ea2 sorry, if this was not communicated clearly enough
@user-358ea2 I do not have a list right now. I will test tomorrow.
@papr Thank you! I look forward to hearing back
@user-358ea2 have you been able to test an analog mic?
Hello, one of my eye cameras appears dark, and with low resolution... is there anything i can do to fix that ?
Hey, just a quick check-back (I'm using the HTC Vive Add-On): when I start Pupil Capture (v1.11), I keep getting loads of UVC warnings (see below). This just keeps going as long as Capture is active. Is that something that should bother me or that I could do something against? (For now the HMD is just lying next to me, not calibrated, etc ... but that seems unrelated. Broadcast from eye cameras seems to be intact - or at least looks like it.) Any feedback or experience with similar issues highly appreciated. I did not find any conclusive info online. thanks!
@user-141bcd can you detect any issues with the cabling?
@user-888056 please contact info@pupil-labs.com regarding this issue
@papr thanks for the quick feedback. Exactly what I needed. Getting rid of the USB extension resolved the issue. Thx!
Hello! I have some problems with pupil player, the program stops responding when turning on certain plugins (actually it stops responding when doing almost anything but it usually comes back.)
When turning on offline surface tracking the program stops and does not come back even after 10 minutes wait. This has been working somewhat fine before .
I've tried rebooting, deleting the player settings folder, re-downloading the software. Suggestions?
@user-cac57e Is this something specific to the current version or the selected recording?
It doesn't seem to be working with any recording as for now.
It has been working up until today.
@user-cac57e Did you install a new version?
I did not change anything before I got this problem, it worked before I left yesterday. First time trying it today it did not work. From previous experience I know deleting the settings folder solves some of these problems but not this time. Upon which I downloaded the app files again, which just gave me exactly the same result.
If neither the program files, nor the user settings, nor the recording changes something it has to be something with your computer 🤔
Any updates downloading/installing etc in the background? Difficult to tell
Is Capture slower than usual as well?
I kinda figured the same, but nothing that I am aware of has changed (I haven't used the computer since yesterday). Maybe windows update... Doubt that would cause this though.
It works on my colleagues computer with the same files.
Capture works as expected on my machine. I will try on another device this evening see if I can solve the issue myself.
In the fixations, how do we get a negative gaze_point_3d_z ?
@user-888056 depth is estimated via vergence. Sometimes, the gaze rays intersect behind the subject, especially, if he/she looks far away. @user-92dca7 what was your recommendation on how to correct this during post processing?
If I am interested in depth measurements in general does it matter what distance I place my calibration markers, for accuracy?
and another question: how do you calculate dispersion? I have alot of NAN values
@papr I have not yet as we don't have one, have you found success using one?
Can I somewhere get a lens for my Pupil world camera without an infrared filter?
Yes @user-54376c email us at info@pupil-labs.com
Already broke the filter out, thanks 😂
When loading a file into the Player, I can see that it has detected 634 fixations. Although looking through the video and the data exported, I see that only half of the fixations are taken into account (the video stops counting at 300) and only half are readable in the exported files.
What can be the issue?
Thanks.
@user-c4a4d3 what do you mean by "the video stops counting at 300"?
I can see in the fixation plugin to the right that it has detected 634 fixations, but as I go through the video it doesn't go pass 300 fixations. The other 300 fixations in the latter half of the video are not readable in the exported text files and cannot be seen with the fixation detection.
@user-c4a4d3 This seams as if your world video is much shorter than your eye videos. The exporters only export data within the trim marks, which are set to the start and end of the world video by default.
I don't think that's the problem because I can see the whole recorded world video in the Player, but the fixations stop being "detected" after 300.
Imagine having an eye recording with timestamps [0, 1, 2, 3, 4] and a world video with timestamps [0, 1]. Next, imagine pupil positions [0, 1] and [3, 4] form a fixation each with timestamps 0.5 and 3.5 respectively. Now, Player will play back the world video from 0 to 1 seconds. At 0.5 seconds the first fixation will be shown. The second fixation will never be displayed or exported since it is out of the world video's time bounds.
I'm not sure how to interpret that. But, In the csv file with fixations on surface it shows the same frame number for a lot of rows , so it goes from say frame 4600 to repeating frame 7427 xxx number of times. (probably the same as missing fixations). Is that any clue?
The trim marks in the world video is set to the whole video, me and my colleague have tried on two different devices and only this recording has this issue.
If you look at the "start frame", it is the same frame for many rows for 321 rows, the same number of fixations (321) that we are missing.
This is indeed strange! How is this file named exactly?
Hi, is it possible to use the pupil without the world camera? My lab ordered a headset but with only one eye camera and no world camera, and now it seems that I cannot calibrate
@user-ce3ac5 What do you want to calibrate the device against, if it does not have a world camera?
The calibration is only necessary to map from eye to world coordinates. Without a world coordinate system, a calibration is neither possible nor useful.
I see, thank you!
@papr It is called "fixations_on_surface_Surface1_15556089732.309056"
@user-c4a4d3 Can you check if you see a similar issue in the fixations.csv
file?
Looks like it. The frames are continuous until 4182, then makes giant leaps.
@user-c4a4d3 Your second screenshot does not show the same subsection as the first one.
The "fixations.csv" actually end like that. What I printed are the last rows of that file
~~Ok, but what do you man with "frames are continuous until 4182" then?~~ ah wait
Hello, I am trying to do online surface detection.
Atlhough I can see I detect the detect the surface
Am i missing something?
Can you switch the mode
To show the surface instead of the marker ids?
@user-888056 Generall, it is very dark and the markers are comparibly small.
if I play long enough there are times that all of them are detected at the same time
but i will try with bigger and brighter lighter
You should see all markers in blue in order to perform surface tracking well
Ok, then remove the current surface, seek to the location where all markers are visible, and add a new surface
ok
success
thank you
do you have any update on what to do about negative z coordinates on fixations?
@papr Any clues on my problem above? I have noticed that it happens on more data collected just now
Hi @user-888056, in hmd-eyes we are flipping the "gaze_point_3d"
if the gaze point is behind the viewer (via vec *= -1
). The gaze direction is still accurate in these cases, only the gaze depth/distance is off - like a negative magnitude.
so I can just invert it
for the entire recording I was looking forward... why would the point be behind the viewer?
does pupil camera can move to adapt ergonomy constraints?
@user-6fdb19 Yes, you can adapt the eye cameras positions
My player is crashing constantly https://gist.github.com/alexZ82/9522105788bc93e1c43255734ec76d1f
@user-888056 can you run /Applications/Pupil Player.app/Contents/MacOS/pupil_player
from your terminal? Please check if there is a Python crash log. Also, please add the player.log contents to your gist file
Hi Pupil team, I'm testing the Capture app for mac. I get the following error when selecting a video file source via the backend manager
@user-1164a3 Please share the capture.log
file in the pupil_capture_settings
I added the player.log at the gist
Ok, it is not a Python level crash
It crashes after it opens any file. It just hangs ...
any clues about what to do with the player?
i can see it crashes after it populates references and the default gaze mapper
@user-888056 this is not a crash. The app is 'hanging' have you tried waiting. maybe your recording is big and needs to format update thats running?
i have tried. I also made new recordings which are very short
but it still hangs
@user-888056 can you share one of the shorter recordings with data@pupil-labs.com ?
What is contained in "fixations_timestamps.npy"? Is that readable?
@user-c4a4d3 yes, it is. Check the documentation on the details of the data format.
Hi, I was searching for book Lech Swirski: "Gaze estimation on glasses-based stereoscopic displays" Chapter: 7.4.2 Cyclopean gaze estimate which is mentioned in the gaze_mappers.py. Can anyone provide me the link? Thank you
Hello! I'm working on the project related to eye tracking. Can you help me a little bit?
Is it possible to launch pupil software without the world camera?
@user-e08fba it is possible
@papr how?
@user-e08fba the software should launch without any camera connected
If you only have eye cameras connected, the eye processes should be able to run the Pupil detection
I'll try
Hi group. Our lab are visiting a local school next week to talk to the kids (age 13-14) about psychology, brains, research etc. I'd like to bring our pupil headset and a laptop with me and use it to demonstrate eye-tracking and teach them a bit about how the eye and visual system works. I'm curious if anyone here has done a demo like this before, either with kids or adults? If so, what sort of things did you do with the participants? Also, is anyone aware of any pupil labs eye-tracking games or demos that I could potentially take advantage of? Thanks!
@papr I went to try an analog mic on my 2018 Macbook Pro, however we found the machine does not have an audio in line. Do you know if others have successfully used either a Bluetooth mic or if they have run the Pupil programs on a Windows Bootcamp with success?
@user-358ea2 your laptop has mic in. You need to use a compound headset or splitter.
I posted earlier about a problem with fixations not being exported correctly. I have now noticed that when that happens the audio is also out of sync.
We can see in the log that we get a lot of these messages:
2019-04-29 16:52:28,265 - world - [DEBUG] fixation_detector: Resetting history
2019-04-29 16:49:37,980 - world - [DEBUG] calibration_routines.calibrate: Binocular match rejected due to time dispersion criterion
And also this: 2019-04-29 16:47:14,672 - world - [INFO] calibration_routines.finish_calibration: Collected 559 monocular calibration data. 2019-04-29 16:47:14,672 - world - [INFO] calibration_routines.finish_calibration: Collected 0 binocular calibration data.
Any clue what is going on? Thanks.
When things are working we can see that monocular calibration data is almost the same as binocular calibration data.
Hi guys, I recently used pupil_capture and worked well, but it doesn't work anymore and I don't find the reason why... Here is a screenshot of what I get. Thanks !