hey can anyone tell me how to use pupil capture player from laptop cam?
@user-d81c81 Please be aware that Pupil Capture is designed to work with head-mounted eye trackers. Your integrated laptop camera is not suited for this kind of eye tracking. You should look for remote eye tracking software.
ok
if we give a video as an input can we get the eyeball coordinates in that particular time frame?
I am not sure if I understand what you mean by "particular time frame".
Do you mean the time frame during which the video was recorded?
if the video is 15 sec long then i need the eye ball coordinates from 02 sec to 15 sec ,eyeball coordinates for each second of the video
If you have a recording that is compatible with Pupil Player (see our documentation for details), then you can run the pupil detection algorithm offline and export the data in any range that you would like.
ok
Hi, what's the maximum duration of a recording made with pupil mobile before it stops by itself? Thank you.
hey i'm not able to install pyrealsense i'm getting error status 2
Hi - I am trying to use an Intelreal sense d435 camera as the input to the world view. Capture does not recognize the camera. I am using Windows 10 and installed the realsense sdk and am able to view the camera through that. Please advise on missing steps and if this is possible.
I'm looking for open eyetracking VR/AR datasets, which I could use for my master thesis. The description of this repo (https://github.com/pupil-labs/web-media) mentions datasets, but I couldn't find any. Are there any datasets available?
@user-6b1b1f We will release Capture v1.10 soon. It will add support for the D400 Realsense Series.
OK. Thanks. would v1.10 work on linux and realsense or still the same issue?
Also, how soon is soon?
@user-6b1b1f You can run from source already if you want
Just install the Linux source dependencies + pip3 install pyrealsense2
and checkout this branch: https://github.com/pupil-labs/pupil/tree/realsense2
It is mostly finished and only requires a bit more testing.
Hello! My company just bought pupil eyetracker. And I try to understand how it works. After collibration I have such massage. What I do wrong?
Hi! We just bought the pupil labs eye tracker for our phd thesis and aren't able to open the files we've recorded in Pupil player. It always says "The document "world.mp4" could not be openend. Pupil player cannot open files in the folder format." This happens to every dataformat which is in the recordings folder. Pupil Player itself gives the feedback "Oops! That was not a valid recording." Could anybody help us? Thanks
@user-145ba0 please open Player first and try to drop the folder containing the recording on top of the gray window.
@user-acb13a are the eye windows open?
@papr Thank you so much. We've tried to open the single documents before.
@papr yes. I just made eye models. Changed some eye settings
And which calibration routine did you run? Make sure that the displayed markers are in the field of view of the world camera
@papr What is displayed markers? This pink point?
After changing eye settings this point stop moving. Does that mean that settings has been changed wrong?
Hi everyone, I upgraded Capture and Player with the November releases (no more bug with surface creation thank you so much!) . I didn't update my JS script with the new annotation method before recording new sessions and I can no longer export annotations from Player even if I can still see them in notify.pldata. Is it possible to export the notifications in any way ? Or should I simply use the former player to export those annotations ? Thanx !
@user-acb13a no, that is the gaze estimate. I mean the black and white concentric circles displayed during the calibration procedure.
Your calibration is all over the place 😅 Did you move your head during the calibration?
If we want to extend the datum, for example add another computed value like "Blink Detected", is the recommended approach to create our own datum on the ZMQ bus, yes?
@user-a6a5f2 if you want that data saved during a recording, you will need a simple plugin that reads the data from the IPC and exposes it to the recorder.
Thanks, @papr! One loose idea for the future is to provide a "datum metadata" feature, perhaps a simple XML file which we could edit to add new items for a given pupil datum.
Could you give a concrete use case example?
Not quite concrete, just a general/loose concept that comes to mind from database and reporting tools where you can create database views or computed columns in a report from existing columns/elements. For my current use case, the computation is too complex, so I will need to create a plugin anyway.
One other benefit could be to allow non-pupil developers who are developing strictly to ZeroMQ bus (e.g. they are a VB.NET shop, no python, matlab experience) to have a richer integration with Pupil. By editing this XML file, an optional attribute/flag would specify whether their custom datum should be registered with the recorder???
For this to happen, the recorder would have to subscribe to all ipc messages and look for that attribute. This is not very efficient. I think an alternative would be a general purpose plugin that can specify those topics in the UI that should be recorded. Afterwards the plugin can subscribe explicitly to these topics.
Yes, that sounds good.
Hi @papr . Thanks for pointing me to the source code. I installed pyrealsense2, and also librealsense2 with supported drivers (I can see my stereo cam using the realsense_viewer. However, it still does not recognize the stereo camera in pupil capture 1.9.33 I tried on my linux ubuntu 18.04 install that is running pupil no problem. I still see the "[INFO] video_capture: Install pyrealsense to use the Intel RealSense backend" in the capture log. One question on the pupil docs its says that intelrealsense is not supported - is this still the case in 1.9.33? Thanks.
@papr in this variant calibration was with one mark:) As I understood in such calibration you should move your head)
@user-6b1b1f You will need to git checkout the branch linked above instead of the master branch. The docs have not been updated in this matter.
@user-acb13a That is correct! In this case, I would recommend to move the head a bit slower while fixating the center of the marker.
Thanks. I ran the right code this time and got some version errors. Is there a way to work around this?
Error calling git: "Command '['git', 'describe', '--tags']' returned non-zero exit status 128." output: "b'fatal: not a git repository (or any of the parent directories): .git\n'" Traceback (most recent call last): File "main.py", line 43, in <module> app_version = get_version(version_file) File "/home/initial/pupil_rs2/pupil_src/shared_modules/version_utils.py", line 59, in get_version version = pupil_version() File "/home/initial/pupil_rs2/pupil_src/shared_modules/version_utils.py", line 43, in pupil_version raise ValueError("Version Error") ValueError: Version Error
@user-6b1b1f You probably downloaded the zip. You need to git clone
the repository and then git checkout realsense2
hey @papr i'm not able to build pyreal sense wheels on python please help me out
@user-d81c81 which realsense camera do you have? R200 or D400?
neither i'm building it for my laptop frontcam
@user-d81c81 why would you need the realsense backend then? It is optional.
i'm using pupil player its asking to install pyrealsense cam
@user-d81c81 Do you mean Pupil Capture?
You only need pyrealsense if you have an realsense camera. Pupil Capture runs just fine without pyrealsense
Your Webcam needs to support the UVC Standart in order to use it in Capture. Additionally, you might need to install our drivers manually, if you are on Windows. You won't be able to use your webcam for other applications if you install these drivers.
ok
i'm not able to access pupil player its running but unable to maximise and give the input file
Which input file?
the video recording captured by pupil capture
You need to drop the whole folder, not just the video file
yeah i know that the issue is pupil player is not getting maximised
I don't know what you mean by that. Can you do a screen recording that shows the issue?
pupil capture output what should i do next to get the eyeball coordinates?
i think the pupil detection algorithm has been run how to export the data that had been analysed?
You can export by hitting e
in the Pupil player window.
Hello! I am looking into fixation algorithms that can be used for eyetracking. Is there any info or file available? I can only find info on your website about the fact that there is an online and offline version of a fixation detector. Thank you in advance!
@user-09d181 We use a max-dispersion-min-duration-based fixation detector
Thank you!
this might be a dumb question, but how would you fit the world camera on top of existing prescription glasses?
@papr - Got it and working great on Linux so far. Thank you.
@user-6b1b1f nice! Are you running Ubuntu 18.04.?
Yes, I am running 18.04 ubuntu.
Hello, I will be happy for your help with two things I encounter when recording/ exporting the data: 1. what does the error in the attached photo means, how it affects the data, and how to prevent? 2. Why in some frames in the recording the green dot of the gaze point is absent, but there is data for this frame in the exported gaze positions? Thanks!
hi there - i'm wondering if pupil will work with a Windows Surface? if so, any recommendations on which i should buy? i want to be more portable when using pupil and not use a mobile device
@papr, back in July, you mentioned that the 3d model is being reworked to include refraction. Is that refraction of the cornea and/or crystalline lens? Or is that refraction from prescription lenses? Also any updates on ETA?
@user-910385 MS Surface Pro would be a good device for Pupil. I don't have any experience with the non-Pro version of MS Surface.
@user-a6a5f2 the work is still underway - no concrete ETA on release, but hopefully soon. The refraction is of the eye/cornea.
Thanks, @papr
@wrp great, thank you!
Hi, I did some recordings with the binocular model in 3D mode. both eye cams running . unfortunately the exporter does not export the Right eye. eye0. Did someone came across that problem before?
I tried offline pupil detection but did not work either . On win10 with pupil labs 1.9
@user-a6a5f2 Hi, to your question. the crystalline lens is just behind the pupil and has no optical effect to the pupil. There is a paper from Kai Dierkes out about cornela refreaction and influence on pupil shape apperance 'A novel approach to single camera, glint-free 3D eye model fitting including corneal refraction'
Hello. I have an issue with pupil player when reading the result file. It says "Oops! That was not a valid recording"
@user-a16531 Nina please explain the procedure you are following to read the recorded data in the player
@user-e711e0 You mean the location of the player or the recorded data?
Hi @user-a16531, please make sure to drop the whole folder onto Player, not only the video file.
@papr I did so. Some say that it's because of the file name. So I renamed it consisting only numbers, i.e. 001.. it wasn't solved. Some worked on info.csv. something. Can you tell me more about this?
Wait, what exactly did you modify?
The file or the folder name?
Sorry the folder name.
And these are the folders that you dropped onto Player? That also directly include the info.csv file?
Yes. Please tell me what I can try next.
Please share a recording with data@pupil-labs.com and I will have a look at the recording.
Thank you for the help. Just sent it to data@pupil-labs.com with an email titled Nina...
@user-a16531 The issue is the user name in the info.csv file. please rename this line and it will work.
Hi everyone! I am currently try to install pupil-labs on a odroid XU4. I know that there was some discussion on github, I will also ask my question there, so sorry for the double posting.
Question 1: I did not manage to get pytorch on the XU4 easily, so I did abondon that for now. Is it possible to run pupil-capture without having pytorch installed?
Question 2: Beside pytorch I could install everything when going trought the running from source tutorial in the docs (I mean install all the dependencies). I have an odroid XU4 with arm7l architecture, with an Ubuntu 18 and the 4.14 Kernel. I was trying to use pupil 1.9. When I start with "python main.py capture" in pupil_sry I get the error: File ".../eye.py" line 636: make_coord_system_pixel_based((*window_size[::-1], 3), g_pool.flip) SyntaxError: can use starred expression only as assignment target
I use python 3.5, but I had the impression that this line is not so python 3.6 or 3.7 specific?
ok, I just figured out I have actually python 3.4.3 in the miniconda environment. I will try to install python3.6 manually on the box and see if that fixes it...
I'm sorry if someone has already asked this, but the search crashes if I type more than one word - I am using manual marker calibration and keep getting an error message that says "WORLD: Not enough ref points or pupil data for calibration." What might be causing this?
@user-14d189 thanks. I had just read that crystalline lens is fully responsible for accommodation at close distances and naively wondered if that could be useful for inferring where user is gazing in depth/z.
http://hyperphysics.phy-astr.gsu.edu/hbase/vision/rfreye.html#annotations:R3SZjPmyEeimfLflCM4Zcg
Hi @papr , a couple of months ago I was enquiring about the possibility of Batch Exporting a large number of files. To recap, I'm just wanting to extract prerecorded pupil, gaze, surface and annotation data (not doing any online processing)...ideally without having to manually load anything into the pupil player GUI. You suggested this should be possible, and directed me towards some files in the shared_modules of the github (https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/file_methods.py#L114). I just got round to looking at this code, and it's unclear how I'm supposed to use it to batch export the files? Can you clarify which code snippets I should be using to load and export the data? Thanks!
@user-2798d6 Make sure that your manual marker is detected. You should see a visualization in the world video during the calibration phase.
@user-e7102b The linked function is used to load the legacy pupil_data
files. If you want to load the newer .pldata
format, you need to use https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/file_methods.py#L137
The Incremental_Legacy_Pupil_Data_Loader
is initialized with the recording directory while the load_pldata_file
function takes the recording directory and a topic. It will load data from the <topic>.pldata
file within the given directory.
E.g. how we use the Incremental_Legacy_Pupil_Data_Loader
: https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/player_methods.py#L615-L616
These are just components for a batch export script. You still need to write the boilerplate code around it to pass multiple recordings and export the loaded data in the format of your choice.
Hi, I just got a oneplus 6T to use with the pupil mobile. I went to the play store, and I saw that the app is incompatible with the device. is there a major obstacle the makes the 6T incompatible, or will it be compatible with an update soon? if I install an apk of the app, should it work?
Hi @user-0858b6 Pupil Mobile is only incompatible with Android 9. As long as you run Android <9, you should be fine. No guarantees though.
thanks, I will try to downgrade. so you dont expect an app upgrade soon for android 9?
From what I understand, this is not easy to fix.
downgrading the OS of upgrading the app?
Both, but mainly fixing the detection issue on Android 9
@user-a6a5f2 well there is a known effect of within accommodation, vergence and pupil dilation. Furthermore during dilation changes the pupil center can change its absolute position in relation to the optical axis of the eye. Minor effects... Maybe need to be taken into account.
@PeterBHVI interesting. Makes sense.
hi everyone: an update: I could get the pupil-capture generally to run on the odroid XU4 with: pupil v. 1.7 since I did not get pytorch to run. Is there an option to run 1.9 without pytorch? I used python 3.6 withouth minicoda since there is no moden python for the arm7l architecture for miniconda. I installed all the dependencies like in the running from source tutorial.
I have one issue open though: If I run pupil-capture I have like 1FPS. I did not yet test if this is only the display, and the video is at full speed, or also recording per se is so slow. It seems to me that the problem is that odroid has openGL ES and not openGL. Is there anything I could do about that? I am mainly concerend about the capturing
@user-8944cb These are mjpeg related error messages. You can most-likely ignore them. Let me know if you have any other issues with your eye/world videos.
a question: When I run pupil capture from source I always get ther error libGL error: unable to load driver: exynos_dri.so . I guess this is the source why it is so slow? Any ideas how to fix that?
Hey @user-bab6ad , this is on arm, right? Unfortunately, I have no other idea than to google for that driver and check if it is available for your device
@papr ok, that was what I was doing so far. Basically it seems that is the GPU SoC the odroid xu4 (among some other boards like the beagleboard) uses. I now try to install custom drivers. I wonder since I just do that, if I need to rebuild any of the dependencies from the run from source tutorial. If yes, can you point me to which?
Potentially opengl, glfw, pyglui
ok, will try that, thx!
Hi @papr , thanks for your reply! about the green gaze point in the recordings - if it is absent, or with a faded color compared to usual, but there is still exported data in these frames- why might this happen? Thank you!
Hello, I am not sure if this is the right place to ask: the camera for the right pupil is upside down for me, how can this be fixed?
@user-0982e6 this is normal.There is a button in the software to flip the image.
Sorry I didn't send the capture log earlier. So we are using one camera with just the world camera and the other with teh world camera and the eye tracker.
The main issue recently has been that the eye tracker (here parent) goes into gost mode and disconnects on the phone. And for this reason the laptop doesn't play either.
Is there a way I can recover either the laptop or phone videos? And what's the best solution to this problem? I have changed routers but that's not helping much.
@mpk you mean in the pupil capture software? I cant seem to find it
@user-0982e6 in the eye window
Ahh, got it, thanks 😃
Hello, I am using the 200fps (192,192) resolution pupil labs to record pupil position during head movements. I am only recording from one eye (eye1). After putting the pupil labs on, I start with a world camera intrinsics calibration, then I do a manual marker calibration with the wearer's head in a fixed position. Within pupil capture I visually validate the calibration and then record. When I go to export the data from pupil player I get the following error message:
2018-12-11 11:17:50,719 - player - [INFO] camera_models: No user calibration found for camera eye1 at resolution (192, 192)
2018-12-11 11:17:50,719 - player - [INFO] camera_models: No pre-recorded calibration available
2018-12-11 11:17:50,719 - player - [WARNING] camera_models: Loading dummy calibration However, I can still export data and a gaze_positions spreadsheet is created. Is it using my calibration when creating the gaze _positions file or the dummy calibration? Thank you for your help!
@user-88c6c9 Hi. The camera intrinsics calibration is only applied to the world camera. Seeing this message for the eye camera is expected.
@papr Is it expected even if I do the manual marker calibration as well?
Do you mean if it is expected to get that message during a manual marker calibration?
Not exactly. I mean that before recording, I did a manual marker calibration. Then, after recording I go to export the raw data from pupil player and I want to make sure that it is using my calibration and not the dummy calibration since it appears that it can't find a calibration for the eye camera I'm using.
Ah, the camera intrinsics calibration and manual marker calibration are meant for two different things. Former estimates the distortion of the world camera (you usually only need to do this once) while the latter estimates the pupil-to-gaze-mapping function.
pupil
refers to data in the eye camera coordinate system while gaze
refers to data in the world camera coordinate system
That makes sense and I understand the distinction. What I'm wondering is if the manual marker calibration is being used to generate the gaze raw data or if the dummy calibration is being used. Since the PupilPlayer error states that it can't find a user calibration, even though I did the manual marker calibration.
Ah, I understand. Please be aware that the message says: camera_models: No user calibration found for camera eye1 at resolution
In this case, the eye1 process tries to load the user-calibrated camera intrinsics for the eye1 camera. But since there is none, the message is printed.
This message is not related to the manual marker (gaze) calibration.
Thank you so much for the clarification and I apologize for the confusion.
@user-88c6c9 don't worry, that's what the channel is for
Hello - In trying to focus and adjust the world camera on my head set, the whole camera unscrewed and a bunch of pieces fell out. Is this fixable?
Please write an email to info[at]pupil-labs[dot]com
Done! Thank you!
Hi, I have been trying several recordings outdoors and I consistently find that the 2d calibration is more accurate than the 3d calibration, any idea why this could be?
@user-2be752 Hi Teresa, as i understand it the 2D pupil pupil position of the eye cameras are related to 2D gaze coordinates of the world camera image. Within the 3D model it requires 3D positioning of the eye rotation center and 3D pupil positioning to calculate vergence and 3D gaze focus point. All calculations with very small angels . the binocular 3D gaze mapper visualize it very good.
@papr Hi Papr, just want to check if you had time to look into some recording I sent to you. The pupil data from eye0 was not extracted. The videos are good. Is it possible to get the data out afterwards?
@user-14d189 I am having a look right now
@papr Legend! Thank you
Question: the "world camera coordinate system/ reference frame is outside the body fixed reference frame or the head reference frame.? Thank you
It is the coordinate of the world/scene camera. Therefore it approximates the head reference but it isn't exactly it
Hello, we are trying to calculate Gaze Velocity as used by Di Stasi et al in "Gaze entropy reflects surgical task load." Surgical endoscopy 30, no. 11 (2016): 5034-5043. This requires an x,y position (measured in degrees of visual angle), posted at camera frequency. The closest I have been able to get is Dispersion in the offline fixation calculator, but it is not at the correct frequency. Is there a way to get at-frequency degrees of visual angle postings? The best work around I can think of is to max the sensitivity on the fixation detector, so that most timestamps will post "fixations", and then use that data output to calculate gaze velocity. THANK YOU!
A question: I could let the pupil capture run on the odroid xu4. The graphics driver seems ok, but the capture is super slow. Can I switch off certain things or so to make it faster?
How can I make Pupil Capture remember the settings I've set in it? Every time I load it it is set back to default settings and I have to re-input all my changes to the default settings. Thanks.
@user-bab6ad you can turn off the pupil detection in the general settings of the world window. This should save you a lot of CPU.
@user-d3a1b6 This should not be happening. Usually, the settings are being remembered. Are you running a custom plugin? Which version of capture do you run?
@user-a08c80 Hi campti01, do you need it live? otherwise you need to look into gaze data. My approach: from the center of eye rotation you calculate the difference of the normal vectors in between data points. this gives you the relative movement of the pupil. for small angles you can use sine or tan function to calculate velocity in between data points. if you want to work of absolute pupil position you need to consider the arbitrary value for the distance eye rotation center to pupil center. (12 mm Pupil Labs) cheers
Hi papr, i need to frequently swap the eye tracker in between people. Is there a way to delete previous calibration and start 'fresh' with every new person? I get a bit more variance with initial measurements.
@user-14d189 do you mean online or offline? Online, simply start a new calibration hitting C. Offline, create multiple calibration sections with adjusted ranges
@papr I am running 1.9, I seemed to have fixed it by deleting the settings folder in pupil capture settings and letting capture recreate the folder, I think there was a conflict with the previous 1.8 versions' settings
@user-d3a1b6 ah, yes, we do not load settings from previous versions.
@papr thx for the suggestion. it is still too slow. I guess it has to do with the odroid, who might be just too weak. We will try a different bard for now.
@user-bab6ad I heard good things about the Microsoft Surface pro in this regard.
Hi! Can somebody explain me or send any links about how to work with pupil min and pupil max values?
Would be very grateful!
@user-6c10b2 The pupil detection algorithm discards ellipse candidates that are out of bounds of these limits (outliers). See https://arxiv.org/pdf/1405.0006 for details
@papr Thanks a lot! But what is the difference between min and max values? Should I match min outlayer with pupil image size for best confidence?
@user-bc5d02 The confidence is not dependent on these values. Usually, you do not need to adjust these values -- only if you have unusually big/small pupil sizes that are discarded as outliers
@papr thank you for your help!
Does flipping the camera images doing anything to the data collected or to the calibration. Or this only for visualization? Thanks.
Flipping eye images does not affect pupil detection or calibration.
Great. Thanks!
Hi! Can someone explain to me what the 3 coordinates in my offline_screen_marker export stand for? Concerning srf_positions_* m_to_screen column.
I want to know because i want to detect wrong/jumping surfaces
Hi, is there a broadcasted gaze positions topic available without using surface markers? If not, how could I best get the the gaze position in real time? Thanks!!
@user-f441d4 Yes I can, run this script after you started Capture. https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_messages.py
Hello, I changed the world camera to the 60° lens and then tried to re-calibrate it using the camera intrinsics estimation plugin as described in the documentation (https://docs.pupil-labs.com/#camera-intrinsics-estimation). I can't manage to capture a pattern when I press 'i' (I also tried 'c'). Nothing seems to happen; the count on the circular button does not change. The world camera seems to be in focus. What am I doing wrong? Update: I got it. In case anyone encounters the same problem: Within the interface of the plugin, select you model, then press the circular button 'i' on the left, then show the full pattern on full screen, then press 'i' on the keyboard, close window with 4 mouse clicks, press the 'i' button on the interface again if it is not blue anymore, show pattern on fullscreen, press 'i' and so on.
Hello is there documentation to get finger tracking for pupil?
Object tracking for pupil?
hi everyone! I've seen messages about 1 year ago, about problems with windows drivers... I seem to have the same problem here... so, I noticed that windows had automatically installed drivers (which is bad) so I uninstalled them. But still, same problem... "eye0 - [ERROR] video_capture.uvc_backend : Init failed. Capture is started in ghost mode. No images will be supplied."
@user-eb041a in the right UVC Manager menu, are there cameras listed as unknown?
I don't see a list of cameras there... it says "Manager : local usb"; "Local UVC sources"; and "Activate source : select to activate".
(in fact, I cannot see the drivers under libusbk, in the windows driver manager)
OK, this means that the cameras are not properly connected
Usually, the USB cable is not fully to the hub/clip of the headset
it seems ok, I think, because if I don't prevent it, windows installs the drivers automatically (which means that it is plugged)... (by UVC manager, you mean when I start "Capture", right?)
(I really think the drivers aren't there)
in the pupil docs (regarding the drivers), it is said : "Navigate to pupil_labs_camera_drivers_windows_x64 directory". I cannot find that directory, does anyone know where that is?
@user-eb041a in the newer version, Capture will install drivers automatically. Therefore, you need to connect the headset first, let windows do what it does, and then start Capture with administrator rights.
Yes, I meant the uvc manager in Capture.
that's exactly what I did...
(but I think if everything was fine drivers-wise, I should see them under "libusbk", which I don't)
@user-eb041a do you see them somewhere else in the device manager?
Because if the cameras are not listed in Capture, not even as unknown
, then this is a connection issue not a driver issue.
I can select "unknown" in the "activate source" thing (if that's what you mean).
Yes, that is what I meant.
ok, sorry... regarding the drivers : if I let windows install the drivers alone, I can see the 2 cameras appearing under "image device" in the device manager. I've read somewhere that this was not good, so I uninstall these, and relaunch "capture". Then nothing happens, drivers-wise.
and when I select "unknown" in the "activate source", it says that "the selected camera is already in use or blocked"
@user-eb041a what version of Windows are you running? Do you have other instances of Pupil Capture or Pupil Service running?
windows 7, and nothing else running...
shouldn't I try to install the drivers that are supposedly in pupil_labs_camera_drivers_windows_x64 directory???
@user-eb041a unfortunately, we do not support windows 7, only Windows 10. We highly recommend to upgrade the operating system.
?? is that said somewhere on the website?
anyways : I happen to have ordered the "mobile bundle". Would there be a way to pair the mobile with my computer (win7)?
It is mentioned as one of the first things in the documentation under Windows dependencies. But you are right, that this should be communicated better. I have created an internal support ticket to improve the situation.
Using Pupil Mobile might work on Windows 7. Connect the headset to your phone, open the Pupil Mobile app and make sure that the cameras are listed. You might need to give permission to access them on your phone.
Select Pupil Mobile instead of Local USB in the Capture menu previously mentioned. You should be able to see your phone and to activate the camera stream via wifi.
ok, it works on the mobile side (I can see the images from the cameras)
but says "Nohost found" in the backend manager... (and it says "make sure the time_sync plugin is loaded")
Make sure the phone is connected to the same wifi as your computer
The time sync plugin can be activated in the plugin manager menu in Capture
Hi everyone, I am new at both Pupil and Discord so apologies for being a total newbie. I am using a Pupil-Labs device for psychology experiments run on Matlab with the Psychtoolbox. So far I can start and stop recordings using the example provided in pupil_remote_control.m from the Pupil-Helpers. But I cannot send custom annotations (eg triggers signalling the start of a new trial). I have the feeling that I missed something obvious but after hours I can't see what. Anyone with an example matlab script sending triggers that I could inspire from?
@user-cfdb2c which version of capture do you use?
@papr The version of capture is Version 1.8.26 on Mac OS X
@user-cfdb2c Unfortunately, I do not have the possibility to try to reproduce the issue right now. Please create an issue in the pupil-helper repository.
Thanks @papr I don't think there is an issue with the available code but rather in my ability to use it. Basically, the pupil_remote_control.m shows how to start and stop a recording or start and stop calibration using the ZMQ interface. I was simply curious to know how I could use the same interface to send an annotation to the Pupil device. I can do this manually in Capture. I simply don't know how to do it via Matlab commands.
The existing matlab code is a bit out-of-date. Additionally, the next release will simplify sneding messages to Capture. I will adapt the matlab scripts accordingly for the next release.
Great news! May I ask when the next release is planned for?
There are only some Pupil Mobile tests left to do. Therefore, hopefully this year. But I won't be able to change the matlab scripts until start of next year.
I see. In the meantime, if you have some material or ideas about how I could send annotations using Matlab that would be much appreciated! I would be happy to check the data recorded with the Pupil and a Matlab interface. I am collaborating with a team interested in buying more devices but they need to be certain that we can use it for our protocols.
Please try Capture v1. 7 and v1. 9
I will! Thanks for your help!
Hello, Is it possible with Pupil Mobile Eye Tracking Headset to calculate pupil diameter, which will use to estimate driver fatigue level ?
@user-a4a77a Hi, this is possible offline if you open a Pupil Mobile recording in Player and run the offline pupil detection.
@papr I am working on driving simulator, so i want to use eye tracking headset to find changes in pupil diameter (frequency of pupil oscillation decreases).
So you mean online? In this case you have two options: 1. Connect the headset to a computer running Capture and subscribe to the pupil data 2. Stream the eye video streams to a computer running Capture and subscribe to the pupil data
I will try , Thanks @papr
Hello I am getting the 'GLFW window failed to create.' exception. How can I amend this?
@user-c6ccfa what OS and OS version are you using?
@wrp I am using an ubuntu 16.04.3 subsystem
Can you update graphics drivers?
@wrp unfortunately not. My windows 10 says my driver is already updated. Does this mean I will not be able to use pupil on my computer?
Sorry, I just saw that you said subsystem
. So you're using Windows 10, but trying to run Pupil on the Windows 10 Ubuntu subsystem, correct?
Why not just run Pupil app on Windows directly?
Mainly because I prefer using the subsystem for programming and I am not familiar with windows commands.
So there is no solution in this case?
Usually this error GLFW window failed to create
is due to an OS (usually Windows) not haivng updated drivers that support OpenGL. Usually updating graphics drivers/OpenGL drivers resolves this issue. Your situation is a bit more complicated because you are using linux subsystem in Windows. Perhaps the subsystem does not have access to the same graphics drivers as the Windows host system (this is just a guess).
@user-c6ccfa are you running from source or from the application bundle?
I am running from source
@user-c6ccfa after reading a bit through some forums, it seems like WSL
has some issues with OpenGL support (along with limited/no hardware access on host system?). Have you seen this: https://github.com/Microsoft/WSL/issues/2855
@wrp thank you for your help!
Hi - I want to use pupil for HMD Vive, but am limited by other hardware that is not compatible with unity. I would like to collect raw pupil 2D tracking data and then calibrate it on my own outside of the pupil environment. Can you provide some information about the raw calibrated data 2D data? How should calibrated data be interpreted. What is its reference frame?
@user-6b1b1f did you have a look at the python example client in the hmd eyes repository? Not sure if it is up to date but it should give you an impression of the work flow.
The cool thing about the hmd calibration is that you define your own reference system by providing the reference/target locations.
Thanks for directing me here. This is what I needed. Thank you
Confusion: for an eye ball with 12mm radius I get a pupil radius of 0.1059 (suppose in mm). So the circle_3d_radius from pupil datum is in mm or not? if not what are the units? Thanks
@user-cfdb2c Where you able to solve the problem of MATLAB communication?
Good afternoon, tell me please, I can not understand why the image of the heat map is not saved? Data is saved, but the image is not?
Hi @user-d9bb5a did you define a surface and specify the size of the surface(s)?
yes)
Just to confirm; you are using Pupil Player with the Offline Surface Tracker
and are exporting data and you do not see any image of the heatmap in the export/00X
folder correct?
yes
ok, what version of Pupil software are you using? What OS/OS version?
pupil_v1.9-7-gdf51687-windows_x64.7z
and Windows 10
Also to confirm; the .png
is saved correct when you export. This is a png of the heatmap.
No, there are all files except .png
ok, perhaps you could make a sample recording with surfaces and share with [email removed] and we can provide you with concrete feedback.
OK
thanks!
I sent
did you watch? can I get feedback?
Thanks our team will get back to you with a response
OK)
For the 200Hz binocular eye cameras setup, is a USB 3 port required or will a USB 2 port do?
@user-0883b2 Not completely. Instead of sending annotations to the Pupil, I managed to receive timestamp that I store in matlab. That way I can sync Matlab and the Pupil recordings but there is a lag (ie time to request and receive the timestamp from Pupil). I can send you the code if you are interested.
@user-cfdb2c Thank you! It would be interesting to look at your code, I modified and add some lines in the code proportioned for communication from Matlab to Pupilabs using v 1.9. I am able to send annotations now. Please let me know if you are interested and can share the code with you.
Hey. @papr I am trying to use the natural features calibration method. The computer on which I use the eye tracking system is separate from the one which runs our psychophysics task. I show on the other screen: 9 red dots, one at a time as stimuli, whose coordinates we know in our screen space. (The 9 dot pattern shown in this link is what I am trying to use: .http://www.okazolab.com/faq-blog/how-to-calibrate-an-eye-tracker-within-eventide) The subject is asked to fixate on those dots and I click on those 9 dots in the world camera view . How are pupil positions and gaze positions normalized in this case? Is 0,0 now the coordinates of the stimulus I show on the top left among the ones I show? I am a little confused with mapping of the coordinate system of the eye tracker on the coordinate system of the screen for our psychophysics task. Could you help me out with that ? And do you recommend a better method of calibration ,if not this ? Also, where is the readable calibration file saved for each file ? All the pupil capture calibration file gives is the angular accuracy and precision . Where can I get the gain and offset values ?
Good day. Merry Christmas!!!! But I would like to know the answer, how to solve my problem.
Good day. Merry Christmas!!!! But I would like to know the answer, how to solve my problem.
Here is a Pupil eye tracking device to use the mouse_control.py code available at https://github.com/pupil-labs/pupil-helpers/tree/master/python, but the result is not satisfactory because the movement of the mouse did not match according to the tracking.
This is a real stupid question, but where is the pupil capture executable within the win64x pupil capture download file?
@user-71d771 unzip the archive on Windows with 7zip. Then in the pupil_capture...
directory you will find pupil_capture.exe
. Right click and run as administrator to install drivers.
thanks!