Hi all. I have just received Pupil hardware. It is good to see you here in discord! Now I am having installation problem in window environment, mainly due to ceres compilation issues. I raised an issue in github, ( https://github.com/pupil-labs/pupil/issues/822 ) so If you have time, please share your kind advice. Thank you!
Hi Injoon, do you have to make changes to the code? Otherwise I would recommend trying the bundle version first: https://github.com/pupil-labs/pupil/releases/tag/v0.9.14
Hi! Yes, unfortunately i need to change the code. π¦
Hi all .. just a quick question regarding the data which is being saved in the pupil_data
file. Is there any way to extract the eye ball position in the reference frame of the camera? In a dream world, I'd want the 3D position of the eye ball surface when looking straight π
Not sure whether that's possible at all, or whether you only have the eye ball position w.r.t. the eye camera(s)
Hi Injoon, have you checked plugins out? They will allow you to do great many things quickly without directly changing the source.
Hi, we are having some troubles with the pupil tracking. Sometimes it is only processing half of the pupil, especially with big pupils. I attach a picture where only half of the pupil is being processed. We have been playing with the minimun and maximum sizes, camera position and the ROI without success. Do you have any tips to avoid this problem?
@user-2d6071 this is a very distal angle
Please try moving the camera
@user-2d6071 Please send us an example recording. This looks like a bug. The video would help us to debug the problem.
@user-2d6071 if you haven't already, please see the good and bad eye images section in docs for reference: https://docs.pupil-labs.com/#3-check-pupil-detection
we have tried moving it, but we can try again. Do you think that is the reason of the half detection?
ok, we will record a video π
Simply enable the eye recording option in the recorder menu and hit R
The reason for the half detection seems to be that the secondary ROI (the smaller blueish squares) are not set correctly
@user-2d6071 are you aware that you can slide the eye camera arm: https://docs.pupil-labs.com/#pupil-headset-adjustments
@wrp The pupil is huge in the picture. This is probably a thresholding issue
@user-2d6071 Do you run from source or from bundle?
from bundle
@wrp yes, we move the arm to center the pupil, we reached that off axis image while trying to fix the pupil tracking
We will investigate the issue as soon as we have the recording. In the mean time, I would recommend installing the source requirments such that you can apply changes yourself. This way you will not need to wait for the next bundle
@papr we have recorded two videos, each one in of the recording modes (big file, less cpu and small file, more cpu), they are available in this link: https://1drv.ms/f/s!AiyjKz0X3Zomg-RL-WIIhnwTy0g-yw The eye which fails is the eye 0. Thank you very much for your help and responsiveness.
@Juan#7767 I will have a look at your recording now
Hi everyone π I was wondering if someone has a ready C# implementation of pupil remote client for getting real time data (ZMQrequest)?
@user-6419ec in case that you find an implementation or build one youself, please do not hesitate to make a pull request to pupil-helpers. This way other people will have easier access to it.
@papr I'll upload the implementation once it starts working
@Juan#7767 Please set the ROI like this:
The combination of dark eye lashes and huge pupil make the algorithm fail. It works if you set the ROI such that it excludes the eye lashes. Make sure to increase the pupil max
setting if the pupil is as much dilated as in your case.
@papr thank you very much, we will try later setting a smaller ROI, which luckily should work with our experiment where the pupil position should be stable.
@user-2d6071 Keep in mind, that you always can run the offline pupil detection for existing recordings in Pupil Player. This way you can fix old recordings.
@papr it works now, even with this particular subject!
Great!
@wrp sorry for the late response, I'm using the latest Pupil Labs, version 13 and I'm using it on windows 10. It appears when I'm using Pupil Player. It doesn't interfere with the workings of the software, at least it seem so. It just closes the software when I want to export data but the data is already exported and saved in the folder.
How can use OFFLINE CALIBRATION if : 1. The calibration process has not been included in my recording 2. The calibration process is made with screen marker
@mattiaparentela#1655 you will need to know where the subject was looking for a few locations in the video frame. This can be markers but could also be anything else.
@mpk so if I asked to look at a particular point in the visual field and the instrument because of incorrect calibration does not locate that point I can use that element to do an offline calibration?
That's right but you will need 5 points in the scene
Ok thanks @mpk
Let us know how that goes. It a new feature and feedback is appreciated!
Anyone available to help with my libuvc import issues?
Is there a way to get a csv file or something similar that lists fixations and their durations? I've loaded the fixation plugin, but would like to have a file with all of that information if possible!
Libuvc and pyuc master versions don't work posted my issue
And fix/workaroubd
good morning. i have just started toi work with pupil labs eye tracker in HTC Vive . can somebody guid me what software I need for the best accuracy?
@mpk in the offline calibration the five points must be indicated at the same frame of recording or i can establish a series of points on the entire recording ?
a series of points. They need to be looked at by the subject.
@mpk Sorry but i don't uderstad. I try to explane me better.If I see that the subject is looking at a single element but the instrument misplaced the pointer, can I use that element to recalibrate the instrument? For example, the subject is driving on a road and realizes that there is a STOP signal. From the recording it seems to be watching them but there is an error in the instrument and then the pointer does not fall on the stop signal. Can I use this signal to recalibrate the instrument?
Yes, it is possible.
ok thanks @papr
Create a new section In the offline calibration plugin. Set the calibration type to "natural features". This will allow you to click within the picture to define the location were you think the subject is looking at. Set the calibration range such that it includes the frame that you choose to define the point. Repeat for multiple locations were you know that the subject looked at. Preferably in different locations of the field of view. Afterwards click calibrate.
Great! Thanks again @papr
No problem
Hi everyone, i am trying to receive specific parameters in real time by pupil remote and zmq subscriber. I found in pupil-helpers/pupil-remote the file filter_messages.py. However by running the script it always stops here https://github.com/pupil-labs/pupil-helpers/blob/b472415133a1c2c65317c332af0930588ebe8f68/pupil_remote/filter_messages.py#L33 without an exception. Do you have any clue what happend ? Or guess what am I doing wrong?
@user-6419ec the script waits for incoming messages. If it stops indefinitly at this point, it means that you are not receiving any pupil data. Pupil data is produced and published if at least one eye window is open and its video source is active.
Use the Test Image
source to generate zero-confidence test data if you do not have an headset around to test
thanks a lot:) Only the worl camera was running, this was my fault
Good to hear that the problem could be resolved π
Hi, I try to follow the pupil docs to install Windows Dependencies, but an error occured when I try to install msgpack_python File "c:\python36\lib\site-packages\pip\compat__init__.py", line 73, in console_to_str return s.decode(sys.stdout.encoding) UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 116: invalid start byte
Hi @user-a49a87 - can I ask you a quick question; do you need to run from source?
What features do you require from source?
I didn't manage to make things work with capture
If you don't mind me asking, what are you trying to achieve? Perhaps there is an easy way to do so with a plugin or via the IPC backbone?
I try to get the eyetracking working to get the gaze info into Unity to be able to get eye-tracking in the HTC Vive
@user-a49a87 in this case there is no necessity to build Pupil from source
You do not need to install the source requirements for that. You can simply use the bundle from our release page.
I try Capture, but it doesn't detect the pupils
@user-a49a87 @wrp maybe it would be better to move this discussion to the π₯½ core-xr channel?
Yes, π₯½ core-xr is appropriate channel for this discussion. @user-a49a87 please migrate.
ok
Can anyone help with getting my time sync master to send commands to time sync follower that's running my capture software?
Is there a way to get a csv file or something similar that lists fixations and their durations? I've loaded the fixation plugin, but would like to have a file with all of that information if possible!
Hi @user-2798d6 if you run the fixation detector in player and hit export (e) you should get such a csv file.
The 2D or 3D fixation detector?
I think it should work for both.
Ok, lately when I've done that, the program has shut down
Should I try re downloading the program? Or is there another fix?
@mpk @wrp did you guys see my issue with pyuvc and libuvc?
@papr
Hi all. Does anybody know reference work for the 3d search part using Kalman filter in the code? I am looking for some papers and documents explaining algorithm of Filtercircle3 and predictPupilState in EyemodelFitter.cpp.
@wrp @mpk questions migrated from π₯½ core-xr channel. I posted the calibration error as an issue on github page for Pupil Capture 0.9.14. They acknowledged the bug and closed it after fixing it. Here is the reference to that issue page, https://github.com/pupil-labs/pupil/issues/825. Do anyone know how to use the version with the fixed code?, I see the main download still contains the faulty version.
We can make a new windows bundles today @user-5874a4
Hello! When I try to export after loading the 2D fixation visualizer, the Player program shuts down before it exports. Is there a fix for this? Or do I need to redownload the program?
This might be a bug are you on the most recent version of player?
I believe so I'm on 9.13-48
I think the lastest is 9.14
Ok, I'll download that and try again. Thank you! Is there a way to get notification of new versions or should I just keep checking the website?
Thank you!
We announce it in this chat but I think we'll add an in app notification soon
The new version doesn't shut down! Thank you for your help!
I'll make sure to keep a closer eye on the new releases
Glad this was so quickly fixed:-). Have a good weekend!
Hi everyone, it's me again π i try to set the PUPIL EPOCH to UNIX EPOCH because i have different measuring instruments. I found https://github.com/pupil-labs/pupil/issues/392 and thought by editing the lines (250-253) in pupil_remote.py with time.time() this would be fixed. However because of the method get_timestamp() in world.py this isnΒ΄t that easy. Is there any way to implement the UNIX EPOCH? I know that there is also time_sync, but in my usecase this isn't the best way. Thanks a lot
I think my response to the issue is still current and should work fine: https://github.com/pupil-labs/pupil/issues/392
basically set the offset to the difference of pupil time and your unix time.
either with pupil remote or a small plugin.
thanky, i will try. nice weekend to you
to you too. Once you have something to show and questions dont hesitate!
Howdy, I can't get Pupil Capture v0.9.14-7 to detect the HTC Vive display in the Pupil Capture - World window. I get the following error when I launch Pupil Capture: [ERROR] video capture.uvc_backend: Init failed. Capture is started in ghost mode. No image will be supplied. Capture is detecting each individual eye camera but I cannot get the virtual environment as seen through the Vive to be a source for World.
@user-13c487 I do not think that this is possible. =/
The display is an output device. Capture looks for an input device. You would need to stream a single picture into Pupil Capture by using the NDSI protocol for example.
The Vive Display is actually two displays, one for each eye. You would need to unify this picture. That is what I meant with the "single picture"
Makes sense about the Vive Display being two displays. I'd be interested in seeing how to get the Game window in Unity Editor to show up in World.
I think the usual work flow is the other way around. Get the gaze data into unity. You have everything there. Getting a single rendered image into the world process does not help you much.
So how do I go about recording the virtual environment with the eye tracking overlay? The PupilGaze plugin has a "Recording" button that only works if a recording is in-progress in Pupil Capture. It creates a video file called "Unity_MainCamera.mp4" but it only shows the skybox and not the virtual environment I built.
Sorry I cannot help you with that. I have no experience with the unity setup.
Maybe some of the guys over at π₯½ core-xr can help you out.
Thanks.
I may just be missing this on the website, but how do these glasses track the eye and measure pupil size? For example, another tracker I was using used infrared and corneal reflection. Is that what is happening here or is it something different?
Pupil captures eye video in IR
@user-2798d6 Corneal reflections are not used by the pupil detection algorithms
Thanks! So it's just the IR?
The algorithm just needs IR video of the eye region, yes
Thanks! Trying to make sure I'm explaining correctly to my advisor
I can send you a link to an old (2014) technical report that provides and overview of the 2d detection steps
On moment
sure! that would be great!
Please see the paper linked via the docs link above
Note - this paper is from 2014 and does not include information about the 3d detection and mapping algorithm
Hi, I don't manage to have the two cam working at the same time
In the terminal, it says: video_capture.uvc_backend: Capture failed to provide frames. Attempting to reinit.
one cam is working, then glitches, then the other work but the first one fails
@user-a49a87 I have a few questions for you. Which Capture version do you use? Which OS do you use? Did it work before? When did you buy the headset? Did you try to disconnect the headset for multiple seconds? Is it always the same camera that fails? π
Pupil 09.14-7 on Win10 it was working last week but I already had this problem with my previous computer I bought the cam to add in a HTC-Vive No, I didn't try to disconnect the cams It is always the same, cam 1 is working, cam 0 isn't and I get some glitches, then cam 0 is working and cam 1 isn't (it is always only one cam working at a time)
if it might help, here is a print screen of the terminal
Now I tried to disconnect the cams and the problem is still there
Hey @user-a49a87 please send an email to info@pupil-labs.com with the order_id associated with this hardware and we can diagnose further and determine if we need to initiate a repair/replacement.
@user-a49a87 are you connecting the headset to a usb 2 or 3 port?
USB 3
Ok, then please follow @wrp 's intructions π
@user-a49a87 and I are already in contact via email - thanks!
Hello - My audio is not merging with the video when I export from Player. The audio file will play, so I know it was recorded, it just isn't being included in the video export.
Also, what is the green circle around the eye that appears in addition to the red circle and dot around the pupil? Is there a way to adjust that or is it an automatic process?
One more issue I'm running into tonight - When I drop a recording into Player it says "There is no gaze data for this video. Aborting." This is happening even with videos that used to have gaze data and would show gaze visualizations with a plugin. I tried re downloading all of the software, but it's still happening.
@user-2798d6 the green circle is a visualization of your eyeball as estimated by the 3d detector.
@user-2798d6 are you supplying the correct directory to Pupil Player?
I've tried several different three0digit folders from various recordings
@user-2798d6 re audio - what OS and version are you using?
OS Sierra 10.12.6
@user-2798d6 please reset pupil player settings. Delete the directory named pupil_player_settings
in your user directory
And try opening the files again
That worked! I'm curious - what did that do and why did that work?!
Likely you had set pupil player to use offline calibration
You can switch back to using the calibration from the recording
Oh you are so right!
@user-2798d6 I will try to reproduce audio muxing behavior you noted and get back to you a bit later
ok, thank you!
I just wanted to follow up and report that @user-a49a87 issue was a driver issue. To all those on Windows - if you are experiencing issues with Pupil cameras not being accessible in Pupil Capture (and you recently updated Windows), please open Device Manager, goto View > Show Hidden Devices
and delete all drivers for Pupil Cam - especially those listed under the Imaging Devices
group. This resolved @user-a49a87 issue.
Hey everyone, I was wondering if Pupil makes a option/init file after running for the first time that is saved in another place as the folders: pupil_capture_v, pupil_player_v or pupil_service_v.
@wrp I got the same problem as before, but I manage to solve the problem again by running as administrator
Is it a known issue?
@user-72b0ef _v
? It should be *_user_settings
@user-a49a87 this is not common behavior
I will try to recreate this - but not sure that I will be able to do so as it may be related to your system setup.
@papr @user-72b0ef you cannot just transfer the settings file. They are not meant to be portable. It may work between service and capture but thats really a hack solution.
however you can move the eye settings files. Those should be portable from serivce<->capture.
I have 2 users on my laptop. To avoid any problem, I used the administrator with you. But I use to work on an other user and when I tried again after you "left" with my non admin user, the bug was still on. But running as administrator make the bug disappear
Is there a way to store a configure file somewhere to set the recording folder and other setup on a text file?
@mpk @papr I'm not really trying to transfer the file from capture to service or anything like that...... I'm trying to find the settings file, so that I can transfer it from a laptop to a PC. Because in-between tests Pupil went goof on me (I didn't change any settings in pupil). One eye only started returning the values inifinty and the other "0.2". I got pupil on another laptop and it worked fine there, the correct info is being send again (coordinates where the eye is looking.), which shows that it's not a hardware issue or an issue with my project itself. But after deleting pupil from the PC and un-packing it again from the .rar file it has still kept its settings from before.......(already tried re-installing cam drivers as well already)
Just delete the settings folder to test if the issue persists
Hi, Is there a way to store the path of my recordings? When I play back the captured pupils videos to see if the algos are correctly finding the pupil+gaze,I have to type again the path of the recording (and it's quite frustrating)
@user-a49a87 you can start the recording via a notification that includes the session name (which can be a path). You send notifications via pupil remote or within a plugin
If you have an experiement software it can trigger the start of the recording as well as say where to store it using this notification
I don't understand you answer, so I guess that my question wasn't clear ^^ I just got the cam working today and I try to see if the algorithm car track my pupils. I launch pupil_capture.exe and record my two eyes in file to be able to replay the videos as sources for the algo. The question was: when I switch in one of the eye of pupil_capture.exe from local_USB to Video File Source, I have to type the path of the recordings, is there a way to save this path?
Hi, I don't manage to make the pupil detection to work. Here is a print screen of the "mode algorithm" view
Ah I understand now. After selecting Video File Source as manager you should be able to drag and drop the videos on top of the eye windows.
great :D Thx
Please move the ROI such that it does not include the eye lashes
and increase the maximum pupil size
these are a pair of huge pupils π
the green square seems too small, but I didn't find a way to increase their size
the pupils are big because it's inside the HTV Vive and I have no images atm (I try first to test the eye-tracking)
As I said, try to change the ROI to not include eye lashes. You can also simply show the Steam VR environment to reduce pupil size
It works better, but it is not stable, as soon as I look at the edge of the "screen", the pupil is lost
Could you post a screenshot of that?
The reason why the algorithm fails is because of the bad contrast
OK, what should I do?
Could you try to use the distance slider of the Vive and move the eye cameras a bit further away?
The IR light needs to be able to spread evenly
If this does not work try to increase the aperature
here are the values for the aperture
Try to increase exposure time
be aware that very high exposure times can result in problems for the camera. In this case it will turn itself off.
Yes, when I increase the exposure time, it get lighter then darker suddenly
try to find a value where it is lighter
e.g. 75 should work
it seems much better
Do you have other advises?
Not at the moment. Looks very good! It is really bright though. But as long as it works β¨
I can decrease aperture... You suggested me to get it brighter... I did so
Do you think I should decrease a bit?
Yes. But the exact values always depend on your setup. Try different values and look how robust the detection is
The goal is to get a strong contrast between iris and pupil
Ok, thanks so much... Next step, get it to work with Unity
hehe Good luck with that. π Keep an eye on the unity plugin repo. There should be improvements comming soon π
@Djinn#0083 one last think to improve is to turn the lenses a little bit until the foxus in on the pupil and not the eye lashes.
I don't see @wrp signed in, but I wanted to check in again about audio issues. I am getting an audio file that works after using Capture, but the audio isn't merging with the video during export in Player. I'm on Mac OS Sierra 10.12.6.
@user-2798d6 I gues you nade already sure to use the most recent version of Pupil Player?
I did - I re downloaded last night, but it's still happening
ok. Then please have a bit of patience until we are able to reproduce the issue.
I had asked earlier about audio, and I am fine to be patient while you all try to recreate the issue! But may I check my understanding? Is the audio supposed to merge with the video file during export or is that still needing to be done manually?
It is supposed to merge during export
In the newest version
Ok, just wanted to make sure I wasn't trying to do something it wasn't supposed to do. I'll be looking forward to your answer and I appreciate your help!
π
Hello there
[email removed]
Hello everyone! I am having problems in loading the pupil_data pickle in python: it raises me the error "ValueError: unregistered extension code 28847" in both python 2 and 3. Does any of you know how to solve this issue?
@user-cf2773 please see https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/file_methods.py#L52
Oh right, I have to use msgpack. Thanks a lot @wrp!
Welcome @user-cf2773
how do you get two pupils to time sync? i try chaning the name from clock master to clock follower and it just reverts back to master. I also tried chaning the bias and that doesn't work either
@user-ed537d Do both instances see each other?
They should list each other
they don't
i'm unsure why they don't @papr how do i need to set it up?
Same wifi, same time sync group, same time sync protocol number. Do you use 0.9.14?
Yeah I believe so I'll check tomorrow morning. Do I need to change the node info?
Also how do I get the network time sync in pupil helpers to be the master?
Do you mean the bias? No, only if you want to control which of the Pupil capture instances is the clock master.
Yeah it wasn't working. What I'm trying to do is sync up the neural timestamps we get to the pupil. I'm having problems even getting the pupil helpers time sync script to even send a time to pupil capture. Any advice/direction?
So wait, I thought you were using two Pupil capture instances? What is your exact setup? If you are using the helpers, do the Pupil Remote ones work?
In case that you want to have a bit more insight on how the time sync works under the hood : https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/time_sync_spec.md
Hi everybody, has someone used pupil labs at night especially in the car / while drriving?
@user-d7b89d dont think so.. Pupil dilation may become a problem... But I think you can make it work. We have used Pupil in flight simulatiors and those a quite dark as well.
sounds good, do you have some recommondation for the setting?
Pupil min and max. also ROI.
thanks, I'll try this
Hi, we're using uncalibrated pupil positions. So we get in pupil_data the circle3d_norm_pos. Works great so far, but is there a possibility to calculate angles from this easily? I know there is a theta and phi in the data, but I haven't found any docs about them. And the "center" of the eye should be zero angles, should it? ... Secondly, can you name the torsional rotation of the Vive-Cameras? So the rotation around of the line of sight. It is something between 20 and 30 degrees, but I would like to know the exact amount. Best wishes, Matthias
Theta and Phi are the polar coordinates of the pupil on the 3d model
IIRC the convention is that the 'looking straight forward' eye position does not correspond to the zero angle. This is mostly to prevent zero crossings in the data.
From tests one of both values ranged from 0-90 degree and the other from 180-360 degrees
ah ok, sounds like a tangens agreement π
so phi is the "left/right" and theta the "up/down" angle?
not sure. I am not even sure if it is consistent with both eyes since one camera is rotated. You should be able to infer the correlation using two 2d norm_pos entries
@user-29e10a This is not important if you are only interested in the angle difference between two pupil positions. Simply calculate the euclidian distance in thetheta-phi-space to get the angle difference. E.g., the new fixation detector makes use of this.
right, the absolute value is not important for us, but a true horizontal motion of the pupil should only be seen in the phi angle. if I calculate the euclidian distance between the two directions, in my opinion I get a equipotentialline around the center and therefore I'm not sure "how" the pupil has moved, only how far. so it would be nice to rotate the coordinate system as the cameras are rotated... does that make sense? π
another question ... we have a Microsoft LifeCam 3000 HD and we were like to use this as a world cam. this is definitely UVC compatible and works with libUSBk drivers. But pupil capture tells me the device is blocked or not available. Is there are way to make this running? we are working under windows 10 latest version.
I think the polar coordinate system is actually fixed to the camera coordinate system
yes, there are, but it would be no problem to make some coordinate system rotation. but whats the exact angle? and is the camera rotated in theta direction, too?
I think the driver has some kind of internal list of USB ids that it assumes to be compatible. Currently this list only includes Pupil Labs cameras. @wrp should be able to answer this in more detail
@user-29e10a the rotation translation of each camera is estimated during calibraiton. You can extract this data if you modify the source.
the eye pos and gaze normal for each eye (in world coordinates) is also available in the gaze datum.
@papr is correct re the windows driver. It has a list of PID and VIDs for Pupil Labs hardware (cameras)
@user-29e10a you would need to install libusbk drivers for this camera manually
ok, then I will search for some calibration matrices to βpost-calibrate" my data
You can use a tool like Zadig to install libusbK drivers for your device
@wrp can I modify the list? the driver for libusbk is already installed and working in other applications
yep
Hi, I'm currently attempting to replace test image capture with that of what I captured using OBS for a project in unity, I'm currently finding it difficult to sync up the recordings in pupil player. Currently I'm trying to match the capture fps to that of the eye cameras. I'm wondering if there is a more simple and elegant solution to what I am trying to do.
Hi, I am wondering if it is possible to record from more than one eye tracker on the same laptop, while having both trackers plugged into the laptop's USB drive? Thanks!
I mean USB ports. Thanks
Hi Everyone, anyone here officially from Pupil? Been trying to get in touch with the company for a couple weeks now, and haven't gotten any replies on info@pupil-labs.com
Hi @user-7fea1c apologies if we have not replied. We usually are quite quick with responses. I will follow up via PM/Email
please follow up via email. Lets set up a call if possible
@user-7323ec this is possible if your computer has more than one USB bus. You could also try reducing the resolution of the world and eye cameras. Please check out the docs for more on this topic here: https://docs.pupil-labs.com/#usb-bandwidth-and-synchronization
what's the quickest way to get my pupil-labs vive insert working in unity?
any sample unity projects?
@cmanders#0602 I would recomment asking this in hmd-eyes the devs can point you to a unit3d example.
Hello! Is there a way to get data about saccade lengths throughout a recording? I'm able to get a csv file of fixation durations, but would like to get saccade lengths as well if possible. Thank you!
We are working on such a feature
Oh great! Thank you!
re: the saccade feature - will the data also tell the direction of the saccades in addition to the length?
Gaze positions that belong to a saccade are marked as such. You have all information available
Hey! Thinking about picking up one of the HMD add-ons to try out. How reliable is the calibration? Does it typically work every time? Having some issues with tracking calibration with FOVE HMD I've been using
Calibration is stable if Pupil detection is good and the subject is good at focusing markers :)
@user-893cb5 to elaborate on @papr I have not tried fove. With pupil you have full access to the pipeline. If calibration should fail you can check and find out why. Generally all that's needed in good pupil detection and you can see that in the eye video steam debug monitor.
We are actively developing methods and tools. The unity integration is getting a refactor right now and we will add a new eye tracking algorithm that extends our current 3d model.
hi there, is there any possibility to get something like fixation_on_srf like gaze_on_srf in realtime via pupil remote (zmq)? I thought maybe considering the position but i think that the position out of subscribing "fixations" (pupil remote) is not relative to the surface.
Short question: Is there a time frame when we will be able to record audio in pupil capture in windows? π
@papr hi, why does the 3d fixation detection plugin only give fixations for individual eyes and nothing on the cyclopean eye? Is there a way to get cyclopean fixations? Thanks a lot! π
In the upcoming version it will use cyclopion fixations if no 3d data is available. It then uses the camera model to distort the 2d norm gaze positions. This assumes a valid calibration
yeah my problem right now is I need cyclopean fixations and the fixations on left and right eye are not synchronized, so I haven't figured out how to use this data
You could adapt the code such that it calculates fixations for both eyes separately instead of the one with more data.
oh interesting
do you mind showing me how/where I can do this? I haven't played around with the python code yet
In the fixation detector file in the shared modules folder.
The bottom class should be the online detector
oh it's interesting that I can't find the shared module folder in our lab's computer
we just downloaded the three packages (service, capture and player)
Yeah, the bundles are a different story...
On which platform do you work on?
I work on windows 10
Oh mmh. Modifying code on windows is a bit more difficult. If it is not too urgent, I will implement this before the next release.
oh, okay, thanks
when will the next release be?
Else you will have to go through the instructions on how to run the application from source on windows
I'm doing data analysis on our pilot testing batch
My guess is 2-3 weeks.
and the data is somewhat useless right now for me because we can't get cyclopian fixations
So you need it for interaction?
yes, the original plan is to get the hit data from Unity, but that seems not too reliable
so now we switch to fixations and gazes
to get the fixations and then see the 3d gaze points of gazes that lie within that fixation
and will it be possible to get the positions of the cyclopean fixation's gaze points in the next release?
Would it be enough for you to know the gaze positions that belong to an fixation?
Their norm positions are relative to the world camera. That would be your cyclopian eye
the norm positions of the fixations?
or of the gazes?
for both, there's only norm x and norm y though, so we're missing the z dimension
Both. The fixation norm pos is just the mean of all its base gaze positions
I thought I could use "gaze_point_3d_x" (and y, z) as the position of where the person looked at in world camera, no?
we don't have a recording of world camera (because the frame rate is way too low for VR rendering), that's why there will be no world camera picture for reference (and therefore having only norm_pos_x and norm_pos_y is not sufficient)
I understand. I will think about how to implement this best.
thank you very much!
can you please show me the instructions on running the application from source?
I would like to try it out if possible
Good luck :)
thanks! π
I will be afk for now but will occasionally check in case that you have questions :)
thank you so much π
I should go now and have dinner too so you won't need to check for my questions over the next few hours :))