hey, is there a guide or something for advice on optimal pupil detection? I see the docs mention pupil min/max
and intensity range
but there's a bunch of other settings that I'm not sure what they do exactly. And I'm having trouble getting good pupil detection
HI @user-f9af20, can you share a screenshot of the eye images with us, where you are having trouble with pupil detection? Since there are a lot of different scenarios, we don't have everything listed, but try to give individual feedback on how to improve pupil detection.
@user-c5fb8b hey, i tried to record pupil capture but the recording messed up from OBS. here's the recording from the software in pupil capture. I will try to post a vid going through ROI and algorithm camera view
and here is in pupil capture https://streamable.com/obn0wy
hi friends! i got some prablem... i use vive and pupil but it can't calibration.
how do i do?
@user-bbee68 does your calibration show success or failure?
@user-bbee68 does your calibration show success or failure? @user-f9af20 i don know. just no response...
i have pupil core and using hp vive sametime. it is cant using sametime?
VR add-ons and core is so different?
or some compatibility is exist?
ah! i found some problem...
thanks guys!
it just technical mistakes..
Hi @user-f9af20, I had a look at your screen recordings, here are some notes: 1) your ROI's are too small to contain the full range of eye motions, especially at eye0. If you look to the sides, the pupil will overlap to boundaries of the ROI and won't be detected. Have a look at this screenshot, showing a better ROI in red. The ROI should be able to contain the pupil in all possible locations:
2) I noticed the pupil detection algorithm sometimes picking up the iris instead of the pupil. To avoid this you should make the Pupil max value smaller. In the algorithm view you see two red circles in the middle, which represent Pupil min and Pupil max, adjust these to limit the possible pupil sizes to a reasonable range.
As a general tip: you can make the eye window a bit larger, then you can have the side-menu open while still being able to see most of the image.
3) As it sais in the description, the Pupil intensity range should be set with the algorithm view enabled. You should move it as far right as necessary (but not more) to have the pupil shaded in dark-blue in all (!!) viewing angles. It might be dark blue when looking straight, but not fully when looking to the sides, so you need to check when looking up/down/left/right. This is easy to do yourself when using Pupil Core, but a bit cumbersome in VR. I saw you might be using some VR Virtual Desktop already, which might help. Otherwise a second person might come in handy.
Please test these suggestions. If you continue to have problems, please just send us screenshots of the algorithm view, this is normally enough to investigate pupil detection issues.
thank you for the very detailed response, and sorry you had to sit through that 😅
another question. in the gaze visualizer, what is the difference between raycast hit marker and raw gaze direction hit marker?
Hi @user-f9af20 - the visualizer projects the gaze direction into the scene. The black raycast hit marker shows the actual point where gaze and 3d environment intersect (= hit point). The yellow raw gaze direction hit marker stays true to the gaze direction, only using the raycast hit point for how deep to draw the marker into the scene. The size of the yellow disc is based on the angular error. https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#default-gaze-visualizer
@user-c5fb8b Hello again! I tried the pupil plugin screencast demo, and it fails on calibration (at the step of optimization, after targets have been presented).
Here is the log:
hi, I have a problem. The MP4 eye video files output (eye0.mp4 and eye1.mp4) cannot be opened or played in anything besides VLC media player. I would like them to be displayed in unity automatically. Is this possible?
@user-c5fb8b Hey, any updates on this? Is mine an isolated issue, or are others experiencing the same issue?
Please let me know what I can do to help diagnose.
Hi all, I am currently trying to set up an experiment that requires recording the gaze data of quadcopter pilots. I am currently doing this in VR using the Vive, but eventually we want to move to real world. The vive does not have any support for FPV for real world drone flying, but the Oculus rift does and so does the epson AR glasses. I was wondering if anyone has any idea if the discontinued eye tracking model for Oculus can still be acquired, or if the epson glasses would be appropriate for my means (I don't think it would be since the screen is translucent, which could negatively impact the data). Really any advice would be appreciated.
@user-ac101f Fancy meeting you here! They have not integrated into the newer Oculus models yet because there simply isn't enough room in there to place the cameras.
(this is Gabe)
out of curiousity, can you send me the links to the FPV Oculus solutions you've been looking at?
hey i had a question. im trying to get the eye cameras set up for an oculus go. if i get the separate left and right core cameras, can I set them up with a custom mold for an oculus go? what would I need to do to make that happen?
Hi @user-8779ef, sorry for the late response. We have identified the issue with the frozen eye video previews in Unity and fixed it in Pupil v2.1.
I still haven't been able to reproduce the calibration issue from you. I have tested the new versions extensively in mutliple setups, switching back between old and new versions, but everything works perfectly for me. We haven't heard from anyone else having this issue.
However, we have added a lot of stability fixes in v2.1, especially for the calibration procedure. I would suggest you give it a go with v2.1, there's a chance that your issue has been fixed as well.
Hi all, I am currently trying to set up an experiment that requires recording the gaze data of quadcopter pilots. I am currently doing this in VR using the Vive, but eventually we want to move to real world. The vive does not have any support for FPV for real world drone flying, but the Oculus rift does and so does the epson AR glasses. I was wondering if anyone has any idea if the discontinued eye tracking model for Oculus can still be acquired, or if the epson glasses would be appropriate for my means (I don't think it would be since the screen is translucent, which could negatively impact the data). Really any advice would be appreciated.
@user-ac101f Very interesting project! I've been experimenting with DJI Digital FPV goggles. They have plenty of space for fitting eye cameras. And can be used for VR and real-world FPV. So far, I've managed to put in the cameras from pupil core, i assume however that infrared camera placement similarly to the ones from the VR-AR Kit for Vive would be much better. If I understand correctly, you do own a pupil VR-AR Kit for HTC Vive? Could you, or somebody from pupil labs share the exact dimensions? I'd be curious to know if it would fit into DJI FPV goggles as well.
@user-c5fb8b Great! I'll try and get to that later this week. Thanks.
@user-ac101f That's a pretty extreme view of the eye - far off axis. Can you share an eye video?
anyone have any input on my question?
"hey i had a question. im trying to get the eye cameras set up for an oculus go. if i get the separate left and right core cameras, can I set them up with a custom mold for an oculus go? what would I need to do to make that happen?"
Only 4 neodym magnets 😄
maybe 8, to avoid the pivot of the cameras
this way, you have a really good fixation, and your cameras remain removables. Plus, there won't be any wear off (compared to any other kind of mechanical fixations)
I dont have any better pictures than this, but inside the red circle, there are magnets glued to the "arm" of the camera. and inside the glasses, there are some other magnets 🙂
@user-1af2b3
okay so the two cameras can work together somehow and give the accurate data when inside the oculus go?
ive not purchased the left and right cameras separate before, do they come with a connector or something? i have the regular version which is already connected together and has a usb
You still need to have the connection cables and usb adapters
I dont really know :/
do i buy those on their own?
or do they come with it??
Unfortunatly i can't tell you, i've worked with multiple pupil labs set up, but they all were for the Hololens
But i'm sure you'll be able to find a solution. Pupil Labs is a great versatil solution. You may ask the commercial service 🙂
ok ty for the help 🙂
Maybe try to come up to them with some schematics or something, so they will understand better the need. For exemple, you may need at least a minimum lenght for your cables, you should try to mesure or evaluate that need. Because their cable are really really thin, and re-work them is absolutly hard
ive not purchased the left and right cameras separate before, do they come with a connector or something? i have the regular version which is already connected together and has a usb @user-1af2b3 With the Hololens version, you can plug and un-plug the cables from each cameras. I'm not awere if it's the same for regular version (are the cables soldered to the cameras ?)
aware*
@user-c5fb8b Ok, everything works but the screencast.
The screencast IS recording things in my custom scene, but only SOME things.
It's quite strange. From the video, you would think the world was quite sparsely populated with game objects. The terrain is not visible.
The VR wand is not visible.
Any ideas?
@user-8779ef glad to hear that the new release fixed your issues.
(other things are not visible)
it did. Thanks!
almost all of them.
This sounds like an issue with the camera or layer setup? I'm not a Unity expert, but I don't think this can be caused by Pupil or HMD-Eyes. HMD-Eyes just sends the rendered image from the camera GameObject. If there is stuff missing, the rendering in Unity does not pick it up I would guess.
Well, the rendered image from the camera gameobject shows objects that are not captured by the screencast, so there is something going on here.
I agree that it may be related to the use of layers, but I'm not doing anything fancy there.
The question is - which rendered image is HMDEyes sending to pupil capture?
I'll try anoter thing or two.
@user-8779ef I just realized our ScreenCastDemo uses a separate camera object that is attached to the VR camera
So I guess you will have to modify this camera to have the same rendering behavior as the VR camera
But you should have seen the same behavior with the old versions of Pupil and HMD-Eyes. Or did it work back then for you?
Can't remember.
Ok. Anyways my guess is that you will have to modify the screencast camera prefab to include the gameobject's you are missing... potentially through layers
Ok, success.
No, the issue was that the screencast had to be a child of the VR camera.
I did not realize this until you had me look back at the demo scene. My mistake.
I was also misinterpreting the sympotms.
It was rendering what it saw, but the orientation was off.
(significantly)
Thanks for your help!
I'm trying to use FramePublisherDemo but it shows only first frame. Is it already known problem and is there any solutions?
@user-8779ef glad to hear you were able to resolve the problem! 🙂
@user-27dfe4 are you using Pupil v2.0? Then updating to Pupil v2.1 should fix the problem.
@user-c5fb8b yes, I used v2.0, so updated and it works!! Thanks so much 😆
@user-b4144f hey sorry to follow up so late, but with holo lense, do you connect the cameras to the headset or to a pc? is there a way to use these cameras with a standalone headset like an oculus go or would the pupil cameras need a pc?
Hi @user-1af2b3, I think it's best if you send an email to info@pupil-labs.com describing what you want to do in detail and the specific questions you have. I'll make sure the people from our hardware team with experience in building the VR addons will respond to your questions!
alright thanks bud
@user-c5fb8b I would like to set the recording folder programmatically, upon startup, from a script attached to a gameObject.
...having issues, because I can't import the PupilLabs namespace.
Any suggestions?
I'm guessing that this has to do with the compilation order, in that /Plugins are compiled first.
...but, would have thought that this gave scripts outside of plugins access to the plugin's namespaces!