Urgent: I need help finishing this Pupil Labs to UnReal plugin as soon as possible ... we need it by May 5. Tudor started it but he is STRESSED OUT writing his thesis due end of this month... here is his code: https://github.com/SysOverdrive/UPupilLabsVR. We really, really need it Lisa
In the hmd eyes where is the Start demo script and ?
@user-c9c7ba (and everyone, really) sorry for the late reply. if you are still stuck with the raycast, the problem was that you can not use the 3D gaze position with ViewportPointToRay. you basically gave the answer in the second part of your question. the difference between the gaze position and the norm position is that only the latter gives you a viewport point. gaze positions are 3d coordinates in camera space
@user-aa0833 were you able to get the blink demo to work?
I'm having problems getting Holographic emulation working, I was wondering is there a way to use pupil labs eye tracking without that part?
My main issue is even though I know my HoloLens and computer are connected and the HoloLens eye tracking addon is working for some reason the holographic emulation does not connect to my HoloLens. I have tried to follow the steps in the main page on github hmd-eyes apart. The only different steps I took was changing VR build and player setting to work for HoloLens and actually building it because before doing this I was getting errors in the pupil labs script which went away after the build. Any help will be greatly appreciated
Good afternoon, please tell me, does anyone work with the data obtained by using the aytracker? Can anybody help with their transcript? (fixations, etc.). Help me please...
We got the pupil lab vive eye-tracker today, why the Pupil Service can't detect cameras at all.
@user-bd2540 do the cameras show up in the device manger. Can you also try to run pupil capture instead?
It seems that the cameras don't show up in the device manager. I am using win10. How do I solve this problem?@mpk
Please make sure the USB connection is well attached. The USB c plug might need a bit of wiggle to be fully in.
Thanks, It works now. But the problem is that when I press 'c' while runing the VR 2D demo, it doesn't start calibration@mpk
Please follow the demo guide in hmd-eyes repository on GitHub
Hey guys, I bought my HTC Vive integration about ...1.5-2 years ago. I'm curious if there have been any updates to the camera placement?
Hi Everyone, Can anyone please guide me to what is the most recent page for hololens and pupil labs development? And which parts come before and after which? By that I mean should I be doing the first step provided in the "developer docs" link in the HMD eyes git hub page and then continue the next steps on the HMD eyes githyb page? I separately tried to add the pupil labs unity package but the file shown in github for me to adjust the calibration and port number in the developer docs page are not in the unity package I got yesterday and It does not connect to pupil labs.
I do follow the VR demo guide. "Press 'c' on your keyboard to start a calibration and focus your gaze on the displayed marker as it changes position." When I press c, nothing happens. I can run in monocular mode, But it seems that while the exe unity file is running, it doesn't receive command from keyboard in VR mode. How should I solve this problem? @mpk
Does any one run the VR 2D demo and VR 3D demo succesdully?
It shows following Error in runing VR demo
@mpk
@user-bd2540 looks like an issue with steamvr. Is steamvr running and everything’s green?
Yes.Thank you very much. Everything is green. Have you runned the VR 2D and 3d demo before?@user-29e10a
Yes! Are you able to run other VR software? Otherwise I would suggest doung these steps: https://steamcommunity.com/app/250820/discussions/2/142261352655019562/
Good to hear that. My problem is that when I run the realse.exe from the VR demo. It seems that the release.exe can't get the command from keyboard. So I can't press 'c' to start calibration. Can you do the calibration?@user-29e10a
Hey pupil folks, I'm curious to hear about your groups efforts to put the eye cameras behind the optics in the Vive. I'm strongly considering having a visiting student attempt to do this over the summer, but I'm sure folks have considered this in the past. Anyone?
...as it is, the off-axis cameras get pretty poor eye images. We need a hotplate
@user-b91aa6 yes i can, does it have connection to pupil capture? You have to click on the unity window to focus it. Then „c“ should work.
But how could I clikc the unity window when the Vive is connected? Since the window is displayed in Vive's screen. I can't use the mouse to click the window@user-29e10a
You‘ll see the same window in 2D on your screen as well @user-b91aa6
Thanks. I fount the reason. I need to click the VR button on the steam VR status window. Otherwise, the computer will take the HMD as the thrid monitor. That' s the reason that it can't receive keyboard command. Thank you very much.@user-29e10a
Who from pupil worked on the HMD integration hardware ?
Question: Why three markers are used in VR for calibration in one direction?
Does anyone know what the public static void UpdateCalibrationConfidence( string eyeID, float confidence ) behavior is ?
Why do we need three markers in 3D VR model?
@user-b91aa6 collecting data a different depths. 3d mode is experimental... we recommend 2d mode for now.
3d mode means the 3D gaze position or 3D eye position? For example, can we use the 3D eye model to get 2D gaze positon?
@mpk
@user-b91aa6 @mpk great question. The two issues seem a bit wrapped up in one another right now. Mpk?
3d mode means the 3D gaze position or 3D eye position? For example, can we use the 3D eye model to get 2D gaze positon?@mpk
Any reason there has been no attempt to integrate the 200 Hz cameras into the Vive?
@apple @mpk 2D mode means two things: 1) The 2D pupil-to-gaze mapping algorithm is used. This algorithm seems to use only the centroid of the detected pupil, and calibraiton quality is quickly degraded by small shifts of the helmet on the face. 2) The data is represented as points on a 2D screen.
@user-b91aa6 I'm not a fan of the conflation here.
My conclusion that 2D market scene demo uses the 2d pupil-to-gaze mapping is formed on the observation that if you start the 2D market demo, then look in player capture's general settings, the mapping algorithm is automatically switched to 2D.
I do not know why the 3D gaze mapper behaves so poorly.
@mpk @user-b91aa6 I have posted an issue to HMD eyes to explain this long-standing problem, and hopefully to motivate a fix. https://github.com/pupil-labs/hmd-eyes/issues/47
Hey guys, I just noticed the new Vive addon goes as fast as 200Hz but the resolution is much lower. 200x200 pixels? isn't that a bit too low? isn't 400x400 a better idea even if just running at 120Hz? Anyone tried and can say how accurate it is?
Also any reason to go 200 Hz and not 240?
@user-c7a20e we dont have a vive addon that does 200hz. Maybe you have some custom HW?
its on the site
@user-c7a20e we have the hololens addon that does 200hz. The vive addon does. not. Or there is a mistake. Can you send a screenshot?
I got confused since were on the same page. But still I read its planned. When its planned I suppose the resolution will be 200x200 as well, right?
May I ask that how do I compute accuracy in degrees in HMD vive?@mpk
@user-b91aa6 we run the calibration and then we run the same procedure but measure the difference between gaze and ref points.
Yes. But how could I turn the accuracy from pixels to degree?
@mpk
we use the intrinsics of the HMD. I dont remeber how we calculated them.
The important thing is the distance from eye to screen, do you know where can I find the instrinsics of the HMD?
Another thing is that the 3D model means using 3D eye position to estimate 3D gaze position, right? 2D eye position is not used for 3D gaze position?
@mpk
@papr
@user-b91aa6 The 3d model is based on a time series of 2d positions.
Each 3d sample has corresponding 2d sample.
Thank you very much for your reply. 3D mode means using 3D eye position to estimate 3D gaze position, right?2D pupil position is not used for estimating 3D gaze position?
@papr
Also, in the HMD Vive, how do I change the pixel accuracy to degrees?
The 3d detection/mapping pipeline uses a series of 2d pupil ellipses to generate a 3d model, see the attached paper for details. After estimating the model each new 2d pupil ellipse can be projected onto the 3d model and generate a 3d pupil position. This 3d pupil position can be mapped to a 3d gaze position after the 3d calibration.
Therefore, yes, every 2d pupil position is used to generate a 3d pupil position in this pipeline.
Which pixel accuracy are you referring to?
For this, I understand. There are two concepts: 3D eye position and 3D gaze position. What I mean is that the 3D gaze position is estimated from 3D eye gaze direction, not 2D eye position, is this correct?
For example, if I want to measure the accuracy in VIVe, I show one marker in HMD, I calculate the gaze positon, the distance between gaze and marker is the accuracy measured in pixels. How do I change the pixel accuracy to degree?
@papr
@user-b91aa6 as mpk said, you need the hmd intrinsics to calculate the degrees from the pixel distance. I do not know where to get hmd intrinsics from, unfortunately.
@user-b91aa6 Yes, a 3d pupil position is required to calculate a 3d gaze position. Internally, a 2d pupil position is calculated from an eye image at first and then converted to a 3d pupil position before being published.
Thank you very much for your kind help. Can we get the 2D gaze position from 3D pupil position? Can you ask somebody who develops the HMD calibtration where to find the instrinsics of the HMD? I do need to implement this. Thanks again.
Yes, you can enable 3d pupil detection and use this data for the 2d calibration/gaze mapping
Thanks. For the pupil service, It already sets the mapping method to be the same as the detection method? We can change it in our codes?
@papr
Set detection & mapping to 3d but use the 2d hmd calibration method. It will automatically result in 2d gaze positions based on the 3d pupil positions
Thanks. Then, can we get the 3D gaze positions using 2D pupil position?
I haven't found where to set the 3D mapping method in pupil_service .
@papr
No, only the other way around. You get 2d gaze positions from 3d pupil positions
Thank you very much. Question 1: For HMD calibration 3D, the reference markers during calibration must be put with depth?
Question 2, can you help me ask the person who implements the accuracy test to see where to find the intrinsics of HMD vive?
Unfortunately, I have not enough inside into the hmd eyes project to help you with these questions.
Then, can I send an Email to the person who develops this? I really need to know how it's computed. Or can I find the information in the source codes? @papr
Hello, I am trying to implement eye-tracking in my Hololens app, but when I do seemingly the same operations than on the example given, it does the eye-tracking, but does not compensate for any head movement, any idea of what I could have forgot ?
Hi Everyone, Is it possible to use pupil mobile with HoloLens ? Since Unity's controlling the connection I was thinking that wouldn't be possible
@user-006924 you can connect the cameras the the android phone and in Pupil Capture select pupil mobile and stream data wirelessly to capture.
the rest of the workflow would be the same!
I have a few questions. I'm trying to follow the process on the doc page, I just recently got the pupil mobile app from the play store but it doesn't have any of the capabilities shown in the doc. I made sure my cell phone and the hololens and the computer are all on the same network but pupil capture isn't detecting the host. Can you help me figure out why?
This is a screenshot of when I open pupil mobile
@user-006924 what device and what cable and what version of Android?
It's Gppgle Pixel 2, Android 8.1.0 and I'm using the USBC cable that came with the phone to connect to pupil labs
Is USB OTG enabled? Often cables that ship with the devices are not sufficient.
Also we have never tested this specific device in house.
I'm not sure if it's OTG enable or not. I'll check thank you. If I use one of the devices you have tested I would still need and OTG enabled USBC cable, right?
Some Android OS versions/mods (like Oxygen OS - on OnePlus devices) have options to enable/disable USB OTG. It must be enabled in order to power and transfer data from connected devices
I don't know about Pixel 2
Thank you. I'll look into that
Proper USBC-USBC cable is also a point to check if cameras are not appearing in Pupil Mobile
We have tested USBC-USBC cables from this manufacturer
To make sure I understood everything correctly I'd need the proper cable on a n OTG enable device?
I don't get any video feed on my cell phone so the cables aren't right as well, I guess
I don't know if Pixel 2 would work or not but if you are able to invest in a cable to test it would be nice to have definitive information about this device's compatability
I think testing a new cable is easier than getting a new phone. I'll look into it and share what I find.
Hi Everyone, I've worked a bit with pupil eye-tracking but am very new to VR, just wondering if anyone has any short videos of the output of the gaze position within the VR environment? I'm assuming the pupil software will use the VR scene as the world camera and overlay the gaze position on top?
Any information you can give me is greatly appreciated, Thanks
Hello, I am trying to follow the developer instructions for hmd-eyes, but I can't get the parameters described in the instructions like "Available Scenes" or "Current Scene Index", is the documentation outdated ?
Hi everyone, I'm having difficulty doing a good calibration with pupil addon HoloLens. I know the steps are petty basic but for some reason the gaze point are way off were they actually should be. I attach a link here to a zip file which includes 1) the video of the gaze points after calibration, 2) a picture of how pupils were detected throughout the calibration (I have another video were the pupil are always detected correctly but I didn't upload it here) and 3) the accuracy results from after calibration. In the video I'm always looking above the laptop screen but my gaze point is never there. Does anyone have any suggestion how I can improve the accuracy? Thanks in advance
I forgot to put the link : https://www.dropbox.com/s/u0gtltos84fm5jn/Pupil_HoloLens.zip?dl=0