Guys, I got a color film developed to use it as an IR pass filter, but the other wavelengths clearly pass through? https://i.imgur.com/xel0q3J.jpg
Guys why aren't you advertising your HMD addon as a foveated rendering solution? SMI uses 250Hz tracker cameras but Tobii uses 120Hz and the same 0.5 degree accuracy and 10ms latency. It doesn't appear much better than Pupil and the SDK is not open source. https://www.tobiipro.com/siteassets/tobii-pro/product-descriptions/tobii-pro-vr-integration-product-description.pdf/?v=1.5 It appears Pupil should handle the job well.
ok then...
is there a raytracing exemple in Hmd Eyes ?
Ok found it. Ignore that question
Can someone help me properly calibrate pupil ? I think i must change som estuff at the pupil 2d detection settings
Where is the communication framerate being specified ?
Looking through source code hmd_calibration.py, line 239-241:
ref_points_3d_unscaled = np.array([d['ref']['mm_pos'] for d in matched_data]) gaze0_dir = [d['pupil']['circle_3d']['normal'] for d in matched_data if '3d' in d['pupil']['method']] gaze1_dir = [d['pupil1']['circle_3d']['normal']for d in matched_data if '3d' in d['pupil']['method']] Line 241 seems to have a typo - should be d['pupil1']['method'] at the end?
Hello we are working with the HTC Vive Pupil Labs implementation in Unity. However, we cannot find an example of how to subscribe to PupilTools after calibration in the same scene. We have tried to do it on start but then Pupil Labs won't write whether the calibration was successful or not. If we do it after calibration pupilTools.OnReceiveData doesn't seem to be called. Does someone know how it should be done? π
Hi all, I also am trying to use the pupil plugin for unity. after I did a calibration, and the game loaded my scene, I want to record the data using the pupilGazeTracker. when I click 'R', i got the video to record OK, but there is no data in the 'UnityGazeExport.csv'. I tried to look into it, and for some reason, I think the game doesn't go into UpdateGaze at all (confidence of eye tracking is high). can you think of what might be the problem?
and, can you please tell me in what format is the .Time file in? is the scale there in anyway calibratable to the unity time scale?
Has anyone got any Idea where the framerate is being set in hmd eyes unity_pupil_vr thing ?
Hello everyone, I wanted to know if it was possible to calibrate my Pupil labs using the Hololens camera as the World Camera?
Hi, will the Pupil Vive Addon working on the new Vive Pro? http://de.engadget.com/2018/04/09/video-review-htc-vive-pro/
@user-29e10a we just tested it and yes. It will! π
@mpk perfect! is there a spare usb connection for the cameras on the hmd, too?
seems there is a USB -C port. So yes I believe this will work.
We only tested if the parts fit. (They do.)
thx
here a quick video of the assembly: https://www.youtube.com/watch?v=ZRdWlmxBH30
For the raycasting part done in Unity which data exactly are you using from The Pupil headset ? If someone can be super specific where are the values being send in the code
@user-a23640 sorry for the late reply, but let us now try to work through your problem. I would suggest you start with the calibration scene which, after you finished the calibration, load the Empty Scene Unity scene. This is set in the Pupil Manager and can be easily adapted to load your custom scene, instead. If you have a look at the Empty Scene gameobjects, you will find the CalibrationDemo script. All that does is, if a connection to Pupil is established, call PupilGazeTracker.Instance.StartVisualizingGaze ();. That method serves as a good starting point on how to subscribe to the βgazeβ topic after a successful calibration
@user-b4961f if UpdateGaze is not called, did you not subscribe to the βgazeβ topic after the calibration finished? Otherwise, you should be receiving data and UpdateGaze() should be called..
when I drop the recording in the player for the new 1.16 version, it says the folder is not valid/ not readable. Does anyone have this problem?
Question: why pupil lab in VR can compensate slippage? How this process is done?
@user-fe23df can you share the contents of what is in your recording dir (e.g. ls
contents of the dir - or screencap of dir)?
@user-b91aa6 - short explanation here: https://docs.pupil-labs.com/#pupil-detection
Hi guys, does anybody in hmd eyes the public static Vector2 LeftEyePosition Where is it extracted from the dictionary received from pupil
?
@wrp Thanks for the reply. I'm not having access to the tools at the moment, I will send you the screencaps as soon as I'm back.
Another question: how come it's not possible to run the accuracy test with the HMD? When I press 't' the Capture app says this function is not available when using the HMD
I realized I had a few copies of the recordings with me, so here goes:
this is the content of the recording
it looks really different from the recording content when recorded with the older versions of pupil labs and older versions of the HMD plugin?
I think this is why there's the "invalid data" message
@ohooo#9757 the folder you are showing is the one that is generated from the Unity Plugin. There should actually be two new folders for each recording. one containing the recording data from Pupil, the other containing the files visible here, so a video, the GazeExport and a time reference file
@Chifor Tudor#8163 please read through the chapter titled βAccessing Dataβ in the developer docs (https://github.com/pupil-labs/hmd-eyes/blob/master/Developer.md) to get a better understanding how the gaze information is extracted from the dictionaries received from Pupil
Hi, I'm using the HTC Vive but I can't get the blink demo or basically any script with ' PupilTools.OnReceiveData += CustomReceiveData;' to work. I added some prints to the blinkDemo code ( see image) but the Debug.Log("Blink detected: rec"); is never reached. What am I doing wrong?
Secondly, I think this function in Pupiltools.cs should have "confidence" hard coded in? - again I might be wrong
Hi guys, regarding hmd eyes where you set the hit variable for the raytrace
Ray ray = sceneCamera.ViewportPointToRay (viewportPoint); RaycastHit hit;
what parts of the message from gaze is used to set the Vector3D in viewportPoint
Hello, I want to set my Ray object from camera in direction of gaze... but I am using 3D calibration method... I wanted to do something like:
viewportPoint = PupilData._3D.GazePosition; Ray ray = Camera.main.ViewportPointToRay (viewportPoint);
then I tried this: shootRay.origin = this.transform.positionshootRay.direction = viewportPoint;
*shootRay.origin = this.transform.position shootRay.direction = viewportPoint;
And I dont know, what is the difference between gaze_point and norm_point in 3D calibration?... What should I use?
We are working on a project that requires getting monocular blink data. Any hints on how we could do that?
May I ask that what's the budle adjustment is doing in HMD-eye calibration? What the 3D reference points mean? Thank you very much.
Hi Everyone, Do I need to have the holoToolkit while I'm using the eye tracking addon for HoloLens in my unity projects? "Because when I have both I keep getting the error of multiple plugins with the same name"
And when I start the pupil capture in the first step should I load and adjust the cameras manually? because mine aren't loaded automatically and whenever I load the eye cameras to check whether they have a good view of the eye or not pupil capture shuts down.
And do I need to have the holographic remoting player for steps 8 through 10 of the setup? Because mine doesn't connect with the remote device : (
I can't get holographic remoting app since I don't have Windows 10 pro, is there any other way for me to use the pupil labs HoloLens addon?
I completed a build with no errors or anything and while I know that the HoloLens I'm using is paired with my computer I can't get connected through the holographic emalution and I keep getting the error "Disconnected with failure reason unreachable" and I can't download the holographic remoting app since I don't have windows 10 pro.
Hello we are currently using an older version of the Pupil Software and plugin for unity, and am running into a few issues with the project we are currently developing. Before I go into the issues I would like to clarify, that I understand that there is a newer version of both the Pupil software and the plugin, and I will adapt them to the project after we have a stable environment to begin data collection.(Just to have a working iteration before updating the pupil software).
My first question is with the what is being captured in Pupil Capture, one of the camera's images are much darker than the other and I am wondering if that is affecting the gaze data. My Second, being a continuation of the first, is regarding flickering of the "circle" representing the pupil in the capture and methods to fix the issue. My third is regarding the record hotkey not working while in Unity for the Pupil Capture software and whether or not I can fix this, as we need to sync our data to that we are collecting.
May I ask that why the budle adjustment is needed in HMD-eye calibration? What the 3D reference points mean, the points in world coordinate system of 3D model? Thank you very much.
will the htc vive binocular add on work with the vive pro? edit search reveals yes, but not available for sale yet?
Yes it will work. The Vive add on works for both Vive and Vive pro
It's available and for sale π
what is the 200hz model? something newer?
@user-24270f 200hz eye cameras are our newest eye cameras. Currently 200hz eye cameras are available only for Pupil headsets and AR add-ons.
@wrp, Hi, I have a question about 3D calibration accuracy method.
@mpk
May I ask that why the budle adjustment is needed in HMD-eye calibration? How the 3D reference points mean are computed? Thank you very much.
@wrp
@paprMay I ask that why the budle adjustment is needed in HMD-eye calibration? How the 3D reference points mean are computed? Thank you very much.
@user-b91aa6 The bundle adjustment is used for 3d mapping. It estimates the geometrical relation between the eye cameras and the scene camera. This relation is used to project 3d pupil vectors into the 3d scene.