Eagerly awaiting release of HTC Vive Pro 2 form factor ... will there be room for an eye tracker?!! ...or, will space contraints mean that Pupil Labs has to go all-in on the Invisible-style blackbox limbus tracking?!?
For what it's worth, I'm very, very impressed with the track from the invisible.
Hello everyone! I got an issue when playing the "BlinkDemoScene" in unity 2020.3LTS version (custom package HMD-eyes.VR.v.1.4) ,which is the camera of the game view following around with my HTC Vive headset, that is, I cannot move around to watch other objects except the duck in the middle!π€£ However the camera works pretty well in unity 2019.4LTS version. Someone else knows what happen here? Thanks a lot!
Has anyone managed to fit the Binocular Add-on inside the Valve Index? Does the eye-tracking happen independently to the Vive on the PC? Wondering how viable it is as an option as a part of my planned VR-based Motion Capture suit
Hi @user-302799. The Binocular Add-on does not fit the Valve Index out of the box. Eye tracking does occur independently to the Vive. The Add-on connects to a desktop or laptop computer, where gaze and pupil data is recorded with our desktop software: https://docs.pupil-labs.com/core/software/pupil-capture/#pupil-capture We also have a project that implements Pupil VR with Unity3D: https://github.com/pupil-labs/hmd-eyes
Thank you for the information. Would the modifications required to make the Binocular fit the Valve Index be something I could do myself? Or would that risk damaging the Headset or Binoculars irreversibly?
One option would be to de-case the add-on, which would ultimately void the warranty and risk damaging the internals
No problem, that's expected, although good to hear it's potentially viable. Is there any mention of a properly supported Index solution anytime soon?
It would be great if you could contact info@pupil-labs.com regarding the Valve + Binocular Add-on.
Question about the ScreenCast camera/ScreenCast (script). Is it necessary to have the ScreenCast camera a child of the main VR camera or is it possible to add the ScreenCast script to the VR camera and have it work? I am wondering because rendering 2 cameras is causing major performance decreases. Also, I did add the ScreenCast script to the main VR camera, but pupil capture throws an error saying "[Warning] network_api.controller.frame_publisher_controller: <class 'video_capture.hmd_streaming.RGBFrame'>s are not compatible with format "FrameFormat.JPEG"." I tried changing the FrameFormat but it seems to be being overwritten somewhere. If someone could let me know if this is possible that'd be great.
Hi, please guide me. I have VR Headset (VIVe Pro) and Eye tracker (Binocular Add-on). I used Unity3D for showing my file. I follow all process in ( https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md).
At first, I tested Demo that there is in ( https://github.com/pupil-labs/hmd-eyes).
My Problem: I can see Demo file in my VR headset, but I canβt see anything in Pupil Capture and also in recorded movie by Pupil Capture, I just see gray screen.
Please check attached pictures.
Hi @user-2c8a6d, did you use the screencast feature to stream the VR Scene to Pupil Capture: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#screencast ?
thanks. I will use and check it, so inform you about the results.
Has anyone here worked with the application AttentionVisualizer that can stream eye tracking data from Invisible and Core?
please check my video, and guide me how I can manage and control points and lines? before recording i used calibration.
It is difficult to tell what is going wrong without the eye videos and pupil detection results. Please share a raw Pupil Capture recording of the calibration with [email removed] Please also use the screencast feature to include the vr scene as a scene video in Capture.
How can I fix and control points?
Hello, Sorry if my question is a little boring; I'm new into pupil labs products. I wan't to do a research with pupil labs add on for Htc-Vive. My goal is to design some 3D scenes in unity and record the gaze and fixation points and generate heatmaps to know which parts of the scenes are most visible. could you help me how should I do this experiment?
I suggest using hmd-eyes to access the eye tracking data. Read more about it here https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md
The gaze demos presented at the end might be of specific interest for you. hmd-eyes does not have a built-in demo for heatmaps though.
I mean something like this.