What is the recommended way to get eye tracking data from a user wearing a HoloLens 2? After reading in here, it seems the pupil HoloLens attachment is only for the HoloLens 1.
Our HoloLens 1 add-on won't fit the HoloLens 2 out of the box. Some users have developed custom mounts to fit our equipment onto HoloLens 2. If you're able to do some prototyping, that could be an option, but it's definitely not a turnkey solution!
can we attach the binocular add-on with the oculus quest pro?
I've responded to your message here: https://discord.com/channels/285728493612957698/446977689690177536/1149933057198014565
Where can i buy the eye tracking module ? dose it work with vives softwear?
trying to play vrc with eye tracking
Hi @user-049d11 👋 The Pupil VR Add-on is compatible with HTC Vive, Vive Pro and Vive Cosmos. If you're interested in using our hardware with a different VR headset, please note that it's not compatible right out of the box. However, we can offer a solution by providing you with the necessary cameras and cabling for custom prototyping. In this case, you would need to develop mounts that fit the geometry and constraints of your VR headset.
Software wise, we offer an integration with Unity3D: https://github.com/pupil-labs/hmd-eyes
Regarding purchase, you can acquire the Pupil VR Add-on directly from our website (https://pupil-labs.com/products/vr-ar/), or if you'd like to delve into more details or ask for quote, please don't hesitate to shoot us an email at [email removed]
Hey Eleonora, with the custom prototyping option, would we be able to request specific curvature of the cables containing the IR LEDs?
hello, im new here, still didnt have the chance to check out the code. But i want to develop and app thats able to run both android and iOS. Is it possible? I guess it consumes all the ram
Hi @user-fd0659! Could you share some more details of your intentions with this app?
Hi, I'm trying to save data from Pupil VR Add-on to a csv file. Does anyone have an example script to do this?
Hi @user-e16e05! Have you recorded the data with Pupil Capture/Service?
I think I solved my problem, thank you! Is it better to use Pupil Service for VR?
It really depends what you need. Service is essentially a very lightweight version of Capture, e.g. it only streams the eye videos and has a limited UI. In most cases, Capture is fine.
@nmt , I wonder if you could help me address and old issue in which the reference locations presented in Unity are subject to a temporal offset. I would like to correct for this temporal alignment. This participant was a special patient with a particular form of vision loss that drove several hours to provide this data, so it is not as easy as recording it again.
The image shows the issue. If the offset were corrected, the white sphere should line up with the 2D projection (teal disc). I can verify that the issue is not a mismatch in the world.intrinsics (which is used when mapping the 3D locations to screen space). It is a clear temporal offset - the reference marker timestamps (represented by the teal disc's presentation time) are shifted forward in time relative to their actual presentation time (represented by the white sphere's presentation time).
I must use those reference locations because they are 3D positions exported by Unity. Perofrming a standard post-hoc 3D calibration by clicking on the screen to define reference locations will not work, because those are only 2D screen location references and the resulting gaze export file will be missing critical columns of data. I need to use the HMD3D calibration method that takes advantage of the 3D reference data exported from Unity and stored in notify.pldata.
Previously, you helped give me the ability to update the items in the notify.pldata file because I thought the solution was to update the timestamps of the items with topic ['add_ref_data']. I've finally returned to this issue and, unfortunately, that did not fix it. There is no noticeable effect. I am willing to share that code if you would like.
Do you know which data structures / time stamps I should modify to shift the time that the reference locations are presented in pupil player (i.e., to shift their alignment with the gaze data) ?
Thank you!
Hi! I'm new to using Pupil Lab as a whole and working with the binocular add-on for the HTC Vive as part of my project.
I was wondering if it is still possible to do a project in the latest version of Unity? I know it says Unity 2018.3+ in the docs but I just wanted to ask about it
Hi @user-c32d43 👋🏽 ! Welcome to the community! There are some users that have used more recent versions of Unity (e.g., https://discord.com/channels/285728493612957698/285728635267186688/1084898351583068220, https://discord.com/channels/285728493612957698/285728635267186688/840500927441010698), however, we have not systematically evaluated the latest versions. We would recommend opting for the version suggested in the docs: https://github.com/pupil-labs/hmd-eyes
Hi, this may be a generic question. But is there a reason that multithreading set in a code that receives data from pupil labs VR set would have an effect on the cameras leading to a drop in the camera's FPS and larger fluctuation in FPS in pupil capture when running them side-by-side. The interesting thing about this is that it happens only in release mode of my application and not debug mode.
Hey guys, let me expand on my previous question and add that, in addition to modifying notify.pldata, I have also tried manipulating the pupil_timestamps.npy. Neither had the expected effect of changing the timing of reference points in Pupil Player. SO ... where does Player get that timing data from?