Hi everybody ! Do you guys have a product / solution for CAVEs ?
(using stereoscopy)
And if possible with free head movement
Core looks like it would
Hi @user-67f0b3, how does stereoscopy work in a CAVE? From my very naive understanding I remember that this would mean presenting different images to the left vs right eye?
active 3d glasses, but yes, use a double GPU projector
volfoni glasses if i recall correclty
Can you provide a bit more details on your desired experimental setup? Do you want to track head movements? Do you want to track gaze with respect to the CAVE image?
Well i don't have a very specific project in mind yet im considering options
I already have full body motion tracking
Pupil Core generally supports tracking gaze with respect to a reference image recorded by the "world camera", which sits between both eyes on the forehead.
so it wont give me any data about 3d depth i guess
Since its open i guess i can couple my headposition with the the tracking gaze
but i wont have any information considering the depth: is the subject looking 1 meter away or 10
@user-67f0b3 we do report a 3d gaze point, but the magnitude is very unreliable as depth through vergence is hard to measure.
We have users that integrated Pupil Core in a cave and with motion capture systems.
What our system does is report that data in realtime over a network interface, you can integrate that with head-pose tracking to get gaze and headpose in your cave. It will require some integration work but its certainly doable.
Alright, that seems pretty good
the softs that you offer are the same on all your prducts?
If i get Core to work in my CAVE, can i also get some binocular addon for HTC and use the same soft?
(pupil capture /pupilplayer)
@user-67f0b3 Yes, Pupil Core and the VR/AR addons both use Pupil Capture and Pupil Player. Note also that this software stack is available under open source licensing.
Perfect!
thank you very much, @mpk @user-c5fb8b, have a good day
@user-67f0b3 glad we could help! Take care!
@mpk depending on the system, real time may mean 100 ms of latency or more. Thatβs certainly too much for anything that I would consider βreal time,β like gaze contingent stimulus delivery. Iβm not saying that itβs not possible to lower the latency sufficiently for real time use, just said it would take some very careful measurement and verification for each specific user. My own experience is that thereβs more latency than one would want in a real time system. I have not yet tried to operate people labs on a separate machine on the local area network. Whatβs great though is that pupil labs provides the timestamps need it to reconstruct the proper temporal sequence of things off-line. So, donβt take this as a complaint. Just a precautionary tale so that an eager scientist doesnβt runoff and collect data with assurances that there will be no temporal issues. π
Hi there
We'd like to use eye tracking with Holo Lens AR headset
is there anyone who has used this combination?
Hi. I have VIVE HTC Pro headset. would you please guide me, may I use pupil on my headset for eye tracking?
@user-6752ca The Vive Add-on is compatible with the HTC Vive Pro, however Pupil Core headsets are not compatible due to the physical space constraints of Vive/Vive Pro HMD. Please see: https://pupil-labs.com/products/vr-ar/
hey how do i get pupil position in unity with VR
i tried the demo scene doesn't work
Hi @user-70d572, which versions of hmd eyes and unity are you using and which demo are you trying to run? Could you please elaborate on the type of issue you are having with the demo scene.
Hi. would you please guide me , i can take pictures or record documents about points what something are seen when using binocular Pupil eye tracking?
Is there any Academic discount for buying binocular eye tracking?
please give me more information about binocular eye tracking such as Gaze data output frequency, Accuracy, Calibration, Trackable field of view, Data output (eye information), SDK engine compatibility.
Hi @user-6752ca, this channel is reserved for question regarding the Virtual/Augmented Reality solutions of Pupil Labs. For general questions please refer to the π core channel. I also noticed that you received an answer to the same question in the π» software-dev channel as well.
@user-6752ca You can find the tech specs of the VR add-on on our website: https://pupil-labs.com/products/vr-ar/tech-specs/
@user-6752ca: in case you are asking specifically for academic discounts on the VR addons: We only support a dual pricing model for Pupil eye tracking headsets. VR/AR products are priced relative to the academic discount model we have for Pupil headsets and therefore do not provide any additional discounts. Since these are custom made research tools made in low quantities the margins are already too slim.
Thank you so much. π
@fxlange Hello again felix! Hope all is well. I wonder if that modification that enables the option to turn off the annotation rendering was ever implemented in HMD eyes
or if you guys are considering some other method to facilitate the logging of data via pupil labs
I'm building a new little experiment, and now would be a great time to transition.
Hi there,
I'm going through the hmd-eyes documentation, i can run the completed executable with successful calibration and eye gaze working. When in the Unity scene i get a message saying i've successfully connected to Pupil and returns my current version (1.23.10), however the status text still says not connected and when i press C to begin calibration i get an error saying Calibration not possible: not connected! leading back to subsCtrl.IsConnected being false.
Anybody had a familiar issue? I believe it has an issue with the SubscriptionController script.
hi, Can I adapt pupil labs to a Vuzix M300 xl?