πŸ₯½ core-xr


user-67f0b3 06 April, 2020, 13:05:32

Hi everybody ! Do you guys have a product / solution for CAVEs ?

user-67f0b3 06 April, 2020, 13:06:05

(using stereoscopy)

user-67f0b3 06 April, 2020, 13:06:22

And if possible with free head movement

user-67f0b3 06 April, 2020, 13:09:17

Core looks like it would

user-c5fb8b 06 April, 2020, 13:13:40

Hi @user-67f0b3, how does stereoscopy work in a CAVE? From my very naive understanding I remember that this would mean presenting different images to the left vs right eye?

user-67f0b3 06 April, 2020, 13:14:55

active 3d glasses, but yes, use a double GPU projector

user-67f0b3 06 April, 2020, 13:15:56

volfoni glasses if i recall correclty

user-c5fb8b 06 April, 2020, 13:18:46

Can you provide a bit more details on your desired experimental setup? Do you want to track head movements? Do you want to track gaze with respect to the CAVE image?

user-67f0b3 06 April, 2020, 13:19:37

Well i don't have a very specific project in mind yet im considering options

user-67f0b3 06 April, 2020, 13:19:46

I already have full body motion tracking

user-c5fb8b 06 April, 2020, 13:19:56

Pupil Core generally supports tracking gaze with respect to a reference image recorded by the "world camera", which sits between both eyes on the forehead.

user-67f0b3 06 April, 2020, 13:20:17

so it wont give me any data about 3d depth i guess

user-67f0b3 06 April, 2020, 13:21:06

Since its open i guess i can couple my headposition with the the tracking gaze

user-67f0b3 06 April, 2020, 13:21:20

but i wont have any information considering the depth: is the subject looking 1 meter away or 10

mpk 06 April, 2020, 13:22:03

@user-67f0b3 we do report a 3d gaze point, but the magnitude is very unreliable as depth through vergence is hard to measure.

mpk 06 April, 2020, 13:22:26

We have users that integrated Pupil Core in a cave and with motion capture systems.

mpk 06 April, 2020, 13:23:58

What our system does is report that data in realtime over a network interface, you can integrate that with head-pose tracking to get gaze and headpose in your cave. It will require some integration work but its certainly doable.

user-67f0b3 06 April, 2020, 13:24:35

Alright, that seems pretty good

user-67f0b3 06 April, 2020, 13:27:00

the softs that you offer are the same on all your prducts?

user-67f0b3 06 April, 2020, 13:27:29

If i get Core to work in my CAVE, can i also get some binocular addon for HTC and use the same soft?

user-67f0b3 06 April, 2020, 13:28:54

(pupil capture /pupilplayer)

user-c5fb8b 06 April, 2020, 13:30:53

@user-67f0b3 Yes, Pupil Core and the VR/AR addons both use Pupil Capture and Pupil Player. Note also that this software stack is available under open source licensing.

user-67f0b3 06 April, 2020, 13:31:55

Perfect!

user-67f0b3 06 April, 2020, 13:34:23

thank you very much, @mpk @user-c5fb8b, have a good day

user-c5fb8b 06 April, 2020, 13:35:54

@user-67f0b3 glad we could help! Take care!

user-8779ef 10 April, 2020, 02:07:28

@mpk depending on the system, real time may mean 100 ms of latency or more. That’s certainly too much for anything that I would consider β€œreal time,” like gaze contingent stimulus delivery. I’m not saying that it’s not possible to lower the latency sufficiently for real time use, just said it would take some very careful measurement and verification for each specific user. My own experience is that there’s more latency than one would want in a real time system. I have not yet tried to operate people labs on a separate machine on the local area network. What’s great though is that pupil labs provides the timestamps need it to reconstruct the proper temporal sequence of things off-line. So, don’t take this as a complaint. Just a precautionary tale so that an eager scientist doesn’t runoff and collect data with assurances that there will be no temporal issues. πŸ™‚

user-f89c96 21 April, 2020, 09:35:08

Hi there

user-f89c96 21 April, 2020, 09:36:10

We'd like to use eye tracking with Holo Lens AR headset

user-f89c96 21 April, 2020, 09:37:39

is there anyone who has used this combination?

user-6752ca 23 April, 2020, 08:00:35

Hi. I have VIVE HTC Pro headset. would you please guide me, may I use pupil on my headset for eye tracking?

wrp 24 April, 2020, 08:51:44

@user-6752ca The Vive Add-on is compatible with the HTC Vive Pro, however Pupil Core headsets are not compatible due to the physical space constraints of Vive/Vive Pro HMD. Please see: https://pupil-labs.com/products/vr-ar/

user-70d572 24 April, 2020, 11:06:05

hey how do i get pupil position in unity with VR

user-70d572 24 April, 2020, 11:06:28

i tried the demo scene doesn't work

fxlange 24 April, 2020, 11:21:49

Hi @user-70d572, which versions of hmd eyes and unity are you using and which demo are you trying to run? Could you please elaborate on the type of issue you are having with the demo scene.

user-6752ca 27 April, 2020, 03:40:03

Hi. would you please guide me , i can take pictures or record documents about points what something are seen when using binocular Pupil eye tracking?

user-6752ca 27 April, 2020, 05:35:06

Is there any Academic discount for buying binocular eye tracking?

user-6752ca 27 April, 2020, 09:04:47

please give me more information about binocular eye tracking such as Gaze data output frequency, Accuracy, Calibration, Trackable field of view, Data output (eye information), SDK engine compatibility.

user-c5fb8b 27 April, 2020, 09:14:05

Hi @user-6752ca, this channel is reserved for question regarding the Virtual/Augmented Reality solutions of Pupil Labs. For general questions please refer to the πŸ‘ core channel. I also noticed that you received an answer to the same question in the πŸ’» software-dev channel as well.

papr 27 April, 2020, 09:18:36

@user-6752ca You can find the tech specs of the VR add-on on our website: https://pupil-labs.com/products/vr-ar/tech-specs/

user-c5fb8b 27 April, 2020, 09:22:45

@user-6752ca: in case you are asking specifically for academic discounts on the VR addons: We only support a dual pricing model for Pupil eye tracking headsets. VR/AR products are priced relative to the academic discount model we have for Pupil headsets and therefore do not provide any additional discounts. Since these are custom made research tools made in low quantities the margins are already too slim.

user-6752ca 27 April, 2020, 09:58:46

Thank you so much. πŸ‘

user-8779ef 28 April, 2020, 15:58:38

@fxlange Hello again felix! Hope all is well. I wonder if that modification that enables the option to turn off the annotation rendering was ever implemented in HMD eyes

user-8779ef 28 April, 2020, 15:59:36

or if you guys are considering some other method to facilitate the logging of data via pupil labs

user-8779ef 28 April, 2020, 16:00:07

I'm building a new little experiment, and now would be a great time to transition.

user-020426 30 April, 2020, 01:43:23

Hi there,

I'm going through the hmd-eyes documentation, i can run the completed executable with successful calibration and eye gaze working. When in the Unity scene i get a message saying i've successfully connected to Pupil and returns my current version (1.23.10), however the status text still says not connected and when i press C to begin calibration i get an error saying Calibration not possible: not connected! leading back to subsCtrl.IsConnected being false.

Anybody had a familiar issue? I believe it has an issue with the SubscriptionController script.

user-894e55 30 April, 2020, 19:19:54

hi, Can I adapt pupil labs to a Vuzix M300 xl?

End of April archive