vr-ar


user-1eb241 01 February, 2022, 04:02:22

Hii.. i remember reading this somewhere that we can use pupil labs VR add-On cameras just like any other usb webcam using our PC… if we access the camera stream like that, is there a possibility that I get gaze and pupil metrics data using API in a post-hoc manner..

user-1eb241 02 February, 2022, 11:57:50

Hi @papr , can you please comment on this as well..

user-1eb241 02 February, 2022, 11:53:28

Hii…. We recently bought a 200Hz vive add-on cameras… But the cap.available_modes doesn’t show the option of recording at 200Hz… can someone help me out with this

papr 02 February, 2022, 11:54:46

Hi, I am not sure about the low level interface, but 200hz is only available when selecting the 192x192 resolution

user-1eb241 02 February, 2022, 11:55:57

I’m using pyuvc library… is this what you mean by low level interface?

papr 02 February, 2022, 11:59:51

You won't be able to operate the cams at the full frequency with the default drivers. You will need the libusbk drivers to do so.

user-1eb241 02 February, 2022, 12:01:38

Correct. That’s why I followed the process given here and replaced the drivers : https://github.com/pupil-labs/pyuvc/blob/master/WINDOWS_USER.md

papr 02 February, 2022, 12:00:40

That said, you can convert any video into a Pupil player compatible recording and potentially run Post-hoc detection methods on it.

user-1eb241 02 February, 2022, 12:02:26

Great… can you please provide a link on how to convert the videos and run post-hoc detections

papr 02 February, 2022, 13:16:50

https://gist.github.com/papr/bae0910a162edfd99d8ababaf09c643a The passed video will be used as scene video. To run pupil detection, you will need to run extract_and_save_timestamps_with_offset on your eye videos and name the files eye0. and eye1. respectively. Pass the appropriate offset s.t. the videos are in sync with the scene video.

user-1eb241 02 February, 2022, 13:33:55

Will try this. Thanks a ton…

nmt 07 February, 2022, 10:26:59

Hi @user-47a821 👋. The Pupil VR Add-on for Vive is compatible with Linux. Please read the developer documentation for further details: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#developer-documentation

user-b74779 09 February, 2022, 01:22:14

Hi, I am hopping to purchase the VR/AR add on for Hololens 2. I saw that the Pupil lab add-on does not fit on Hololens 2, I am currently working on stress detection on this device and I would need pupil size data. Do you think that it could fit and work just by changing the frame ?

nmt 10 February, 2022, 08:34:20

Hi @user-b74779. Please reach out to info@pupil-labs.com in this regard 🙂

user-b7599b 14 February, 2022, 16:01:04

Hi sorry if this has already been asked multiple times, but I would like to use the raycast hit marker as an input to press a button but only under the conditions that confidence remains above 80% and that it stays fixated on the object for longer then 5 seconds. I can figure out how to use it as an input but not how to add the conditions for confidence, would appreciate any help

papr 14 February, 2022, 16:03:16

Could you point me to the location where the raycast is constructed? It should be based on a gaze datum that contains the corresponding confidence information.

user-b7599b 14 February, 2022, 16:07:25

I am currently using the default GazeVisualizer script that is used in the GazeDemo

papr 14 February, 2022, 16:10:50

It should be discarding low confidence data already https://github.com/pupil-labs/hmd-eyes/blob/master/plugin/Scripts/GazeVisualizer.cs#L149-L152

user-b7599b 14 February, 2022, 16:12:08

oh ok perfect, thank you!

user-9def09 21 February, 2022, 15:34:09

Hello all! I hope I write in the good chanel. I have a Htc vive pro 2 and I'd like to buy an eye tracker. Is there any products that is good for me?

user-9def09 21 February, 2022, 15:36:20

I saw on the Pupil webpage that you have add on for vive pro but is it compatible with pro 2 too?

nmt 21 February, 2022, 15:55:31

Hi @user-9def09 👋. Please contact info@pupil-labs.com in this regard 🙂

user-d1efa8 21 February, 2022, 18:33:32

Does anyone know how to send a "start_plugin" notification to start a new 3DHMDGazer with custom parameters

papr 21 February, 2022, 19:45:41

You can pass arguments via the args field

user-d1efa8 21 February, 2022, 19:59:19

like

 {"world_to_camera_matrix", *insert matrix here*}

?

papr 21 February, 2022, 20:08:19

Args needs to be a dictionary. If this is the syntax for a dict in c#, then yes

user-d1efa8 21 February, 2022, 20:10:14

I'm pretty sure but I'll double check, thank you

user-f62f53 24 February, 2022, 21:09:38

Hello folks, my lab is having trouble with one camera cutting out during a recording. It's inconsistent which eye and when. Does anyone have suggestions on how to mitigate these glitches?

user-f62f53 24 February, 2022, 21:25:04

My colleague is suggesting that a shorter USB-C has helped in the past. We are currently using a cable other than what was packaged with the system. Could this be the cause of all my problems?

nmt 25 February, 2022, 08:42:29

The inconsistent disconnects could be related to the USB cable and/or connections. How long is the USB cable you are using?

user-f62f53 25 February, 2022, 13:42:46

We were using an 10' cable and when we switched to the stock cable we had fewer disconnects, but they are still happening

user-eaeecb 28 February, 2022, 10:17:28

Hey there, is there any way i can adapt an existing vr addon into the quest 2?

user-eaeecb 28 February, 2022, 10:17:45

anyone who's done so before?

nmt 28 February, 2022, 10:32:52

Would you be able to make a test recording using the stock USB cable, but on a different computer running Pupil Capture?

user-f62f53 28 February, 2022, 17:37:05

We can certainly try that this week. Thank you for the suggestion

End of February archive