vr-ar


user-f90f05 03 July, 2021, 08:10:36

Guys i keep getting this error whenever i run the gazedemo scene from the unity plugin I don't know the problem

FormatterNotRegisteredException: System.Collections.Generic.Dictionary`2[[System.String, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089],[System.Object, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]] is not registered in this resolver. resolver:StandardResolver
MessagePack.FormatterResolverExtensions.GetFormatterWithVerify[T] (MessagePack.IFormatterResolver resolver) (at Assets/Plugins/Pupil/3rd-party/MessagePack/IFormatterResolver.cs:38)
MessagePack.MessagePackSerializer.Serialize[T] (T obj, MessagePack.IFormatterResolver resolver) (at Assets/Plugins/Pupil/3rd-party/MessagePack/MessagePackSerializer.cs:64)
MessagePack.MessagePackSerializer.Serialize[T] (T obj) (at Assets/Plugins/Pupil/3rd-party/MessagePack/MessagePackSerializer.cs:55)
PupilLabs.RequestController+Request.SendRequestMessage (System.Collections.Generic.Dictionary`2[TKey,TValue] data) (at Assets/Plugins/Pupil/Scripts/Request.cs:110)
PupilLabs.RequestController.Send (System.Collections.Generic.Dictionary`2[TKey,TValue] dictionary) (at Assets/Plugins/Pupil/Scripts/RequestController.cs:185)
PupilLabs.RequestController.StartEyeProcesses () (at Assets/Plugins/Pupil/Scripts/RequestController.cs:206)
PupilLabs.RequestController.Connected () (at Assets/Plugins/Pupil/Scripts/RequestController.cs:157)
PupilLabs.RequestController+<Connect>d__28.MoveNext () (at Assets/Plugins/Pupil/Scripts/RequestController.cs:143)
UnityEngine.SetupCoroutine.InvokeMoveNext (System.Collections.IEnumerator enumerator, System.IntPtr returnValueAddress) (at <10564ed154d647e194bef4aef8878649>:0)
papr 03 July, 2021, 11:19:36

Please see the dependencies section here https://github.com/pupil-labs/hmd-eyes/ especially in regard to api compatibility level

user-f90f05 04 July, 2021, 00:44:22

Thank you, my supervisor helped fixing it and ur totally correct it was an api issue

user-fde905 07 July, 2021, 07:47:28

Hi, we are trying to use HoloLens 2 with the pupil labs hmd add on kit. Has anyone designed 3d prints for mounting on HoloLens 2?

user-26fef5 12 July, 2021, 20:27:32

Out of curiosity, doesn't the Hololens 2 feature built-in eye tracking? Why would you need additional hard+software that occupies more space? Is the built-in tracking bad?

user-f90f05 11 July, 2021, 10:18:36

Does anyone know how I can use gaze data as an input in a new script in unity without using the gaze demo? or more specifically gaze visualizer component?

user-fde905 12 July, 2021, 20:42:41

The built in eye tracking is only 30 hz and the pupil labs has 200 hz eye tracking and we need it our experiment

user-359878 12 July, 2021, 20:44:44

As I understand you can only get 200Hz if you upload your data to the cloud - and we can't do that within human academic research ethics protocols for data access. Without uploading, local use allows ~60Hz, or 30Hz eye data, if I've understood replies here, though it's also not really clear what 200Hz means, especially since the data is actually multiples of 4ms samples, which is super confusing considering the eye camera is supposed to be 200Hz - but there's no 5ms sampling. Still waiting for clarification.....with fingers crossed.....we're due to record in August and we don't know our real sample rate or temporal precision....

user-fde905 12 July, 2021, 21:01:29

Thanks for the info. We actually need pupilometry as well so we have to use pupil labs. Please let me know if you are able to get it working at 60 hz or higher.

user-73b616 12 July, 2021, 21:03:40

I don't think that pupil invisible constraints apply here

user-359878 12 July, 2021, 21:09:59

Oh maybe not - it is the pupil invisible system specifically I'm referring to, @user-fde905, assumed that was also the case for you as you referred to 200Hz tracking, which was also the selling point for me.

nmt 12 July, 2021, 21:11:20

Hi @user-fde905 đź‘‹ @user-73b616 is correct. The Pupil Ar addon adds Pupil Core based eye tracking to HL1, which can sample at 200hz. However, the addon isn't designed for HL2.

user-fde905 12 July, 2021, 21:12:09

That's great to hear! Thanks

user-fde905 12 July, 2021, 21:14:34

@nmt is it possible to do 200 hz pupilometry locally on the recorded data?

papr 12 July, 2021, 21:15:00

With Pupil Core software compatible hardware, yes.

user-359878 12 July, 2021, 21:17:52

That means not with pupil invisible, unless data is uploaded to the cloud, correct? If using PI and not uploading to the cloud, it's 30Hz gaze data max sampling? Or 66Hz?

user-fde905 12 July, 2021, 21:15:20

Got it. Thanks

papr 12 July, 2021, 21:19:37

Currently, Pupil Invisible is not usable in a vr-ar context due to its form factor.

user-359878 12 July, 2021, 21:21:37

It fits perfectly under our varjo headset though! But anyway - can you confirm sampling limitations are as above? I realise my other questions regarding temporal precision may take some time to answer, but whatever information you can provide will be helpful for a very tight timeline here...

papr 12 July, 2021, 21:24:27

It fits perfectly under our varjo headset though This is good to know.

Regarding sampling rates: The scene camera records at 30 Hz, the eye cameras record at 200 Hz but the realtime gaze estimation happens at 66 Hz on average if you use the 1+8. https://pupil-labs.com/products/invisible/tech-specs/

papr 12 July, 2021, 21:31:12

@user-359878 Also, these numbers are averages. The scene camera has a variable frame rate, too.

papr 12 July, 2021, 21:29:44

At the moment, I can only offer my previous response to this question https://discord.com/channels/285728493612957698/633564003846717444/862723198617649172

user-359878 12 July, 2021, 21:32:11

There is no way the data above would lead to averages at 200Hz/5ms intersample time (isi), though?

papr 12 July, 2021, 21:33:47

If you want to continue discussing this topic, I suggest moving the discussion to invisible where it belongs.

user-359878 12 July, 2021, 21:34:38

sure we can do that. I've left messages there too but looked here to see if there was anything useful, pending response.

papr 12 July, 2021, 21:36:27

Pupil Lab's VR/AR-specific hardware is designed to work with Pupil Core software which works fundamentally different than Pupil Invisible. Most of your issues and questions do not apply to the Pupil Core products.

user-359878 12 July, 2021, 21:36:08

Deleting here to paste there.

user-c0e5b4 21 July, 2021, 15:45:56

Hello. I have the Pupil Capture v.3.4 and Hmd-Eyes.VR v1.4 package in Unity and trying to calibrate in one scene and collect data and a video in another. The Capture app is crashing when i do that with the demo scenes "CalibrationScene" and "ScreenCastDemoScene". At the ScreenCastDemoScene there is warning text :: Experimental feature -- requires Pupil Capture v1.15, so I used this version to record data and video but then I cannot use the scene management of a calibration scene and a recording scene as it is crashing for Capture v1.15. Is there a solution for this? sorry if this has been answered already somehow I couldn't find the answer.

papr 21 July, 2021, 15:47:33

Both, 3.4 and 1.15, crash reproducibly for you?

user-c0e5b4 21 July, 2021, 15:50:03

3.4 is crashing when I want to collect video from the "ScreenCastDemoScene" via Capture App. the 1.15 does not do that and let me record video and data from the "ScreenCastDemoScene" but then when i want to use LoadSceneAfterCalibration function to calibrate in another scene it crashes

papr 21 July, 2021, 15:53:11

Please share the capture.log file from the pupil_capture_settings folder after reproducing the crash with v3.4. I will have a look tomorrow and check if I can identify the issue.

user-c0e5b4 21 July, 2021, 15:53:44

ok thanks i will!

user-53365f 22 July, 2021, 09:03:52

Hey Papr. Regarding my question about creating an app for tracking individual pupils for someone with strabismus. It dawns on me... If I understand your reply in the other channel one of the problems with doing it with the Core hardware is calibrating against the real world. I would imagine, therefore, there would be no such problem with VR? Or does that require a similar calibration step with both eyes in sync?

papr 22 July, 2021, 09:05:23

Yes, the same issue applies. The calibration is mostly about finding the physical relationship between the eye cameras and the target coordinate system (scene camera for Core; Unity main camera for VR applications)

user-53365f 22 July, 2021, 09:37:06

That's good to know, thanks. That makes me less inclined to look at a VR solution then as the Core has a lot of real world utility.

user-53365f 22 July, 2021, 12:52:51

If I understand the design correctly, Core doesn't support most glasses (if not all depending on IR filtering?). Is that right?

papr 22 July, 2021, 12:54:30

It is kind of uncomfortable to wear both, glasses and the Core headset. Also, the glasses' frame often occlude the eye cameras.

user-53365f 22 July, 2021, 12:56:52

Mine are rimless. What I could potentially do is mount lenses directly to the frame or with a custom bridge. Shame Invisible is so costly as I imagine that would enable me to achieve all my goals vis-a-vis independent eye tracking and prescription lenses?

papr 22 July, 2021, 12:58:10

Pupil Invisible does not provide monocular gaze estimations, though. They would not fit your requirements. A custom bridge holding the glasses could indeed work well.

user-53365f 22 July, 2021, 13:00:14

Thanks. In the end possibly the best solution for my needs might actually be the VR solution as it could accommodate lenses fine. Shame I can't try it out to see if it'll work. I'm sure I can find a way around most of these problems but without devices in my hands it's hard to guess. I wonder if there's any researchers in London who have one? Or, at least, some precedent/advice about working with them with people with glasses?

user-5ac103 27 July, 2021, 14:03:36

Hello everyone, I am using a binocular Eye-Tracking Add On for a HTC Vive with a sampling rate of 200 Hz. I have run some experiments in the end of 2020. Unfortunately, my raw data analysis suggests, that the sampling rate is only 120 Hz. The CPU is from Intel Core i7-9700K BX80684I79700K (3,6GHz). The Unity Plugin was HMD-Eyes v1.3 and I was using Pupil Capture v2.4. To get the correct timestamps for sampling I use this desktop to render the Unity VR-Application. The trigger in the Unity Application uses the same C#-command for data recording, as the pupil labs recording controller. I do not use screen casting in order to keep the requirements for the CPU pretty low. As recommended for good performance in your hmd.eyes Github document.

Is there anything I can do to “boost” the sampling rate?

Thanks in advance!

papr 27 July, 2021, 14:04:45

Have you changed the target frame rate to 200 in the video source menu of the eye windows already?

user-5ac103 27 July, 2021, 14:49:42

I have changed it to 200Hz, the first time I was piloting the study. Unfortunately, I thought that the settings in the menu would be saved unless you say otherwise (aka. restart with default settings). This seems not to be the case. Is there an initiation routine I could run/hidden save button I could use in order to achieve 200Hz whenever I am using Pupil Capture ?

user-2c8a6d 28 July, 2021, 01:44:57

Hi . Would you please introduce me useful books about eye tracking?

user-7b683e 28 July, 2021, 11:18:50

Hey, you can reach many academic studies about this topic. For example, Pupil eye tracker was a master thesis. If you read it, I think you learn so useful track and background of eye trackers.

user-2c8a6d 28 July, 2021, 01:49:05

I need a book that explain about different kind of method for analyzing and explain about taht.

user-5ba7db 29 July, 2021, 10:42:44

Hello everyone, my workgroup owns a binocular Eye-Tracking Add On for a HTC Vive for some time now. Most recently i got tasked with setting up the tracking of experimental VR-Applications for teaching. We just require a simple recording of the VR-Screen (like the one from Steam VR), combined with the visualized Tracking. I am pretty new to this whole matter and by no means qualified, but just recently worked succesful with the Pupil capture Headset. Setting up the hardware and starting capture, the binoculars are showing up properly via external USB (and htc-vive usb port), but i simply cannot get a world view in capture. Is it even possible to get the basic results for VR-tracking in capture (like the headset with the World view camera)? Do i need to set up a external origin Project, fusing together tracking data from capture and world view from steam, etc., utilizing the Hmd-eyes project, even for basic capture ?

Thanks in advance

user-5ba7db 29 July, 2021, 10:51:43

Thx for the fast answer, i guess this will solve my basic problem. Gonna try it asap 🙂

papr 29 July, 2021, 10:52:41

This feature only streams unity's main camera, though. I do not know how you would capture steam application content with that.

End of July archive