vr-ar


user-0eef61 01 November, 2019, 03:16:17

Hi, I am trying to run the scripts of fixation detector, gaze velocity and pupil diameter by fixation on surface from pupil labs tutorials: 1- https://github.com/pupil-labs/pupil-tutorials/blob/master/01_load_exported_data_and_visualize_pupillometry.ipynb In the fixation detector tutorial it says ln[14]: fixation detector has not been defined.

user-0eef61 01 November, 2019, 03:18:03

2- https://github.com/pupil-labs/pupil-tutorials/blob/master/05_visualize_gaze_velocity.ipynb In gaze velocity, theta psi and r are not defined. I assume theta can be retreived from pupil.positions.csv . Also psi can be retrieved from pupilpositions.csv by using phi instead? However I do not know how to retreive r

user-0eef61 01 November, 2019, 03:19:13

3 - https://github.com/pupil-labs/pupil-tutorials/blob/master/06_fixation_pupil_diameter.ipynb I am not sure where I can find the fixations_on_surface.csv

user-0eef61 01 November, 2019, 03:19:31

Your advises would be very helpful. Thanks πŸ™‚

user-c5fb8b 01 November, 2019, 07:29:21

Hi @user-0eef61 I think the error you are seeing might come from not running the entire notebook. In case you are new to Jupyter Notebooks: every code cell has access to the results of previously executed cells. The sections you are referring to reference variables or functions defined in previous code cells of the respective notebooks, so I assume you just executed a single cell, or copied the code from a single cell and executed it in isolation? If this is not the case, please let me know, then I'll take a deeper look at your problems! πŸ™‚

user-0eef61 01 November, 2019, 11:22:53

Hi @user-c5fb8b Thanks for your reply. I have copied all the code and executed in my own jupyter and spyder. I don’t know why it says that fixation detector has not been defined because I saw it in previous lines. Maybe is something else?

user-0eef61 01 November, 2019, 11:24:15

The previous codes (for example plotting pupil diameter) work very well

user-c5fb8b 01 November, 2019, 12:03:08

@user-0eef61 Let's maybe move this conversation to the core channel, as it's not directly related to VR/AR... I'll ping you there!

user-0eef61 01 November, 2019, 13:45:02

Ok @user-c5fb8b Hopefully we can fix this

user-0eef61 01 November, 2019, 13:45:12

because I want to make some analysis of fixations

fxlange 05 November, 2019, 08:45:51

@here We are pleased to announce the latest release of hmd-eyes v1.1!

https://github.com/pupil-labs/hmd-eyes/releases/tag/v1.1

hmd-eyes v1.1 brings support for remote annotations via the AnnotationPublisher component and adds a Publisher C# class for the network layer. By utilizing the later for sending calibration data the calibration routine is now robust against interrupts caused by network/performance issues.

user-dae891 05 November, 2019, 21:01:07

@papr Does Vive Pro add-on works for the most new product "Vive Pro Eye"?

wrp 06 November, 2019, 02:43:10

@user-dae891 we have not evaluated the fit of our vive pro add-on in the vive pro eye. Additionally, the Vive Pro eye is not a system we are looking to make an add-on for. That being said, if you would like to prototype something for this system, we can provide you with hardware (cameras + cabling). Please email sales@pupil-labs.com with a request.

user-94ac2a 07 November, 2019, 10:35:52

why does calibration on Pupil Capture require world camera but HMD version do not require world camera?

fxlange 07 November, 2019, 10:46:07

@user-94ac2a For the hmd calibration the VR view basically replaces the world camera, for hmd-eyes the Unity VR/stereo camera. Because of that the hmd calibration always needs the VR/AR client to implement and run part of the calibration protocol, which hmd-eyes offers for building applications in Unity.

user-94ac2a 07 November, 2019, 11:43:02

@fxlange So basically if it is a custom headset it needs VRAR client to implement its own

fxlange 07 November, 2019, 11:50:40

@user-94ac2a depends what you mean by custom headset. Unity supports many VR headsets, for which you could all use hmd-eyes.

user-d77d0f 07 November, 2019, 12:46:16

Hi! I'm trying to run the GazeRaysDemoScene in unity but it doesn't seem to be working (all other demos work perfectly fine). I get an error saying: "NullReferenceException: Object reference not set to an instance of an object PupilLabs.DisableDuringCalibration.Awake()". Any idea of how to solve it?

fxlange 07 November, 2019, 13:37:44

Hi @user-d77d0f - Indeed, the DisableDuringCalibration component of the HeadAnchor game object is missing a reference. The CalibrationController is not assigned.

user-d77d0f 11 November, 2019, 19:48:42

Hi! I'm trying to retrieve gaze data in unity and am therefore using the call "gazeController.OnReceive3dGaze += ReceiveGaze;". However I get the following error in unity: " NullReferenceException: Object reference not set to an instance of an object BenchMarkBehaviour+<displayCubes>d__10.MoveNext () (at Assets/Scenes/BenchMark/BenchMarkBehaviour.cs:52) UnityEngine.SetupCoroutine.InvokeMoveNext (System.Collections.IEnumerator enumerator, System.IntPtr returnValueAddress) (at C:/buildslave/unity/build/Runtime/Export/Coroutines.cs:17)" (where line 52 in my script corresponds to the one indicated beforehand). I'm not sure what the issue is considering the fact I've mostly re-applied what is done in the demos. Thanks in advance πŸ™‚

user-cd7aec 12 November, 2019, 03:04:13

Hey everyone. I'm trying to improve my gaze accuracy and the documentation online says that the 2d calibration has <1 deg accuracy where as the 3d calibration has 1.5-2 deg accuracy. Is there a reason why support for 2d calibration was dropped in hmd-eyes 1.0 considering that 2d seems to outperform 3d? Thanks!

user-fd4a45 12 November, 2019, 08:48:25

Hi! I've just read a lot of messages from this discussion. Still have one question: I can't use Unity as our research is based on Skyrim VR (on Steam), is there any way I can still manage to retrieve a precise Gaze position onto the "world cam" view (maybe thanks to a real-time recording of the view of the headset)? Maybe I've missed something with my approach...

user-d77d0f 12 November, 2019, 15:59:37

Hi! I seem to be getting errors due to the gazeController not being instantiated but I assign the Connection (SubscriptionsController) to it just like in the demos. I'm not sure what else I need to do to stop getting these NullReferenceExceptions.

user-29e10a 12 November, 2019, 21:34:14

@user-dae891 @wrp we checked today if the addon would fit inside the Vive Pro Eye. It doesn’t. The Tobii Tracker also uses a similar infrared array, so the Pupillabs ring is too small to fit. πŸ€·πŸΌβ€β™‚οΈ

user-14d189 13 November, 2019, 04:06:54

Hi there, I used the Vive pro add on for pupil tracking on Win10 with pupil 1.18. the framerate between eye0 and eye1 was very different eye0 100-120, eye1 40-50. Is there an easy fix to it? I restarted eye1 but no difference, different resolutions - no change. thanks for your help

user-14d189 13 November, 2019, 04:15:47

Chat image

user-14d189 13 November, 2019, 04:17:06

and while closing service, it does not save the settings, does it?

wrp 13 November, 2019, 04:21:14

@papr or @user-c5fb8b please could you follow up on this question today ☝ ?

papr 13 November, 2019, 06:55:11

@user-14d189 The resolution should be saved between sessions if the program was shutdown gracefully.

The frame rate can be limited by very high exposure times. Can you check the exposure settings of both cameras in the uvc source menu?

fxlange 13 November, 2019, 08:59:08

@user-fd4a45 The short answer is no, it is not supported to have gaze estimations with third-party VR clients. But depending on the VR client, it might be possible to have a workaround. The important part is the calibration between VR cameras and eye cameras. Without access to the camera intrinsics of your VR client (Skyrim VR) you won't be able to retrieve accurate 3d gaze estimations. If you would know all camera parameters you might be able to mimic them in Unity, run the calibration via hmd-eyes and switch to Skyrim VR afterwards. Another approach could be place calibration targets in your VR client, at known positions at a known time - for example by writing a Skyrim Mod - and communicate these timestamped reference positions to Pupil Capture. But again, these are just ideas. Haven't tested any of that and I'm not sure if it would be feasible.

fxlange 13 November, 2019, 09:03:47

@user-d77d0f Looks like that you didn't assign your GazeController to your benchmark behavior. Make sure that the BenchMarkBehaviour::gazeController is public or a [SerializableField] so that you can assign it via the Inspector.

fxlange 13 November, 2019, 09:10:15

@user-cd7aec The 2d approach is very sensitive to slippage (which can happen quickly with HMDs), while the 3d approach is able to adjust to slippage over time.

user-14d189 14 November, 2019, 01:35:17

Hi @papr, Thank you, it works. service start with auto exposure mode manual mode and absolute exposure time 63. if I set exposure time higher than 75 the frame rate start to drop. Cheers

user-d77d0f 16 November, 2019, 15:21:28

Hi! I'm trying to better understand how pupil and unity communicate together but I can't find as much in-depth explanations on the subject as I need (I've checked out the Developer Documentation). For example, how are the GazeData dictionaries updated by pupil? I see that it depends on the GazeListener but I can't seem to find where the dictionaries are created/updated. Thanks in advance πŸ˜„

user-0ccd62 18 November, 2019, 18:54:56

hello guys, I have a question to get the pupil size data in Unity from the pupillab. Does anyone know how to get the pupil diameter data? Thanks in advance.

user-b09747 21 November, 2019, 03:24:42

Hi! I am developing a hologram as Unity. Until then, it worked well if you built Unity's app in a hologram lens and connected it with a pupil capture program. But after I recently changed my Internet settings, I couldn't connect to the new build app. Do you think VPN or proxy settings will affect when I connect remotely from a pupil?

user-f8595b 21 November, 2019, 12:16:14

is there any operations when pupil vr and HTC vr contected, i can not see any videos using the pupil service or capture

user-f8595b 21 November, 2019, 12:16:24

hello , i am Lydia

fxlange 22 November, 2019, 08:21:20

hi @user-b09747 - your connections issues might be related to your new internet settings, especially when you are trying to connect remotely. but that's hard to tell from here. make sure that the ports are not blocked but forwarded properly (50020 by default) and test if everything works locally first.

fxlange 22 November, 2019, 08:26:06

hi @user-0ccd62 are you using hmd-eyes v1.0+? please check the developer docs: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md and have a look at the PupilDemo. the demo showcases how to receive PupilData from a PupilListener.

fxlange 22 November, 2019, 08:39:24

hi @user-d77d0f - the plugin and the GazeTracker prefab includes a GazeController, which wraps and "controls" the GazeListener class as a MonoBehaviour. The GazeListener subscribes to the gaze topic provided by Pupil Capture/Service (via the network api) as described here https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#accessing-gaze-data.

So in other words Pupil Capture/Service sends gaze data via the network API, the GazeListener subcribes to, receives and parses the message and provides c# events containing the preparsed GazeData. In your customs scripts you only need to have a reference to a GazeController (which lives in your scene) and bind a custom method to GazeController.OnReceive3dGaze.

user-0fc4b2 23 November, 2019, 10:18:54

@fxl HTC Vive Add-On cannot linked with HTC Vive to collect eye tracking data coordinated with any world view of VR client, right?

user-8779ef 23 November, 2019, 20:26:16

@user-0fc4b2 the vive integration does in fact take head position and orientation into account to cast a gaze ray into the virtual environment.

user-8779ef 23 November, 2019, 20:27:18

This is a function of the unity plug-in, which polls the pupil Labs server for eye-in-head data from within Unity.

user-716336 24 November, 2019, 09:53:32

I am trying to create an experimental setup using the HTC Vive Pro VR set with the Pupil add-on eye tracker. We have a collection of experiments already coded in PsychoPy and OpenSesame. We can display these experiments in the Vive environment (i.e. in the head-mounted display) using Steam's features: showing the desktop, or with the Virtual Desktop plugin. It is also possible to record what the participant is seeing in their glasses using OBS and its OpenVR capture plugin. I can't figure out how to interface all this with Pupil, i.e. how to record what the participant is seeing as "world" video. I have tried the Pupil hmd-eyes plugin: if I'm not mistaken, it allows creating an environment in Unity and capturing it using Pupil Capture (eye+world). However, we would like to avoid to have to rewrite all our experiments in Unity, if at all possible.Is it possible to stream from OBS into Pupil Capture? Is there another way of using Pupil Capture to record an entire VR session in our situation? Any ideas will be very much appreciated. Thank you in advance, George

user-716336 24 November, 2019, 12:50:34

Update: another idea would be to stream the Pupil eye cameras to OBS Studio, and record them simultaneously with the HMD scene video. Unfortunately, when I look in "Video Sources" in OBS Studio, the two eye cameras are not available. A solution for that would be to run Pupil Service or Pupil Capture and instruct OBS Studio to record the two windows (window capture) -- but this is starting to look too much like a hack to be relied upon for a research experiment... Note that a couple of cognitive psychology experiments could be reprogrammed in Unity, but we'll run into the same problem if we try to use a more sophisticated application like a flying/driving simulator that uses the HMD without giving us access to its source code.

user-bd0840 29 November, 2019, 10:21:53

hi there! i have some questions about the hmd-eyes stuff (mainly for reverse engineering purposes): what's the reasoning for the circle radius values in the default 3d calibration targets? and for the distance of the "circle layers"? and also, why is a gaze position behind the observer expected (https://github.com/pupil-labs/hmd-eyes/blob/c96b6b5331540185089bcc761b6e1467cf7f82cc/plugin/Scripts/GazeData.cs#L186)? maybe @fxlange has time for a quick chat? πŸ˜…

End of November archive