vr-ar


user-94ac2a 02 December, 2019, 06:35:15

Can I use 2D mode in unity?

user-bd0840 02 December, 2019, 12:45:00

@user-94ac2a i think 2d mapping was removed in the new hmd-eyes version

user-fc194a 02 December, 2019, 14:45:20

Hi, when viewing my recordings in Player, the messages "No Pre-Recorded Calibration found/Loading dummy calibration" always appear. The sequence I use in Unity is to run calibration first, wait for success status and then run my tests (with Pupil Capture in background). There are no calibration files produced in the folder either. Is this normal? (I am using a 200Hz binocular HTC add-on)

user-fc194a 02 December, 2019, 14:48:12

Another issue I have is with light colored eyes. Compared to darker eye colors where the algorithms work very well, I cannot seem to get the proper settings to have the detection work as good with light colored eyes. What would be your recommendations to get good results? Thank you

user-94ac2a 03 December, 2019, 03:05:17

@user-bd0840 but it seems 2D is much more accurate?

user-bd0840 03 December, 2019, 14:34:48

@user-94ac2a on paper it seems so. however, i haven't been able to get the 2D calibration to work reliably, neither with the unity-based example nor with my own code. 3d seems to be working quite well, though.

user-bd0840 03 December, 2019, 14:35:34

so maybe i was doing stuff wrong (happy for input here 😅), but i guess there's a reason 2D is gone from the current hmd-eyes

user-9a94be 04 December, 2019, 12:48:59

Hello! I have a question related with the ScreenCast component. I have seen the demo and I noticed that even if I put the same parameters for the main camera and the ScreenCast camera, I never get exactly the same image in Pupil Capture and the Game Screen at Unity. I am not sure if this is normal and I am just getting started with Unity and Pupil, so sorry if my doubt seems silly. Thanks for your help 🙂

user-7ebecb 04 December, 2019, 18:07:39

Hi! I have a question regarding the exported gaze data when recording gaze positions as demonstrated in the DataRecordingDemo. I can’t find the gazeDirection and gazeDistance variables in the exported csv. files. Am I just looking at the wrong headers or do I need to include them somewhere else? Also, how can I add a timestamp in Unitytime to the output (I know how to transform the timestamps with the TimeSync script, but I can’t figure out how to export them properly)? Thank you!

fxlange 04 December, 2019, 18:49:52

Hi @user-7ebecb - GazeDirection and GazeDistance aren't available in the export. They are calculated in hmd-eyes while the recorder only stores data available directly in Pupil Capture. That said, you can calculate both based on the 3d gaze point which is available in the recording/export (direction = 3d gaze normalized, distance = length of 3d gaze vector).

fxlange 04 December, 2019, 19:39:57

Hi @user-9a94be, the main and the screencast camera aren't supposed to have the exact same parameters and can't really show the exact same image. As the Unity main camera actually represents two stereo cameras in VR you can check that the screencast camera always covers the gaze target area (by tweaking FOV) and that the mapping of gaze data to the screencast camera is working (https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#screencast).

fxlange 04 December, 2019, 20:19:36

Hi @user-fc194a, when recording via Pupil Capture, pupil and gaze data are stored directly and Player doesn't need the calibration data. Have you checked the pupil detection fine tuning section https://docs.pupil-labs.com/core/software/pupil-capture/#pupil-detector-2d-3d? Especially the intensity range should help to some degree.

fxlange 04 December, 2019, 20:31:04

@user-94ac2a @user-bd0840 yes, we discontinued to support 2d for hmd-eyes v1.0. while it can be more accurate on paper it requires a fixed depth and 3d is more robust against slippage.

fxlange 04 December, 2019, 20:39:09

@user-bd0840 the default calibration targets are arranged on circles. adjusting the radius and distance allows to cover the full field of view at different distances but make sure to have enough points around the center of your fov as well.

user-bd0840 05 December, 2019, 15:02:31

@fxlange thanks! i presume the circle radii that are used in the example were then determined by trial and error, or is there some "deeper truth" to them?

user-e91538 05 December, 2019, 17:33:31

Hi everybody I need to ask a couple of pre-sales question on VR pr Am I in the right place?

user-e91538 05 December, 2019, 17:34:47

papr this is the right section for my question I guess

user-e91538 05 December, 2019, 17:34:54
user-e91538 05 December, 2019, 17:35:56

Is it possibile to integrate your VR eye tracker in the OCULUS QUEST? In case YES, could I see the user eye from the mirroring of UNREAL?

user-e91538 05 December, 2019, 17:46:40

@papr

user-9a94be 06 December, 2019, 16:24:42

Thank you very much, @fxlange !

user-f8595b 08 December, 2019, 05:24:19

afternoon, we have conntected, however, we can not see the vr video on screen, so that we can capture the hot map using the eye-tracking. we can see the eyetracking video without any vr picture. how should we adjust them? or link. Thank you firstly.

user-f8595b 08 December, 2019, 05:51:53

and another Q, there is no data when we clicked the button R.

user-9d2520 08 December, 2019, 12:37:21

Hi, I’m trying to use the Pupil HDM on an HTC Vive to record data from a Unity stimulus. I wanted to use the demo scene (GazeDemo) for calibration and then switch over to the scene with the stimulus using Unity’s SceneManager. I’ve noticed that the connection does not seem to be persistent between switching scenes, so I am unsure if the calibration data if used when displaying the stimulus and recording eye movements. Is the calibration data retained between scenes or is there an extra step I need to add? Thanks! 😀

user-9a94be 08 December, 2019, 13:07:51

Hi @user-9d2520 , I am trying to do the same thing and in general I would say the calibration remains and works quite well, but my technical abilities are limited and I'm just telling you my impression. I did not do anything too different to the ScreenManagement demo, except the ScreenCast part that I attach to camera and then I just add the DontDestroy script there. Are you doing something similar?

user-9a94be 08 December, 2019, 13:10:12

By the way, the only annoying thing I am trying to figure out is how to avoid saving the Status Text in the Application Scene. Because I have it as a "son" of Main Camera, it never leaves when the Application Scene loads.

user-94ac2a 09 December, 2019, 08:01:11

Any documentations on Default Targets Settings?

user-94ac2a 09 December, 2019, 08:01:59

Chat image

user-845a9a 09 December, 2019, 08:31:57

Hey all, we are using the Vive Add-on but unfortunately the HTC-Vive does not fullfill our requirements anymore in terms of resolutioin and field of view. We therefore want to switch to the Pimax 8k. Has anyone made attemps to customize the add-on, so that it fits in the Pimax. If so can you provide any diy instructions (or 3D print templates) to customize the add-on. I would really appreciate if there was an Add-On or some 3D printable parts for the Pimax.

user-141bcd 09 December, 2019, 16:16:17

@fxlange : is it problematic if I'm seeing negative pupilTimestamps (e.g., -800282.357787)?

user-d77d0f 11 December, 2019, 10:29:29

Hi! I'm trying to figure out how to detect fixations in real-time in Unity, all of the information I've found so far uses the data from pupil capture and detects fixations in post-processing.

fxlange 12 December, 2019, 09:18:27

Hi @user-141bcd, as long as the timestamps are increasing it is not problematic.

fxlange 12 December, 2019, 09:24:05

@user-94ac2a hi, you can find the documentation for the calibration here: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#calibration. The settings describe the circles on which we arrange the targets. The targets are in camera/hmd space.

fxlange 12 December, 2019, 09:31:04

hi @user-9d2520, the calibration is consistent between scenes, you just have to make sure to "donotdestroy" the connection and other hmd-eyes objects on switching scenes. We have a demo for that as well (SceneManagementDemo) - as @user-9a94be pointed out 👍

fxlange 12 December, 2019, 10:09:50

@user-9a94be you can remove or deactivate GameObjects by custom script, reference them via the inspector (as a public field for example) and destroy/deactivate them based on input or events.

fxlange 12 December, 2019, 10:17:44

@user-f8595b the pupil hmd addon has no world camera and requires a connection to the VR client and access to the VR view. this won't won't work with third party VR clients but you can for example use hmd-eyes to build your own VR environment in Unity https://github.com/pupil-labs/hmd-eyes hmd-eyes offers a screencast demo to showcase how to stream the VR view to pupil capture via the ScreenCast component: https://github.com/pupil-labs/hmd-eyes/blob/master/docs/Developer.md#screencast

user-82e7ab 13 December, 2019, 07:59:12

hi @fxlange/@wrp I was looking for information regarding support for the Valve Index, and I've read that you were evaluating this a few month ago - did you made a decision yet?

fxlange 13 December, 2019, 09:45:09

hi @user-d77d0f - yes we have a fixation plugin for pupil player but also for pupil capture. the later can be accessed in realtime via the network API. We don't have direct support for fixations in hmd-eyes but you can use the SubscriptionsController to subscribe and listen to fixation topic via custom scripts.

Fixations are represented as dictionaries consisting of the following keys: topic: Static field set to fixation norm_pos: Normalized position of the fixation’s centroid duration: Exact fixation duration, in milliseconds dispersion: Dispersion, in degrees timestamp: Timestamp of the first related gaze datum confidence: Average pupil confidence gaze_point_3d: Mean 3d gaze point

(from the old docs, information is missing in the new ones but we are working on it)

user-7ba0fb 16 December, 2019, 10:08:04

hi. when I playback the VR scenario in Pupil Player,I found the visualized gaze point is incorrect (it did not show the position where I looked), while the Gaze Visualizer component almost showed the right gaze direction. What's the problem?

user-9a94be 16 December, 2019, 11:05:32

Hi @fxlange, I saw in the documentation that the Screencast component can be very expensive in terms of performance and that it is suggested to used another computer for doing this. How can this be done? I have no clue about how can I make the VR in one computer as the World view in pupil capture in a different computer. Any suggestions? Thank you very much.

user-e91538 16 December, 2019, 13:52:25

Hi I am working with HMD Eyes Unity Plugin and I managed to get eye0 & eye1 texture2d from the plugin. Is it possible to get world video frame by frame into texture2d the same way as the eye video? It would be very helpful. Thanks!

user-e91538 16 December, 2019, 13:53:06

[email removed] can you check please, because I think this should be possible.

user-e91538 16 December, 2019, 15:17:28

It is working for world video the topic is called "frame.world"

user-0eae33 16 December, 2019, 16:00:52

Hey guys, Currently I am working with the HTC Vive Plug-In in the Vizard environment. I already got some data output, which is awesome. Now I'm trying to measure the vergence via the data output 'gaze_normal'. In the current data output there is no data for that eventhough it is supposed to be there. I guess that I somehow have to collect in the 3d mode instead of the 2d mode. And I find an option for that in the pupil_capture-app, but everytime I run the code from vizard the preferences are set back to the 2d mode (same for another calibration mode btw.).

Now, is there a possibility to set the data recording mode to 3d in the vizard code (and change the calibration mode)? Or can I define that somewhere else so I can get the data I want?

Thank you guys for your work btw!

user-175561 17 December, 2019, 05:03:57

Hey guys, happy holidays!So I'm using the HTC Vive Add-On and am currently struggling with the blink detection. The actual detection works fine and picks up the big blinks. However, every once in a while it will record a blink that lasts multiple seconds despite the video showing no blinks that low. Is this a common issue? If so, what can I do to fix it?

user-175561 17 December, 2019, 05:04:34

Also, is it possible to work with the post-processing effects after the video of the eyes is recorded?

user-833165 17 December, 2019, 11:14:19

@fxl If you are currently evaluating the Oculus Quest you can count another user interested in it (me)

user-9a94be 17 December, 2019, 11:14:31

Hi! If I understood correctly, in order to use pupil capture in another computer different than the one running Unity. I should: 1. Connect the add-on to the computer where I will run Pupil Capture 2. Use the Unity Plugin to communicate with Pupil Capture through Pupil Remote. 3. I understand this is done here, in the Request Controller, by adding the same values I see in the Pupil Remote plugin inside Pupil Capture. Is this right? I tried but It is not working.

Again, thanks for your help.

Chat image

user-a1c7ea 17 December, 2019, 13:09:32

Hi! Could please someone tell me, if the actual eye positions in head calculated by pupil model influence the accuracy of 3d gaze points/eye normal vectors estimation? Assume I have the actual eye centers in (26.0, 0.9, -7.7) for the right eye and (-31.7, -0.8, 1.5) for the left one, probably because the hmd is not perfectly aligned on the head. So my question is, whether I should pay attention to these eye center coordinates to achieve better gaze estimations or it doesn't matter at all, because they are not crucial for gaze estimation? Should I adjust the hmd position until the eye centers are well aligned?

marc 17 December, 2019, 13:30:48

@user-a1c7ea The eye centers are measured by the system automatically to accomodate different positioning of the HMD in relation the eyes. It is not necessary or beneficial to optimize the positioning of the HMD in order get certain values for the eye centers. To optimize the performance you should rather optimize the HMD position on the head for best pupil detection performance. Accurate pupil detection yields accurate measurments of the eye center positions, which in turn yields accurate gaze estimates.

user-a1c7ea 17 December, 2019, 13:45:39

@marc thanks!

user-d77d0f 18 December, 2019, 10:22:41

Hi! I've started using the SubscriptionsController to listen to fixations in unity but unfortunately there never seem to be any fixations detected (whereas if I plug the recorded data in pupil player, the plug-in finds some fixations). Does anyone know what the issue might be here?

user-d782d9 23 December, 2019, 18:33:38

Hello, warm regards to all the esteemed team of developers. I have a question, is the current SDK possible to implement on the web? For example with WebVR or WebXR using Three.js?

user-28db6e 30 December, 2019, 13:22:29

Hello! I am working in VR-pupil using HTC vive and the Binocural add-on and Unity 19.2.1 I have noticed that the one of the cameras of the eye 0 is turning grey screen and the other eye 1 even if it has a sharp/ good quality image of the eye cannot detect the pupil most of the times. Is there a way to help the detection of the pupil? Is the grey screen a connection problem? I cannot find any cable issue.

user-c5fb8b 31 December, 2019, 08:07:14

Hi @user-28db6e regarding the grey screen, please contact info@pupil-labs.com Can you share a screenshot of what the images of the other eye look at, so we can suggest ways to improve detection?

End of December archive