🥽 core-xr


user-3902f9 02 October, 2017, 05:41:41

just had the same experience [email removed] I can't get this working either

user-3902f9 02 October, 2017, 05:42:25

definitely don't have other services in the background

user-24ee96 02 October, 2017, 13:44:05

Hi everyone, I have the exact same problem as @user-13c487 , if anyone has a way to fix this offset, or has any idea where it might come from that would be a great help ! Thanks !

user-f1d099 02 October, 2017, 16:20:44

@user-13c487 I accomplished the raycast by using the camera's forward vector and rotating the vector by the angles between the X and the vector and between Y and the vector. There may be a more elegant way to do this but thats for later.

you can fix the offset by subtracting .5 from the XY values

user-f1d099 02 October, 2017, 16:21:58

btw, the estimated distance between the camera and the plane the XY are drawn on is 0.75 units

user-5874a4 02 October, 2017, 23:59:07

@user-5ca684 The unity plugin won't accept any KeyInput until pupil capture is loaded and two eye processes are started. Was this feature set intentionally, so as to safeguard the user from recording/calibrating before the app connects with pupil capture?

user-5874a4 02 October, 2017, 23:59:43

Even then, I see that the behavior is not consistent. When eye processes pop up in front of the screen after hitting play in unity, Key inputs work only when I select the eye process windows and minimize it. Otherwise, No input is taken. Why would it work this way?

user-e1dd4e 03 October, 2017, 16:00:50

Hey everyone, I'm trying to get my pupil eyetrackers working with the Vive and I'm stuck. I'm on windows 10 so I don't know if that will be a problem, but does anyone have a solid step-by-step of how to get this running . I know there is the stuff on Github but I think I'm just missing a few steps.

user-5ca684 03 October, 2017, 18:24:45

@user-5874a4 no the input is not intentionally blocked, there must be something else going on there. Where do you have your keyinput written? Or do you mean the R and C keys for calibraion and recording?

user-90aa66 03 October, 2017, 18:33:21

Hi all, I am using the DK2 with one eye tracker and Unity 2017.1 for a research study. I'm able to calibrate and record from my scene and the tracking data looks pretty good in pupil player, however it seems to be sped up. I'm having trouble synchronizing the exported video files to play in pupil player. For example: I just made a recording and it created an eye recording of length 1:08, 90fps; a Unity recording of length 2:44 and 30 fps and a fake world video of length 1:08 and 30 fps.

user-90aa66 03 October, 2017, 18:34:00

When I change my unity movie output to length 1:08 it still does not match up - does anyone know what might be going on here??

user-f1d099 03 October, 2017, 19:13:03

@user-90aa66 I'm having a similar issue, except I am using OBS studio to record my Unity feed and replacing the world video file with the recording from OBS. I'm having limited success at the moment by having the OBS match the average fps of the eye cameras.

If there is a more elegant solution I would love to here it, as the current set up is less than ideal.

user-90aa66 03 October, 2017, 22:32:47

@user-f1d099 Yes I would also love to hear that as well. Do you also have ~90fps eye recordings? I set ffmpeg in Unity to export it's video at 90 fps. This resulted in a 39sec eye video and a 31sec video from Unity. They still play at different speeds and can't be synchronized no matter how I line them up. Is your result somewhat decent looking with OBS ? I may try doing a screengrab like that of my unity window and see if that is better

user-5874a4 03 October, 2017, 22:55:18

@user-5ca684 Yes, I noticed the problem first with R and C keys. Later when I tried to use KeyInputs for my script, the same problem appeared. I guess for some reason, KeyInputs are disabled till eye processes are started

user-570235 04 October, 2017, 13:28:53

Hey all. Are there any news to build the hmd-eyes project for the hololens?

user-54a6a8 06 October, 2017, 14:42:10

I have a question about the camera drivers for the Vive add-on. Should I ask here or in 👁 core?

user-54a6a8 06 October, 2017, 15:02:51

I asked in 👁 core. It seemed more appropriate there.

But my question for this channel: Before I do it on my own from scratch, does anyone have a 'simple' Unity project that does HMD calibration without all the extra code and inspector fiddling that the current Unity assets have? We want to be able to compile our project and not run from the editor, so the interaction via inspector is not an option.

mpk 06 October, 2017, 15:32:30

@user-54a6a8 please have a look at the dev branch of hmd-eyes.

user-54a6a8 06 October, 2017, 16:47:57

@mpk thank you. I installed NuGet package manager, installed NetMQ, but I'm getting errors from MessagePack ('TinyJsonReader' could not be found). The MessagePack that comes by default with my Unity-installed MSVC2017 doesn't seem to have some of the types that the project is expecting. I can't install MessagePack from NuGet because "Could not install package 'MessagePack 1.6.2'. You are trying to install this package into a project that targets '.NETFramework,Version=v3.5', but the package does not contain any assembly references or content files that are compatible with that framework. For more information, contact the package author." Is there a specific version of MessagePack that I should be using for this project?

user-54a6a8 06 October, 2017, 16:55:33

Never mind, I found the .unitypackage file that needs to be imported into the Unity project directly, and not through Visual Studio.

user-24270f 09 October, 2017, 00:43:09

@wrp [email removed] we have a CV1 add-on in the pipeline. All parts are finalized, just waiting for final revisions before public release (estimate 2-3 weeks).``` any update? Will it be 1400 euro, for binocular? Also why does it cost so much more than 2x 450 euro individual cameras?

user-e04f56 09 October, 2017, 08:29:43

@user-570235 We are currently working on issues with the MessagePack implementation that does not work with UWP. Once this is fixed, we can continue with the HoloLense build.

wrp 09 October, 2017, 08:31:32

@user-24270f we are a little bit delayed on the final public release for the CV1 add-on. We are finalizing the camera. This should happen very soon. I can not say much more definitive at this time.

wrp 09 October, 2017, 08:32:09

The add-on will also have a cable clip and cabling. Pricing will be the same as the Vive add-on

user-24270f 09 October, 2017, 11:43:01

Think I'll go one of the different sub $500 USD options for vive

user-df67c1 10 October, 2017, 13:14:28

Is anybody knows how to build and deploy PupilHMDCalibration sample on HoloLens?

mpk 10 October, 2017, 13:29:45

@user-df67c1 @user-e04f56 is working on this and will repost ASAP.

user-df67c1 10 October, 2017, 14:06:23

@mpk I want only pupil tracking and marker showing functions of sample which is able to apply to HoloLens. Getting a direction is difficult right now?

user-73268e 10 October, 2017, 15:00:34

When the "Recording" button is not pushed down, the calibration process goes well but when I pushed down the "Recording" button, error log which saying "FiniteStateMachineException: Req.XSend - cannot send another request" will be generated, is anybody has ever met this?

user-73268e 10 October, 2017, 15:01:38

Also, when I push up the "Recording" button, it displays "Processing ..." and is stucked

user-73268e 10 October, 2017, 15:06:01

Chat image

user-73268e 10 October, 2017, 15:06:11

Chat image

mpk 10 October, 2017, 15:18:24

@user-5ca684 do you have an idea about the problem @user-73268e discribes?

user-73268e 10 October, 2017, 15:53:56

The error reported at GetPupilTimestamp()

user-73268e 10 October, 2017, 15:56:12

It seems that we have to receive the response before send request again in ZMQ

user-90aa66 10 October, 2017, 16:24:59

Hello, I am having some trouble calibrating with a monocular eye camera in the dk2. I have a right eye camera and I think the targets during calibration in Unity are being displayed with respect to the left eye, not the right eye. Does anyone know how I might fix this? Also, not sure if this is related but the targets (RGB) in Unity are not moving after calibration, but Pupil Capture is still receiving the gaze data. Thanks!

user-54a6a8 10 October, 2017, 18:10:07

Sorry in advance for commenting on the dev branch dev_Refactor-and-demos folder. Should I post issues instead? - It needs a proper .gitignore so I can easily see what code I changed to make things work. - The lack of ffmpeg.exe lead me to download and install the latest FFmpegOut package. The FFmpegOut API seems to have changed and the latest is incompatible with the project. I had to comment out some code related to the Recorder. - After calibration, the eye-target gizmos are static. I had to add pupilTracker.StartVisualizingGaze(); to StartDemo() in PupilDemoManager.cs. Now the blue and green targets move with the eye tracker; the red is still static.

user-54a6a8 10 October, 2017, 18:11:05

I have to run now. I'll work on this again tomorrow.

user-e04f56 11 October, 2017, 07:59:55

@user-90aa66 While working on the new version, I discovered that there was a mix-up of the eye ids in the previous version. "0" represents the right eye and "1" the left one. Going through the old code, I think the only occurence of this error is in the OnGazePacket() function. This is the old code: if (eyeIDStr == "0") { leftEye.AddGaze (x, y, 0); if (OnEyeGaze != null) OnEyeGaze (this); } else if (eyeIDStr == "1") { rightEye.AddGaze (x, y, 1); if (OnEyeGaze != null) OnEyeGaze (this); } If you change it to this, it should fix the problem if (eyeIDStr == "1") { leftEye.AddGaze (x, y,1); if (OnEyeGaze != null) OnEyeGaze (this); } else if (eyeIDStr == "0") { rightEye.AddGaze (x, y, 0); if (OnEyeGaze != null) OnEyeGaze (this); }

user-e04f56 11 October, 2017, 08:19:55

@cboulay#7975 ffmpeg is currently not tested and will have to be looked at in the future. The issues with the calibration scene will be fixed by this week, though.

user-90aa66 11 October, 2017, 14:12:39

@user-e04f56 ok thank you! I'll give that a try.

user-54a6a8 11 October, 2017, 14:46:37

@user-e04f56 , ok thank you. Just to be clear, I wasn't actually trying to use the ffmpeg part, it's just that I couldn't run the 'main' script because it was attempting to initialize a Recorder instance which had some ffmpeg-related errors. So when you're fixing up the project this week you might want to comment out the Recorder functionality or update to the latest ffmpeg API so the code can actually run. As for the others, I think the provided examples have enough information for me to figure out what I need to do. Thanks!

user-54a6a8 11 October, 2017, 15:01:54

@user-90aa66 , for now you can add pupilTracker.StartVisualizingGaze(); to the end of StartDemo() in PupilDemoManager.cs to get the blue and green targets moving after calibration.

user-54a6a8 11 October, 2017, 17:07:33

To get the red target to move around too, in VisualizeGaze() in PupilGazeTracker.cs, change the two _markerGazeCenter lines to the following:

                _markerGazeCenter.position.x = PupilData._2D.GazePosition.x;
                _markerGazeCenter.position.y = PupilData._2D.GazePosition.y;
user-90aa66 11 October, 2017, 20:11:39

@user-54a6a8 ah, thanks for that! Points me in the right direction i think. I haven't been able to get the dev branch working properly so I don't think I can do exactly that. I had the targets moving around when I was using an older version of Unity but not with 2017.1. It would be nice to have it working to easily see how good the calibration is but not essential for what I am doing yet

user-90aa66 12 October, 2017, 20:00:19

So actually, I do need to see the moving targets to verify that my calibration is good. Does anyone know how to get them (or just the right target) to move after monocular calibration?

user-5d12b0 12 October, 2017, 20:01:38

Earlier today they pushed changes to the dev branch. I was able to get it working without any changes to the code. After binocular calibration, I was able to get feedback immediately. I haven't tried monocular calibration.

user-90aa66 12 October, 2017, 20:03:11

Did you have any problems getting the dev branch to work? I am getting: VR: OpenVR Error! OpenVR failed initialization with error code VRInitError_Init_PathRegistryNotFound: "Installation path could not be located (110)"!

user-90aa66 12 October, 2017, 20:04:14

I'm using the up to date branch and have VR enabled and it shows OpenVR and Oculus listed there in Unity.

user-97591f 12 October, 2017, 20:32:51

When I had that problem, it was because SteamVR wasn't installed / wasn't running. On my laptop, I can reproduce a similar error when nvidia PhysX settings did not force use the GPU. Are you able to use your VR setup on any other unity project? The dev branch works fine for me.

user-90aa66 12 October, 2017, 20:45:20

Yes both the dk2 and cv1 are working fine with everything else. My unity project that is using the master branch of the pupil plugin also works ok with the dk2 and eye tracker (but can't get the targets moving there)

user-90aa66 12 October, 2017, 20:45:40

I haven't used steamVR before

user-97591f 12 October, 2017, 21:06:09

Does your windows name contain special characters? The default directory sometimes may not be able to read it; It might be trying to read ... /Users/your_name/AppData/Local/OpenVR... but it's not there. So you might be able to confirm your project settings. I have OpenVR enabled but I use the SteamVR from the asset store. It's not necessary, but for the HTC Vive it gives that error of SteamVR isn't installed Edit: SteamVR is necessary for my setup (maybe give that a try), but SteamVR Assets is optional.

user-90aa66 12 October, 2017, 21:49:24

That's a good thought I'll see if there is some way to check where it is looking and see that it is installed there. No special characters in the path. I'll look into steamVR ideally I don't want to install more things, I already need Oculus running but if that will help its worth a shot!

user-90aa66 12 October, 2017, 22:03:30

@.__.#6214 installing steamVR got me past that error - thanks! Not sure why this version needed that while others didn't.

user-e04f56 13 October, 2017, 07:20:12

@thepod#1280 we will look into the issue why the dev-branch version seems to not be working with OpenVR as opposed to SteamVR. but as you have a working version right now, I hope if it is ok if we push that to next week.

user-489841 13 October, 2017, 11:03:23

Hey there, I am having some issues getting the unity calibration example project working. I set up the PupilGaze object as instructed on the hmd-eyes github page, but when I run it, I get one pupil capture window only, and this error message - Failed to connect with the Server. Trying again in 5 seconds. I also tried switching the path to the pupil_capture.exe, and the same error occurs.

Chat image

user-489841 13 October, 2017, 11:04:28

Here's a screen grab with pupil_capture.exe as the path

Chat image

user-5d12b0 13 October, 2017, 14:20:17

@user-489841, if you're able, I recommend trying the dev branch. IMO, it is much more user-friendly.

user-54a6a8 13 October, 2017, 14:41:56

@user-90aa66 , was the OpenVR error fatal? i.e. did it prevent you from running the market scene? You can try going in to Edit>Project Settings>Player>Settings for PC...>Other Settings... then under 'Virtual Reality SDKs' you can change the order to put Oculus on top. I don't recommend doing this generally because it'll make it difficult for users that have the Oculus SDK installed to use SteamVR to run the compiled program if they choose to (e.g. they want to use other SteamVR features). If this project is not going to be distributed to anyone that might use OpenVR, then you can remove the OpenVR SDK from the list.

user-90aa66 13 October, 2017, 15:01:14

@user-e04f56 OK great! The scenes seem to be working. I am getting a lot of flickering around the edges for some reason when I turn my head but I will look into this.. @user-54a6a8 It wasn't a fatal error - it let me start the scenes but the calibration would not work - a target would appear and it would hang. I'm going to try your fix to get the targets moving after calibration now.

user-5d12b0 13 October, 2017, 15:02:04

With the latest version of the dev branch (do you have the MarketScene?), I didn't have to do anything to get it working.

user-5d12b0 13 October, 2017, 15:02:18

My above fixes from a couple days ago weren't necessary.

user-90aa66 13 October, 2017, 15:02:19

@Mat#9275 I also had some trouble first starting out with those instructions. I found that if I started pupil capture before starting my unity scene and setting it to fake image capture as the source it would work

user-90aa66 13 October, 2017, 15:05:59

@user-5d12b0 right now I can get the scene to run but the pointer is not moving. I think i need to again implement the right eye left eye swap that @user-e04f56 mentioned possibly. The scene is also a bit laggy and a lot of flashing at the edges of my display not sure why

user-97591f 13 October, 2017, 15:31:38

@user-489841 On the one window that opened, scroll to the bottom and make sure the port is 50020. If the port is occupied, point the directory to pupil_service.exe instead. When you click on eye0 eye1 do any additional windows open? If not, you may need to reinstall the drivers.

user-90aa66 13 October, 2017, 16:40:34

I think I may be having a similar problem as @user-489841 now. Using the dev branch the plugin says it is connected but the green eye lights do not light up. The window in unity also says capture rate=0, left eye, right eye both (0.0, 0.0)

user-90aa66 13 October, 2017, 16:41:43

The plugin and pupil capture are both set to port 50020 and I have tried pointing the plugin to both pupil_capture and pupil_service and also tried having pupil capture open before starting and not open, do you know what might be happening here??

user-5d12b0 13 October, 2017, 17:40:01

@user-90aa66 , our workflow is to first start pupil_service, then close they eye camera window that opens (its title is 'pupil capture camera 0' or something like that), then load the Unity project, then click play. After some time, two pupil capture windows open. We make sure the user's eyes are in the middle of the frame (if alone, we use SteamVR desktop viewer to see the windows from inside the HMD), then minimize the pupil capture windows, then click on the unity project to give it focus, then press C to calibrate. When calibration is complete, the eye-laser shows you the gaze direction. At that point you can press G to toggle on the bullseye markers.

user-e04f56 13 October, 2017, 19:45:03

the [email removed] is describing is actually the way we want to do it in the future: start pupil_service as a separate app, then launch Unity. but in the current version (both dev and main branch) the service should be started from a Unity script. Is this not working in multiple cases? Other than the path to pupil_service not being set correctly (from Unity inspector or in the PupilSettings file in the Resource folder), I will have to do some tests to find out what could go wrong, here.

user-90aa66 13 October, 2017, 19:48:14

@user-5d12b0 thanks! That is helpful to hear your workflow but that is not working for me using the dev or main branch and pupil capture or service

user-90aa66 13 October, 2017, 19:49:02

@user-e04f56 Nothing is working for me right now but when the main branch was working I always had to start pupil_capture first before Unity and the Unity script would open the eye camera windows

user-90aa66 13 October, 2017, 19:51:10

I think @user-f1d099 is also starting pupil_capture first before the Unity script

user-e04f56 13 October, 2017, 19:57:39

@user-90aa66 would you maybe be open for a remote session so I can have a look at your setup? might be the fastest way to try to get this solved

user-90aa66 13 October, 2017, 20:00:53

@user-e04f56 that would be great! I will direct message you.

user-df67c1 14 October, 2017, 12:33:22

Hello, I'm so frustrated because our project performance is being delayed because of a few occurring errors when I click File > Build Settings > Build. · error CS0234: The type or namespace name 'Design' does not exist in the namespace 'System.ComponentModel' (are you missing an assembly reference?) · error CS0246: The type or namespace name 'Process' could not be found (are you missing a using directive or an assembly reference?) · Error building Player because scripts had compiler errors · UnityEditor.BuildPlayerWindow+BuildMethodException: Build failed with errors. at UnityEditor.BuildPlayerWindow+DefaultBuildMethods.BuildPlayer (BuildPlayerOptions options) [0x001b9] in C:\buildslave\unity\build\Editor\Mono\BuildPlayerWindowBuildMethods.cs:162 at UnityEditor.BuildPlayerWindow.CallBuildMethods (Boolean askForBuildLocation, BuildOptions defaultBuildOptions) [0x00050] in C:\buildslave\unity\build\Editor\Mono\BuildPlayerWindowBuildMethods.cs:83 UnityEditor.HostView:OnGUI()

Although plenty of solution about compiler errors in MS document , it couldn't change our situation. I want to get the answer if I can't develop pupil-tracking application with HoloLens.

user-e04f56 14 October, 2017, 18:48:55

[email removed] , we are currently testing a HoloLens implementation and there are some hurdles with UWP builds. One is the one you mention with “Process” not being available. “Process” is used to start/stop pupil_service which, as I mentioned [email removed] yesterday, will not be necessary in the future. This still leaves UWP problems with MessagePack and NetMQ, the former of which we are currently working on. So we are making some progress. Hope that helps

user-df67c1 15 October, 2017, 08:12:54

@user-e04f56 Then, is there not available Pupil Lab sample for HoloLens?

user-e04f56 15 October, 2017, 17:14:42

there is currently no sample for HoloLense available, but we are actively working on it

mpk 15 October, 2017, 17:38:49

Also note that you can of course use pupil and hololens with a separate computer that sends data to hololens or for only analysis.

user-df67c1 15 October, 2017, 23:17:33

Thanks for your detailed reply. I'm looking forward to it already.

user-d08045 16 October, 2017, 09:49:09

Hi all! Does anyone have a link to a bare-bones Unity example to calibrate and then just read 2D/3D gaze points? No recording etc, etc as in the example project that is available via gitHub.

user-5d12b0 16 October, 2017, 14:36:42

@user-d08045 , the dev_Refactor-and-demos project in the dev branch should get you there.

user-8779ef 17 October, 2017, 17:53:35

Hello! When running in Unity, I find that my console is flooded with errors. Most relate to OperatorMonitor.cs. Is this normal for the current build?

user-8779ef 17 October, 2017, 17:54:49

NullReferenceException: Object reference not set to an instance of an object OperatorMonitor.OnGUI () (at Assets/Scripts/OperatorMonitor.cs:100)

user-8779ef 17 October, 2017, 17:55:25

...it also seems to be trying to write a debug log to a directory that doesn't exist.

mpk 17 October, 2017, 17:58:08

Please have a look at the dev branch. We are about to merge that into master

user-8779ef 17 October, 2017, 17:58:52

Thanks, will do.

user-8779ef 17 October, 2017, 18:20:41

...and, 3D calibratino is nonfunctional in the dev branch?

mpk 17 October, 2017, 18:34:04

@user-8779ef not yet. will be added next week.

mpk 17 October, 2017, 18:34:20

we are refactoring to make things easier to read and maintain.

user-8779ef 17 October, 2017, 18:34:52

great!

user-8779ef 17 October, 2017, 18:35:18

also recommend you make calibration grid angular size an easily adjustable public variable

user-8779ef 17 October, 2017, 18:36:49

I see the dev has the 2D angular extent set to 0.2-8 in normalized screen space, and that tends to fall almost off of the HMD view. That means we're sacrificing accuracy in the center for questionable gains in the periphery, where folks rarely look, and distortions are greatest.

user-8779ef 17 October, 2017, 18:37:57

Those distortions would really complicate the fit of any kind of parametric models ...for example, amking it non-linear at the edges. Best to just avoid it, no? Again, folks rarely look out there. Just my two cents... 😃

mpk 17 October, 2017, 18:39:05

@user-8779ef I think we actully move points inwards in out latest chage today. But I agree. we could move even more inward. A angular spead var might be a nice idea.

mpk 17 October, 2017, 18:39:29

@user-e04f56 see above. Lets discuss this tomorrow.

user-8779ef 17 October, 2017, 19:34:35

Ok, I found the new calib. plugin inside the dev refactor subfolder and, yes, the calibratnoi grid is smaller.

user-8779ef 17 October, 2017, 19:34:55

(it turns out that I was still using the older plugin, also buried in the dev. repo)

user-8779ef 17 October, 2017, 19:35:14

Now, I'm having consistent issues if I try and run the calib routine twice in a row.

user-8779ef 17 October, 2017, 19:35:17

MissingReferenceException: The object of type 'Text' has been destroyed but you are still trying to access it. Your script should either check if it is null or you should not destroy the object. PupilDemoManager.OnCalibtaionStarted () (at Assets/Script/PupilDemoManager.cs:51) PupilTools.StartCalibration () (at Assets/pupil_plugin/scene/Scripts/New/PupilTools.cs:203) PupilGazeTracker.Update () (at Assets/pupil_plugin/scene/Scripts/PupilGazeTracker.cs:254)

user-8779ef 17 October, 2017, 19:37:25

I understand this is a dev build, so I'm only trying to help.

user-8779ef 17 October, 2017, 19:49:12

...but, I did get the market demo working. Great job, guys!

user-8779ef 17 October, 2017, 19:49:37

looks wonderful. Get rid of that sharp edge between the color / gray in the color demo : )

user-c68bba 18 October, 2017, 02:47:56

@wrp Hi,could give me a "pyglui-1.8-cp36-cp36m-win_amd64.whl".

Chat image

user-c68bba 18 October, 2017, 02:49:37

@wrp or "pyglui-1.7-cp36-cp36m-win_amd64.whl".

wrp 18 October, 2017, 02:49:59

Hi @user-c68bba there is a v1.8 release draft. We are awaiting some changes to be merged to fontstash so that we can release v1.8

wrp 18 October, 2017, 02:50:31

I can create a v1.7 wheel today

wrp 18 October, 2017, 02:50:37

and upload for you

user-c68bba 18 October, 2017, 02:51:13

@wrp thank you very much!

user-8779ef 18 October, 2017, 14:13:41

Ok, quick questino for you: I'm running lab code and it's crashing when trying to access a data package returned from pupil that seems to be lacking an expected key (it's in python)

user-8779ef 18 October, 2017, 14:13:54

the key is data[]

user-8779ef 18 October, 2017, 14:14:01

oops. typing...

user-8779ef 18 October, 2017, 14:14:17

the key is data['base_data']['gaze_normals_3d']

user-8779ef 18 October, 2017, 14:15:05

Now, I'm using an HMD, and so a 3D gaze direction would be expected, but I don't see one in the data stream. Am I missing something?

user-8779ef 18 October, 2017, 14:17:11

...I see only 2D gaze coordinates stored in data['norm_pos']

user-8779ef 18 October, 2017, 14:32:45

It seems like my question is: Where are the docs on the python data structure?

wrp 18 October, 2017, 14:38:24

@user-8779ef please see the Pupil Datum Format section within this section in the docs: https://docs.pupil-labs.com/#process-structure

wrp 18 October, 2017, 14:38:42

Also see: https://docs.pupil-labs.com/#message-format

user-8779ef 18 October, 2017, 14:39:52

Thank you.

user-8779ef 18 October, 2017, 14:40:04

...however, I'm not sure that the doc answers my questions.

user-8779ef 18 October, 2017, 14:40:52

(I've been there). Is there a reason why gaze_normals_3D key is missing?

wrp 18 October, 2017, 14:41:18

Are you performing a 2d calibration?

user-8779ef 18 October, 2017, 14:41:37

I did perform a 2D calibration, yes. So, this shapes the output data. That makes sense.

user-8779ef 18 October, 2017, 14:41:56

...it would be more helpful if it were explitly stated in the docs 😃

wrp 18 October, 2017, 14:42:01

For 3D data to be available, a 3D calibration mode must be performed.

user-8779ef 18 October, 2017, 14:42:11

Ok, but that's not currently working, right?

wrp 18 October, 2017, 14:42:30

It is currently being refactored, correct.

wrp 18 October, 2017, 14:42:54

@user-8779ef if there are areas in the docs that you would like to see improved, please make an issue: https://github.com/pupil-labs/pupil-docs/issues

wrp 18 October, 2017, 14:43:04

alternatively you can also make a PR 😄

user-8779ef 18 October, 2017, 14:43:08

WIll do, thanks for that link.

user-8779ef 18 October, 2017, 14:43:38

Ok - ETA on 3D calibration? I know you guys are close.

wrp 18 October, 2017, 14:44:37

Yes, we are close - @user-e04f56 how's it going with the refactor?

user-8779ef 18 October, 2017, 14:45:14

( Sorry, not trying to pester, just trying to decide the future of a student working with this for credit ).

user-8779ef 18 October, 2017, 14:50:20

Long term project I would like the undergrad to perform involves testing tracker latency using fake eyes and a motor. Those are high hopes, but she's quite talented. Longer term goals are to present stimuli into the scatomas (blind spots) of patients with damage to visual cortex. I'll keep you updated, but don't hold your breath!

user-5d12b0 18 October, 2017, 14:52:54

@user-8779ef , I'm curious how critical you think it is to get a 3D gaze vector? Our plan is to get the 2D points, get the vector from each eye to its 2D point, then raycast that until it hits something. I will reserve judgment on the 3D data until the new version drops, but the previous version gave us pretty terrible results while the 2D version seemed to work well.

mpk 18 October, 2017, 14:53:53

@user-5d12b0 the current dev branch demo scene does the intersection you are talking about. I works much metter.

mpk 18 October, 2017, 14:54:11

true 3d mapping will come next. We are working on a much improved version.

user-8779ef 18 October, 2017, 14:54:17

Are you implying that you would like to detect the object of fixation via collision detection?

user-8779ef 18 October, 2017, 14:54:47

This is not a robust method given biological + tracker noise, especially for smaller objects at a distance, or in the periphery.

user-8779ef 18 October, 2017, 14:55:26

Many scientific endeavors will require algorithms based upon minimum angular distance.

user-5d12b0 18 October, 2017, 14:56:28

@mpk, yes but the origin isn't in the eye itself. The 'laser' is coming from the center of the HMD to a point that is the average of the 2 eyes, so if there is, e.g., an object sitting right in front of your nose, but the eyes are clearly looking past it to something in the distance, the current ray will collide with the object in front of the nose.

mpk 18 October, 2017, 14:56:53

@user-8779ef we are adding this next. There are a few ways of getting there: One is using the dual 2d mapper and intersect gaze rays (binocular vergence, This would happen in unit3d)

mpk 18 October, 2017, 14:57:13

the other is using our 3d gaze mapper.

user-8779ef 18 October, 2017, 14:57:32

I'm comforttable using the cyclopean vector.

user-5d12b0 18 October, 2017, 14:57:36

@user-8779ef , our objects are not small and they are not far. It's just a 2-target task, each target being a few degrees from the midline, so we should be fine.

user-8779ef 18 October, 2017, 14:57:52

cboulay - yes, depends on the use case.

mpk 18 October, 2017, 14:57:54

@user-5d12b0 that is true semi transparency is not solvable without using vergende.

user-8779ef 18 October, 2017, 14:58:24

There are also many situatnois where a human won't look exactly at the target of interest. For example, if a human is tracking a moving object, they will often intentionally look ahead, in the direction of movement. I'm studying these predictive eye movements.

user-8779ef 18 October, 2017, 14:58:49

mpk: happy to use cyclopean vector. In fact, until i test the quality of the verfgence signal, I can't put faith in it.

mpk 18 October, 2017, 14:58:55

@user-8779ef I understand that in your usecase intersection will not work.

user-8779ef 18 October, 2017, 14:59:28

@mpk Even biological noise would complicate teh estimation of the vergence point beyond a few meters

user-8779ef 18 October, 2017, 15:00:14

The eyes are almost parallel beyond ... what, 4/5 meters? (Not a number from literature, an estimation)

mpk 18 October, 2017, 15:02:15

@user-8779ef beyond 4 meters the vergence signal is drowned in noise.

mpk 18 October, 2017, 15:02:31

the problem lies with the small baseline of IPD

user-8779ef 18 October, 2017, 15:03:10

yes, small amounts of noise in the angular signal will give huge devaiations in the depth of the point of regard.

mpk 18 October, 2017, 15:03:19

correct.

user-5d12b0 18 October, 2017, 15:03:34

@mpk, I'm looking forward to seeing how the 2D mapper and intersecting gaze rays work out. I imagine, to accommodate noise, the gaze rays have to be given some radius then you look at the intersecting volumes? That method sounds computationally expensive but I'm not expert.

mpk 18 October, 2017, 15:04:01

@user-5d12b0 @user-e04f56 is working on this soon.

user-8779ef 18 October, 2017, 15:04:03

...as I've said, I'm happy with the cyclopean. I need 3D, though. Can you tell me more about the "dual 2d mapper" ?

user-8779ef 18 October, 2017, 15:04:15

A point at the docs would be fine.

mpk 18 October, 2017, 15:05:07

@user-8779ef monouclar mapping for each eye. In our new example we map onto the frustum img plane for each eye.

mpk 18 October, 2017, 15:05:40

these points can then be projected outward as rays origninating from the respective frustom org.

user-8779ef 18 October, 2017, 15:05:44

Ah, OK. So, you're suggesting I calculate my own unit vectors using the 2D info + near plane depth ? That's fine, but I need to make sure I have the correct parameters :

mpk 18 October, 2017, 15:06:09

since we dont have paralax error in VR this methods is actully quite nice.

user-8779ef 18 October, 2017, 15:06:22

Yeah, that makes sense, and in fact, thats how I've done it in the past with other trackers.

user-8779ef 18 October, 2017, 15:06:46

So, I'm running the calib in unity before switching to another program for my VR presentatino (an older program - vizard). Still moving the lab to Unity 😃

user-8779ef 18 October, 2017, 15:07:14

Currently, I'm using the newest dev branch calib routine in Unity. I have dug around and found the normalized screen coords...

user-8779ef 18 October, 2017, 15:08:09

well, basically, to go from normalized screen to cartesian head-space, I need to make sure that I have the accurate angular frustum size and near plane position assumed during the unity calibration. Is that right?

user-8779ef 18 October, 2017, 15:15:06

Sorry, I just need the FOV.

mpk 18 October, 2017, 15:16:46

@user-8779ef you will need that or the depth of the plane @user-e04f56 can you shed some light into this?

user-8779ef 18 October, 2017, 15:16:59

I will need the FOV, or the depth+size of the plane (in meters).

user-8779ef 18 October, 2017, 15:18:35

a side point - there an issue to be aware of that us neuroscientists will care about, and you should too in certain cases.

mpk 18 October, 2017, 15:18:53

we are all ears 😃

user-8779ef 18 October, 2017, 15:18:54

The rays you're casting are from the pupil center, or from the camera center?

mpk 18 October, 2017, 15:19:04

in this case camera center.

mpk 18 October, 2017, 15:19:16

since the calibration is against the camera center.

user-8779ef 18 October, 2017, 15:19:45

Yeah, that makes sense. So, keep in mind that there are really two geometries here that one could use to cast rays.

user-5d12b0 18 October, 2017, 15:19:55

I think you're using Camera.ViewportPointToRay.

user-8779ef 18 October, 2017, 15:20:08

One is from camera center to near plane. The other is from eye center to physical screen coordinate.

user-e04f56 18 October, 2017, 15:20:09

we are @user-5d12b0

user-5d12b0 18 October, 2017, 15:20:09

hmm.. markdown doesn't work...

mpk 18 October, 2017, 15:20:22

in the 3d calibration (as opposed to the dual monocualr calibration) we cast rays from the eye center through the visual axis of the eye (not the optical axis.)

user-8779ef 18 October, 2017, 15:20:36

mpk: thanks for the input.

user-8779ef 18 October, 2017, 15:21:06

In many cases, the distance from the eye and the screen will not be equal to the distance from the camera to the near clipping plane, and this will cause compression/expansion of the angular data

mpk 18 October, 2017, 15:21:20

@user-5d12b0 just paste the link, discord will do the rest.

user-8779ef 18 October, 2017, 15:21:58

...not sure if that makes sense. Without images, it's quite difficult to explain.

user-8779ef 18 October, 2017, 15:22:22

This is especially a problem if one is interested in gaze velocities or accelerations.

user-e04f56 18 October, 2017, 15:23:02

@user-8779ef are you familiar with the Camera.ViewportPointToRay function mentioned by @user-5d12b0

user-e04f56 18 October, 2017, 15:23:05

?

user-8779ef 18 October, 2017, 15:23:14

No, I'm not. Is this a unity function?

user-e04f56 18 October, 2017, 15:23:19

correct

user-e04f56 18 October, 2017, 15:23:45

and it should actually take care of the clipping planes

user-8779ef 18 October, 2017, 15:24:06

Max, I'm actually not using the data in unity.

user-8779ef 18 October, 2017, 15:24:24

I'm only using Unity for the PupilLabs calibration plugin, but I'm then shutting it down and collecting data in a 3rd party app. I guess the lesson here is that I will have to ask unity for its FOV.

user-5d12b0 18 October, 2017, 15:25:34

I was just looking through the Unity docs trying to get the HMD info (e.g. VRNode.LeftEye) from the native VR API but unfortunately the docs website is giving me 404's, even when I click on internal links.

user-8779ef 18 October, 2017, 15:26:01

oh, crap, is the normalized coordinate for the left eye specifically for the left eye frustum?

user-5d12b0 18 October, 2017, 15:26:51

I don't know. I think in Unity there is a single 'camera' for both eyes, so I think not... but I'm not sure.

user-8779ef 18 October, 2017, 15:26:54

(thanks for helping @user-5d12b0)

user-8779ef 18 October, 2017, 15:27:18

Yes, that would be more convenient - if left/right eye normalized coords were mapped onto a single cyclopean/main camera frustum

user-e04f56 18 October, 2017, 15:36:07

Unity also provides a function to calculate the frustum corners for each eye

user-5d12b0 18 October, 2017, 15:36:11

I know that, a while back, early VR implementations had two cameras, but I don't think that's the case anymore. I'm not really an authoritative source on this. I'd like to know for sure either way. TBH I'm a bit confused. I have to look into the code to see how calibration actually works. The calibration target locations are in normalized screen coordinates. So the dual monocular calibration returns the normalized gaze point for each eye. Don't we then have to project from each eye to the point on the screen? If we project from the single camera to the point on the screen then won't we get a vector that's pointing in a completely different direction?

user-e04f56 18 October, 2017, 15:36:17

from there you can calculate the fov

user-8779ef 18 October, 2017, 15:36:50

@user-e04f56 MaxSHining - The inspector for the pupil demo manager camera shows an FOV value. Is that reliable?

user-e04f56 18 October, 2017, 15:37:03

we needed to do this in our sample project when working with shaders

user-8779ef 18 October, 2017, 15:37:21

Sorry - I'm such a discord newb.

user-e04f56 18 October, 2017, 15:38:01

@user-8779ef well, that FOV is the one for the camera, but for each eye, they will be different

user-e04f56 18 October, 2017, 15:39:02

maybe have a look at this: https://docs.unity3d.com/ScriptReference/Camera.CalculateFrustumCorners.html

user-8779ef 18 October, 2017, 15:39:25

Ok, so you're saying that the math will require: gazeSampleLefEye[norm_pos] , and fov for the left eye specifically ?

user-8779ef 18 October, 2017, 15:40:25

The coordinates returned in gazeSampleLefEye[norm_pos] are normalized within the near clipping plane of the left eye's frustum?

user-8779ef 18 October, 2017, 15:40:39

...and not a main camera frustum?

user-5d12b0 18 October, 2017, 15:42:00

@user-e04f56 , do you happen to know in Unity where the camera origin is in relation to the HMD? I know in Unreal, about 2-3 years ago at least, the camera rig origin was at the base of the neck because that is the point of rotation/translation of the head... then offsets were added for each eye.

user-5d12b0 18 October, 2017, 15:43:04

I'm wondering if this is why the eye-laser looks like it's coming from my chest to the target, instead of coming from in between my eyes.

user-8779ef 18 October, 2017, 15:43:23

We jokingly refer to it as the crotch laser.

user-8779ef 18 October, 2017, 15:46:45

Ok, so the question remains: within what clipping plane are the normalized left and right eye coordinates extracted? A shared, cyclopean plane, or the left/right eye planes? The answer to this question should also tell me which camera origin to use: either the cyclopean, or right/left camera origins.

user-e04f56 18 October, 2017, 15:58:29

the lasers origin was actually manually set below the camera by me

user-e04f56 18 October, 2017, 15:59:21

sorry for that, but it is easier to see when coming from below compared to originating from the cameras origin

user-8779ef 18 October, 2017, 15:59:36

Yes, if coming out the eye, it would be a point.

user-8779ef 18 October, 2017, 15:59:56

No need to apologize - I prefer the crotch laser.

user-e04f56 18 October, 2017, 16:00:37

sceneCamera.transform.position-sceneCamera.transform.up

user-e04f56 18 October, 2017, 16:01:48

@user-5d12b0 OpenVR provides a way to access the actual camera and eye positions

user-e04f56 18 October, 2017, 16:04:55

Unity Reference sites seems to have problems right now, but I will try to look it up later

user-e04f56 18 October, 2017, 16:07:00

@user-8779ef from my understanding, it is the shared plane

user-e04f56 18 October, 2017, 16:07:18

will have to pick this up again, tomorrow, though. sorry

user-8779ef 18 October, 2017, 16:11:09

That's Ok, and I very much appreciate your looking into it.

user-8779ef 18 October, 2017, 16:11:47

I'll log in again tomorrow and will see if you're around.

user-8779ef 18 October, 2017, 16:17:59

@user-e04f56 also, it sounds like you're contributing to HMD-EYES and the 3D calibratoin. Great work on all this . Much appreciated. LOoking forward to the refactor! Any idea when that will be merged, and 3D calibration available?

user-90aa66 18 October, 2017, 20:16:17

Has anyone found a good way to import/export the pupil gaze plugin? I've been trying to get this to work for a few hours but am a bit stuck. I've tried exporting it and importing it into a different project. I've also tried exporting my old project scene into the dev branch project but things keep breaking. Just curious if anyone has found an easy way to do this.

user-90aa66 18 October, 2017, 20:17:11

I was able to export the plugin using the main branch previously but haven't figured this out with the newer version I am now using.

user-c68bba 19 October, 2017, 02:05:46

@wrp Hi, "pyglui-1.8-cp36-cp36m-win_amd64.whl" or "pyglui-1.7-cp36-cp36m-win_amd64.whl" has updated. Do you need my Email?

wrp 19 October, 2017, 02:06:14

@user-c68bba apologies, I ran out of time yesterday

wrp 19 October, 2017, 02:06:28

I can do this within the next hour or so

user-c68bba 19 October, 2017, 02:07:39

It's OK, thank you very much!😀

wrp 19 October, 2017, 02:16:53

@user-c68bba please try https://github.com/pupil-labs/pyglui/releases/tags/v1.7

wrp 19 October, 2017, 02:17:05

I just uploaded a wheel for windows

user-c68bba 19 October, 2017, 02:22:33

Thank you, I have found.

wrp 19 October, 2017, 02:24:56

You're welcome @user-c68bba apologies for the delay. Let me know if it works for you

user-c68bba 19 October, 2017, 03:05:20

Yes,I have tried it. It run successful!

wrp 19 October, 2017, 03:33:05

@Max Yao#0687 great!

user-e04f56 19 October, 2017, 07:16:08

@user-5d12b0 as promissed yesterday, here the method you might be looking for: UnityEngine.VR.InputTracking.GetLocalPosition( VRNode node )

user-e04f56 19 October, 2017, 07:17:38

the Unity reference page is still down, so here some possible VRNodes: LeftEye, RightEye, CenterEye, Head.. I think CenterEye is what you are looking for

user-e04f56 19 October, 2017, 07:50:03

@user-90aa66 We just added a .unitypackage file to the dev branch with which you can import all currently needed files into a new project. We will try to update the package as often as possible.

user-e04f56 19 October, 2017, 07:51:39

In general, if anyone wants to generate a custom Unity package one based on your source code changes, right-click the "pupil_plugin" folder in the Unity project tab and select "Export package..". In the new dialog, de-select "Include dependencies" and click export. Unity packages can be imported via "Assets/Import Package/Custom Package..". If you should start a new project, do not forget to select the VR option in the Player settings.

user-e04f56 19 October, 2017, 08:01:33

@performlabrit#1552 now to your question: Unity uses one camera to render both the left and the right eye. Depending on which eye is currently rendered, the camera frustum changes. I think this image from the link I posted yesterday captures this quite well:

user-e04f56 19 October, 2017, 08:03:46

The left lines represent the left eye frustum, the red lines the right eye one. Blue represents the classic frustum for the mono case

Chat image

user-e04f56 19 October, 2017, 08:11:21

In case of the currently implemented 2D calibration, we get viewport coordinates (range 0 to 1) for each eye. Much like it does for e.g. mouse pointer coordinates, Unity can transform these from 2D to 3D based on the relevant camera. We use two methods to do so. 1. Camera.ViewportPointToRay(Vector3 viewportPosition) for the laserpointer

user-e04f56 19 October, 2017, 08:12:44
  1. Camera.ViewportToWorldPoint(Vector3 viewportPosition) for the three colored gaze visualization
user-e04f56 19 October, 2017, 08:19:59

one of the differences between the two is how the z coordinate is interpreted (details see Unity documentation): While the former ignores/does not require the z component, the latter takes it as distance from the camera in its forward direction. In our standard case z equal 2, the distance at which we also render UI elements.

user-8779ef 19 October, 2017, 18:58:42

@user-e04f56 You are a king. Thanks for the info!

user-90aa66 20 October, 2017, 13:13:04

@user-e04f56 I haven't been able to export or import the plugin using those methods with the version you helped me with or yesterdays or the one from 5 hours ago. I'm now trying to just create a new scene in the current dev branch project with a simple 360 image skybox to test things.

user-90aa66 20 October, 2017, 13:13:12

Using the current version, is there still a way to use the PupilGazeTracker gui to calibrate, record, etc? Even in the calibrate or market scenes when I press PupilGazeTracker everything freezes and I get errors like this:

user-90aa66 20 October, 2017, 13:13:17

NullReferenceException: Object reference not set to an instance of an object CustomPupilGazeTrackerInspector.DrawMainMenu () (at Assets/pupil_plugin/Editor/CustomPupilGazeTrackerInspector.cs:311)

user-90aa66 20 October, 2017, 13:13:55

When I press record it can't find ffmpeg.exe even though it is there under StreamingAssets

user-e04f56 20 October, 2017, 20:28:43

[email removed] I was not able to finish the Marker update I started today, which only became obvious after I had committed stuff. I recommend working with the previous version until Monday. sorry for that

user-e04f56 20 October, 2017, 20:33:50

as for the import/export, I tried the unitypackage file I included on Github in a new project and it worked fine. hope you got your 360 test scene to work..

user-4d7f39 21 October, 2017, 20:44:54

I'm trying to connect Pupil service to Unity, it requires .exe, but I'm running on Mac. Any solution?

user-d08045 23 October, 2017, 10:02:16

@cboulay#4121 Sorry for being a noob, but where do I find the "dev_Refactor-and-demos project in the dev branch "?

user-90aa66 23 October, 2017, 13:14:23

@user-d08045 https://github.com/pupil-labs/hmd-eyes/tree/dev

user-879f69 23 October, 2017, 14:25:34

hey guys, I'm getting an error I was wondering if you could help me with.

user-879f69 23 October, 2017, 14:30:56

When I launch unity with PupilGaze in my project, the log says, "Connecting to the server: >tcp//127.0.0.1:50020. UnityEngine.MonoBehaviour:print(Object)" which runs for a while and then times out saying "ContextTerminate. UnityEngine.MonoBehaviour:print(Object)"

user-90aa66 23 October, 2017, 15:31:08

@user-e04f56 Ah ok! Do you know if that will allow the PupilGazeTracker gui to run? I still have the problem of the clean market scene or calibration scene not working whenever I click on PupilGazeTracker to open the gui for calibrate/recording

user-90aa66 23 October, 2017, 15:38:49

@clemsonvr#7163 have you tried opening pupil capture before starting unity and making sure the port is correct there? It's supposed to work without doing that (letting unity launch pupil_service) but that is how I was able to get it to connect on my computers

user-a36fbf 23 October, 2017, 21:39:06

Hey guys, a few questions. when I run the calibration eye 0 opens up two windows. Also, I'm having a hard time with the calibration. It looks like the first the first calibration point shows up and gets stuck...

user-e04f56 24 October, 2017, 07:48:48

@clemsonvr#7163 I think @user-90aa66's suggestion is the best solution as it is the way we want to do it in the future. If you still want Unity to start the service, did you check if the path to pupil_service.exe is set correctly in the settings? We recently updated the standard path, so maybe something got mixed up, there. If there is still a problem, please select the error message in Unity console and post where it originates from. There is usually a call stack displayed for each message.

user-e04f56 24 October, 2017, 07:58:39

@user-a36fbf in spirit of @user-90aa66 : Is Unity starting pupil_service.exe or are you manually doing it through Pupil Capture? if it is the former and you run a scene multiple times, this could lead to the behaviour you describe. While Unity can close the tracking windows, the background processes unfortunately keep running. please look for these in the Task Manager after you exit a scene. You will have to end these manually, but I would recommend starting Pupil Capture separately, so that it takes care of the eye tracking processes.

user-a36fbf 24 October, 2017, 10:40:28

Unity is starting pupil_service.exe

user-72b0ef 25 October, 2017, 12:44:24

Hey guys, I have an issue, where the user cannot see the lower part of the screen very sharp, since the headset is too high on his head. However when we ask the user to put the headset how they want it, their eyes on the screen are outside the c amera's reach. Can we somehow "Bend" the camera positions in the headset so that it is easier to detect the eyes...?

mpk 25 October, 2017, 13:25:50

@user-72b0ef use the depth adjuster of the vive it will move the eyes into the cameras fov

user-5874a4 26 October, 2017, 04:38:51

Does anyone know the details of how the 3d eye model used in 3d detector is constructed? I am having difficulty understanding parameters, 'circle_3d','sphere','projected_sphere'. Is there a paper describing this model?

user-8779ef 26 October, 2017, 16:23:17

@user-72b0ef @mpk Adjusting eye position in the HMD to get a good image in the eye camera may mean sacrificing a crisp view of the virtual world. This is not ideal. You guys have got to work on putting the camera behind the optics (yes, I know that's a lot to ask! ).

user-8779ef 26 October, 2017, 16:28:49

@wrp Any idea when 3D calibration will be working again?

user-8779ef 26 October, 2017, 16:34:21

@user-879f69 sounds like you didn't set your port correctly. The port specified in unity has to match the port at the very bottom of the config settings in the world camera view of capture.

user-e04f56 26 October, 2017, 16:55:20

@user-8779ef we have been working on it (3d calibration) all week. can you maybe describe your use case so that we can test for it?

user-8779ef 26 October, 2017, 18:30:27

@user-e04f56 I simply want the vector rather than normalized position. I'm about to take a crack at mapping the 2d normalized coords to 3d space, though.

user-8779ef 26 October, 2017, 18:31:58

@user-e04f56 Use case is a study involving catching virtual balls - I want the angular distance from the gaze vector to the ball.

user-c68bba 27 October, 2017, 02:02:55

@wrp I have a question about 3D calibration of HMD. Do you have any paper about 3D calibration?

user-e04f56 27 October, 2017, 07:40:56

@performlabrit#1552 the calibration marker as well as the 3-colored gaze visualization markers (left & right eye plus center) of the dev branch version are positioned in 3d space. While PupilData._2D.GetEyeGaze (GazeSource s) provides the 2d view space coordinates being sent by pupil service, PupilMarker.UpdatePosition(Vector2 newPosition) translates them to 3d. maybe this can help you in your mapping endeavor

user-275258 27 October, 2017, 09:15:04

@user-c68bba 3D calibration works pretty much out of the box, right? I have been using it with Unity and the Unity package available on the Pupil HMD Eyes github and it seems to work fine for me the way how it is....

user-275258 27 October, 2017, 09:16:01

@user-8779ef I think indeed it will be a lot of work, but since our product owner is doing research with the system we have presented to us currently, we cannot just change the whole setup, since it will mess with the research results

user-275258 27 October, 2017, 09:16:29

Btw, this is 2SQRS saying this 😛

user-8779ef 27 October, 2017, 14:55:58

@user-e04f56 thanks for the suggestion. I'm trying to use the data in 3rd party software (Vizard), so a simple command-line printout of 2D-to-3D values will be helpful in validating my assumptions. I assume that the eye bases are +/- an IOD/2 form the main/cyclopean camera along a particular axis in head-space. Let me know if you have any information to add there - I can't find much documentation.

user-8779ef 27 October, 2017, 15:31:34

@BL1TZ#0007 Yes, it seems a priority to get 3D calibration working first. I do hope the team strongly considers changing the camera placement in preparation for integration into the next gen of helmets.

user-5874a4 29 October, 2017, 06:36:46

@user-5ca684 Using Pupil Player to load the recording directory uses world.mp4 video for overlaying data visuals. whereas, I will require it to use Unity_Camera (eye).mp4 instead, which is the user's visuals in hmd-eyes. Here, world.mp4 is just a fake source video.How do I make Pupil Player use the appropriate video in a clean way?

End of October archive