vr-ar


user-5874a4 05 July, 2017, 21:00:54

I have a problem during calibration. Some of the 9 markers that appear during calibration are off my field of view in HTC vive. Is this a problem that can be corrected in Unity or do i need to change my vive settings?. I could see that those offset markers are attached to my head. They are off my FOV even if i move my head. I guess this results in not so accurate calibration. Also, I've seen that the depth is not tracked properly. Any of you guys designed an experiment to check for accuracy?. I want to be sure that I have the best accuracy possible with HTC vive- pupil labs combo

mpk 06 July, 2017, 06:29:36

@muzammil#8393 this can be change in the plugin code. @user-5ca684 can you point to the relevant lines?

user-a2dcc3 07 July, 2017, 03:15:00

Hello, I just got my Vive Pupil-labs tracker up and running. I've jumped into the Unity application and have been able to run a calibration (had to do 2D, or else some targets were out of FOV as mentioned by muzammil). I plan to use the eye tracker to collect data while running a different application (VR environment powered by Vizard, a different engine that interfaces with the vive similar to Unity). My use case would be to use the Unity app to perform the calibration, close the app and then run my own python script (through Vizard) that would start the binocular data recording. This is possible right? I plan to take the StopProcess() call out of the PupilGazeTracker.cs script (so eyes don't stop running and lose calibration) and then consult the python docs for telling the capture program to start recording (these are just network messages right?). Is this the correct approach?

user-5ca684 07 July, 2017, 10:35:56

@user-5874a4 the calibration points are set in a way that you should be able to see all of them! your true field of view depends on how far your eyes are from the lenses. The markers have to be attached to your head otherwise the calibration would not make sense. My personal experience tells me that in the case of Vive+Pupil the distance between your eyes and the lens thus the pupil cameras is crutial. For example if I have it set to closest the pupil cameras are not picking up the confidence perfectly. I also think every head/eye are different somewhat, so probably the best thing you can do is experiment with this. The best fitting setup for my head is two clicks from the closest. I think you should not move your head inrelation to the headset during calibration if you want to get an accurate result. What do you mean by depth is not tracked properly? could you please elaborate ?

user-5ca684 07 July, 2017, 10:37:16

@user-5874a4 if you are still having problems with the calibration points, or you think the best lens-eye distance in your case does not allow you to see all calibration points, you can adjust the hardcoded calibration point positions

user-5ca684 07 July, 2017, 10:42:35

@user-a2dcc3 perhaps a better approach would be, to write your network messages from your program to call start eye process

user-5ca684 07 July, 2017, 10:44:12

@user-a2dcc3 I believe once calibrated, you can just call (in case of our Unity plugin) :
_sendRequestMessage (new Dictionary<string,object> {{"subject","eye_process.should_start.0"},{"eye_id",0}}); _sendRequestMessage ( new Dictionary<string,object> {{"subject","eye_process.should_start.1"},{"eye_id",1}});

user-5ca684 07 July, 2017, 10:48:06

@user-1486c3 sorry about the delay! Here is an example how you can use the task scheduler: ScheduleTask (new Task (delegate { Debug.Log("Hello World from the Main Thread!"); }));

user-a2dcc3 07 July, 2017, 17:09:00

@user-5ca684 So that means the calibration would stick after closing the Unity app right? Then I would just kick off those processes and listen for events. If so that's great. Going to try that out later today. How would you clear calibration data then? Just run another calibration?

user-a2dcc3 07 July, 2017, 17:09:47

@user-5ca684 Was able to fix my problem with seeing the calibration points by adjusting the lens, thanks for the advice.

user-5ca684 07 July, 2017, 17:47:05

@user-a2dcc3 great to hear!

user-5874a4 09 July, 2017, 02:48:35

@user-5ca684 Thanks for the advice. I'll also try by adjusting the lens in vive. Should work just as it had worked for @user-a2dcc3

user-5874a4 09 July, 2017, 02:57:03

@user-5ca684 By depth, I meant the calibration in 3D space. For some reason, after calibration, I felt it was not accurately showing the near/far fixation points. The tracking results were very wobbly. Whereas in 2D mode, results seem to be almost perfect. anyone else had the same issue ? Or does pupil inherently has these issues in 3D mode?

user-a2dcc3 09 July, 2017, 15:02:05

@user-5874a4 when you say wobbly do you mean the red gaze target that represents the 3D gaze point after calibration keeps moving between far and near, or actually shows a lot of error/noise in x,y location? I haven't looked into the Unity script but I assume the main difference between the modes is the visualization of gaze afterwards. I'm guessing 2D mode is drawing targets based on the screen space position, and thus isn't trying to portray the depth you are looking at. Whereas after a 3D calibration the target is drawn at the inferred depth (intersection of gaze normals). With such a visually sparse and flat environment (only feature is the current gaze point) it would be hard to produce consistent depth values without something in the 3D scene to fixate on. I would have to check but I assume that the calibration methods are independent of the current tracking model (3d/2d, set in the capture program), so doing a 2D calibration but visualizing the current gaze point in 3D would have the same behavior.

user-5874a4 10 July, 2017, 01:01:45

@user-a2dcc3 the x, y coordinates are reliable and accurate. its the depth that keeps wobbling

user-5874a4 10 July, 2017, 01:03:08

@user-a2dcc3 Although this might not be the best idea, I was trying to test the depth accuracy using the square tiles of the virtual walls( from HTC vive room setup) as reference. So, when i am inside the wall, I was hoping there would be some differences in size of the red gaze point while moving the fixation across the walls. I didn't get the desired results, the size of the gaze point was changing completely oblivious to the point of focus(near/far) only sometimes reliable that too when fixation long on a point. I was thinking that since the walls are a 3d object too, that it should work. Maybe i am wrong. Let me know if I need to try a different approach

user-1486c3 10 July, 2017, 10:45:09

@user-5ca684 thanks. so far i have used the data_lock and did my calculations on the main thread once the update method occurred. would the task scheduler have a significant advantage over using lock? another thing, about the circle3D normal, could someone please tell me in what coordinates is it given? does the Z axis represent the forward axis in the VR? does it represent the normal to the camera? is the formula i have written for converting the circle3D normal to VR world coordinates the right one? I have asked this a few times but i didn't see an answer... @mpk @user-5ca684

user-5ca684 10 July, 2017, 10:49:43

@user-1486c3 im sorry if it wasnt clear, you can use the debug view to see visualized data by the way. about the task scheduler, I believe it should have the same efficiency level because they both sync up after their update frame, there is a version that Im working on and will be publishen soon if everything goes well that is running on a single thread using the poller of netmq

user-5ca684 10 July, 2017, 10:51:57

@user-1486c3 I believe the Circle3D normal is in the camera matrix

user-5ca684 10 July, 2017, 10:55:47

@user-1486c3 I belive that the unity's coordinate system forward axis is the same as pupil's

user-ef7690 10 July, 2017, 11:12:12

does hmd-eyes only work in editor-mode?

user-ef7690 10 July, 2017, 11:21:21

there are a load of Unity Editor References in runtime-code

user-ef7690 10 July, 2017, 11:28:42

if i build the application from unity with HMD-eyes integrated the application crashes straight away without hmd-eyes it works fine

user-a2dcc3 10 July, 2017, 19:34:48

@user-5874a4 hmm seems like the walls should stabilize it somewhat. I haven't really developed with pupil or looked into the source enough to see how the model projects depth, though its likely from intersection of the gaze normals. I guess you could test this intersection to see how the depth varies. One thing my lab is planning to do is create a validation/accuracy measure for hmd post-calibration. Essentially use a 3D grid/cube of points at varying depths (can perform calibration on the middle depth slice) and then display each these point while recording what the reported 3D gaze point. Compare this 3D point to the known one and then you can visualize/measure the error distribution. It might start to fall off very quickly as you move away from the depth you calibrated at.

user-a2dcc3 12 July, 2017, 22:25:47

@user-5874a4 Also, the calibration points for 3D are in a list on line 318, in the list3D variable of unity script. Just changed mine so they would always be in view.

user-5874a4 13 July, 2017, 07:45:35

@user-a2dcc3 Correcting the lens distance mostly brought all the calibration points in the FOV for me. Some were on the border but I hope those didn't affect the calibration for the most part. Thanks, I will change the list variable if I still have problems.

user-5874a4 13 July, 2017, 07:55:25

@user-ef7690 I tried integrating the Calibration part from unity_pupil_plugin into my application. I just exported the unity_pupil_plugin project as a package and imported into the application. Then opened the calibration scene to copy the "Pupil Gaze Tracker" script component into my Scene's empty game object. It didn't seem to crash and was working fine although I had some problems with the scene output. Was this the integration that you were trying to do?

user-fe23df 14 July, 2017, 18:15:55

hello hello, my eyetracker seems to be connected but in the calibration scene the pupil gaze tracker component still says localhost is not connected. Anyone has an idea how I can fix this? Thanks! ๐Ÿ˜ƒ

user-5874a4 15 July, 2017, 00:46:27

@user-fe23df First verify if the link for pupil capture software is set correctly along with the port no. Then, check if you had enabled "Autorun pupil service" in pupil gaze tracker component? It must be in settings -> pupil app i think. or otherwise you can manually start the service before starting the unity app. Also ensure both eye processes are started in pupil capture software. if started, u will have two windows with tracking stream, one for each eye.

user-641c7f 15 July, 2017, 14:04:11

hi my eyetracker works with unity plugin but issue i got is that callibration is not very good in unity Editor so i wanted to build the project to see if it works in full resolution ... But scripts has using UnityEditor that is not able to include in built .... Any help to workaround for unity plugin ?

user-641c7f 15 July, 2017, 14:06:04

ohoo you should give port number given in your setting to pupil capture app and wait few seconds after running app so it connects through that port

user-fe23df 15 July, 2017, 20:23:17

@muzammil#8393 Thanks!

user-ef7690 17 July, 2017, 08:53:02

@user-5874a4 yep exactly what I did aswell in the editor it works, my problem is that i need the eye positions at runtime in a build no editor...

user-ef7690 17 July, 2017, 08:53:26

the plugin causes my build to crash

user-ef7690 17 July, 2017, 09:06:05

@user-5ca684 any chance on running hmd-eyes without Unity Editor dependency?

user-a2dcc3 17 July, 2017, 23:43:36

I have a question on hmd_video_frame_size. For the Unity script it is set to 1000 by 1000. Does this effect the normalized por values returned by the tracker? Or just the calibration? I can find in the hmd_calibration.py file where the screen size is used in the error calculation (thus using a square frame size weights horizontal and vertical error equally), if I use the Vive full resolution then this will distribute the error to weigh less on the vertical component. I will try and test it but was wondering if using the wrong screen size there would effect the por.x and por.y values being used in another application.

user-1ada9f 18 July, 2017, 05:21:57

If I do calibration in HMD eye, I have to get gaze information from hmd eye not the pupil capture

user-1ada9f 18 July, 2017, 05:22:03

Is it correct ??

user-1ada9f 18 July, 2017, 05:50:50

To do calibration from HMD eye, do I have to deploy hmd-eyes to hmd ?

user-a2dcc3 18 July, 2017, 14:14:30

@user-1ada9f If you do the calibration in hmd-eyes you can exit the program and then use the capture/service to get gaze data.

user-075b65 19 July, 2017, 01:19:36

hey everyone, has anyone tried hmd-eyes calibration on cardboard app?

user-5874a4 19 July, 2017, 07:58:58

Guys, I am trying to use Physics.Raycast to cast a ray to the gazepointโ€™s direction. The objects hit would mean that the user was looking at the object. So, then I can implement some actions after that trigger. I can load the gaze points into my script using Calibration. Marker object. It has exactly 6 markers for each update. Thinking that these markers represent gaze positions, I tried to position a simple sphere in those positions. But it didnโ€™t work properly. Why didnโ€™t it work? Do anyone know other ways to accomplish this?

user-fe23df 19 July, 2017, 13:34:11

@user-5874a4 I'm also trying to do raycasthit on objects, and am wondering if I understand the variable names correctly. I cast rays with origin points being eyecenters3d and directions being gazenormals3d

user-fe23df 19 July, 2017, 13:34:33

do you think this is correct? and is this also what you're trying to do?

user-fe23df 19 July, 2017, 13:37:00

is there a documentation of the scripts somewhere? I am having a lot of troubles understanding them, and a documentation would save tons of time for me (or anyone, I suppose)

user-fe23df 19 July, 2017, 14:54:52

Does anyone know why after calibration is done there is a huge redmarker that is stuck almost at the same position as the camera? And how do I turn this off? Also, I tried to get access to eyecenters3D, gazepoint3D and gazenormals3D but all of them always give me values of (0,0,0). I wonder where I do things wrong ๐Ÿค”

user-5874a4 20 July, 2017, 06:09:18

@user-fe23df Did you try printing the values seperately like Pupil.values.GazeNormals3D[0] and Pupil.values.GazeNormals3D[1]? I also get (0,0,0) in the 0th pupil. but there are values on [1] . If you still get (0,0,0), then likely you are not receiving data from pupil capture over the network.

user-5874a4 20 July, 2017, 09:27:02

@user-fe23df That's not accurate. I get data from both pupils. By default the gaze normals are at (0,0,0) . Upon the flow of tracking data, both pupil normals change. you have to wait till the initialization steps are complete for data to flow through

user-fe23df 20 July, 2017, 14:21:02

@user-5874a4 I did print the values separately for each element in the array, yes, and waited for a while but it always updates only (0,0,0). About raycasting, is it correct to cast rays with eyecenters and gazenormals?

user-05da2f 21 July, 2017, 11:22:46

Hello everyone. I'm having trouble running the hmd_eyes project in Unity3D (5.6.2f1). The first suspicious thing is that when I run pupil_service manually it opens up by default a single eye window. It does not start both eyes.

user-05da2f 21 July, 2017, 11:23:46

If, however, I run pupil_capture.exe and click on "detect eye0 and eye1" then both eye windows open and work properly.

user-05da2f 21 July, 2017, 11:27:53

So.. inside unity with the hmd_eyes project loaded and the PupilGaze GameObject selected, in the inspector it's constantly in disconnected state.

user-05da2f 21 July, 2017, 11:33:32

In addition to this when i start the project I get the following error:

user-05da2f 21 July, 2017, 11:33:41

Error in Unity3D

Chat image

user-05da2f 21 July, 2017, 11:50:01

So I guess, is there anyone that got this hmd_eyes thing to work? [email removed]

mpk 21 July, 2017, 11:50:46

@user-05da2f yes :-). @user-5ca684 any ideas about this error?

user-05da2f 21 July, 2017, 11:51:27

@mpk so you got the latest verison of hmd_eyes to work with the latest drivers and pupil_v0912_windows_x64 ?

user-05da2f 21 July, 2017, 11:52:00

unity 5.6.2f1 ?

mpk 21 July, 2017, 11:52:15

@user-05da2f Pupil Service and Pupil Captures API has not changed. Our Unity dev might have ramped up the min unit version requirement.

mpk 21 July, 2017, 11:52:47

You can use capture instead of Service if you want. This gives you some more visual feedback.

user-05da2f 21 July, 2017, 11:53:21

@mpk the instructions say to use Pupil_Service.exe :/

user-05da2f 21 July, 2017, 11:53:38

So you mean I can instead run pupil_capture.exe and achieve the same result?

user-05da2f 21 July, 2017, 11:53:50

i.e. select pupil_capture.exe in the unity interface

mpk 21 July, 2017, 11:54:03

Yes. It should work. We designed both apps to be able to talk to the unity3d plugin.

mpk 21 July, 2017, 11:54:07

correct.

user-05da2f 21 July, 2017, 11:55:28

@mpk that fixed the both eyes not opening problem but the NetMQ error persists.

wrp 21 July, 2017, 11:55:31

I believe @user-5ca684 may be AFK for the next few days - so you may not get a response from him right away

user-05da2f 21 July, 2017, 11:57:28

@mpk @wrp this is the offending method: https://www.pastiebin.com/#&togetherjs=jfRwKasSUC

user-05da2f 21 July, 2017, 11:58:05

When _sendRequestMessage tries to use NetMQ I think all hell breaks loose

user-05da2f 21 July, 2017, 11:58:50

Anyway I guess I have to wait for @user-5ca684 Thank you guys for trying to help

wrp 21 July, 2017, 11:59:37

@user-05da2f can you link to the line number via github and/or make an issue on https://github.com/pupil-labs/hmd-eyes

user-05da2f 21 July, 2017, 11:59:53

@wrp will do! thanks

user-ef7690 21 July, 2017, 16:20:29

Hi all, I've finally got Gaze data (Pupil.values.GazePoint3D) in Unity but now stuck with the unit it's stored as (mm) how do convert it to screen pixels or a normalized screenspace?

user-fe23df 21 July, 2017, 16:38:01

@mpk can you help me with my issue please? I tried to get values of eyecenters and gazenormals but it always gives me zero no matter what. Is there like an extra step in between that I missed somewhere? I have the pupil script in the scene and pupil capture running, both eye windows opened and calibration finished. Thanks in advance!

user-5874a4 21 July, 2017, 19:45:40

@user-ef7690 I tried one method similar to the one the script uses to draw the gaze points after calibration. Look into "CalibrationGL.cs" line 47. I am pretty sure that's the method(Marker(_marker)) used to draw the points in the default output after calibration. That uses (Pupil.values.GazePoint3D) stored in Calibration.marker. I thought of using this method, but i am not familiar with the GL library they use to draw the graphics. You could try this way if it makes sense to you. Presently, I am trying to raycast using Gaze point normals.

user-fe23df 21 July, 2017, 23:27:29

@mpk nevermind, it started to work somehow, I have no idea why. Anyways, if I want to cast rays from gaze is it right to do it with eyecenters and gazenormals? because the result I gather seems pretty off, so maybe I'm doing it wrong

user-ef7690 23 July, 2017, 11:59:25

@user-5874a4 CalibrationGL uses the Unity Low-level graphics API drawing vertexes etc is indeed a viable option, was hoping there was something more directly related to the camera

user-a1febb 23 July, 2017, 12:23:06

Hi, just a quick question: is there a command to start a validation procedure in the calibration scene in order to get some accuracy measure of the calibration?

user-a1febb 23 July, 2017, 12:25:08

Also to ohooo: When I do raycast I use eye center. I use a little object (just a box) which is a child of the player and its relative location gets updated by using the gaze coordinates. Then for the ray I use its global position as ray direction and the players position as origin. Hope that helps

user-ef7690 23 July, 2017, 12:28:24

press c in play mode

user-a1febb 23 July, 2017, 12:35:08

But this starts the calibration, not validation. How can I get an accuracy measure out of this? Like giving the same targets a second time and comparing how many degrees the eye tracker is off from the actual position of the targets.

user-a2dcc3 23 July, 2017, 19:25:26

@user-a1febb I don't think there is any validation built into the Unity script. I perform a validation externally in a different program where targets (smaller than those calibrated with) are presented and I calculate the angular error manually using the average of the gaze normals (cyclopian gaze vector) compared to a ground truth gaze vector.

user-ef7690 24 July, 2017, 07:21:37

vivi you could possibly do something with the confidence values?

user-a1febb 24 July, 2017, 10:45:26

@user-a2dcc3 that kind of script is exactly what I was looking for. With 'in a different program' do you mean in a different scene or outside of Unity? And I don't know if this is too much to ask but would it be possible that I can take a look at your script? That would be really awesome and probably save me a lot of time but if not I could understand. @user-ef7690 I don't think this would help me. I already use this but for a different purpose. As far as I understand this just gives me how confident the tracker is about the pupil location which can var with each frame. But I need a general rating which tells me how exact the calibration is.

user-4a2a65 24 July, 2017, 14:31:51

hello! I've very recently run into a problem. I thought just to make sure it's not a common problem, I'd ask here. I'm trying to use hmd-eyes via remote connection. I've set up a very simple network, just a pc and a tablet connected to a wifi-router. both pc and tablet can ping each other just fine, the network seems up and running. eye tracking is connected to the tablet, pupil_capture is running there. checked a bunch to make sure ip and port are correct. Now I'm starting the unity pupil plugin on the pc, but all I get back is that the connection to the server failed. no additional info. maybe I'm missing a very basic step somewhere?

user-ef7690 24 July, 2017, 14:32:38

check Pupil Remote is set to the correct port in Pupil Capture

user-4a2a65 24 July, 2017, 14:35:09

the port specified unter pupil remote in pupil_capture and in the service port field in the unity plugin are the same

user-ef7690 24 July, 2017, 14:35:40

Have you tried disabling your firewall? :

user-4a2a65 24 July, 2017, 14:37:41

mhm, had to do that to begin with. although I may have to get back to you on that... I'm not 100% certain that nothing is interfering on the pc's end. I don't have full rights on that device. but I've turned off all I can at least. and windows on both machines keeps annoyingly telling me that I'm in danger ๐Ÿ˜‰

user-4a2a65 24 July, 2017, 14:45:18

okay this may just be something on my end here.. sorry for the bother. I'll make sure that everything I must have open, is indeed open.

user-ef7690 24 July, 2017, 14:45:42

takes a few seconds to connect aswell though

user-4a2a65 24 July, 2017, 14:46:01

I've upped the timeout to 5 seconds instead of the 1 it had

user-ef7690 24 July, 2017, 14:46:19

theres a 7 second delay built in

user-4a2a65 24 July, 2017, 14:47:03

oh right, just found it

user-4a2a65 24 July, 2017, 14:47:08

the servicestatupdelay

user-4a2a65 24 July, 2017, 14:50:22

so then the timeout I added really didn't help anything, better change that back. really looks like it can only be one or two things here, and it's probably me missing a security feature of windows10 somewhere. thanks for the help! Have to look at that with fresh eyes tomorrow ๐Ÿ˜ƒ

user-265feb 24 July, 2017, 19:07:44

Hello Everyone, I am trying to build a gaze detector (finding the screen coordinates where the user is looking) for my university project. I am new to this so if anyone can help me on how to get started with this lib and work further on this it would be appeciated.

user-a2dcc3 24 July, 2017, 20:00:04

@user-a1febb I can't give you the exact script now and it would be hard to parse into your own code base as it comes from a Vizard script. Vizard is the external program I'm using to render the validation. I can describe an approach though, your best bet might be displaying your own targets using the existing methods for displaying calibration points in hmd-eyes. Use the same points or others (I have seen others use points in between the calibration positions for the validation) and add validation procedure to the 'v' key press. Are you familiar with the idea of a cyclopian gaze position/direction? To translate binocular vision to this you imagine gaze coming out of one central eye instead of two converging eyes. The cyclopian eye position is the average of the two eye positions, so it rests between your eyes in the middle of face. The cyclopian gaze direction is the average of your left and right eye gaze vectors. The ground truth vector is the one from cyclopian eye position to targets (target_pos - eye_pos), and reported value is the cyclopian gaze direction you compute. You can normalize the vectors and plug the dot product into arccos() to get the angle between them. I'm not sure how the Unity script places the eyes (in space) relative to the HMD camera object, but i'm guessing it uses values reported by the tracker. You could hardcode a position for each eye relative to the HMD camera position as well.

user-a1febb 24 July, 2017, 22:04:26

Okay, thank you! I will try to do this

user-f7028f 25 July, 2017, 07:58:49

Hi guys! I invented new issue for myself ) I wish to trick eye-detector by given it a fake eye which will be a mobile camera's lens! I tried to print on sheet picture of real eye, of drawn eye but for now the best detection result is get from just a solid white paper with tiny hole for camera lens. But this results is still quite inaccurate to use in my experiment. I don't need to get calibrated coordinates, detected eye center raw data is enough!

I would like to ask some advices from experts about best solution for such task.

Chat image

user-72b0ef 25 July, 2017, 08:18:32

@user-265feb There is already a Unity plugin called HMD-eyes, which communicates with pupil and correspondingly moves markers in screenspace according to eye movements

user-05da2f 25 July, 2017, 10:21:08

@user-5ca684 can I get some help with hmd-eyes unity project? It does not work. I just spoke with @wrp and now I'm sure I have the latest drivers, properly installed.

wrp 25 July, 2017, 10:21:35

@user-05da2f - please also migrate your screenshot here

user-05da2f 25 July, 2017, 10:22:03

Chat image

wrp 25 July, 2017, 10:22:35

@user-05da2f can you try running this with Pupil Capture instead of Pupil Service

user-05da2f 25 July, 2017, 10:22:36

@user-5ca684 So there's a netmq error and.. I'm also wondering about that part to the right. Is it supposed to show "connected" and the eyes are supposed to change or something? : )

wrp 25 July, 2017, 10:22:47

this way you can aalso gain insight if the correct Port is specified

user-05da2f 25 July, 2017, 10:25:03

@wrp @user-5ca684 same NetMQ problem with pupil_capture

user-05da2f 25 July, 2017, 10:32:20

and that NetMQ error only comes up when I press 'C' to calibrate.

wrp 25 July, 2017, 10:42:45

@user-05da2f ok - please allow @user-5ca684 some time to respond to these messages

user-05da2f 25 July, 2017, 12:29:51

@user-5ca684 I have made a little progress. I started pupil_capture on a macos host in the same subnet and tried talking to it from Unity3D using the PupilGazeTracker script. Again I only receive the left eye.

user-05da2f 25 July, 2017, 12:32:14

So the situation is as follows: Using some older version of PupilGazeTracker I can get one eye. Using the version in hmd_eyes I get NO eyes.

user-0ffc2d 25 July, 2017, 15:59:01

Hello to everyone! I am new with pupil software and I have only basic knowledge for unity( and coding in general) . Is there a dummies manual for the hmd - eyes with all functions explained with scripts rules aso. ? I'd like to integrate the markers with my moving scene and in general automatize the vr ambient without the need to access to pupil app during the tests. Thanks!

user-05da2f 25 July, 2017, 16:13:32

@user-0ffc2d I have yet to make hmd-eyes work. If you succeed please let me know.

user-0ffc2d 25 July, 2017, 16:20:16

@user-05da2f Jes we set the ports and paths correctly and the default calibration scene is working. But our experience finished here...

user-05da2f 25 July, 2017, 16:20:38

@user-0ffc2d which O/S if I may ask?

user-05da2f 25 July, 2017, 16:21:08

Windows I suppose, if you're using Unity and VR

user-0ffc2d 25 July, 2017, 16:24:53

@user-05da2f exactly. I first tried on a MAC but it asked for an .exe file path so i could not find a solution for it. I'ts actually not a real VR. it's a projected scene on a wall with some interaction with the body of the user.

user-05da2f 25 July, 2017, 16:25:44

@user-0ffc2d can you tell me your unity version, Windows version and version of the drivers/utilities?

user-0ffc2d 25 July, 2017, 16:30:54

@user-05da2f Unity 2017.1.0f3 (64-bit), the working pc is a windows 10. the bugging pc is running Windows 10 on Parallel MacOS (here the drivers for the camera are not working properly and give "GostModeOn") . drivers are dpinst64.

user-05da2f 25 July, 2017, 16:31:41

@user-0ffc2d you mean you're on 0.9.12 right?

user-05da2f 25 July, 2017, 16:32:15

capture_utility, service, recorder etc.

user-0ffc2d 25 July, 2017, 16:41:35

@user-05da2f yes

user-05da2f 25 July, 2017, 16:42:11

@user-0ffc2d I got the utils to run on macos natively

user-05da2f 25 July, 2017, 16:42:22

@user-0ffc2d why are you trying to run them inside a virtual machine?

user-05da2f 25 July, 2017, 16:42:56

@user-0ffc2d if you need to run your software inside a virtual machine just run the capture_service outside, in the host (MacOS) then connect to it over the network from the client (Windows 10)

user-0ffc2d 25 July, 2017, 16:44:55

@user-05da2f because my teammates pc's are lugging all the time. I have a mac and could not effort to connect unity to pupil_capture. So i tried it on Parallels. Do you know how to do the network connection? Thanks a lot

user-05da2f 25 July, 2017, 16:46:27

@user-0ffc2d basically you tell PupilGazeTracker.cs that the IP you cant to connect is the IP of the Host (MacOS). Try first to figure out the IP addresses of the host OS<-->Virtual Machine client, see if you can ping between them, THEN try to connect using the hmd-eyes

user-0ffc2d 25 July, 2017, 20:17:04

@user-05da2f Thanks! I'll try it tomorrow!

user-3e1362 25 July, 2017, 22:28:08

Hi Pupil Community! I am new to pupil software and am still getting my HTC Vive add-on fully up and running. I am actually using it in a Daydream headset for a VR project and streaming the camera content over a local network through the Pupil Mobile app on a Pixel XL (it works!). My questions are as follows: As I understand I should use Pupil Service to do the eye-tracking since it does not have a world camera. However, when I run it only 1 eye-feed opens (eye 0) and I can't seem to figure out how to get both eyes to show. Additionally, when I run capture it registers one of the eye cameras as the world camera and won't let me detect the eye cameras. Second question, I have tried running the filter_messages script to parse out the gaze data for at least the one eye that is registered but it gets hung on the "sub_port = req.recv_string()" line. The port and addr match with that shown in Pupil Capture. I am running v09.12 on Windows 10. I would like to avoid running it from source since the helper script seems to give me the data I need. Any help would be greatly appreciated!

user-3e1362 26 July, 2017, 00:14:43

edit I just ran the Pupil Service executable twice and set them to different camera inputs so that seems to have solved the single eye issue, but I'm still trying to figure out why the filter_messages script is getting stuck

user-05da2f 26 July, 2017, 07:30:54

@wrp any idea when @user-5ca684 will be back?

wrp 26 July, 2017, 10:45:04

@user-05da2f - I have pinged him but have not heard a reply.

user-97d236 26 July, 2017, 21:04:06

Hi, I've been having some success in using this to get the objects the user is looking at in Unity, but it seems to be only able to run in the Unity Editor? Trying to build seems to expose some dependency in the OperatorWindow file on having the Editor. Attempting to remove these dependencies allows me to build, but the executable just crashes instantly. Anyone have any ideas on how to get around this?

user-05da2f 26 July, 2017, 22:02:36

@user-97d236 I would also like to know this. I have also attempted and failed.

user-fe23df 26 July, 2017, 22:06:12

I think it was stated somewhere that you can only run with Unity Editor unfortunately. Hope that I'm wrong though

user-1ada9f 27 July, 2017, 04:58:21

https://github.com/pupil-labs/hmd-eyes/blob/master/hmd_calibration/hmd_calibration_client.py

From this code

What I have to do are 1 Get pupil data and append that data to refs 2 Run pupil capture 3 Wear HMD with pupil cams 3 run hmd_calibration

Do I understand it correctly?

user-5ca684 27 July, 2017, 08:11:56

@user-05da2f Hi, apologies for the delayed response

user-5ca684 27 July, 2017, 08:14:18

@user-05da2f have you tried to press the calibrate button instead of the "C" key... should not make a difference though. Also if I understand you correctly you are trying to run pupil Service and it does only bring up one eye, correct? in that case could you please test it with running Pupil Capture instead of pupil service?

user-5ca684 27 July, 2017, 08:17:05

@user-1a0aec you are right at the moment unfortunately some of the Editor related scripts are not specified to compile only for Editor mode thus building might be problematic. I am aware of this issue and will be fixed in the next major release! Currently there is a whole refactoring under development where the Unity side of the story would run only on one thread so UWP platforms should not have issues with it

user-5ca684 27 July, 2017, 08:18:59

@user-1a0aec @user-05da2f if you feel like fiddling with the code you can use #if UNITY_EDITOR #ENDIF so the editor related code will not disturb building the project for standalone

user-5ca684 27 July, 2017, 08:21:33

@user-fe23df so yes, currently it is true

user-72b0ef 27 July, 2017, 08:24:05

@user-5ca684 I would like to ask whether there is a way for us to get some insights on the features / additions / improvements that will be new in upcoming releases

user-72b0ef 27 July, 2017, 08:26:42

much like a "sneak preview " or something ^^

user-5ca684 27 July, 2017, 08:28:25

@user-72b0ef well currently the scope is mainly to improve data access and compatibility, and to fix the above mentioned issues.

user-5ca684 27 July, 2017, 08:29:10

do you have anything specific in mind?

user-5ca684 27 July, 2017, 08:29:25

I mean are you looking for a specific feature?

user-5ca684 27 July, 2017, 08:31:01

@user-72b0ef one thing though, there are some demo/usecase examples we are working on

user-72b0ef 27 July, 2017, 08:31:05

@user-5ca684 I see. ehm, no not particularly haha, still I was interested on stuff that might be improved / new features or additions, since I work closely with your software

user-5ca684 27 July, 2017, 08:35:44

@user-72b0ef there is one thing that might be interesting for you. We are planning to separate the dataReciver part from the GUI options. So you will be able to access incoming data through a static class a bit more lightweight (for calibration you will need to have the main component, but once you have a good calibration the datareceiver will be enough)

user-72b0ef 27 July, 2017, 08:37:01

@user-5ca684 That does seem interesting! Are you able to tell me how this will affect our code?

user-72b0ef 27 July, 2017, 08:39:51

@user-5ca684 Also, to come back on the feature request part: We would like some sort of "measurement" or indication when our calibration has "failed". Currently, we use debug markers in our unity scene (which follow the user's eyes) and ask people whether the marker is on the right spot of where they are looking with their eyes. In addition, we thought of a way to detect a "bad calibration", by checking whether during calibration the confidence data by pupil was over 0.7

user-72b0ef 27 July, 2017, 08:40:38

@user-5ca684 I have absolutely no idea whether this is going to work out well though

user-5ca684 27 July, 2017, 08:41:00

well the code is already checking for the confidence level

user-5ca684 27 July, 2017, 08:41:30

so during calibration there is no data feed toward pupil service if the confidance level is lower than the threshold

user-5ca684 27 July, 2017, 08:41:41

I believe currently is set to 0.5

user-72b0ef 27 July, 2017, 08:41:57

right on! Good thing to know.

user-5ca684 27 July, 2017, 08:43:08

however about calibration quality feedback, you might need to ask @mpk or @wrp , I dont think we can measure it from Unity

user-5ca684 27 July, 2017, 08:44:19

there might be a notification if it has failed if I remember correctly, but it is not a measurement of the quality

user-72b0ef 27 July, 2017, 08:45:17

true, when the calibration has been stopped or not enought points are gather I believe, then the calibration will return a notification in Pupil

user-5ca684 27 July, 2017, 08:45:58

yes

wrp 27 July, 2017, 08:55:05

@user-72b0ef calibration quality is a feature that should be implemented in Pupil

user-05da2f 27 July, 2017, 09:06:30

@user-5ca684 yes pupil service only brings up one eye when you run it, but then later when an app tries to "connect" to the pupil service both eyes open up.

user-05da2f 27 July, 2017, 09:07:03

@user-5ca684 but no eye information is fed to unity

user-05da2f 27 July, 2017, 09:11:31

@user-5ca684 that's from the hmd-eyes project. In our project only the left eye is fed.

user-05da2f 27 July, 2017, 09:11:50

@user-5ca684 any idea on how we can troubleshoot this?

user-72b0ef 27 July, 2017, 09:43:34

@user-05da2f You can try launching pupil_capture instead? I get both eye feeds this way....

user-05da2f 27 July, 2017, 09:45:52

Unity says "Failed to connect with the server" despite the fact that pupil_capture is running and is showing both eyes. Is there some way to check the ports?

user-05da2f 27 July, 2017, 09:46:12

i.e. is there some way to check in the window of pupil_capture WHICH port it's running on, and then change it on the unity side?

user-72b0ef 27 July, 2017, 09:46:21

btw, I just loaded the new HMD-Eyes. this is AMAZING! It seems I have been working with a very old version of the HMD eyes plugin.... I am overjoyed right now with all the changed that have been made.

user-72b0ef 27 July, 2017, 09:46:38

@user-05da2f load the pupil remote plugin

user-72b0ef 27 July, 2017, 09:47:02

@user-05da2f there you can see the port

user-05da2f 27 July, 2017, 09:47:23

@user-72b0ef first of all thanks for the help

user-05da2f 27 July, 2017, 09:47:42

I'm not sure what you mean by pupil remote plugin.

user-72b0ef 27 July, 2017, 09:48:18

I'll PM you

user-05da2f 27 July, 2017, 09:48:19

the unity-pupil-plugin unity project has a few plugins under the Plugins folder but no "pupil remote" plugin

wrp 27 July, 2017, 09:57:00

@user-05da2f - you can set the port in Pupil Capture within the Pupil Remote plugin

wrp 27 July, 2017, 09:58:14

Chat image

user-05da2f 27 July, 2017, 09:58:15

@wrp @user-72b0ef helped me. It looks like the value is hardcoded so it was very confusing. Not only this, but it does not accept characters as input, only numbers...

user-05da2f 27 July, 2017, 09:58:39

so if you click on it and test with the keyboard if you can input something there, nothing happens

user-05da2f 27 July, 2017, 09:59:03

anyway, thank you all for the support. I'll play with the ports a little bit and try to talk to pupil_capture

wrp 27 July, 2017, 10:00:19

@user-05da2f - right you can click to input/modify text or long click to select everything in the text entry

wrp 27 July, 2017, 10:00:40

You can also untoggle Use primary network interface and set the ip manually

user-05da2f 27 July, 2017, 10:07:24

Any ideas why this might be happening?

Chat image

user-05da2f 27 July, 2017, 10:07:45

@user-5ca684

wrp 27 July, 2017, 10:08:44

@user-05da2f are there other instances of Pupil Service running in the background? Please check the Task Manager.

user-05da2f 27 July, 2017, 10:09:09

@wrp seems like there is another one

wrp 27 July, 2017, 10:09:25

When you close Pupil Capture I mean, is there anything else still running?

user-05da2f 27 July, 2017, 10:09:35

@wrp no there is not

wrp 27 July, 2017, 10:09:55

ok - my thought was that there may be an instance of Pupil Service running in the bkg

user-05da2f 27 July, 2017, 10:12:11

omg it worked.

wrp 27 July, 2017, 10:13:04

๐Ÿ™†

user-5ca684 27 July, 2017, 10:15:08

@user-05da2f I will take a look at this strange behaviour with the Pupil Service, but im glad that finally you could connect! ๐Ÿ˜ƒ

user-5ca684 27 July, 2017, 10:17:46

@user-72b0ef ๐Ÿ˜ƒ I guess this way the new features came earlier than expected. Let me know if you have any questions/feedback

user-05da2f 27 July, 2017, 10:53:37

Does anyone know if the calibration script in unity creates some file that I can then later re-use so that I don't need to re-calibrate every time?

user-5ca684 27 July, 2017, 10:54:26

@user-05da2f in Unity there is a file that stores camera intrinsics

user-5ca684 27 July, 2017, 10:54:39

otherwise pupil side stores the calibration data

user-5ca684 27 July, 2017, 10:55:11

but you can test it yourself, just get a good calibration and try to run again without calibrating again

user-5ca684 27 July, 2017, 10:55:28

you will see that the last calibration is going to be valid

wrp 27 July, 2017, 10:55:56

@user-5ca684 is right - however @user-05da2f For most accurate results we recommend re-calibrate each time you put on the HMD

user-5ca684 27 July, 2017, 10:57:46

@user-05da2f yes in my experience a small difference in your eye's position in relation to the cameras might damage the accuracy of the data, thus recalibration is best practise. by the way for best results please adjust the eye-lens distance on your hmd device

user-5ca684 27 July, 2017, 10:59:01

in my case on the HTC VIVE the best setup is two clicks far from the closest position

user-72b0ef 27 July, 2017, 11:28:48

@user-5ca684 There seems to be something wrong with the left eye and right eye markers.... Whenever I toggle them, they show in the scene, but they are not moving.... When I switch them to 3D they simply dissapear

user-5ca684 27 July, 2017, 11:29:36

@user-72b0ef they will reappear once you have 3D calibration

user-5ca684 27 July, 2017, 11:30:05

if you want 3D data visualized, you will have to calibrate for 3D mode

user-5ca684 27 July, 2017, 11:30:33

every time you switch between 2D/3D you will have to recalibrate

user-72b0ef 27 July, 2017, 11:30:42

Chat image

user-72b0ef 27 July, 2017, 11:30:49

This is what it looks like...

user-5ca684 27 July, 2017, 11:31:05

the toggle should be encapsulated actually

user-5ca684 27 July, 2017, 11:31:24

I mean the toggle is basically handled by the code

user-5ca684 27 July, 2017, 11:31:45

in the refactored version this will be much clearer

user-72b0ef 27 July, 2017, 11:32:06

I see! Then how would I switch to 3D calibration? I switched to 3D mode

user-5ca684 27 July, 2017, 11:32:23

thats is it, now calibrate

user-5ca684 27 July, 2017, 11:32:42

you should receive the 3D data after successful calibration

user-72b0ef 27 July, 2017, 11:34:37

there seems to be something wrong.... Whenever I press C, windows makes this distinctive sound, this also happens when calibration is done

user-72b0ef 27 July, 2017, 11:34:46

Also no markers show up ๐Ÿ˜ฆ

user-5ca684 27 July, 2017, 11:35:36

the sound is fine

user-5ca684 27 July, 2017, 11:36:03

that is from the pupil app to let you know when the calibration ended

user-72b0ef 27 July, 2017, 11:37:22

errr, something really strange is happening haha, I checked 2D calibration, but it actually calibrated 3D now!

user-72b0ef 27 July, 2017, 11:37:30

Chat image

user-72b0ef 27 July, 2017, 11:37:42

2D and 3D are mixes up ๐Ÿ˜…

user-72b0ef 27 July, 2017, 11:38:58

Chat image

user-5ca684 27 July, 2017, 11:39:05

this is 2d

user-5ca684 27 July, 2017, 11:39:20

3D has only 1 marker after calibration

user-72b0ef 27 July, 2017, 11:39:55

hmmmm, okay, that is my bad then

user-72b0ef 27 July, 2017, 11:40:06

is it possible to have 3 markers in 3D calibration?

user-72b0ef 27 July, 2017, 11:40:18

after**

user-5ca684 27 July, 2017, 11:41:07

well not really

user-5ca684 27 July, 2017, 11:41:26

I mean there are other ways for visualization with 3D

user-5ca684 27 July, 2017, 11:41:47

the 3d marker is the point where the two gaze normals intersect

user-72b0ef 27 July, 2017, 11:41:56

I think I understand, the white marker is the intersections

user-72b0ef 27 July, 2017, 11:42:38

yes XD I just drew for myself why It would only show 1 marker, but it can of course only be 1 marker, because all points intersect there

user-5ca684 27 July, 2017, 11:42:42

I think the 3d marker is just red

user-72b0ef 27 July, 2017, 11:42:46

I feel so stupid XD

user-5ca684 27 July, 2017, 11:42:55

no worries

user-5ca684 27 July, 2017, 11:43:30

I dont think its stupid, just needs some time to get used to these concepts

user-5ca684 27 July, 2017, 11:46:03

there is a debug view feature where some 3d data is being visualized

user-72b0ef 27 July, 2017, 11:46:28

another question, whenever I use 3D calibration, the points dont seem to change in depth. Is this not necessary for 3D calibration?

user-5ca684 27 July, 2017, 11:48:55

the calibration marker(white) will stay on one depth

user-5ca684 27 July, 2017, 11:49:08

the 3d gaze position (red) should change in depth

user-72b0ef 27 July, 2017, 11:49:39

yep, but why does 3D calibration not need the white marker to move on interchangeable depths?

user-5ca684 27 July, 2017, 11:50:22

I believe @mpk could explain this much more in depth

user-5ca684 27 July, 2017, 11:51:41

im not really familiar with the math behind the scenes of the calibration process

user-72b0ef 27 July, 2017, 11:52:33

hehe, no problem, Im sorry I ask so many questions and so much in depth too. All of this is relevant for our research and understanding how everything works

user-5ca684 27 July, 2017, 11:53:42

totally understandable

user-5ca684 27 July, 2017, 11:53:52

and I ll help where I can

user-72b0ef 27 July, 2017, 11:55:15

Thanks a lot ^^, if there's anything I can help you with also, I would be happy to do something in return

user-5ca684 27 July, 2017, 11:55:37

I ll keep in mind ๐Ÿ˜ƒ

user-f7028f 28 July, 2017, 02:59:58

Hi there! I bumped into strange issue. I constructed static stand with fake pupil which well detected. I read a non-calibrated eye data, especially circle_3d center and found that x, y, z values of circle_3d center are changed between detection runnings (for example when click to 'reset model') Example data: 1st model: "center" : [ -21.8446561224384, -33.1030575399491, 110.640810764555 ] 2nd model: "center" : [ -61.7128511468933, -92.7098553737352, 310.398432162575 ] 3rd model: "center" : [ -15.7032193468464, -23.5389137821636, 78.8681217289737 ]

I look in debug window to be sure that pupil circle is always detected in the same area

user-f7028f 28 July, 2017, 04:16:22

or I should use "circle_3d normal"data? It looks constant in all of this 3 models.

user-f7028f 28 July, 2017, 04:29:53

probably no, "normal" - is a direction of gaze, not a pupil center coordinate

user-fe23df 28 July, 2017, 07:27:41

@user-5ca684 hi, are EyeCenters3D world coordinates of the eye centers? I'm wondering because the values that I get are very big and not at all close to where the player is, and the average of them also is not (0,0,0) (as should be if my assumptions are correct). Thanks in advance! ๐Ÿ˜ƒ

user-0ffc2d 28 July, 2017, 09:29:54

Good day guys, I was wandering why i'm not able to export the calibration scene. Second questions: Is there a way to have the operator buttons directly on the operator screen?

user-5ca684 28 July, 2017, 09:31:07

@user-fe23df The EyeConters3D values are in local space of the HMD

user-5ca684 28 July, 2017, 09:31:52

@user-fe23df the reason why you are getting very big numbers is that Pupil data is in mm and Unity is in m I believe

user-5ca684 28 July, 2017, 09:37:09

@user-0ffc2d Good day to you too, are you using Unity? If so do you mean building the scene by exporting calibration scene? Currently it is not planned to move/duplicate buttons onto the scene, but you can use onSceneGUI or UI GameObjects and refer to the same functions

user-0ffc2d 28 July, 2017, 09:57:16

@user-5ca684 Thanks for the answer. Yes, I'm working in Unity3D. I try to build the scene but nothing happens. I am not confident with coding. Which is the file with the functions i should refere to ?

user-5ca684 28 July, 2017, 10:04:59

@user-0ffc2d well in most cases, if you trying to build and nothing happens, there are actually some error messages that would make it clear why build has failed. At the current version unfortunately you are restricted to in Editor usage, however a new release is coming up where some obstacles will be tackled in order you can build your project into standalone version. Either way for now you can test your work in Editor mode.

user-5ca684 28 July, 2017, 10:06:00

@user-0ffc2d regarding the buttons, you can write your own .cs file with the sceneGUI code, please see the official documentation: https://docs.unity3d.com/ScriptReference/Editor.OnSceneGUI.html

user-5ca684 28 July, 2017, 10:08:09

@user-0ffc2d alternatively search on Youtube, there are many tutorials on how to do that. After you have prepared your interface you can trigger the functions you want your buttons to refer to, but in order to really be able to help you I would need the exact functionality you want in your scene butttons

user-0ffc2d 28 July, 2017, 10:17:33

@user-5ca684 thanks! Since we are using different biosensors with their own app, our aim is to have an unique operator UI to have a complete overview on all bio-parameters and sync the start of the different recordings and calibrations.

user-fe23df 28 July, 2017, 16:05:38

@user-5ca684 Oh thanks a lot! The mm thing explains everything now :))

user-a1febb 28 July, 2017, 16:33:19

Hi, I am currently looking for a way to access the data from the 3D Fixation detector. Since I am working with Unity and the HMD Eye tracker I need to access this data with some C# code via the IPC backbone. To get the pupil data I just subscribe to the gaze socket with

subscriberSocket.Subscribe("gaze");

If I subscribe to all sockets with subscriberSocket.Subscribe(""); and then let it print out the message types I recieve only pupil, gaze and notification messages are recieved. This makes me think that I need to access the fixation data differently somehow. Maybe I can get the events dictonary somehow? Or is the fixation detector not running properly? I opened it in pupil captur and just to make sure also load it again in my program with _sendRequestMessage(new Dictionary<string, object> { { "subject", "start_plugin" }, { "name", "Fixation_Detector_3D" } });

Maybe someone can help me figure out how I can access the fixations dictonary?

Thank you and I hope you all have a great evening!

user-5ca684 28 July, 2017, 20:48:28

@user-a1febb Hi, have you tried using .SubscribeToAll() ? im not sure if the " " will do subscribe to all. You can also try o run the plugin manually from the PupilCapture. but beyond this I dont know if the fixation data is at all distributed. @mpk @wrp any ideas on the subject?

user-5874a4 29 July, 2017, 06:47:37

What is the ideal place to implement the processing logic using gaze data? Like I am implementing Raycast logic to implement an action when a user looks at an object. Presently, I do this in a separate script using the update() method. And directly accessing the Pupil values using Pupil.values.xxx . I've seen in one of unity-hmd integration project, It was implemented inside a lock function( I guess for running in the main thread). Can anyone tell me what's the best way for this and the differences between the two ways?

user-5ca684 29 July, 2017, 16:17:37

@user-5874a4 well if you are running on Update() that will run on the mainThread. In future versions our aim is to solve datagathering on a Singlethread.

user-5ca684 29 July, 2017, 16:22:33

@user-5874a4 about the lock: it just keeps thing in sync between threads, basically the whole code architecture is meant to expose the data to be used from the mainThread. We based this code on using a secondary thread due to the NetMQ behaviour. The point is that a secondary, dedicated thread will ensure that no messages are lost, and all is being fed to the dataLibrary, so I would say running your dataAccess on the mainthread pretty much anywhere should be fine.

user-72b0ef 31 July, 2017, 14:46:51

@user-5874a4 I personally still use the lock, but what I did is the following: In a script I listen to all the incoming pupil packages and with events I store them in a queue, then in the same script using the update() with the lock in it, I call my own method: GenerateTrackingdataForBothEyes() which dequeues 1 value from the queue and makes a raycast with that data....

End of July archive