Hi all, how can I access the latest version of the code and docs? It seems that there have been lots of changes lately, but https://github.com/pupil-labs/hmd-eyes/tree/dev indicates the last commit was a month ago
Okay, things are working really nicely actually, only I'd like to switch from using auto-run pupil-service, to having the service constantly on (for faster developing and less reloads of the service)
When I do the auto-run pupil service, everything works perfect but if I either A) start the service manually and then start the project on local mode or B) start the service manully and run the project on remote with 127.0.0.1, it doesn't connect
I made sure that Pupil Remote it on, and that the ports match
I've also noticed I don't recieve any "Attempting to reach server.. trying again in 5 seconds" errors, like I sometimes do on autorun before the server loaded, its as if its not trying to communicate with it
@user-64e12b 127.0.0.1 doesnt really look good to me as a remote IP address to be honest
@user-5ca684 I think @user-64e12b is using the remote setting with 127.0.0.1 to be able to run PupilService conitously. That should work fine. I m not sure why it does not.
@user-5ca684 It would be nice to have a aut-run mode that keeps Pupil Service running coninously.
lets discuss this soon.
@mpk okay!
@mpk so locally running as remote with the local IP? in that case the Service Port might be different than default
@user-5ca684 I think the port would be the same. Its the standart 50020
@mpk I have experienced that it changes when running it locally with Pupil Capture
Only on capture and only because the port was already bound by anther app (possible service or a let over instance). In this case Capture will use a random other port.
this is so that remote can still load. However this does not help in the case of our unity3d plugin. Here we should always use 50020
Pupil Service will not start if it cannot grab that port.
Pupil Capture will simply use a different port.
killing all instance of Pupil before will allow you to set Pupil Capture's remote port to 50020
I see
well using the capture and setting the port in Unity to 50356 in my case has worked
yes. But Ideally we always use 50020. The only reason its already taken is a zombi Pupil Capture or Service instance lurking around.
so at clean restart that should never happen?
BTW.: Pupil Capture's remote plugin UI in the sidebar shows the port. ( @user-64e12b )
Yes.
thats interesting,
I mean there must be a zombie instance generated always in that case
Well once there was a zombi and Pupil Capture is using an alternative port if will rember that port unitl you reset or manually set to 500200 again.
Im gonna check the code, if for some reason it might run it more than once
I dont think thats the case. Its just that once there is a double instance its basicaly 'screwed'.
but I had a clear restart
I think we should give Pupil Service a but of a UI and then try to only use that.
A restart does not help. The settings of Pupil Capture where not deleted or were they?
ahh okay, that is stored of course even between OS restarts
Pupil Capture will retain the 'wrong' port once it had to use an alternative.
well in case of @user-64e12b , might worth a check, if there was ever a zombie instance and the port got derailed from default I guess
yes.
@mpk thx for the clarification!
I'm not sure I understood but was there some resolution for me or will that be a feature later on?
@user-64e12b well one thing for sure, you can check out what Service port you have in your Pupil Capture
Hi everyone! I am all of a sudden having trouble with one eye of my pupil tracker. I am using unity_integration_calibration with an Oculus DK2 to calibrate my tracker. I didn't make any changes to my code or mess with the hardware (as far as I know) and I am getting that the position of the right eye is all NaNs. This was not previously happening. Attached is a screenshot of the unity screen. In the upper left hand corner are position values for the left and right eye. Does anyone know what could be causing this behavior? Thank you!
@user-f13d61 the latest version of the Unity plugin is called unity_pupil_plugin in my fork and under my dev branch
try that out please
@user-5ca684 Thanks for the quick response! I can't seem to find "unity_pupil_plugin". I can only find your fork of hmd-eyes which has the same file names as the original. Would you mind sending me the link to unit_pupil_plugin? Thanks!
Hi all, Is it possible to connect normal eye trackers to unity or does it have to be the HMD addons? I followed the instructions on the github page and opened the calibration scene from the unity_integration _calibration folder but I keep getting this error.
@user-006924 - for 2d mode this is possible. But for 3d mode you will not be able to see the points displayed without an HMD (or without modifying the positions of the calibration points). Just to check, are you using the dev
branch of hmd-eyes - if not, please switch to this branch for now.
@NahalNrz#1253 from a technical standpoint you can use the headset with unity just fine. The question is what you want ot achive with that. 2d and 3d mode calibration will work but what tragest are you calibrating against? What space you want to track in?
@user-f13d61 You have to switch to dev branch to see it : https://github.com/kornellvarga/hmd-eyes/tree/dev
@wrp @mpk thanks for your response. I wanted to use the blink data in Unity which i think the 2D calibration mode would be fine for.
@user-5ca684 You mentioned you're working on an UWP implamentation. Have you got any further? I notice UWP support was added to netmq recently
@user-006924 if you want to use blink data you dont have to use calibration at all. You can just use the pupil datastream. So this should work just fine in unity3d
@mpk So all I have to do is open the main scene in Unity integration and stream the blinks via ZMQ to Unity?
@user-5b7b52 It is on our roadmap, and it is not so far in the future. But it will take some time still. It looks like that UWP and Hololens in Unity doesnt like Multithreading, that means quite fundamental changes. I do not want to promise you any exact date for UWP compatible version, but if you want drop me a PM next week, I might be able to tell you more then
connect
I am trying to do calibration with pupil labs on HTC Vive. Everything loads properly. After I press the play button, I see my pupil being detected. But I don't understand the process of calibration. I press the C key which brings a screen with green and red circles constantly flickering. It just happens all the way. How do i know the end of calibration? And how do i know if it's properly calibrated?
Hi guys I was following the steps mentioned in the pupil docs for windows integration:
I got the following error when I ran run_capture.bat in my python terminal: File "E:\Pupil\pupil\pupil_src\capture\pupil_detectors__init__.py", line 19, in <module> from .detector_3d import Detector_3D ImportError: DLL load failed: The specified module could not be found.
any idea why this can occur?
@user-5874a4 - if you are not already, please use the dev branch: https://github.com/pupil-labs/hmd-eyes/tree/dev
@user-4a3b48 unfortunately, building from source on Windows is quite the challenge. This error means that the pupil detector did not build or link properly. The real issue may be related to dependencies required by the pupil detector not being built or linked properly. Without seeing more about your env setup, I would conjecture that Ceres solver is a likely culprit of the errors.
@user-5ca684 Thanks for you're reply. I understand and will send you a PM next week. Good luck 😃
@wrp thanks for the response. I got it all sorted. It was reallyt a silly error. Once I started using the command prompt in administrator mode then everything fell in place. Its all working now.
I've been trying to use the main scene in the unity integration folder. I change lines 103 and 104 in the PupilListener.cs to the number it shows for connect remotely in the pupil remote section of pupil capture and I calibrate it in pupil capture. But I still only get (0,0,0) for pos. Does anyone know where I'm making a mistake?
Hi all, it has been a while since I wrote it, but i didnt see any response, so I will just paste it here: ....... Hi, I installed the new pupil version 0.9.12 on a different windows computer (because working with unity and pupil in the same time lead to serious frame-drops of the Vive...). when I run pupil capture the frame rate of eyetracking does not go above ~65 for 1 camera and ~55 to the other, even if only one camera is recording. Its a strong computer, but it doesn't have USB3. the CPU usage is 60% at worst, and no more then 50% memory usage. I looked at the USB bandwidth used by the cameras, and it is fixed on 5% each, even if only 1 is recording. is there anything I can do, or it just wont work with USB2? did anyone else face this problem before? ....... if anyone has seen this issue, or can tell me what i can do with this, that would be great. thanks.
Which kind of camera does Vive addon use? Is this some kind of a custom built camera and are there any alternatives for it?
@teet#0724 yes these are custom made cameras.
@user-1486c3 this is unusual. We run Pupil fine on windows machines with USB2.0.
Have you tried running at qvga?
@mpk Could you please Expand a little bit more on how I could use Pupil's data stream to use blink data or gaze data in Unity. I've been trying for a couple of days and I can't get any data from Unity.
yes, this was done on the lowest resolution possible. on higher resolution the rate drops even more
is there something else that might cause this drop?
Hello @mpk! I would like to ask you for explanation what exact eye data means after 3D callibration for HMD (I use HTC Vive for my experiments).
1) I got such data: "eye_centers_3d" : { 0 : [ 28.118888408036, 2.172403077987, 6.19393017619268 ], 1 : [ -31.6994144654337, 8.8729780375715, 11.2807614019783 ] } What this numbers means? Note: eye camera's sensor settings is 640x480, 120 FPS
2) I know the resolution of scene that I looking at HTC Vive screens (1512x1680)
3) My goal is to determine both eye coordinates (x, y, z) relative to scene in HMD. In other words, where eye is positioning in front of Vive's screen?
Hope I formulated the questions clearly. Thanks in advance!
@user-f7028f can this help you a little bit? https://github.com/pupil-labs/pupil-docs/blob/master/user-docs/data-format.md
I have a problem to get the data from pupil in hmd_integration_calibration. If i try to access the inner objects from "Pupil.PupilData3D data" like "data.ellipse.angle" I get 0 but the print of the received objects is not 0. Is there a issue with parsing from MessagePackObject?
@user-8df5ab inside Unity we parse the messages. Im not sure if all pbject get populated. @user-5ca684 can you get some info regarding this?
@user-8df5ab what @mpk said is pretty much true. There is a BaseData sctructure that I believe is being populated, and the rest is stored in a dictionary, for which there are methods that makes it a bit easier to access. by the way, which topic's data are you looking at?
@user-5ca684 Currently i´m looking to access projected_sphere_center_x/y in function OnPacket(Pupil.PupilData3D data). (pupil capture 3d mode)
@0101010#0669 hmm that actually seems to me, that its not the latest version
@0101010#0669 so what I said about the dictionary is probably not correct in the version that you are using.
Can the hmd-eys software use some other camera with lower frame rate? For example HD6000, that is used in pupil labs DIY kit.
@user-5ca684 Im working with the latest version. Is your content in projected_sphere |= 0?
@user-be88c1 the latest version does not have OnPacket method
can you send me link to git with The latest?
@user-be88c1 https://github.com/kornellvarga/hmd-eyes/tree/dev
@user-5ca684 Looks really good. Thank you. Didnt saw this.
I have a question about the circle 3d normal. i know that in unity they use the z axis as "forward", x axis as "side", and the y axis is up. what exactly are the x,y,z in the circle 3d normal?I understand they are in the head coordinates, but in order to transfer them to world coordinates i need to know which direction is which
@0101010#0669 if you have any question/suggestion, feel free to PM me
Can the hmd-eys software use some other camera with lower frame rate? For example HD6000, that is used in pupil labs DIY kit.
@peep#6789 yes this should work.
@mpk Can I ask you about accuracy of some eye tracking data? I would like to know, if I don't walk through Calibration process will be pupil's coordinates is more or less accurate to transform them to HMD's world camera coordinates? I'm saying about an pupil not about the gaze. And the second question. Suppose I have such data before calibration: ... "circle_3d" : { "radius" : 1.58172494643777, "center" : [ -2.1068308167175, 0.113884674658084, 7.6472956277242 ], "normal" : [ -0.200826704858128, -0.592004067165888, -0.780512536142036 ] } ... 1) this values is in eye tracking camera's local coordinate system? Where is the start of that system? Center of view or something like bottom-left-surface of camera (for y, x and z respectively) 2) How this coordinate system correlates with VIve's Main Camera coordinates system?
habolog, I am trying to do something similar to what you are doing. i tried to use physics.raycast to get where the eyes hit the walls of my room, using the Camera (head).position as the origin of the ray, and the eye direction vector3 as the direction. i calculate it like this: EyeDirction = EyeDirctionLocal.x * head.right * (-_pupilData.id) + EyeDirctionLocal.y * head.up * (-_pupilData.id) + -EyeDirctionLocal.z * head.forward EyeDirctionLocal is taken as the _pupilData.circle_3d.normal so far it gives not so bad results, but i have a major problem i cant work around yet, and if anyone here knows how to solve i would appropriate it very much. i get an exception that the physics.raycast can only be called from the main thread, and this means I cannot use it for every pupil measurement, only 90 measurements per second. if anyone knows how to run such a method inside the pupil thread, it would help a lot. I could do it off line after the measurement ended, but since unity gives you this awesome raycast feature, it would be a shame not to use it. Plus, if any of the pupil staff can verify that the eye direction formula is correct, it will be very helpful. thank you
of course (-_pupilData.id) should be (-1)^(pupilData.id) for some reason, using this method, i get a very strong bias toward up (y axis)...
is it possible that the circle 3d z axis is towards the camera, or should this be calibrated to be the center of the screen?
Heyho! Can I ask a question regarding the Unity plugin? :) If the pupil-diameter is accessible and how, if so.
@user-1486c3 if you calibrate to 3D you can get the Eye Normals and that way you would not have to do any further calculation with raycasting. But if you do want to use the EyeNormals together with raycasting, you can just do it from the mainthread using the published pupil data. Are you trying to do this in Unity?
@user-8d5f72 I believe if you are using the latest version of the Unity plugin, you can access the exposed dictionary that should contain all the data
@user-8d5f72 As far as I know the pupil diameter is in the BaseData that already should be exposed too
@user-1486c3 One more thing, there is a task scheduler included in case you want to run somthing on the mainthread from the secondary thread, or vice versa. Plus we are full throttle working on a single threaded version, but I cannot promise any release date yet
@user-5ca684 thanks for the answer. I am using the eye normals, but are they given in the 3d virtual space, or in the eye camera coordinates? can you please explain what did you mean by no further calculations with raycasting is needed? can I use the normals to get a specific location of what the eyes are looking at in the virtual world? about the task scheduler, it might be what i need. will it be possible to send a few actions to the main thread that will occur on the next update of some class? can you please give me more details on how is it implemented in the plugin?