Hello fellows of pupil labs, If anyone has any tips/tricks with troubleshooting Pupil 1.2.7 on the HTC VIVE, would you kindly assist? I cannot get the Pupil App to send data to Unity, because I'm not receiving any data. Thanks.
@user-de97ec To start, I would check to make sure that the port listed in Unity matches the port listed at the very bottom of the world scene camera settings in pupil capture.
@user-de97ec and sorry, but I'm not at a machine where I can open the programs and give you more specific advice on the location of the port numbers.
The port numbers are the same, and the unity UI says that is connected, however, I receive no data.
@user-de97ec and you're trying for a 2d or 3d calibration?
2d calibration
Are the eye images ok?
If you have reasonable eye images, and the ports are matching in capture and unity, then my last guess is that unity is not pointed to the correct pupil_capture.exe.
That's near the port listing. Also, let it start the server automatically (the related button is somewhere above the port listing). That's all I've got 😃
Thanks for your help. The eye images are pretty good, and they’re being tracked pretty well, and I have tried all of those troubleshooting tips, to no avail.
K. Come back tomorrow. It's a bit late in the day, and folks may be asleep. Many of the contirbutors are on german time.
Is there an easy way to go back to specifying specific 2d calibration points instead of setting a radius and number of points? I would like to manually set points but running into some trouble.
Hello, I'm having trouble with fetching confidence values in a unity script. It's possible I'm doing something quite wrong, but as of right now, I am calling PupilData.Confidence(PupilData.GazeSource.RightEye) and then the same for the left eye. Confidence values are printing to the log as 0, however in Pupil Capture id0 conf and id1 conf both have valid floats that fluccuate correctly with blinking.
I'm on the previous version of pupil. We had to roll back since we were having issues with the latest version.
@StayFroztee#6132 is the confidence level in Pupil Capture above 0.6? Otherwise,this would result in the method receiving Pupil data to ignore the current packet. You can find this in code at: Connection.cs, starting in line 167. This would be a good position for a breakpoint, as well, to see whether you receive any data at all. Just to make sure of some bits and bobs * the port in both Pupil Capture and Unity is set to the same number? * the existing demo scenes (e.g. Calibration) run through fine? * you have subscribed to the "gaze" topic in your custom scene/code?
@user-90aa66 you are probably already modifying Calibration.UpdateCalibrationPoint(), correct? In case of 2d calibration (PupilTools.CalibrationMode == Mode._2D), currentCalibrationPointPosition needs to be set to your custom 2d viewport position. So in the simplest case float[]{ 0.5f, 0.5f }. This would result in a calibration point in the middle of the screen. But what also needs to be set is a corresponding z-depth, at which the PupilMarker gameobject is set. UpdateCalibrationPoint calls this in line 83 Marker.UpdatePosition (currentCalibrationPointPosition); public void UpdatePosition(float[] newPosition) { if (PupilTools.CalibrationMode == Calibration.Mode._2D) { if (newPosition.Length == 2) { position.x = newPosition[0]; position.y = newPosition[1]; position.z = PupilTools.CalibrationType.vectorDepthRadiusScale[0].x; gameObject.transform.position = camera.ViewportToWorldPoint(position); } else { Debug.Log ("Length of new position array does not match 2D mode"); } } [..] } PupilTools.CalibrationType.vectorDepthRadiusScale (used in line 86) is what you probably modified or deleted to instead use your custom implementation. As a first step, you can just set position.z to "2", which is the value we currently use, as well.
Hello there, we recently purchased your htc vive add on to use on our research. Unfortunately, we had some trouble with utilizing your services. The eye position in the 3D method, which is indicated by a red circle does not work properly. It tracks other stuff than the pupil (which it's supposed to do) such as the reflection from glasses or such and also shakes too much. Do you think there is a problem with the hardware itself? like the USB connector or the tracker? or is there any other way that we can resolve this problem? Btw, we are using the latest version of pupil servicses(ver 1.2.7)
Hi @user-ebf24b Are you using the system while wearing eyeglasses within the HTC Vive?
yes, but we also tried without glasses too.
Could you send us some sample eye videos or screenshots
Based on what you wrote, this is not hardware related but likely more about the setup (e.g. depth settings of the HTC Vive headset) and achieving a good view of the eye. We will be able to say more once we see an eye video
You can send an eye video to me via direct message or to info@pupil-labs.com
Ok. I will send you some images as soon as possible. Maybe it takes around 10 minutes. Thank you for your rapid reply.
No problem. A short eye video (you can record with Pupil Capture) would actually be most helpful if possible for you
just send a mail with the movie clip! thanks in advance!!
thanks - received
@user-ebf24b the eye cameras look out of focus
You can adjust the focus of the eye cameras by turning the lens
umm.... how exactly am I supposed to turn the lens?
Re hardware - i can see all IR leds are active and frame rate looks good from your screencap
You can adjust the focal depth of the lens by turning the lens of the eye camera with our fintertips
thanks a bunch @wrp!! I felt so stupid for such a simple solution lol. Just handled the whole problem
@user-ebf24b I'm pleased that there was a quick and simple solution!
oh, and one more thing. According to your documentation, the datum structure contains 'phi' and 'theta'. What is it exactly based on, meaning what does it represent with what kind of standard? and is the unit by radian?
Guys, can you please help about how to get 3d_gaze_position in real time by using Unity3D? I just want to move an object depends on the gaze position
I setup the hmd-eyes plug-in, but I could not access the gaze position data by calling the given function
in the readme file
Actually I have checked MarketScene Demo and 3d calibration script with it. But still no idea how to create a new scene and move an object accordingly
We have your Vive hardware and I think everything overall is setup right. There might be some tweaks I still have to make but one of the pieces I wanted to test was recording the eyes while in your unity project as well as the unity world itself. I setup unity to point to pupil capture however when I run the 2d calibration scene and then go to the global pupil window to select record, it says that I do not have a global camera. I believe this is because the unity window is not in focus, but for me to record the eyes I need to hit the record there....correct? I was able to record my eyes while using "test image" in capture, but can't seem to get thsi working with the unity project. Any ideas?
@user-caf7b9 you actually already fixed the issue, yourself, by setting the Backend Manager to "Test Image" (and clicking the "Activate Fake Capture" button) in Pupil Capture. In contrast to the normal headsets, the Pupil VR setup does not include the "world capture" camera. So with standard settings you get the message you saw. With the change to "Fake Capture" of the world camera, the eyes should now record as intended. You can continue to start this with the "R" button in Pupil Capture, but there is also an option to do this from Unity: Either use the "Recording" button in the Inspector GUI for PupilGazeTracker or press the "R" key. The plugin will then tell Pupil Capture to start the recording and also start a screen capture of the current Unity rendering. The newest version available on GitHub also generates a CSV file containing the gaze tracking positions and their corresponding Unity world coordinates.
@user-e04f56 Thanks for the quick reply! This is awesome news. I was concerned that recording using that would record the test image as the world camera as well. I'm heading into the office now and going to be playing around with that more today. Thanks again!
guys maybe
it bothers you
but I just want to learn how to create a custom scene and move an object according to 3d gaze data
thats all
I will be very glad if you help
I realized I had another question. In the documentation it states: Once you plug the usb cables into your computer:
the right eye camera will show up with the name: Pupil Cam 1 ID0 the left eye camera will show up with the name: Pupil Cam 1 ID1
Mine are reversed. I just want to know if this is going to cause any problems from a functionality standpoint. My left eye is ID0 and my right eye is ID1. I took the vive pupil addon hardware out to see if there was any way of adjusting or switching these, but that didn't seem to be the case.
Also sometimes when I am recording the eyes and world in-game, in the Vive headset it will slowly fade between the market game and the blue screen that shows when the unity window is not in focus. Do you know why this is happening? Is it a resource usage issue? I am currently running this all on my Sager laptop which has 8 core [email removed] 16GB of RAM and an Nvidia Geforce GTX980M.
One other thing...I was just looking for some advice since you guys work with this all the time. I seem to have large eyelashes and couldn't figure out why the vive eye tracking wasn't working well on me but was fine on others. Last night when I was playing around I noticed that when I opened my eyes wide, the accuracy bars for each eye went green which they seem to normally run red with me. Do you have any advice? Have you guys found any tweaks that assist with this? I did find that making sure the focus was as clear as I could on my iris helped immensely, but just wanted to hear your thoughts.
-- Edit-- Finally, is there a "recommended" way of calibrating the Vive? Accuracy is really important to us and I'm trying to minimize the jitter when looking in the market. I have gotten the eyes clearly in focus and seem to be pretty perfect visually, but in the Unity market it jumps around a bit and the spheres for the 2D calibration in the market seem to stay separated more than they are close together.
Thanks!
@uguryekta#1497 we are still testing the Unity plugin with the recent Pupil release version 1.3.9, which came out last Friday. 3D calibration is one of the areas actively being worked on so there is currently no simple solution to point you towards to get you started
@DanK#9345 I will have to check with my colleagues regarding the reversed eye IDs. As for the the recording issue: the "out of focus" behaviour your are describing is connected to the added resources needed for the screen recording in Unity. But it is usually only visible once (if at all) when starting the recording. Is this constantly happening in your case? If you do not need the video of the Unity scene, that could be deactivated in code and save some resources. Otherwise, a build version is more efficient compared to running the scene in Unity Editor. Regarding the eye lashes: The "General Settings" menu of an eye window offers a mode called ROI (region of interest). When you or rather a second person selects it while you are wearing the headset, you can adjust the area in which the pupil recognition is done. I hope this might help with this issue. Please tell us if this maybe also helps with the accurarcy in the 2D market scene you mention.
@DanK#9345 here a screenshot of the eye windows with adapted ROI
Hi! I'm using the binocular Add-On for the HTC Vive combined with Unity for my thesis. I played around a little bit and now I want to "manipulate" the 3D gaze point returned by your program (I get all data via a python script). Is there a way to do this and if so what is the best way to do it?
@user-2bd6d4 just to make sure: Are you using Unity to comunicate with Pupil or are you using the python reference scripts?
ATM: I'm using Unity to display the Calibration (your HMD Calibration within your software). I use a python script where i subbed to the IPC to get the gazedata and gray images in real time.
@ViperZ#7643 I wrote you a DM, so we can go into the specifics of what you want to do
Hi, I am using unity_pupil_plugin_vr with FOVE VR headset and I installed: pupil_service_windows_x64_v1.3.9 and included it to this path: C:\Program Files\pupil_service_windows_x64_v1.3.9 I opened the pupil_service.exe and pupil_plugin/calibration scene in Unity but I can't run the scene as there is an error saying: "Assets/pupil_plugin/Scripts/Pupil/PupilGazeTracker.cs(258,46): error CS0234: The type or namespace name XR' does not exist in the namespace
UnityEngine'. Are you missing an assembly reference?" Anyone has an idea what am I missing? Thank you 😃
@Fjorda#6564 this error is most likely related to the Unity version you are using. Prior to the 2017.x iteration, the XR namespace was called VR. After Unity extended it for Mixed Reality, it got renamed. Either update to the newest version of Unity or search for all instances of XR and replace them with VR (as far as I know, there are currently no incompatibilities). If you are bound to Unity 5. due to licences, the latter is an option, but you will have to repeat this step every time you update to the newest git version..
@Fjorda#6564 please also note that our plugin is not made to work with FOVE.
Hi, thank you for your reply. I changed them to VR and now I am having this error: Assets/pupil_plugin/Editor/SphericalVideoPlayerEditor.cs(6,22): error CS0246: The type or namespace name `SphericalVideoPlayer' could not be found. Are you missing an assembly reference?
I hope it can work with FOVE VR headset. I want to get users pupil size and to be able to make some heatmaps
My lab purchased a Vive binocular tracker a while back, and the hmd-eyes plugin back then was somehow out-of-sync with the pupil service codebase. It looks like the current versions now work well enough to complete the calibration procedure, but I'm getting pretty bad tracking for both 2D and 3D calibrations. Using the Market scenes and the Calibration scene. Any advice on improving things / debugging this? I have a little time to dig into code if needed.
Hi! Is calibration of Pupil Plugin VR accurate? I am using the latest pupil_plugin and pupil_capture(v1.4.1). Even if the calibration is successful, the tracking is often very unstable. Is this specification within the assumption? I am recording an unstable condition. https://drive.google.com/file/d/1Yk1PCYHDEHjaDJa5ZcjMw8ULrCjOkrRq/view?usp=sharing
Can anyone tell me if the binocular addons for Vive/Rift provide enough precision for performing eye tracking for foveated rendering? The addon isn't cheap so would like to know if it is worth investing in. If anyone has done it before any links and/or videos would be appreciated.
@user-adb91e it looks like @papr responded to your points in the core channel already.
not completely
https://varjo.com/developers/ Check the specs of the headset. Only 100Hz cameras, recommended Processor: Intel® Core™ i7-6700 Processor or AMD AMD FX™ 9590, equivalent or better, recommended Graphics: NVIDIA GeForce® GTX 1080 or AMD Radeon™ RX Vega, equivalent or better. This headset prototype performs eye tracking exactly for foveated rendering. I just need to know if similar tracking accuracy and latency can be achieved with Pupil.
Hi everyone, I know it's very basic but I can't get my HTC Vive with pupil labs add on to work in pupil capture. It shows all capture sources as unknown. Does anyone know how to I can fix this issue?
@user-006924 this seems like a driver issue
Uninstall all pupil cam drivers from device manager including hidden devices. Restart the computer. Start pupil capture as admin
Drivers should install when you open both eye windows
I'll do that thank you. What should I do if drivers aren't detected at all for me to uninistal?
I should do this in the device manager? right?
Because previously with normal eye tracker my drivers wouldn't install at all so I had to use zadig to install everything on libusbk manually
You should not need to do manual install
Or need to use zadig
Thank you I'll try it .
I checked the device manager and the only new device was the HTC Vive under cameras. Under what category should the eye cameras be under if the automatically load?
libusbK
what should I do if there aren't anything under libusbk?
@user-006924 Can you follow the instructions on the following link and tell us what you get from USB Device Viewer? https://www.techrepublic.com/article/tracking-down-usb-devices-in-windows-10-with-microsofts-usb-device-viewer/
USB Device Viewer with Vive + Pupil cameras
The vive add-on is 2 cameras, so the connector has a built in "Generic USB Hub" attached on my Port5, and it has ports for the cameras. (My Port6, by the way, is the Vive). If you can't see the Pupil Cams, then try to see if you can identify whether or not the Hub is even being recognized by unplugging it, refreshing device viewer, then replugging, and refreshing device viewer, looking to see what has changed between cycling the plug.
I moved the cameras to a port that didn't recognize it and this is what I see in USB Device Viewer.
After I started pupil_capture as admin, then opened the individual eye windows, the hub + cameras appeared, mind you they were placed on a different port because they are USB 2.0, not 3.0.
Thank you for the detailed guide I'll check it out and see what I get.
I downloaded the software and started pupil capture as administrator and can't see the pupil lab cameras anywhere.
@user-006924 please uninstall drivers from cameras
category, unplug the add-on, and restart the computer. Start up the computer and run Pupil Capture as admin and then connect the add-on again. If you want we can schedule a time for remote support - you can email me at info@pupil-labs.com
Hi, I'm trying to use the "unity_pupil_plugin_vr"with my HTC Vive and I think I'm doing something wrong because it's almost working, but not enough to be usable. First, is there a way to make 'capture' remember my config? Everytime I hit play, I have to reconfigure the 'Absolute Exposure Time' for each eye, and it's a bit frustrating. Thank you for your help
I checked again and there are no pupil labs cameras under the camera category, only HTC Vive and computer's own camera and I'm worried uninstalling those might mess things up since it's not my personal system. Is there a way for me to check whether it's hardware issue with the cameras or not?
@user-006924 are the cables plugged in fully?
The vive cables? Yes, Since it's working fine separately.
An other question. Is it normal that I have to push 'Absolute Exposure Time' to 127 to get an image bright enough for the algorithm to work?
I think I have a problem with the configuration of capture. I've made a video of the pupil detection during the calibration phase and it seems to work (it seems that the pupil is correctly detected), but when calibration ends, the green and blue dots are far apart and the red dot isn't on what I'm looking at. Here is the video: https://we.tl/UwId1P6qtI Calibration start around second 15. Thanks for any help.
@user-006924 Do you know what's on Port9 (WinUsb Device)? Can you try uninstalling anything Tpcast-related?
@user-5d12b0 I'll switch to my own system and check the whole thing and see if I see any difference.
That's probably a good idea. I have a lot of experience helping PSMoveService users try to get the PS Eye working as a libusb camera and there are so many little variations in system configuration that seem innocuous but can get in the way.
@Djinn#0083 the video looks good, now I am wondering about the placement of the calibration points in Unity. with your VR setup (mostly: the distance of the display), would you say the positioning of the calibration points is more towards the edge of the visible area or towards the center? I suspect the former. Could you try to adapt the value by selecting "pupil_plugin\Resources\PupilSettings" and then in Inspector "Calibration\Calibration Type 2D\Vector Depth Radius". The "Y" value determines the radius of the circle pattern the calibration points follow. Please experiment with this value.
Thanks for the answer. The position of the calibration points seems to be at the middle between the center point and the edge of the visible area. What is the good position?
Hello I need some help with the pupil-labs example for hololens/unity
the example can be built, but when trying to deploy it to hololens using visual studio it fails with code 1
if I try any other unity/hololens from MS it works
just the example from pupil-labs fails
what is the problem when calibration is successful but the marker is jumping in random ways in market scene demo
can anybody tell me the stable version of pupil and hmd-eyes project for running the market-scene demo successfully
what is the market-scene-demo?
i just know the shark-demo
its under the hmd-eyes project , there is a market where u can move your pupil and a marker should be adjusted on it
which version you are using for shark demo?
pupil 1.4 seems to work...trying it now
i am new to pupillabs
i do not see that
maybe you have the wrong version of the hmd-eyes project?
could you please send me the link of git to download the project
ok
i am here: https://github.com/pupil-labs/hmd-eyes
just downloaded the zip
ah
i understand now
you are on the VR demo
yes
i am on the Hololens demo
😃
sorry
is it working fine for you? is it tracked properly after calibration?
well that i cannot tell you yet
i'be built
deployed to hololens
it started
then i discovered that i needed to adjust the remote eye ip address
recompile again, deploy again
😃
let me see
have you tried the VR demo?
well it works on the hololens
just did the calibration
i wil retry again
no i didn't tried the vr demos
i am really focused on the Hololens part for now
ok
ok...
my hololens is without battery now
i will try the vr demos
😃
please and let me know if it works
the problem you have is that the end calibration result is not good? or you cannot build/execute at all
?
after calibration ended, the marker is not in proper place where i am looking for
it is moving randomly
and you eye gaze can follow it accurately?
which project?
basically sometimes markers are getting invisible when i gaze to a location
which scene?
hmd-eyes 3d market scene demo
ok
i will try
ok
well mixed feelings here
1st attempt, i had the same result as you
jumping eye gaze
2nd attempt looked more stable but not good
3rd attempt calibrations keeps failing
same here but i have no idea how to fix that
do you know how to show a window for each camera
?
i've closed the cameras windows now i cannot see them
yeah run the pupil capture exe and on this window you will get options for detect eyes radio buttons
just checked that on
in wich option menu is that?
go to general settings after running the pupil capture exe
ok thank you!
the capture app just crashes when i have the unity app running + activate the 2 cameras
it seems to be a problem with the fps in the cameras
unity app opens the cameras by default
and then everythin is slow in those wndows
@user-11dbde @user-e04f56 should be online in about 12hrs. He can help you debug.
thank you
i've found out the problem
the path for the project was too long on windows
moved the project to a shorter path
and voila
it deploys
Good to know!
nevertheless i a bit afraid that my pupil-labs might have a defect
or i do not know yet how to use it 😃
@user-e04f56 can help you with that.
We figure it out :-)
it seems that when i move my head around the cameras stop receiving information....i mean this does not seem to be a delay only
ok
what is your time ?
here is 8PM
@user-11dbde some of the Pupil Labs team is UTC+1/2 and some is UTC+7
ok
I am having problems with eye detection in Unity examples
is @user-e04f56 already around ?
I am, how can I help
hello, thank you for the fast reply
We've bought a pupil-labs for Hololens
and I could not use the unity examples yet properly
it seems that sometimes the data speed is so slow from the hardware to the capture app
that no proper detection is made
i've changed usb ports...
is almost like some defect on the hardware
not sure
have you tried this using Holographic Emulation, as well?
with this you could rule out that the data used over UDP is too much for your network setup
ah i see
it makes sense to me now
i completely forgot that udp data is send around
because it seems to only happen when i start the apps with unity you are right
what should i try exactly?
I also have another question
there is a good documentation available for what to do in Unity: https://docs.unity3d.com/550/Documentation/Manual/windowsholographic-emulation.html
should i use resolution: 320x240? or 1920x1080?
what is the rationale for this?
I think I am the wrong person to ask about this. As the Unity plugin developer, I usually keep the standard Pupil settings, which, in this case, would be 320x240.
but what exactly should i try
can i use your unity examples with holographic emulation?
yes
the HoloLens project works with Holographic Emulation
your ui asks to do a double click in mid air to show the menu
how do i do that in emulation mode?
tap gestures are emulated, as well
I used a game controller, but from what I read the space bar should work as well
let me try that one
nope
but the data from the eye tracker has to be sent by network anyway for the example to work right?
the network communication is all happening on the PC, still via UDP, yes, but much more reliable
ok i understand
now
just having difficulties in controlling the ui example
i do not have a game controller
i can start the calibration
but i cannot choose the option for eye tracking detection
left
option
i press m for menu
but i cannot select the left option
connect to pupil
I will send you a DM so we can go into details
thank you
@user-e04f56 Hi, I'm trying to take the pupil demo working in my test dummy environment, but I don't manage to make things work. Do you have some readings to help me?
@user-a49a87 so you have a new project and would like to add the Pupil plugin to it?
Yes.
When I tested Unity at the beginning, I followed this tutorial: https://www.raywenderlich.com/149239/htc-vive-tutorial-unity And now, I try to import the Pupil plugin in this project
and it is throwing error messages after you imported it? I will try it on my end, as well, to see what might be going wrong
I don't have much experience with Unity. I don't know exactly what to export from the pupil demo project. Or if I should import one of the already exported package but there is two of them: - one in unity_pupil_plugin_vr (but I think I got many errors with this one) - an other one in unity_pupil_plugin (I think I don't get error with this one, but I don't know if I must create an empty scene with this and load the other scene with the 'Game object to enable' field of the pupil demo manager
okay, I just imported the current Pupil Import Package ( https://github.com/pupil-labs/hmd-eyes/raw/master/unity_pupil_plugin_vr/Pupil%20Import%20Package.unitypackage ) and was able to run the Calibration scene
did you already import one of the packages you mentioned?
if so, you might have to delete the old pupil_plugin folder from the Wenderlich project, first
Ok, Import went fine, just some warnings
I dragged and dropped the Pupil demo manager in the Hierarchy, I launched Pupil Capture and the two small eyes in the PupilGazeTracker inspector became green. So I guess that the communication between Unity and capture is working.
OK, I think it was not the good prefab to use. I dragged and dropped 'Calibration Scene' and deactivated all other from Wenderlich. Then, when I press play, I get the 'connecting to pupil' and my two eyes appears underneath but the screen stays on 'connecting to pupil' nevertheless, if I press C, I get the calibration running with the small white dots... Then nothing
hello everyone! i'm playing around with implementing a client for interfacing with pupil capture, however when sending over dummy data, pupil capture replies with Calibration failed for both eyes. No data found
, same with the python reference client. is this intended?
@user-bd0840 is your timestamp in sync with Pupil Capture / Service ?
@Djinn#0083 I compared the two prefabs "Pupil Demo Manager" and "Calibration Scene" and there was actually an error with the latter, as I had not updated it in some time. I pushed a new version which should fix the bug with the displayed text not being updated.
Hello, anybody got experience with msgpack with QT or c++? I'm trying to get specific information about pupils via ZeroMQ, but when I subscribe to pupil., I get all sorts of unnecessary information (for my use-case). Is there a filter function in msgpack, or further specification of topic in zeromq? I only need the pupil diameter and center, and working with the data as string and using regular expressions seems a bit clumsy and potentionally slow. Is there a better way to get specific data from the messages?
@user-e04f56
my output is this:
{"theta"=>2.01197, "model_confidence"=>0.525136, "topic"=>"pupil", "ellipse"=>{"angle"=>-35.8405, "axes"=>[18.7091, 31.5099], "center"=>[162.891, 166.349]}, "model_id"=>1, "phi"=>-2.37829, "sphere"=>{"radius"=>12, "center"=>[8.07407, -1.14548, 60.621]}, "method"=>"3d c++", "timestamp"=>15568, "diameter"=>31.5099, "model_birth_timestamp"=>12538.1, "projected_sphere"=>{"angle"=>90, "axes"=>[245.459, 245.459], "center"=>[242.577, 108.285]}, "norm_pos"=>[0.509033, 0.30688], "diameter_3d"=>2.69375, "id"=>0, "circle_3d"=>{"radius"=>1.34688, "normal"=>[-0.653371, 0.427003, -0.625119], "center"=>[0.233614, 3.97857, 53.1196]}, "confidence"=>0.99}
I was looking at this doc https://github.com/msgpack/msgpack-c/blob/master/QUICKSTART-CPP.md
at the convert function, but I'm not quite sure what's the proper class for messages from pupil.
hey @user-e938ee , the Unity plugin for Pupil uses the C# implementation of MessagePack, but I am not sure if that will help you..
Hmm, not really all that helpful.
I'll try to look into in further.
@user-e04f56 thank you for the information. I have an other question. Is it possible to skip the calibration process to avoid calibration during development?
If I call PupilTools.StopCalibration() , it says calibration failed and ask to restart calibration
Hi @user-e04f56 I just received my htc vive pupil cameras. I was trying to run the marketplace scene. I'm assuming that a red dot should appear on the marketplace where you are looking. Is that the correct assumption? I was not able to see any red dot or anything indicating where I was staring after calibration on the unity marketplace app. Are there any instructions to follow? Also maybe the calibration is not working as I don't get any indication that I am looking at the target, is the target supposed to change colors when you focus on it?
@user-d40c36 During the calibration: if you two pupils are detected, the marker is white. If one eye is missing, it's turning yellow or pink (depending on the eye lost) the marker turn red if none pupil is found
@user-e04f56 Hi, I'm still struggling a bit to make a demo with your import. I tried to mimic the code in the market to display the dots (green, blue, red) and casting a ray to interact with colliders Is it correct that you changed the way we collect the data from the eyes. In your import, you don't use the stuff in Pupil namespace, right? So to get the eye data, I should use: PupilData._2D.LeftEyePosition | PupilData._2D.RightEyePosition | PupilData._2D.GazePosition, right?
@user-a49a87 thx for the info. I did notice that there were different colors and didn't really understand that. But if that is true. I think the calibration should have succeeded then since the markers stay for the most part white. It indeed would be nicer if it turned green rather than white. In that case i'm not sure why I can't see a gaze tracking on the market case seen. I know it's hard to know, but I figure maybe somebody could point me in the right direction with some problems maybe they faced before?
I'm not an expert, cause I'm at the beginning like you and I struggle a lot, but if I can help, it's a pleasure. I think that if the calibration didn't go well, it would say so and you have to start over, so lets assume for now that it's working. In your inspector, is the demo gameobject is correctly enabled when the calibration is over? The demo gameobject has a script called MarketWith2DCalibration and inside this script, you have the demo stuff, like: the color map, the laser and the dot
Btw, do you use the 2D calibration demo?
I think it's the right scene to use atm
@user-a49a87 let me take a look into it a bit more. I just got it today. Right now I just turn on the capture on windows 10, then when the unity app starts it says press 'c' for calibration. That is the calibration I use. (Maybe I should pull the latest source) After that I just proceed and it finishes calibration, but I don't see any dot or gaze on the market scene. let me delve into the code and see if there is something I'm not doing right. Thanks for your help!
@wrp was actually my mistake, i was using it offline with the dummy image generated. dawned on me later that this could not possibly work... with actual data, it works now 😃
@user-a49a87 if you just want to work on the scene itself without any eye tracking, you can press the "s" key to skip the calibration process. there is also Pupil information you can work with, without requiring a calibration. The Blink scene demonstrates such a case. Accessing the eye images (done via Frame Publishing) or reading the pupil diameter are other examples. But if you need gaze tracking, there is no way around the calibration
@user-a49a87 as for your second question: In the current master branch version (and as part of the Unity Import Package we downloaded for you a few days ago), eye data is now accessible through PupilData._2D.LeftEyePosition/.RightEyePosition/.GazePosition If you download the complete repository, the market scene examples have been updated as well
Great. Thanks for the infos
@user-e04f56 @papr i'm currently working on my own calibration routine. is there some additional documentation how the points should be chosen, apart from the code in the hmd-eyes repo?
@user-e04f56 In the import you sent me, the VisualizeGaze () doesn't seems to work. Do you have an idea why? The terminal says We are gazing, but the markers doesn't show
@user-e04f56 Hi, I manage to make a demo scene of my own. The markers are displayed and I've created a laser aimed by my eyes. The problem is now that when I make a build, it doesn't work. It launch capture, but it doesn't seems that I receive anything. Do you have an idea? Thanks
@user-a49a87 I just tried to build a demo scene with the recent GitHub version and discovered that there was a shader missing (which is used for the new eye camera visualization). That might have caused the behaviour you experienced. After I included the shader ( "Edit\Project Settings\Graphics" -> "Always Included Shaders" -> Add "Unlit/Texture" ), a build version of the calibration scene worked as intended. Can you try this, as well, or update to the newest source?
it works. Thx