Hi, has anyone been able to get pupil diameter data from HMD? There is a method in PupilData.cs
public static double Diameter () { return new double (); }
Does this mean that this function is not implemented yet?
It seems that "The HoloLens implementation currently only supports gaze data". In this case, what's the easiest way to send pupil diameter data from Pupil Capture to the program running on HoloLens?
Anybody knows what's the best way to use the gaze to work with an UI ? I already integrated the Eye tracker and got the gaze data in my Unity project, I was thinking of adding a collider on all UI objects and do a linecast from camera to the Eye Marker, but I don't know if that would be the best solution, or if something similar is already available or if it would even work at all.
Hello, we are using the Vive and cannot seem to get the information to show inside the headset. The program runs on unity and we are able to go through with calibration and it says calibration complete, and on unity it then shows the market scene, but nothing is showing inside the headset, including the actual calibration scene or the market scene once calibration has been completed. Any recommendations?
hello , can anyone help me with the project "cursor movements using eye gaze" I have tried different codes but doesn't seem to be much effective . if anyone can help. It will be a great pleasure. Thank you
I'm still having issues receiving Blink Data. I'm not sure if the Blink_Detection plugin is starting in the service. How can I check this?
Hey everybody, I am trying to connect Pupil Labs with my VR-Project (using the HTC Vive and its Pupil Labs Add-On). So far the Pupil Manager is included in my project , but when I start the calibration process via pressing the play button unity is always crashed. It shows me both eyes in the game view, so I am definetly connected to Pupil Labs and then tells me to enter C on the keyboard but nothing happens when pressing it. After I have stopped the "Game" from running, unity is not responding anymore (which is also told by SteamVR plugin). I really don't know what to do anymore. Is this a common problem that someone already solved? I would be happy to get some help as I am still a beginner working with pupil labs π
Moreover sometimes the calibration works, but then the scene will not show on the Vive, but only on the game view . It then says : Display 1 - no cameras rendering
@user-626718 this is not an issue we have heard of before. Do the prebuild demos work for you?
@user-d754f8 please try using Pupil Capture instead of Service it will work the same way!
@mpk yes they work fine! Now the calibration works most of the times, but it seems like the calibration process cannot be ended, as the game objects from the list gameObjectsToEnable do not appear while saying: Display 1 - no cameras rendering. This is the time when unity then crashes. Maybe it is just some stupid beginner mistake if it is not a common mistake. So is there any kind of tutorial or help (additionally to the GettingStarted-Sheet) which tells you how to integrate the Plugin in your own project? Thanks for your response already!
@user-626718 I think it might be related to the zeromq connection part of code. I had similar problem when I tried to connect to pupil remote with my own code.
Just a guess.
Is anyone getting significant performance drop after integrating hmd-eyes into Hololens ?
@user-626718 we're having the same issues, but are just trying to get the calibration to work without it being integrated yet, but we're not even getting anything to show up in the headset. We are also beginners, so probably making a stupid mistake. Any advice?
@user-1e5434 did you try the precompiled demo?
Is it possible to use a "Natural Features Calibration" for the HoloLense , or any other kind of "custom" calibration routine ?
@user-1e5434 same problem I am dealing with. I have no idea how to tell pupil labs to 1) use the cameras of my vive pro as world camera or 2) display anything on the screen of the headset. should this somehow be possible with the precompiled headset or do I need to integrate pupil labs into a unity project in order to get this working? I did not find enough information on the website about using pupli labs with a vive. would be happy about any links to tutorials or more documentation!
ok, I actually found the answer to my questions here: https://github.com/pupil-labs/hmd-eyes
Hello, I would like to ask if usb camera can be used as world camera. I got an eye camera, but I have not obtained a world camera yet. I tried to use usb camera with pupil capture, but it is not on uvc sources list (eye cameras are on list). Thank you for your help.
@user-1e5434 I added a part to the Pupil manager script in the part of StartDemo where I set the CameraEye active. Then it was working again! I don't know whether this might also help for the calibration
Asking again, but is performance drop a huge issue after integrating hmd-eyes into your own Unity project ? I wonder if I didn't do something right or if it's just intensive and I have to optimize the rest
So, 2D or 3D demo is throwing an error upon calibration (after successfully connecting to the tracker):
NullReferenceException: Object reference not set to an instance of an object: PupilManager.OnCalibtaionStarted () (at Assets/pupil_plugin/Scripts/PupilManager.cs:93) PupilTools.StartCalibration () (at Assets/pupil_plugin/Scripts/Networking/PupilTools.cs:364) PupilGazeTracker.Update () (at Assets/pupil_plugin/Scripts/Pupil/PupilGazeTracker.cs:105)
If anyone has also seen this error, let me know. In the meantime, we'll try an older build.
Hey pupil people, I am trying to record data in my own scene with integrated pupil labs plugin. Everything is working (calibration process, start recording, stop recording), but when I stop the game mode from unity, no recordings were saved in the folder that was given in the path. Sometimes unity is even crashing and I dont't know why! I tried recording in the demo scenes and it is working. I compared the record settings from the Demo Scene with my Scene and they are the same. Is someone having the same problem or knows how to deal with this?
Hi everyone. I am currently adding the calibration process into my unity project. The calibration process is working. But my pupilgazeTracker seems broken cause I cannot find any recording results on the customPath. Has anyone had the same problem and known how to solve this? Many thanks.
Hey y'all, I had Pupil Labs working in my scene, and as of today when I try to start the Calibration it just sits there and never connects to Pupil Service. No idea what's going on. I haven't changed anything. Does anyone have any ideas as to why this might be happening?
@user-945479 Did you check whether your port changed? I always work with pupil capture and sometimes the port is changing in pupil capture so that you have to adjust it in Unity in the Pupil Settings as well!
Has anyone been able to record a video of the hololens gaze tracking? I have hololens eye tracking working, with a gaze cursor that follows my gaze. When I press record on hololens mixed reality capture, the tracking falls apart and the cursor starts darting around. It might be that the hololens and pupil is trying to use the hololens camera at the same time? The problem disappears if I choose to record only holograms (and not PV Camera) in the mixed reality capture settings, supporting that hypothesis.
I am currently adding calibration into my unity project. I dragged calibration scene into my project and then use pupil manager to change between scenes. I also pointed the pupiltool.camera to the maincamera after calibration. The scene changing is working but I don't know why my record file did not have any data. My csv file is empty (only has the attributes' name), and my unity_camera video only contained the calibration process. Does anyone have the same issue? Do I need to add other object or code into unity exclude the Calibration Scene?
@user-626718 The port in Pupil Service is the same value as that listed in the Connection section of PupilGazeTracker. And I'm also running things locally so the IP shouldn't matter, right?
It still just sits with the message: "Trying to connect to Pupil..."
Any other ideas?
@user-945479 in this case your ip must be localhost or the actual ip of this machine. The port needs to be the same set in Capture or 50020 in case of Pupil Service.
@mpk Yes I've done that I think. Still nothing.
@user-945479 this should do the trick. Any fileswalls or such?
@mpk No, no firewalls are being used.
then I really dont know. π¦
restart capture with default settings and try the hmd-eye precomiled demo from the releases page.
if this does not work I really think its some kink in the OS or HW.
@user-945479 Did you see eye images before you tried to calibrate?
I'll add that I've had similar behavior, but I did see eye images before I tried to recalibrate (so, it was not a failure to connect, but a software issue). I did post my error previously, but here it is again:
NullReferenceException: Object reference not set to an instance of an object: PupilManager.OnCalibtaionStarted () (at Assets/pupil_plugin/Scripts/PupilManager.cs:93) PupilTools.StartCalibration () (at Assets/pupil_plugin/Scripts/Networking/PupilTools.cs:364) PupilGazeTracker.Update () (at Assets/pupil_plugin/Scripts/Pupil/PupilGazeTracker.cs:105).
I still have to check my versions to make sure it's not a simple error like that (this error occurred on a student's machine).
(a simple error like and old version of pupil service trying to work with a new version of hmd-eyes)
Assume hmd display is just a glowing star 'x' feet away from the gaze - After completing calibration once - & increasing the distance of the star to 'y' feet away from gaze - is another calibration required for accurate metrics ? Or do the pupil hardware auto-calibrate ?
hey everyone , can anyone tell me any application based on eye tracking
@user-7f5ed2 I have responded to your question in the π core channel. Please do not cross post questions πΈ
@wrp Sometimes I got the problem, that the timestamps of the annotations sent within Unity and the timestamps of the pupildata are not synchronous. The differ of about 5 minutes (not everytime). Of course I set the clock of pupil when I start my measurement through netmq to Unity time. What do you think Iβm doing wrong?
@user-29e10a I dont know! this is very intersting.
@mpk ok, any chance to see something in the logs? I donβt use your datafiles, is it sufficient to load the annotation capture plugin to save them in your files? Maybe this would give any clues...
@user-29e10a can you check the log files you can find in ~/pupil_capture_settings ?
@user-8779ef No. No eye images show up in the Unity display whatsoever. Pupil_service seems to be functioning properly. I get eye images there and they track the eyes, but there seems to be a miscommunication between the two programs. I'll check to make sure I'm running the newest versions of both. Thanks.
Hi everyone, may I know if the Pupil Labs HTC Vive Add on could be used on other VR display platforms? Or it's just for Unity?
@user-8779efSo, I figured some stuff out and got it working. A couple weird things. In the newest version, or the version I was using of the pupil-plugin, instead of getting to choose which Additional Scene to go to after the Calibration scene, it was a choice of Game Objects. I had to go into the Market Scene and pull out the correct portion of that to get it to work in my scene.
Also, is there a way to pull out the vertices of game objects like the Head, Controllers, and any virtual objects instead of just the location and rotation of the rigid bodies?
@user-945479 I don't know what this means: " In the newest version, or the version I was using of the pupil-plugin, instead of getting to choose which Additional Scene to go to after the Calibration scene, it was a choice of Game Objects. I had to go into the Market Scene and pull out the correct portion of that to get it to work in my scene." Can you say that in a different way, or elaborate?
hi, i met some problem. when i run the" hmd-eyes source code." in untiy, but there some errors appears. such as " Assets/HoloToolkit/Utilities/Scripts/Editor/SetIconsWindow.cs(307,50): error CS0117: UnityEditor.PlayerSettings.WSAImageType' does not contain a definition for
StoreLargeTile'" what should i do to solve them?
Has anyone got the problem that unity crashed a lot after using the pupil-plugin? Sometimes it does not crash, and everything works perfectly, but at some point it crashed for no reason and I changed nothing.
@user-8779ef I'll try. Inside Unity, in the Calibration scene, in the Inspector tab where I can change some selections, in the version I'm using, for whatever reason the "Additional Scenes" header was not there and in its place was a header titled "Game Objects". I had to go into the Sample Market Project that Pupil Labs gives out and find the appropriate code from the script and copy it into the Calibration for my Project. I'm fairly certain I was using the newest version of the script, but maybe not. Might just be something to check.
Does anyone know if there is a way to pull out the vertices of game objects like the Head, Controllers, and any virtual objects instead of just the location and rotation of the rigid bodies?
Could anyone share how they intergrate the calibration scene into vive? I found the released plugin did not match the develop doc so I download the pupil_master folder from github and import it into my project. I dragged calibration scene into the Hierarchy. Set up available scene names & building settings. Then set the CameraRig to main camera and attached calibrationDemo(trigger gaze topic) script to it. When I press Play button in unity, everything is working. But when I stopped the game, unity crashed and no csv data recorded.
Hello, I've been struggling for quite some time to work out how to actually use the gaze data in Unity, I can't find a way to raycast to my UI, does anybody have a tip ?
@user-58d5ae There is a script in github folder named Marketwith2DCalibration.cs. It shows how to raycast using the gaze data. Hope it could help.
Hello, everytime I start my recording, Unity does not work as smooth as before. Movements of game objects are not displayed in the way the should be, it seems everything is juddering, so no smooth movements at all. If I am not recording, everything stays normal. Is anyone having the same problem or has an idea how to solve it?
@user-626718 I ran into the timestamp only around 0.6-1 problem with you. It turns out to be confidence value, not time stamp. There is a function in PupilTools.cs that set the key to 'confidence', I change it into the input key and the timestamp is back to normal.
Thanks @user-63941a
@user-63941a Thanks a lot! It works for me now as well with that change!
This script is for raycast with Physics, not for UI (Graphics) but I'll try to adapt and try to update if it works
Hey guys, I want to use the eye tracker with HoloLens in order to observe the visual behavior of the user while interacting within the VR environment while surfing in the web browser. Is this possible and if anyone can guide me through the basics I would be grateful. Thank you in advance.
Hey people, so my problem concerning the performance drop is still persisting. Everytime i start recording the eye data, the scene gets distorted and freezes occasionally. Moreover I tried the Demo market scene, to compare the performance during recording. Also here the scene freezes and game objects that are moving, are doing so in a totally not smooth way. If I am not recording everything works smooth! Might that be related to scripts or is that more of a hardware problem ocurring due to not fulfilled dependencies of my computer? I am very thankful for any kind of help or idea!
@user-626718 do I understand that you are recording the unity3d scene? Or 'just running' Pupil Capture while using unity3d?
I am running Pupil Capture before to establish a connection, but the actual recording of the Unity gaze data I am doing inside Unity by just pressing the R button. After having done so, the scene that is shown with the HTC vive is starting to freeze and is not behaving as smooth as it was before. So it seems like the recording has an impact on the 'game'.
@mpk
So I am just recording the gaze data as a csv data and during this process the Unity3d scene is behaving not smooth
Hi, I tried to load some recordings from an Ubuntu 18 recordings with pupil lab 17.4 into my windows pupil lab palyer 17.4.
and get a troubled by MainProcess - [INFO] os_utils: Disabling idle sleep not supported on this OS version. player - [ERROR] player_methods: No valid dir supplied player - [INFO] launchables.player: Starting new session with 'C:\Users\p.wagner\Documents\phd\pupillab recordings\005' player - [INFO] player_methods: Updating meta info player - [INFO] player_methods: Checking for world-less recording player - [INFO] video_capture: Install pyrealsense to use the Intel RealSense backend player - [INFO] launchables.player: Application Version: 1.6.11 player - [INFO] launchables.player: System Info: User: p.wagner, Platform: Windows, Machine: BH-SYD-L047, Release: 10, Version: 10.0.17134 player - [ERROR] libav.mjpeg: unable to decode APP fields: Invalid data found when processing input player - [ERROR] libav.mjpeg: unable to decode APP fields: Invalid data found when processing input player - [ERROR] libav.mjpeg: unable to decode APP fields: Invalid data found when processing input player - [INFO] camera_models: Previously recorded calibration found and loaded! player - [ERROR] libav.mjpeg: unable to decode APP fields: Invalid data found when processing input
did someone had the same trouble. I could not find an answer here so far.
Thanks for help
previously I could load it without any problems????
Sorry for the buzz! I found the problem. My auto start application used an old version of pupil labs player
There's an example on how to Raycast with 2d gaze data in MarketWith2DCalibration.cs. But I don't see a similar implementation for 3d gaze data(MarketWith3DCalibration.cs). When I tried to adapt from the 2d example for 3d , I don't see any good results. Does that mean we need to raycast for 3d data differently? Does anyone know how to do it?
After adapting the 2d raycast, I made a basic project with a UI, but the result isn't right at all, when I look top right, it gets detected that I look at a button that's in the bottom of the UI, and if I look in random places sometimes it triggers other buttons, but I can never get it to detect that I look at a button when I actually look at this button