Hi ! I am very new to your technology, is there a way to read data from your eye tracker in c# in real time ? Lets say x, y of an eye and pupil diameter ?
And welcome to the community!
We have a Unity Plugin in c# that processes data from our realtime network API. See these demos for pupil data https://github.com/pupil-labs/hmd-eyes/tree/master/plugin/Demos/PupilDataDemo (eye data)
Thanks ! I want to prepare my VR game for your hardware (which will be shipped to me next month) and I was wondering if there is a way to simulate output ? To receive some fake data without using an actual headest ?
Mmh. I think I wrote a script that simulates the network API but I do not think it would help with the HMD calibration. It might be useful nonetheless. Give me some time to find it.
Sure, thanks
Here it is: https://gist.github.com/papr/49f58a894364dd94b23c53e6bc6929d0 You should be able to use it together with this recording: https://drive.google.com/file/d/11J3ZvpH7ujZONzjknzUXLCjUKIzDwrSp/view?usp=sharing
Please be aware that you need to start a recording using hmd-eyes in order to start receiving gaze from the recording.
Disclaimer: The simulator might not be feature complete.
@papr Thank you ! It might help me a lot 🙂
Do you have any useful toutorials how to use your tool with Unity 3D ?
Start it from the command line, and try to connect to it as if it was a Capture instance. Unfortunately, I do not have access to a unity instance to test it right now.
Do you have any experience with determining current stress level judging by eye parameters from your eyetracker ? I want to react in my game to player stress in real time. I managed to export from your demo recording a csv called pupil_positions.csv, it cointains many eye parameters, such as pupil diameter. Am i able to receive it in my c# script in real time as floats ?
Yes you can receive pupil diameter measurements in real time. But be aware of the pupillary light reflex. It affects diameter changes much more than stress or Cognitive load ever could. If you have a game it will be very difficult to not trigger it by the displayed environment.
Yeah, you are right, i want to corelate it with other sensors attached to player's body. Any useful tips on how to read eye parametres in my script ? By using Unity plugin, or just by somehow connecting my script to your API ?
Do you use unity in your project?
By reducing stimulants or overstimulating player
Yes, it is an Unity autism simulation game, player will be attacked by many stimuluses in VR headset, I want to check his body's reactions and react to it in real time in the game
If you are using unity already, I recommend using the plugin. It provides a c# api that can be controlled in your own script
Do you have any text or video introduction/toutorial for the plugin ? For an easier start with it ?
Please check the documention linked in the hmd-eyes repository.
Thank you very much and sorry for taking your time ! 😉
Hi, I have pupil timestamps for pupil_positions.csv and gaze_positions.csv. Is there any way to convert those timestamps to unix time?
Yes! 🙂 Please see https://github.com/pupil-labs/pupil-tutorials/blob/master/08_post_hoc_time_sync.ipynb
Hi, many thanks for the help. That did work. I also wanted to ask you, is it possible that from gaze_positions.csv (gaze_point_3d_x, gaze_point_3d_y, gaze_point_3d_z) or from pupil_positions.csv to understand if participant looked in the left or in the right of the VR view in front of them? They didn't move their head, only their eyes.
@papr how would I know if the "start_plugin" notification worked properly?
Do you mean programmatically? If it is a custom plugin, you can emit a notification. Otherwise, Capture emits a log message when a new plugin is started. You can subscribe to logs and check if the corresponding message is being emmited
I'm getting the error unknown plugin for "3DHMDGazer"; do you know what I should be using instead?
it's 3DHMDGazer so i should just be able to subscribe to logs and i'll see whether or not it worked, right?
Hello everyone! I am currently using the HMD eyes Unity plugins with my Pupil Core which works perfectly. However, I need to migrate the same thing to an immersive VR system with 4 Unity instances running simultaneously so whichever is presented, needs to be synchronized with RPC calls. I'm really desperate and struggling in implementing these RPC calls in the HMD plugin (for example, to synchronize the presentation of the calibration marker). Is anyone here a freelancer who provide some support with API development/modification? Thank you! And apologies if this message is not appropriate
nvm i got it
Hi, I have a question with regards to pupil player. Is there any network API or plugin in python that can be used to communicate with pupil player to load pupil data directories and export (gaze, pupil...) csv data? When I do this procedure manually for every participant takes a lot of time.
Unfortunately, that feature is not built-in. But checkout https://github.com/pupil-labs/pupil-community maybe you can find something that suits your use case.
Hey all 👋
Props to whoever set up this discord - has been fun seeing some of the cool projects here.
I'm wondering if someone can point me to a digital interfacing use case for eye-tracking (e.g. non-research, non-gaming)? I have been building a web browser in unity and thought about using eye-tracking for workflows as opposed to clunky controllers or unprecise hand tracking.
Would the accuracy/precision of something like the Binocular Add-on be adequate for doing something like email in VR/AR?
Hey everyone ! I am working on a project with Hololens 2 and I would need to prototype a Hololens 2 + core device hardware, does anyone has already did that ? I thought that it could be possible to print a frame to clip on Hololens 2 and fix the cameras of the pupil core on the frame, yet if someone has already did something similar I will appreciate any help !
Hi @user-b74779 👋. It looks like @user-819914 did some prototyping with HL2: https://discord.com/channels/285728493612957698/285728635267186688/852193611718066186
Yeah I am waiting for a response in MP, anyway I did a model yesterday for core device, I plan to print it next week, if it is efficient i will probably share the model so others can use it
Can you share how you did it?
PM*
Hi @papr , i have some issues with using your API simulator... i have your plugin in my unity game and running API with your demo eye recording but i am not able to receive any data in my unity game. It says : Pupil Version: NOT IMPLEMENTED.
You need to start a recording in order to receive data. Are you doing that already?
Where do i start it ? In Pupil capture ? Of directly from unity ?
Capture should not be running. The script replaces the desktop app. You need to request a recording start via Unity. Specifically, you need to send the "R"
command using the request controller https://github.com/pupil-labs/hmd-eyes/blob/master/plugin/Scripts/RequestController.cs#L188-L191
Seconds after connecting plugin with API i get such an error, have you ever seen it ?
This is possibly related to:
Due to an issue with MessagePack, the default project setting for
ProjectSettings/Player/Configuration/API Compatibility Level
is not supported and needs to be set to .NET 4.x
Hi, I am still working on a prototype frame for Hololens 2 + Core device hardware, I made some test today with my prototype (using the Hololens + using only the left eye camera), during the eye tracking there are some weird "white strips" during the video, I guess it is created by the light of the Hololens 2 ? During this moment sometimes the sensor is loosing my pupil and the quality seems to decrease, what could I do to improve that ? I am pretty sure that I could tune something in the Pupil capture but I have no idea of what to do could you help me ? (see the recording in the following youtube video link)
Hi! Do you have the eye cameras positioned outside of the hololens, i.e. having them look through the visor onto the eyes? If this is correct, then yes, it is likely that this pattern comes from the hololens display. The only adjustment that I can think of in the software might be to adjust the eye cam sampling rate. Placing the eye cameras inside might be another solution but I do not know if there is sufficient space.
hey y'all, for some reason, subscribing to "logs." or "logging." doesnt give me any log messages from pupil, anyone know what the appropriate subject should be ?
logging.
is the correct one
oh actually i just realized that I'm trying to read "logging." as if it were a "notify." message but that won't work, will it?
Technically, it should work in the same way. With the difference, that there is no required subject field.
I'll try again, i might be doing something stupid. It's not too important though; I did manually check the logs and I was successfully able to load a new GazerHMD3D plugin
The camera is below the Hololens so it is no looking through the visor, yet I think it might be a sampling problem, maybe due to the light from the Hololens
Do you see the pattern, too, if you point the camera at the hololens? If it is an external light source, you should be able to find it by pointing the camera at it.
Yeah ok I will try. So it is possible to tune the sampling rate on core device right ?
It is. Via the Video Source menu in Capture
Hello, I have a small question. I'm doing eye-tracking in VR with the plugin and my scene is very simple, just a 3d model with an animation. The thing is, if the model doesn't move, the eye-tracker works perfectly, but when the animation starts and the model moves, it's like if they eye-tracker got stuck, and the black point with the yellow circle that indicates where the user is looking at gets stuck. Why is this happening?
Here is what I mean, to be clear
Quick question. I'm always surprised at how accurately the HMD estimates gaze direction after the 3D model has been fit, but before a proper calibration. My understanding is that your calibration method is used to refine the rotation of the local space that contain the eye and eye camera. Is the observation that it is a fairly good track prior to the calibration in VR just because of the relatively stable eye/camera/screen geometry? Are you able to rely upon an "average" rotation that seems to work well across individuals?
There should be no gaze estimate before a calibration, unless you have restored session settings including a previous calibration 🙂
I could swear that, in VR, the gaze estimate is visualized prior to a calibration. I could be wrong, but I'm fairly confident. So, that's interesting! I'll double-check soon.
To check, restart Capture with default settings. If the gaze signal disappears, then it is likely that the gaze was mapped by a restored calibration. If it stays, we will have to talk details as I do not have an idea top of my head where this would be coming from. 🙂
Ok, will do.
In other news, Kevin has been working since last December on replicating the real-time HMD calibration offline. We see this as an essential step to the ability to refine the gaze estimate post-hoc, as you can with the core, in pupil player. We are checking his progress by comparing gaze estimates when looking at a 9x9 point assessment grid. He has made some great progress, but is still unable to produce the same gaze estimates online and offline. He is cleaning up his code a bit now, and will share it with you soon. We hope you'll be able to help us diagnose the differences.
The key point here is that we may ask you to donate some of your time to help us out.
Giving you a heads up, because I know you're a busy guy
...the upshot is that, if this works, I imagine it would be of great value to the community.
I have had to caution several colleagues against degrading their data quality by attempting to reprocess the data collected in VR offline.
Saved them from quite a lot of wasted time and frustration.
Hi,
My name is Vicente Prado, I am a professor of psychology at the University of Valencia and I already have eyetracker glasses (pupil core), recently I have acquired a virtual reality glasses Oculus Quest 2, would it be possible to install the HTC Vive Add-On or other similar?
Best regards and thanks
Hi @user-1e88bc 👋. The VR Add-on isn't compatible with Oculus Quest 2. Only HTC Vive, Vive Pro and Vive Cosmos. For other HMDs, you would need to develop your own mounts that fit the geometry and constraints of the headset.
ok, thanks