is it possible to use hmd-eyes to calibrate the pupil to another pc that is not connected to the pupil?
i'm asking this for use without a vr/ar applicatoin
short question about the Unity projects with pupil capture running in background: if i start the calibration in the unity_integration_calibration project, finish it, close it and open the unity_integration project, does pupil capture save the calibration done in the other project or do i have to calibrate it once again?
Hi,
I think it should save the calibration.
Hi, I'm having the same problem as Wopsie. In the pupil capture window both eyes show up and are recogmized. However when I try to record data in Unity only one eye gives my data, the other just returns 0,0 as coordinates. I am using the most recent version op pupil labs and also the 5.6 version of Unity together with an HTC Vive. Do you have any ideas on how I could fix that?
Just run calibration. After that both eyes should show.
I ran the unity_integration_calibration project and calibrated it. After that I opened the actuall scene in which I want to record and where I have the script attached to the player. Is that the wrong way to do this? I'm sorry, I am pretty new to this but I haven't been able to figure it out over the past couple of days
I think that should work. But Let me check. We are currenlty completey rewriring the plugin. If you can wait for a few days you can try the new plugin then.
I hope that is ok.
Otherwise I would suggesting looking at the datasteam coming from pupil_service using the montor script: https://github.com/pupil-labs/pupil-helpers/blob/master/pupil_remote/filter_messages.py
Okay, thanks! 😃
Hi, I have a question@mpk . 😀 I do not know how to map the Real-time Image within HMD ( Unity ) to the Pupil Data.
It works fine when I am running the Unity_Integration_Calibration to calibrate my pupil device. So what should I do next? I have no idea how to combine real-time image within HMD with the pupil data. I know we can get messages through the IPC Backbone, which are totally raw data without mapping.
Now I am using Video File Source as Capture Source of world camera. For I plan to using pre-recording video as Capture Source, which is exactly what I will see from HMD. In this way, I guess I am able to solve the mapping issue. Here comes the problem: Gaze detection works fine in pupil_capture, however, the gaze data cannot be recorded when I start recording. Only the Video File Source is saved. When I change the capture selection into Test Image, everything including gaze data recording is OK.
Any help would be most appreciated !
It is working on Win10 with v0.9.3
Hey! With the new Update the left and right eye both work very well, thank you :-) I have only one more short question. So I open the callibration scene, callibrate the eyetracker and then open the new scene in which the experiment will be (both scenes are in the same folder, I don't know if that matters). How do I know now that the eye tarcker is still callibrated? Is there any way to check that?