@user-570235 [email removed] : I just updated UDPCommunication to work with Unity's Holographic Emulation, as well. The "Remote to Device" mode, that is.
Thank you. Going to test it sap
likewise, thank you
Any progress on the 3D calibration routine? When I last check in December, it was quite unusable.
We're working on integrating eye tracking with an HTC Vive used in Prepar3d, looking to get timestamps, pupil size, and gaze locations. I understand that there is some stuff built out for use in Unity, but we are not using Unity at all. Does anyone have any advice on how we can calibrate without using Unity?
@user-988d86 The problem may be harder than you're suggesting. Even in Unity, only eye and head orientation are provided. To get ""gaze locations" requires a suite of additional algorithms that project a ray from the eye+head into the simulation, and detect the collision location, or angular distance of this ray from regions of interest.
@user-988d86 ...but, if I focus on the issue at hand: no, I'm not aware of any calibration routine that is built into Prepared. You have to first calibrate in Unity. The calibration is stored in the pupil server. Data then has to be extracted via SDK in prepar3d.
@user-988d86 ...but, I'm not a pupil developer or contributor, just a user. The pupil team may have additional information.
...also, note my previous message: the calibration (even in Unity) is kind of a mess right now.
@user-8779ef @user-988d86 ll try to give a higher level overview on how the unity integration and othr possible integrations work.
We do actually report gaze data in Unity and we provide this data from Pupil Capture/Service.
They way that unity3d or any other app interfaces with Pupil is via the ZMQ Pupil message bus. We call this the IPC.
Its basically tcp with zmq serialized messages.
When you talk to the bus we are able to get pupil, gaze logging and all other data from Pupil as well ask send intructions to pupil.
on a high level what you do to calibrate in VR is to send a message on the bus that sais "start calibation mode"
then you show an animation to the user that directs their gaze and report these positions in the VR space coordinate system to pupil.
When you send "stop calibration", pupil capture/service calibrates and from then on reposts gaze data on the IPC that maps into this VR space.
Makes sense. Is there a good example file for reference in hmd-eyes?
Even better.
this is a python refenence implementation in pyhton. Its python2. Should be easy to read and does all of what I said above.
You can of couse do much more than this, basically everything we do in Pupil is exposed on the IPC and accesible this way.
I recently added the python example because the Unity3d implemantion is not as easy to read.
I hope this helps!
So, @user-988d86 , you'll have to figure out how to present calibration targets within prepar3d. HOWEvverrrr....I really suggest you play with the calibration inside Unity, first. The 2D calibration degrades very quickly due to helmet slippage, and the 3D calibration is a work in progress (not really ready for use yet, by my standards).
A different route of integration is to write a plugin for Pupil that talks to your app via some other means.
@user-8779ef that is very true. 2d calibraiton does not deal with slippage and 3d calibration is WIP. I really hope we can push some big improvements on 3d very soon!
mpk: I'm (hopefully) teaching a workshop on eye tracking in VR at the next VSS .
I would love to showcase the pupil as a low-cost open source option, so please keep me in the loop.
Right now, i can't recommend people run out and try and integrate it into their research tracks.
@user-8779ef will do!
Not the VR integration, anyhow. I've been having a great time with the mobile implementation.
Do you know if Pupil will have a presence at VSS?
There are some other ways to do slippage compenstion, we are trying to use 3d to solve it for us, otherwise we add slippage comp through other means.
Yes - it looks like you guys are using pupil tracking only (no corneal reflection). That's some low hanging fruit.
@user-8779ef exaclty, if you track glints you can compensate for slippage.
I worry, though. The offset camera angle is extreme. Only time will tell... 😃 😃 . I hope it works out, obviously
we are really close to good slippage comp using 3d model improvements, I hope this will be enough.
since we have 5 ir lead per eye it should be easy to get some of them tracked.
Can I strongly recommend that you default to a vector based representation of gaze direction? The ol' crotch laser in the previous version of the market demo was reasonable.
It is a huge risk to assume that depth can be recovered with any accuracy.
(as is assumed by the current point-in-depth representation)
@user-8779ef 100% open to your input! let us know what you think works best in this case!
I have to sign off now (CET timezone for me.)
Well, I've never seen a demo that does a good job recovering depth from binocular tracker.
So, I wouldn't START with that.
Ok, thanks again, mpk.
I glad I could help! And lets continue the discussion about gaze rep in VR!
sure, I'm going to try and be more of a presence here, as I'm using the system quite a bit lately.
@user-988d86 One more thing - even if 3d calibration isn't working just yet, you can get pupil diameter out for basic pupilometry. ...and tell David I said hi (Gabe here - we hung out in Ohio). edit wait - I may take that back. Pupil diameter without the underlying 3D model is going to be uninformative ...it will be pixel width/height, not mm.
With some hard constraints and excellent calibration, degradation due to slippage can be corrected post facto. I have been using a quantiles based approach to move the points around the center. One must equally distribute stimuli around the center of the screen.
Preferences may develop, for some scenarios, so you must change the position of the stimuli in a balanced manner.
Changing stimuli position and equally spacial distribution is something very common in my field. So, not a big deal.
I'm interested and confused by what you're saying @user-41f1bf . Can you explain further? And what's preventing one from calibrating, then tracking features around the eye over time to detect how the headset is slipping? Also @user-8779ef what is the use case for a binocular tracker without a 3d model of what the user is looking at? i.e. why do you need depth estimation if you can model the surfaces they're looking at?
(then perform a raycast from 2d to 3d) oh I guess a window
@oysterboy#4578 Sometimes the gaze vector will pass near to, but not collide with, the true object being foveated by the subject. To identify this requires a measure of angular distance from the gaze vector to the edge of the object (or centroid, if that's a reasonable enough approximation).
These near-misses (without collision) occur for smaller objects, or when viewing something at a distance, when tracking a moving object, or when the subject is looking at the edge of an object.
@user-d74bad See the above messages - sorry, keep on forgetting to tag you.
I need the information for playing hmd-eyes unity project downloaded from git , in htc vive
ahhhh interesting, thanks
Any help would be appreciated thanks in advance
@user-41f1bf are the stimuli the calibration object?
s
Hi @user-d74bad . I am talking about a workaround I have been using.
It is not available for real time HMD, but it should work
sure, just trying to figure out what you're saying
it's for a recording then?
Since slippage compensation does not work yet, I am correcting deviations post facto. You may ask.. How?
yes, I'm curious how
are the stimuli you're talking about, are those part of the calibration procedure?
no
Please, wait a moment, I am writing a brief description here...
For this to work, you must have some constraints:
1) You must have a known center (lets say, surface centroid) 2) Along your recording session, stimuli presented must be equally distributed around this center 3) Consider a cloud of timestamped xy coordenates around this center 4) Then you can correct using non-parametric barycenter of the cloud along different moments in time
Code here: https://github.com/cpicanco/fpfn-analysis/blob/master/analysis/correction/init.py
You calibrate normally. All you need for this correction to work is to present stimuli this way
It would also be possible to present stimuli without any constraints at all as long as you ensure people will look at the center in between (e.g., during the inter trial interval you present something in the center, or using a starter in the center).
To make this job easier, I have been using transformed pupil surfaces (homographic/ perspective trasformations) so that you have a rectangular surface instead of a trapezoid one.
Is anyone having strange 2D calibration with a mono eye tracker? I have a fresh computer and am trying to use the new HMD-eyes with Unity 2017.2.0 and pupil 1.2.7. Everything seems to work except the calibration is completely off for all of the demos. Before this I have been using Pupil 0.9 and an older hmd-eyes which was mostly working.
@user-90aa66 Calibration Markers have changed
You will not be able to calibrate using old markers in v1.2.7
At least not out-of-the-box
detection of new markers was optimized
Hi all, I'm having a bit of trouble running Pupil from the source
The install seems to have gone fine, but when I try to run main.py, I get the following error: https://pastebin.com/B8iJxZ9K
I've tried searching for similar cases online - in fact, I found a much older Pastebin that contained much the same error message - but I haven't had any luck. From what I can tell, ZMQ is experiencing some kind of error when looking for the appropriate socket. Any advice is greatly appreciated!
I am running Ubuntu 16.04
@user-41f1bf oh I see, so what is the best way to calibrate now? Do I need to use an older version of pupil to calibrate out of the box with the current unity plugin? Or is there some settings to change somewhere?
If you want to use v1.2.7, you can calibrate offline using natural features. For real time you should use new markers (of course).
@thepod#1280 I wrote you a PM so we can go into details on how the updates affect your setup
@user-e04f56 Sorry for my late reply. I confirmed that solved the problem.Thank you for your support!!
Hi! I have been working with pupil labs for few days now and the pupil capture works but I am having trouble getting Unity to connect with the software. On Hololens I can deploy the program but it doesn't seem to get a connection and on Unity editor I keep getting log messages: "Could not connect, retrying in 5 seconds". Could someone point me to the right direction to get it working?
Hello. We're using Unity 2017.3, the latest release of pupil_service, and a freshly cloned version of hmd-eyes. In the Unity Editor, I open up the Market Scene Demo/2D Calibration Demo scene and press play.
Problem 1: If pupil_service is not already running, Unity launches it and one eye camera pops up (but not two), and Unity reports that it cannot connect. Stopping Unity at this point causes Unity to crash.
On my second attempt, with pupil_service already running, things seem to connect OK. I start calibration (key c). BTW I have all the debug boxes checked in PupilSettings. After the 8 calibration points are done, nothing happens. No market scene, no indication that it failed, nothing. This is what I get in the console:
Point: 7, Sampling at : 119. On the position : 0.5498792 | 0.4374535
Sending ref_data
Followed by many lines of
|| norm_pos | 0.5498792 , 0.4374535|| timestamp | 70.40639|| id | 1
|| norm_pos | 0.5498792 , 0.4374535|| timestamp | 70.40639|| id | 0
If I stop Unity at this point, it doesn't crash. Maybe worth noting, even after I close Unity I can't delete the hmd-eyes folder because hmd-eyes\unity_pupil_plugin_vr is being used by another process. I am only able to delete it after I close pupil_service.
I attached VS debugger and put a breakmark in Connection.cs line 129, that's the anonymous function attached to subscriptionSocketForTopic[topic].ReceiveReady
delegate within which the messages are read. The breakpoint is never triggered. I'm not familiar enough with Visual C# to know whether or not it's possible to debug anonymous functions as event handlers but I suspect it is and therefore this is a problem: the NetMQSocket object is not triggering its ReceiveReady event.
I switched to Unity 2017.2 and MSVC 2017 (was on 2015) and the problem persists. The furthest I've been able to debug this is that https://github.com/pupil-labs/hmd-eyes/blob/master/unity_pupil_plugin_vr/Assets/pupil_plugin/Scripts/Networking/Connection.cs#L200 never returns true, so Poll()
is never called. I guess this means that SubscriberSocket never has the correct message.
@user-5d12b0 @user-e04f56 will have a look at this. He should get online in about 12hrs.
@mpk thanks. I am now downloading pupil service 1.1.2 to see if it works there.
OK it works with pupil_service 1.1.2.
I used python_reference_client/notification_monitor.py to see what was happening.
With pupil_service 1.1.2, after the last 'notify.calibration.add_ref_data'
, we get 'notify.calibration.should_stop'
, 'notify.calibration.stopped'
, 'notify.calibration.successful'
and so on.
With pupil_service 1.2.7, after the last 'notify.calibration.add_ref_data'
, we get 'notify.calibration.should_stop'
, and nothing else.
I was never able to get pupil_service built from source so I can't debug further than that.
We definitely need version checking and then we need to figure out what's wrong in your setup.
Pupil service is just running capture with service as arg
'service'
@cboulay#4121 thanks for pointing out the bug. I was able to reproduce the behaviour you described with the current version of Pupil Service 1.2.7. It seems to be limited to Pupil Service, though, as Pupil Capture v 1.2.7 was working as intended. Hope you can use this or stay on v 1.1.2 while we try to find out what is going wrong.
Hi! I still have the same problem and can't figure it out. I have tried all IP adresses I can think might work and opened the port but it still doesn't work. Any suggestions?
@user-f51806 For UDP you have to use the "Hololens Relay" not "Pupil Remote"
Thank you! That got me one step further. This is what I get now with "Hololens relay" plugin set active. I assume it only works once I deploy the program to the hololens? Should I use something else to test things on Unity editor? Is there documentation somewhere so I could refer to rather than ask things here?
Hello there ! I'm a newcomer in the pupil world : I'm starting working with device for htc vive since few days ^^ My issue is that I'm unable to made my calibration scene (https://github.com/pupil-labs/hmd-eyes/tree/master/unity_pupil_plugin_vr ), after following the last point position, nothing happen (it is supposed to made that "the Calibration.unity scene will display three colored markers representing the left (green) and right (blue) eye plus the center point (red)."). Notice that I already managed the Pupil Service configuration as explained (https://docs.pupil-labs.com/#pupil-capture-demo-video )... Someone have any advices / links for beginner ?
@user-f51806 at the beginning of the year, we implemented support for Holographic Emulation that lets you test HoloLens projects in Editor: https://blogs.unity3d.com/2016/09/29/introducing-holographic-emulation/
using that, you will not have to deploy to HoloLens
@user-297203 are you using Pupil Capture or Pupil Service? as was pointed out [email removed] , there is a bug with the current version 1.2.7 which behaves similar to what you are describing. As you are just starting with Pupil, I would recommend using Pupil Capture as it gives a better insight in what is happening. "Confidence level" being one variable to watch and the console output for feedback on the calibration process
@user-e04f56 My mistake, I used the Pupil Capture (and not "Service" as said). As you talk about the "Confidence level", I remember that when I tried it (following : https://docs.pupil-labs.com/#pupil-capture-demo-video), after setting the "Pupil Detector 3D", when I go back to the Capture window, I have nothing on the id conf as you can see on the screenshot below ... It's is possible that I missed something too ... I guess ^.^'
@user-297203 I sent you a PM
Another question when Installing the new exposed film for the ir sensors. After installing it I am not able to get the camera to focus? I have taken off the auto focus, and have tried redoing it a couple times to see if I put it on incorrectly but still the same problem. When I take it off the camera works fine and focuses. Any fix for this? I have been using normal exposed film from a camera, is there a different material I should use?
oops wrong chat sorry
Hello everyone, my team and I just got the pupil-labs for the htc vive. We have some questions about the calibration, is there someone who could help?
the markers after calibration are perfectly on top of eachother (too perfect) and in the Market demo scene it shows the markers to point in a different direction than the hmd is facing
@user-6dcde6 is their position changing/updating or are they staying at their initial position?
@user-e04f56 their position only changes if the hmd moves
they don't seem to change when the eyes are moving
@user-e04f56 do you know what would be going on?
nvm got it working now
@user-6dcde6 can you say what changed?
I managed to get it running and it works nicely. Thank you! As a side note I found an odd behaviour which I assume is a bug. If you turn on streaming while using the Hololens, the eye tracking fails (or at least visualization of it). The streaming screen is larger than the viewport of Hololens and the eye tracking is suddenly mapped to the larger size screen, so the eye tracking points are mapped correctly proportinally to the screen but to wrong screen size. Also if the streaming is on while calibrating, the calibration dots can be outside of the Hololens viewport (you can't see them except from the stream). Thanks again!
@user-e04f56 readjusting the ports, we never saw "calibration ended" until we readjusted it
the udp communication from the emulator also is working for me, nice work!
*remote to device that is
Please update github about 1.2.7 bug issue in unity. I try to execute pupil gaze tracker since yesterday. but If someone notice me the bug info, I would haven't suffered. and thank you @user-41f1bf for the information.
Maybe bug? If I set default path(c:\Program Files\Pupil Service) unity pupil gaze tracker inspector and when I change the path, Pupil Gaze Tracker don't change path. still the actual path is c:\ProgramFile\Pupil Service but the path seem to be changed in inspector. OS: win10 , Unity ver: 17.2.0 f3
@cboulay#4121 after being able to reproduce the problem with Pupil Service v1.2.7, we were just able to debug and fix the problem. According to my colleague, it resulted from some UI changes made in Pupil Capture, which were causing Service to crash. It will be included in the next release. I hope you will be able to stay with 1.1.2 or use Pupil Capture in the meantime.
@user-e11424 I just tried to reproduce the behaviour you described (all with Pupil Service 1.1.2) and only observed the path not updating correctly when there was still a background process of pupil_service running from the old path. In that case, Unity discovered the old process, reported " Pupil Capture/Service is already running ! " and the eye windows were started from that process. Otherwise, path changes worked as intended.
@user-e04f56 , that's good news. Thank you very much. I think we will use Pupil Capture for now. Had I encountered a different bug and been using Pupil Capture then I probably could have identified the problem faster by monitoring the console. My concern with using Pupil Capture vs Pupil Service is that we are already operating near the limits of our system and I assumed Pupil Capture has more overhead than Pupil Service. If you have knowledge that Pupil Capture does not really take any more resources than Pupil Service (once the eye camera windows are minimized) then please let me know and we may just use Pupil Capture from now on.
@user-5d12b0 your assumption is correct the resource use is not higher when using capture.
@mpk Do I have to worry about the world camera window? I'm fine with just minimizing this gray screen but if there's a setting I should set to make sure this window isn't wasting cycles then please let me know.
@user-5d12b0 the window itself especially when minimized does not use much CPU. The gaze mapping happens there, this is the major consumer.
I am building a calibration scene for the Hololens based on the demo scene. I reduced the PupilDemoManager into following code: https://paste.ofcode.org/bNcUa6qSyCvkCKAETGjp4e
It runs nicely, connects and calibrates (shows targets), but then I get this message in pupil_capture command prompt: Stopping Calibration world - [WARNING] calibration_routines.hmd_calibration: No matched ref<->pupil data collected for id0 world - [WARNING] calibration_routines.hmd_calibration: No matched ref<->pupil data collected for id1 world - [ERROR] calibration_routines.hmd_calibration: Calibration failed for both eyes. No data found
Did I miss something?
How do I set fps rate for the pupil capture in hmd-eyes library? Although I set its rate at 30fps in its UI mode, it gives me 90 fps. I modified its code a bit and it does following things. 1. extract gaze position info from "PupilData._2D.GetEyeGaze(GazeSource.LeftEye).ToString();". 2. write the position info in a text file. When I count the results, it prints 90 records per second. Is there other ways to set its rate?
@user-dd79ea is each result unique or do you have blocks of 3 with the same value?
Unity in VR should always run at the HMD's refresh rate, and that is 90 Hz for Rift, Vive, and a few others. So an Update() function in Unity will be called 90 times per second, even if its code is fetching data that is updated slower than that.
I don't know how pupil's gaze estimation works. Many things that have slow sensors but work in VR will use a dynamics model, and each request of the model return a new value that is predicted X msec into the future since the last sensor update, even if the model is only updated with sensor data infrequently. This is how the Vive positional tracker works: you can request the HMD or controller poses at >10 kHz and get new values every time even though the lighthouses system only updates position at about 30 Hz (slightly more complicated because the IMU sensor fusion provides some positional information too at a higher rate, but still not 10 kHz).
@Eero#0410 I have two things you could check: 1. Go to Connection.sendRequestMessage(Dictionary<string,object> dictionary) and un-comment line 112. This will log whether the reference points are sent or not. 2. Add the following section to the switch statement in UDPCommunication.InterpreteUDPData(byte[] data): case 'R': Debug.Log ("Reference points received"); break; This will show whether the points are received by the HoloLens Relay plugin. I will add the latter to the source, myself, and add a debug settings check.
@user-e04f56 Thank you for the suggestion. However, since the holographic remoting isn't working for me (it cannot connect to pupil while using it even if deployed program can and I'm having this issue, which is supposeddly fixed but seems that it isn't: https://forum.unity.com/threads/holographic-emulation-not-working-in-unity-2017-2-0b8.493070/ in 2017.3.0b5). Thus I have no access to the Editor log while testing, which is not nice. I also tried using the pristine(non stabbed) demo you provide and it also failed in calibration. Since it worked before the problem must be at the end of pupil capture. Fair enough after several restarts of pupil capture I managed to get it right and calibration succeeded with both scripts. I can't tell what the problem was tho.
@user-f51806 so are you deploying directly to HoloLens? if you use Visual Studio, you can debug while running on the device. That is how we used to do it before we implemented Holographic Emulation. Just use these Unity build settings when building:
Thanks @user-e04f56 ! I will have a look at that. Yes I am deploying directly at the moment.
Hi ! I've a strange problem since this morning... I'm making a pointing visual metaphor based on the VR demo (same calibration elements). I worked on it yersteday, everything was working, and today, when I launch my play mode, the connection is never made...
I tried re importing my pupil assets, deleting the pupil manager prefab and instanced used from the demo --> same problem. I tried on the demo --> the connection works. I tried on a new virgin project --> the connection works.
Is it possible that there is some elements in my project configuration which prevent me to make the connection ?
@Broopyn#2541 can you set a breakpoint in PupilTools.Connect(bool retry = false, float retryDelay = 5f) to see, whether it is called at all? If you have two separate projects, what could have happended is that you, by accident, de-selected the 'Auto-connect to Pupil' option in the inspector:
This would result in "Connect" not being called when you hit "Play"
Hi @user-e04f56 ! When I tried to resolve it yesterday, I also tried to de-select the "Auto-connect to Pupil" option and starting connection manually, but I also had the problem. When I clicked on play, the canvas still displayed "Connecting to Pupil" and nothing happens... I don't have the Debug.Log which said that connection failed and will be retried in 5 sec.... I'm thinking about if an asset installed Friday (Multi-Camera Window), made that mess when I closed my project... (the last thing I see.. :/) Anyway , I backtracked on my project and restarted my work ^^'
I still have the same problem on my new project after closing then opening it -_-' ... After looking at the code of PupilTools.cs, it seems that this line of the Connect function (l.205) never ends : yield return new WaitForSeconds (3f); I suppose it's because the PupilDemoManager launch the function Start() after... Someone have an idea of why my WaitForSeconds could never ends ?
Okay, so the answer of my question "Is it possible that there is some elements in my project configuration which prevent me to make the connection ?" is... Yes : the Unity Time preferences ! Don't ask me how, but this f****g project changes its values (TimeStep, Time Scale) sometimes on project opening... >.< '
Hello. Is there a best known methodology to do 3D-Calibration? I am using a HTC-Vive + Win10 setup.
@user-f45055 Sadly, it's still not working very well.
Unity, I assume?
Hello! I have a question. What method is unity_pupil_plugin_vr using to estimate depth? I want to make a program to find depth from gaze.
Hi I want to know if it is possible to subscribe to multiple topics on a single socket I see that in the source code for Connection.InitializeSubscriptionSocket(), there is a subport that is used to initialize sockets, that are then paired with a topic I believe this is causing unity to crash when I make multiple calls to this function, as it tries to establish new sockets for topics, but using the same port What I want to achieve is very simple; I want to be able to subscribe to multiple topics
@user-8779ef Hello, You mean the calibration is not working very well with the mentioned rig (htc+win10)? Since I've the same setup and I'm interested to purchase the kit for Vive and develop in Unity... thanks
I’m working on analyzing gaze data from viewing skybox scenes in Unity. Given that I have the head direction in pitch, yaw, roll and gaze in normalized x,y position, is there an easy way of retrieving the (2D) gaze position if analyzing this data as inside a sphere or converting this to other projections?
Does anyone know of useful resources or tips for this type of analysis?
@user-9f8042 They have been in the middle of a refactor of the 3D calibration method for a few months. I haven't yet seen improvement on the output from the 3D pupil tracking + 3D calibration mode. Last time I checked, in early Jan, the track was noisy, and not very useful. Perhaps there has been some progress, but I'm not sure.
@user-9f8042 I do not work for pupil. This would be a good time for one of their team to weigh in on the state of things, though.
@thepod#1280 did you have a look at the new Heatmap demo scene? it translates the gaze information to particles (that fade after a set amount of time) and allows to export a spherical video of the whole scene. It is currently using the market scene assets, but it would be easy to adapt to just use a skybox background.
@user-e04f56 no, but that sounds very cool I will have to check it out when I next update!
I will still need a way to analyze the head direction and gaze points and convert the gaze positions to points on different projections outside of Unity. I'm new to processing gaze data and working with 360 degree data so just hoping someone knows of any good plugins/programs/algorithms out there before I start from scratch.
Can anyone help me troubleshoot blink detection for HTC vive with Unity?
@TheRampage#4633 are you having problems with the Unity demo or with the Pupil Blink Detection plugin?
@user-e04f56 I'm having trouble with the Blinking Demo, and the Calibration Demo, it seems like the function 'CustomReceiveData' isnt happening at all, and although the pupil App says it's connected I don't believe I'm receiving any data from the app.
@user-8779ef thanks for your kindly answer
@wrp so mate....cv1 when??? still delayed?
@TheRampage#4633 if neither the Blink nor the Calibration demo are getting any data, it seems like there is a general problem on the receiving end (the Calibration demo does not use CustomReceiveData). just a quick check: are you using Pupil Capture 1.2.7? I am asking because there is currently a (soon to be fixed) bug with Pupil Service 1.2.7, so may lead to the problem you are describing. Pupil Capture would also give us the chance to debug the situation a little more. If you want, you can send me a DM and we can schedule a remote session.
Hello,
can I ask?, I want to do some head prediction in VR with HTC VIVE based on eye tracking data.... this HMD sample is really good, and there is everything what I need ... But I can not run calibration successfully... I start the calibration process with C... it display white circle that doesnt look very nice, and after looking on the circle at the end it display message like "calibration failed", what is the reason?
the another question is, because I am new in unity, if I want to run this project Pupil-plugin-vr should I click only on the Pupil Import Package and it open this project in my UNITY EDITOR?, and if I wanted to get data from HTC VIVE like head prediction have you some tips?, what to use and where to put my code?
Hello everone,
I was wondering why the pupil capture wanted to use windows powershell to update drivers?
@Flyzza#4990 this is so you dont have to manually replace drivers for the headset cameras we use.
Hmmm does it take long to update those drivers? Cause it doesn't seem to work
@user-6dcde6 it should not take longer than 30 sec.
@mpk weird we have waited much longer than that
Is there a way to skip that update?
it should only run once and the cameras should work after that.
@user-c9c7ba the Pupil Import Package is not intended to be used with the version you download from Github. it is rather meant for existing projects to import everything you need to be able to communicate with Pupil. but I think as a first step, you should have a look at the example project. to open it, choose “Open Project” from Unity and select the Pupil_Plugin_VR folder. it is important that you select this folder and not one within it, as Unity loves to generate new projects within projects if it does not find the Assets and ProjectSettings subfolders
@user-c9c7ba but as you seem to have been able to run a calibration routine, you are at this point, already?
@user-e04f56 thanks for reply :)... yes I have done it allready... what I am doing: 1. I run in the scene 2DCalibrationSceneDemo in unity 2. I follow the white circle 3. After ending nothing happens, no errors, no success. When I debug the code, it call stopCalibration methode at the end. I have debugged requestMessages too I am sure that version, subport, and so on is doing well
anybody know what I am doing bad?, I expect a success message at the end
@JozefD#7276 I sent you a DM so we can go into more detail about what might be going wrong