I am working to build a program to plot the real-time data from the pupil core. I have the data readout coming in through the network API, but not sure if there is a way to plot it in real time as well? Any suggestions?
hi, is it possible to change pupil service to listen on 0.0.0.0 instead of 127.0.0.1?
It actually binds to 0.0.0.0
but we replace that string with 127.0.0.1
in the ui. https://github.com/pupil-labs/pupil/blob/master/pupil_src/main.py#L224-L243
I suppose it does listen on 127.0.0.1
so, will it work if I make unity calibration on another device (and just replace in the calibration settings [in unity plugin] from 127.0.0.1 to PC's ipV4 that is hosting pupil service with connected PL valve)
yes
ok, thank you
hi, does a pupil_labs_simulator.py program simulate gaze position or is it only simulated pupil service to check connection? because we are able to connect to it but client does not receive any data - the same client connected to pupil service while PL vive is connected works just fine - also is there any documentation for pupil_labs_simulator.py?
It just re-broadcasts the data from the provided recording. To start receiving data, send the R
command to the simulated pupil remote port
ok, works fine
thank you
Hi all, I want to know does Pupil Invisible Companion support for OP 9x or even OP 10x device?
Hi, it does not yet. But we are working on it!
Thanks
Hi, i am wondering is that a necessary requirement that the hardware: glass monitor/camera is a must? can we use the camera of the laptop or the phone?
Hey 👋 Our software is designed for head-mounted eye trackers and will not work for remote eye tracking setups, e.g. using subject-facing cameras.
btw, is there any C++ version instead of python?
We don't have any examples in c++ but the underlying dependencies, for Pupil Core zeromq and msgpack, are available for c++, allowing you to replicate the examples.
Thanks Papr.
thank you
so where could i get these head_mounted eye trackers?
should i buy the specific product?
or any kind of head wear camera is fine
You can find an overview of our products here https://pupil-labs.com/products/ Could you let me know a bit more about your use case? I might be able to give a more concrete recommendation. Also, am I right that you have found your way here through the Pupil Core software?
i am trying to use the software without your product....its too pricy
For a research device, it is fairly affordable I would argue. 🙂 But I can see that it does not fit all budgets. You might be interested in the DIY version https://docs.pupil-labs.com/core/diy/
What are you planning on doing with the eye tracking data?
i am trying to develop an app without the devices
That would be a typical remote eye tracking use case, i.e. our software won't serve this use case. But there is various free remote eye tracking software out there! I am sure you will be able to find a good fit for you!
use fixed camera on laptop
of my mac
really? i cant find a suitable one
Check out this overview https://imotions.com/blog/free-eye-tracking-software/
i got a future use for this, maybe in 3/4 years, its for eye protection
thats a subject coop with hospital
that data maybe used for illness study
but thats future use, now i want a software for eye tracking
cool, thanks
hi, have anyone encountered this problem? - this error appears in calibration procedure but only once (in unity)
I am sorry but I have not encountered this issue before. It looks like a netzmq specific error. Are you able to check where the error is being raised?
so for example 5 points will go through without any problem, then this error happens and the rest will work as well - but in the end it wont show any results - it just freezes ;/
in this class
That class is not related to the hmd-eyes project as far as I know. Kind of difficult to estimate how that would be impacted by the calibration.
Problem solved!
During unity calibration process another client was subscribed to the pupil service (.gaze) which caused connection issues. The solution was to run the client after the calibration was completed 😄
Hi, I exported the gaze mapped world video using the "World Video Exporter" plugin in Pupil Player. When I compared the same frame number from the world video (the one in the recordings/ folder) and compared it to the exported gaze mapped world video (the one in the recordings/exports/ folder), they were not the same. Do you know why this would be? Is this a known issue and is there a fix for this? Thanks!
Interestingly, these videos seem to have the same number of frames. Between these videos, the frame rates are different, and thus they have different duration.
Some meta data that might be useful: Minimum Pupil Player Version: 2.0 Player Version: 3.5.8 Recording Software: Pupil Capture Recording Software Version: 1.6.13
I'm using OpenCV 4.5.5 to grab the frames
In the picture, the left image image is the gaze mapped world video (the one in the recordings/exports) and the right is the world video before gaze mapping (the one in the recordings/ folder).
Hi, opencv is known to be inaccurate in this regard. Use pyav instead. See this tutorial for details https://github.com/pupil-labs/pupil-tutorials/blob/master/09_frame_identification.ipynb
Thanks! I'll look into it. 🙂
does imotions come with pupil labs software? or is it a separate purchase add-on?
Hi @user-5f1a16 👋 You can read Pupil Core data into iMotions software using the iMotions exporter plugin in Pupil Player: https://docs.pupil-labs.com/core/software/pupil-player/#imotions-exporter. For data collection you would still need to use Pupil Capture. iMotions can directly import Pupil Invisible recordings. You would need to use the smartphone Companion Device which ships with Pupil Invisible together with the Companion App for real-time data capture. There is no extra cost associated with using iMotions software in either case. But you don't get iMotions software included with our products.