💻 software-dev


user-c01a3e 06 July, 2022, 21:24:51

I am working to build a program to plot the real-time data from the pupil core. I have the data readout coming in through the network API, but not sure if there is a way to plot it in real time as well? Any suggestions?

user-0aca39 07 July, 2022, 08:49:27

hi, is it possible to change pupil service to listen on 0.0.0.0 instead of 127.0.0.1?

papr 07 July, 2022, 08:53:44

It actually binds to 0.0.0.0 but we replace that string with 127.0.0.1 in the ui. https://github.com/pupil-labs/pupil/blob/master/pupil_src/main.py#L224-L243

user-0aca39 07 July, 2022, 08:50:31

I suppose it does listen on 127.0.0.1

user-0aca39 07 July, 2022, 09:23:12

so, will it work if I make unity calibration on another device (and just replace in the calibration settings [in unity plugin] from 127.0.0.1 to PC's ipV4 that is hosting pupil service with connected PL valve)

papr 07 July, 2022, 09:23:32

yes

user-0aca39 07 July, 2022, 09:23:14

Chat image

user-0aca39 07 July, 2022, 09:23:49

ok, thank you

user-0aca39 08 July, 2022, 08:53:20

hi, does a pupil_labs_simulator.py program simulate gaze position or is it only simulated pupil service to check connection? because we are able to connect to it but client does not receive any data - the same client connected to pupil service while PL vive is connected works just fine - also is there any documentation for pupil_labs_simulator.py?

papr 08 July, 2022, 08:54:37

It just re-broadcasts the data from the provided recording. To start receiving data, send the R command to the simulated pupil remote port

user-0aca39 08 July, 2022, 09:06:20

ok, works fine

user-0aca39 08 July, 2022, 09:06:21

thank you

user-04df41 11 July, 2022, 04:48:48

Hi all, I want to know does Pupil Invisible Companion support for OP 9x or even OP 10x device?

papr 11 July, 2022, 08:25:07

Hi, it does not yet. But we are working on it!

user-04df41 11 July, 2022, 04:48:52

Thanks

user-82bf70 11 July, 2022, 08:16:28

Hi, i am wondering is that a necessary requirement that the hardware: glass monitor/camera is a must? can we use the camera of the laptop or the phone?

papr 11 July, 2022, 08:31:02

Hey 👋 Our software is designed for head-mounted eye trackers and will not work for remote eye tracking setups, e.g. using subject-facing cameras.

user-82bf70 11 July, 2022, 08:17:05

btw, is there any C++ version instead of python?

papr 11 July, 2022, 08:33:00

We don't have any examples in c++ but the underlying dependencies, for Pupil Core zeromq and msgpack, are available for c++, allowing you to replicate the examples.

user-04df41 11 July, 2022, 08:25:47

Thanks Papr.

user-82bf70 11 July, 2022, 08:34:48

thank you

user-82bf70 11 July, 2022, 08:36:23

so where could i get these head_mounted eye trackers?

user-82bf70 11 July, 2022, 08:36:55

should i buy the specific product?

user-82bf70 11 July, 2022, 08:37:19

or any kind of head wear camera is fine

papr 11 July, 2022, 08:41:31

You can find an overview of our products here https://pupil-labs.com/products/ Could you let me know a bit more about your use case? I might be able to give a more concrete recommendation. Also, am I right that you have found your way here through the Pupil Core software?

user-82bf70 11 July, 2022, 09:01:09

i am trying to use the software without your product....its too pricy

papr 11 July, 2022, 09:06:16

For a research device, it is fairly affordable I would argue. 🙂 But I can see that it does not fit all budgets. You might be interested in the DIY version https://docs.pupil-labs.com/core/diy/

What are you planning on doing with the eye tracking data?

user-82bf70 11 July, 2022, 09:07:06

i am trying to develop an app without the devices

papr 11 July, 2022, 09:08:22

That would be a typical remote eye tracking use case, i.e. our software won't serve this use case. But there is various free remote eye tracking software out there! I am sure you will be able to find a good fit for you!

user-82bf70 11 July, 2022, 09:07:23

use fixed camera on laptop

user-82bf70 11 July, 2022, 09:07:26

of my mac

user-82bf70 11 July, 2022, 09:10:31

really? i cant find a suitable one

papr 11 July, 2022, 09:12:54

Check out this overview https://imotions.com/blog/free-eye-tracking-software/

user-82bf70 11 July, 2022, 09:11:13

i got a future use for this, maybe in 3/4 years, its for eye protection

user-82bf70 11 July, 2022, 09:11:32

thats a subject coop with hospital

user-82bf70 11 July, 2022, 09:11:51

that data maybe used for illness study

user-82bf70 11 July, 2022, 09:12:21

but thats future use, now i want a software for eye tracking

user-82bf70 11 July, 2022, 09:18:20

cool, thanks

user-0aca39 12 July, 2022, 07:30:55

hi, have anyone encountered this problem? - this error appears in calibration procedure but only once (in unity)

Chat image

papr 12 July, 2022, 07:46:11

I am sorry but I have not encountered this issue before. It looks like a netzmq specific error. Are you able to check where the error is being raised?

user-0aca39 12 July, 2022, 07:33:53

so for example 5 points will go through without any problem, then this error happens and the rest will work as well - but in the end it wont show any results - it just freezes ;/

user-0aca39 12 July, 2022, 08:06:43

Chat image

user-0aca39 12 July, 2022, 08:06:47

in this class

user-0aca39 12 July, 2022, 08:06:48

Chat image

papr 12 July, 2022, 09:03:39

That class is not related to the hmd-eyes project as far as I know. Kind of difficult to estimate how that would be impacted by the calibration.

user-0aca39 12 July, 2022, 09:27:53

Problem solved!

During unity calibration process another client was subscribed to the pupil service (.gaze) which caused connection issues. The solution was to run the client after the calibration was completed 😄

user-d07106 15 July, 2022, 17:21:02

Hi, I exported the gaze mapped world video using the "World Video Exporter" plugin in Pupil Player. When I compared the same frame number from the world video (the one in the recordings/ folder) and compared it to the exported gaze mapped world video (the one in the recordings/exports/ folder), they were not the same. Do you know why this would be? Is this a known issue and is there a fix for this? Thanks!

Interestingly, these videos seem to have the same number of frames. Between these videos, the frame rates are different, and thus they have different duration.

Some meta data that might be useful: Minimum Pupil Player Version: 2.0 Player Version: 3.5.8 Recording Software: Pupil Capture Recording Software Version: 1.6.13

I'm using OpenCV 4.5.5 to grab the frames

In the picture, the left image image is the gaze mapped world video (the one in the recordings/exports) and the right is the world video before gaze mapping (the one in the recordings/ folder).

Chat image

papr 18 July, 2022, 11:12:14

Hi, opencv is known to be inaccurate in this regard. Use pyav instead. See this tutorial for details https://github.com/pupil-labs/pupil-tutorials/blob/master/09_frame_identification.ipynb

user-d07106 21 July, 2022, 13:26:31

Thanks! I'll look into it. 🙂

user-5f1a16 27 July, 2022, 15:15:39

does imotions come with pupil labs software? or is it a separate purchase add-on?

user-9429ba 27 July, 2022, 16:16:55

Hi @user-5f1a16 👋 You can read Pupil Core data into iMotions software using the iMotions exporter plugin in Pupil Player: https://docs.pupil-labs.com/core/software/pupil-player/#imotions-exporter. For data collection you would still need to use Pupil Capture. iMotions can directly import Pupil Invisible recordings. You would need to use the smartphone Companion Device which ships with Pupil Invisible together with the Companion App for real-time data capture. There is no extra cost associated with using iMotions software in either case. But you don't get iMotions software included with our products.

End of July archive