💻 software-dev


user-287b79 06 August, 2024, 02:51:02

Hello, is there a link to the instructions available somewhere? This one seems to be broken.

Chat image

user-f43a29 06 August, 2024, 03:29:45

Thanks for pointing this out, @user-287b79 ! We have updated the link. The new location of that document is here.

user-5d429a 06 August, 2024, 11:13:10

Hello) Has anyone had a problem installing nslr to visual studio? How to solve it?

note: This error originates from a subprocess, and is likely not a problem with pip. error: legacy-install-failure

× Encountered error while trying to install package. ╰─> nslr

note: This is an issue with the package mentioned above, not pip. hint: See above for output from the failure.

user-f43a29 07 August, 2024, 08:10:44

Hi beyonding, may I ask for more clarification:

  • I assume you mean Visual Studio, the C/C++ IDE, since nslr requires compiling C++ code? Just to clarify, you do not exactly install Python packages into Visual Studio, but rather into your local or global Python environment. The install process of nslr will use the C++ compiler to build a supporting library and then you can use the resulting package in Python.

  • Are you trying to install nslr-hmm or nslr from our Github repos? If so, what version of Windows are you on, what version of Python, and what version of Visual Studio are you using?

Note that you can also try the more up-to-date version of nslr at the original repository, but I cannot make any guarantees about whether it is fitting for your use case, nor if it will operate similarly to our fork.

user-5ab4f5 07 August, 2024, 06:12:48

@user-480f4c Thank you. One question aditionally however. If i get the heatmaps, i can't really download the data for them in a .csv format, can't i? In case i want to work with this data statistically. e.g. a Principal Component Analysis

user-07e923 07 August, 2024, 06:17:12

Hey @user-5ab4f5, if I can just quickly jump in for my colleague, the heatmaps are computed using the 2D histogram of raw gaze data across all the recordings from that project. You can see the code implementation here. What this means is, you only need to pool the data and aggregate them in a similar way for your PCA.

user-bef103 19 August, 2024, 07:25:12

Hello, I want to know if there is any differences in the data sent to the cloud service and the data that is sent back into iMotions when using the ip adress of the phone to stream the eye tracking data.

If you need more information let me know

user-f43a29 19 August, 2024, 08:58:52

Hi @user-bef103 , no need to ping. We see these messages as they come in.

If you have the gaze data rate set to 200Hz and you have real-time eyestate computation enabled in the Companion App settings, then in principle, for the fundamental data shared between the Companion App and Pupil Cloud, it is essentially the same.

However, if you use a poor network, such as a University or Work WiFi that is used by many people, then you could have dropped packets in the data collected via the real-time API but not in the Pupil Cloud data nor in the recording saved on the Companion device. Therefore, when streaming data, we recommend using a dedicated local WiFi router that is not connected to the internet.

Note that Pupil Cloud does re-run the gaze/eyestate pipelines on the eye images, so that you always have those data sampled at 200Hz for Pupil Cloud downloads/analyses. This is for the cases where the user changed the data rate in App settings or for the off chance that there was a dropped data point in the recording.

Otherwise, you do get some extra post-hoc data extracted by Pupil Cloud, such as blinks, fixations, saccades, and Euler orientation angles for the IMU readings. We do provide a way to compute blinks in real-time.

user-bef103 19 August, 2024, 08:38:12

@user-07e923 I hope its fine i ping, Who is able to answer this question?

user-bef103 19 August, 2024, 09:05:13

I see, thaks for the information

user-f43a29 19 August, 2024, 09:30:46

Hi @user-bef103 , just one clarification, as I made a small edit in the post above:

If you are using a poor/busy network and a packet is dropped, then this will only be missing in the data that is streamed via the real-time API. This data packet will not be missing in the Pupil Cloud data nor will it be missing in the recording saved on the Companion device.

user-bef103 19 August, 2024, 09:36:07

okay i see. because we have a client experincing issues with connecting the companion app with iMotions to stream the live feed into iMotions

user-f43a29 19 August, 2024, 09:36:46

They have an issue establishing the connection or with streaming the data after the connection has started?

user-bef103 19 August, 2024, 09:36:44

they do not have issues with the app natively or with the cloud

user-bef103 19 August, 2024, 09:37:25

they have issues with establising the connection when they enter the ip adress of the phone into iMotions

user-f43a29 19 August, 2024, 09:38:37

Ok, could you send us an email at [email removed] We can continue communication about this there.

user-bef103 19 August, 2024, 09:37:45

the adress stated in the companion app

user-bef103 19 August, 2024, 09:40:46

Sur

user-bef103 19 August, 2024, 09:40:48

e

user-bdc05d 22 August, 2024, 12:55:29

I was wondering it seem to have blinks detected that are in reality saccades and blinks not detected but there is a blink (observed on a plot of x/y coord), also even if a blink is detected there is coord x/y (even if eye close ?), weird no? I'm a little bit confused about those observation is this normal? (for precision I plotted x and y (2 curves) and over fixations from fixation.csv and blinks from blinks.csv all of these from the cloud download section)

user-4b18ca 23 August, 2024, 11:50:42

Hello Pupil-labs team, in our recording setup, we are using MATLAB with pl-neon-matlab to start/stop recording and send_event to set events in the connected glasses. In a minimum working example, everything is working fine. As soon as we have a recording with many events (mutiple per second, can be some 10ms apart), the events are not visible in the cloud. In the MATLAB log, there is no error indicated when the events are sent to Neon.

Are there any limitations that we should be aware of? Anything that can lead to the events being dropped/purged?

user-f43a29 23 August, 2024, 11:57:13

Hi @user-4b18ca , could you open a 🛟 troubleshooting ticket about this?

user-4b18ca 23 August, 2024, 12:21:01

Sure

user-2b4808 24 August, 2024, 16:02:39

Hello, 👁 core I am trying to use the IPC backbone to get real-time data, but for some reason, I don't see anything. I did see data just once in about a dozen times that I tried. Here is the code I am using (taken directly from the dev guide):

import zmq
import time
import msgpack

ctx = zmq.Context()
# The REQ talks to Pupil remote and receives the session unique IPC SUB PORT
pupil_remote = ctx.socket(zmq.REQ)

ip = 'localhost'  # If you talk to a different machine use its IP.
port = 50020  # The port defaults to 50020. Set in Pupil Capture GUI.

pupil_remote.connect(f'tcp://{ip}:{port}')

# Request 'SUB_PORT' for reading data
pupil_remote.send_string('SUB_PORT')
sub_port = pupil_remote.recv_string()

# Request 'PUB_PORT' for writing data
pupil_remote.send_string('PUB_PORT')
pub_port = pupil_remote.recv_string()

subscriber = ctx.socket(zmq.SUB)
subscriber.connect(f'tcp://{ip}:{sub_port}')
subscriber.subscribe('gaze')  # receive all gaze messages
subscriber.subscribe('pupil.0.2d.')

while True:
    topic, payload = subscriber.recv_multipart()
    # print(topic + "lol")
    message = msgpack.loads(payload)
    print(f"{topic}: {message}")

The code runs fine, but there is no output (I am using a notebook to run this, and it worked once, but I haven't been able to get it up and running again). I am not sure what I am doing wrong here, any help would be appreciated!

EDIT: Nevermind, I realized that I have not been calibrating before I start using it, sorry for the unnecessary post, and thanks!

user-c541c9 28 August, 2024, 10:38:19

Hi, I have located some dependency issues. It seems I can't have a python environment with both pupil-labs-realtime-api and pl-neon-recording ~~installed~~ working. It seems protobuf needed for real time API is v3.18.0 (which is a yanked version as per PyPI), and neon recording library seems to pull the latest protobuf -- with the result that in the python environment containing both libraries, the protobuf 3.18.0 is installed. Normally, I won't bother about dependencies, but it seems pl-neon-recording would not play with the older protobuf version. Could you kindly address this -- such that I can use both libraries in a common environment?

I provide a listing to substantiate what I have described above.

listing-dependency-issues.txt

user-cdcab0 28 August, 2024, 13:42:14

Hi, @user-c541c9 - thanks for the report! We have actually already resolved this - ~~just need to package a new release of pupil-labs-realtime-api and push it up to pypi. That'll probably happen in the next day or so, but if you don't want to wait, you can install from github:

~~ Edit: The fix has now been pushed to pypi

user-c541c9 28 August, 2024, 15:23:02
user-224f5a 28 August, 2024, 16:35:51

Good morning, everyone! I need some help. I've been developing with Pupil Labs and HTC Vive on Unity for months, but I've never been able to fix the background during calibration. When the calibration is activated before an experiment, the background doesn't darken and turn gray as it does in the demo; instead, it stays transparent, showing what's in the experiment room. Does anyone know if I need to manually add the gray background, or have I configured something incorrectly?

Chat image

user-f43a29 29 August, 2024, 16:43:13

Hi @user-224f5a , that sounds like you might have activated VR pass through? Is that true and do you need that for your experiment?

user-c541c9 30 August, 2024, 10:48:40

Hi, I am trying to read back events that I issued while recording using the real-time API with pl-neon-recording (I'm on 0.1.6 version), after having fetched the exported recording. I clearly have events in the file:

wc tmp/Neon\ Export/2024-08-28-16-01-31/event.txt
      10      10     174 tmp/Neon Export/2024-08-28-16-01-31/event.txt

On the shell:

>>> import pupil_labs.neon_recording as nr
>>> recording = nr.load('tmp/Neon Export/2024-08-28-16-01-31')
>>> recording.streams
{'audio': None, 'events': None, 'eye': None, 'eye_state': None, 'gaze': None, 'imu': None, 'scene': None}
>>> recording.gaze.ts[:10]
array([1.72485729e+09, 1.72485729e+09, 1.72485729e+09, 1.72485729e+09,
       1.72485729e+09, 1.72485729e+09, 1.72485729e+09, 1.72485729e+09,
       1.72485729e+09, 1.72485729e+09])
>>> recording.streams
{'audio': None, 'events': None, 'eye': None, 'eye_state': None, 'gaze': <pupil_labs.neon_recording.stream.gaze_stream.GazeStream object at 0x111e53fd0>, 'imu': None, 'scene': None}

>>> recording.events.ts[:4]
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/...neon_experiments/.venv/lib/python3.11/site-packages/pupil_labs/neon_recording/neon_recording.py", line 153, in events
    if self.streams["event"] is None:
       ~~~~~~~~~~~~^^^^^^^^^
KeyError: 'event'

I seem to lazy-load other data streams, but when I try to read events, I run into issues. I suppose the library has some internal confusion around the keys "event" and "events"... Could you have a look please and advise? Thanks.

user-cdcab0 30 August, 2024, 11:17:59

Oof, a typo - one sec... edit: Fixed in v0.1.7

End of August archive