💻 software-dev


nmt 04 December, 2023, 09:05:55

Responding to your message (https://discord.com/channels/285728493612957698/1113359388997058570/1181144919096705046) here. @user-b96a7d. Would you be able to further elaborate on your use of the network api - are you rendering the video frames + gaze overlay or just streaming the raw gaze coordinates?

user-b96a7d 04 December, 2023, 09:28:05

Responding to your message (https://

user-7413e1 05 December, 2023, 16:46:54

Hi - I was trying to run the gaze_contingent_demo in PsychoPy and this is the error I receive:

Traceback (most recent call last): File "C:\Program Files\PsychoPy\lib\site-packages\psychopy\app\utils.py", line 598, in save with open(self.file, "w") as f: PermissionError: [Errno 13] Permission denied: 'C:\Program Files\PsychoPy\psychopy-gaze-contingent-demo-main\README.rst'

user-cdcab0 05 December, 2023, 17:05:50

Hi, @user-7413e1 - two notes. 1. It looks like you downloaded the gaze contigent demo into your PsychoPy install folder. Instead, you'll want to download it to one of your user folders, such as somewhere within "My Documents", "Downloads", or "Desktop". 2. Once you've done that, be sure to open the gaze_contingent_demo.psyexp file in PsychoPy

user-78b456 12 December, 2023, 09:56:26

hi looking for advice, when subscribing to the surface tracker, it seems to freeze my python code when no surface/gaze on the surface is detected in that moment. However subscribing to "gaze." pushes a msg about the gaze information every frame with no freezing. Even using a try: except when subscribed to 'surface' doesnt prevent the program from hanging.

I assume this is because of how sub.recv_string() and recv_multipart() work, but is there a way to even just get a "surface: none detected" so that the program doesnt hang?

I tried using threading instead to wait for the surface to be detected, but it seemed to run even slower in the thread, and additionally, once the surface was detected, there seemed to be a backlog of messages waiting to get printed, which were definitely not caught up, as it showed i was looking between surfaces 1, 2, 3, and 4, even nearly 20seconds after looking away from any surface.

user-442653 12 December, 2023, 15:16:38

Is there by any chance an open source github repository that I can view for the Neon Companion Monitor from http://neon.local:8080/

user-cdcab0 12 December, 2023, 18:48:31

Not at this time. Are you interested in the client-side (receiving and rendering data streams) code? This can be accomplished using the real-time API (https://docs.pupil-labs.com/neon/real-time-api/tutorials/) for which we provide Python samples

user-cdcab0 12 December, 2023, 18:45:11

Can you share your code here?

user-78b456 13 December, 2023, 08:43:34

I'm using the method from here https://github.com/pupil-labs/pupil-helpers/blob/master/python/filter_gaze_on_surface.py

But instead of: if surfaces["name"] == "surface 1"

I'm doing: if "Surface" in surfaces["name"]

To handle 10 surfaces

user-442653 12 December, 2023, 18:56:04

im looking into building a similar monitor app, I have looked at the provided API links and I have managed to create a working video feed, but I was just unsure how to approach connecting it to a UI so that I could control some of the features such as starting/stopping recording with a button and manually adding in events etc.

user-cdcab0 12 December, 2023, 18:59:59

Are there any GUI toolkits that you're familiar with or have a preference for in Python?

One easy method for rendering the image to the display would be to simply use OpenCV's imshow function

user-442653 12 December, 2023, 19:06:12

im currently using imshow to render the image, is it possible to build the UI off of the imshow function?

user-cdcab0 12 December, 2023, 19:38:37

You can, but it's not the friendliest experience since it's not really built for that use. For a very simple UI it's fine, but for anything non-trivial, I'd definitely recommend using a some type of UI toolkit (like tkinter, PySide, wxWidgets, etc)

user-cdcab0 13 December, 2023, 09:26:14

Ah, yes - recv_string blocks by default. Try this instead:

try:
    topic = sub.recv_string(zmq.NOBLOCK)
    msg = sub.recv
    ... snip ...
except zmq.ZMQError:
    # No surfaces - should probably sleep here
    pass

See: https://pyzmq.readthedocs.io/en/latest/api/zmq.html#zmq.Socket.recv

user-78b456 14 December, 2023, 07:04:46

worked like a charm! thanks as always 🙂

user-9973bb 15 December, 2023, 18:21:48

Hello! Does anyone know if there's a way to get the old Pupil Mobile APK? We need to use a phone for running tracking in our study, but are using a version of the pupil labs for HoloLens system, not Neon, and the app is no longer on the app store.

nmt 18 December, 2023, 10:53:23

Hi @user-9973bb. Pupil Mobile is no longer available, unfortunately. What's preventing you from tethering to a laptop/desktop and running Pupil Capture?

user-06b8c5 18 December, 2023, 11:33:05

I'm not sure if it's a good channel, but I may have hit the right one. We are trying to use an image mapper using an implementation in Python. Unfortunately there is no good documentation anywhere on the pupil labs site on how to do this. There is this page: https://docs.pupil-labs.com/alpha-lab/gaze-metrics-in-aois/#define-aois-and-calculate-gaze-metrics But it only shows the implementation for determining the AOI.
What we need for the project is an implementation from scratch, i.e. how to use code to combine a reference image with a movie to convert fixations from a movie to a static image? Because if this would work, then if I understand correctly, then having this, then using the above link to determine the AOI using the reference image?

I would greatly appreciate your help.

user-cdcab0 18 December, 2023, 17:51:34

When you say a "movie", do you mean some type of dynamic, screen-based content? If so, have you seen this article? https://docs.pupil-labs.com/alpha-lab/map-your-gaze-to-a-2d-screen/

user-46e202 18 December, 2023, 12:19:11

Hi everyone, I wanted to create a simple player that streams pupil invisible or neon cameras via rtsp. I wanted a way to test the RTSP streaming. Can I use usual video players (like VLC player) for testing?

user-cdcab0 18 December, 2023, 17:47:47

Yes! You can view the cameras as a network stream using RTSP on port 8086. For example, rtsp://192.168.0.116:8086/?camera=world. You can also change world to eyes.

On my computer/network, VLC won't open the stream using neon.local - only the IP address works. It might work on yours, but if it doesn't, try using your companion device's IP address instead.

Also note that VLC's default buffering is a bit aggressive, which leads to a delay. If that's a problem for you, you might try using ffmpeg directly instead. I've had good luck with this combination of flags ffplay -fflags nobuffer -flags low_delay -framedrop "rtsp://192.168.0.103:8086?camera=world

user-9973bb 18 December, 2023, 16:35:58

The study requires mobility that even a PC backpack may restrict. I know it is no longer available, but is there any way you can provide the old APK for us to sideload? Even if it is no longer supported, we just need to be able to run this off of a pocketable device. I have found links online to the archived APK but the websites are extremely suspicious and I would really prefer not to have to take that route.

nmt 19 December, 2023, 11:51:16

Unfortunately, the APK is no longer available. But you could in theory use a smaller device, like a small-form factor tablet style PC, to make Core more portable.

user-e637bd 19 December, 2023, 13:59:31

Now that there is a local Neon environment, does that get recorded on a phone?

user-d407c1 19 December, 2023, 14:06:16

Hi @user-e637bd ! Could you please develop? Neon did always recorded locally on the phone, then you could choose whether to upload the data or not to Pupil Cloud. Have a look about the first steps here: https://docs.pupil-labs.com/neon/data-collection/first-recording/

user-e637bd 19 December, 2023, 14:07:20

I was just asking based on the above comment Unfortunately, the APK is no longer available. But you could in theory use a smaller device, like a small-form factor tablet style PC, to make Core more portable.

I just wanted to make sure that the same process for Core exists with Neon, its just a different app, ?

user-d407c1 19 December, 2023, 15:04:20

Pupil Core and Neon are different eye trackers and they utilise different technologies.

  • Pupil Core was designed to operate in a lab and does need to be tethered to a PC/laptop or an SBC (single-board computer).
  • Neon, on the other hand, uses deep learning to estimate gaze, can be used almost anywhere, and it runs on a phone.
user-e637bd 19 December, 2023, 18:15:55

[Pupil Core](https://pupil-labs.com/

user-9973bb 19 December, 2023, 19:13:16

Gotcha! My concern is that the site officially lists that the requirements are an intel i7 minimum but portable PCs will run on something like an Atom processor. Have you ever seen Pupil Core run successfully on weaker hardware?

nmt 20 December, 2023, 08:25:49

When running on lower-powered hardware, one way to speed things up is to disable real-time pupil detection and gaze estimation. These can then be run in a post-hoc context. Just be sure to record the entire calibration choreography – highly recommend running pilot trials with this approach prior to study data collection.

user-f03094 25 December, 2023, 13:35:05

I'm currently conducting an experiment at a commercial facility using the pupil invisible. After capturing data and adding Events on pupil cloud for both the entry and exit times, is there a way to download the Timeseries Data for the duration between these events?

nmt 27 December, 2023, 10:04:50

Hi @user-f03094! That's currently not possible. However, there's an events file (containing all of the entry/exit times you manually added) included in the export. So it's possible to match those events with the timestamps in your other exported timeseries data, thus enabling analysis of sections of data.

user-06bb78 27 December, 2023, 16:20:33

Hi, i'm using pupil invisible and i'm trying to define AOIs in the reference image. I've copied the code in this page https://docs.pupil-labs.com/alpha-lab/gaze-metrics-in-aois/#dependencies-of-this-guide, but when i try to run the programme, this error appear (the code in image is that one which make the error).

Chat image Chat image

nmt 28 December, 2023, 08:57:02

Alpha Lab - Define AOIs and Calculate Ga...

End of December archive