πŸ’» software-dev


Year

user-f0b6e1 07 November, 2025, 13:02:08

I downloaded native data and i noticed - compared to other data the manifest.json and manifest.json.crc are missing. Does this influence anything of my data?

user-d407c1 07 November, 2025, 13:06:25

Hi @user-f0b6e1 ! Those are only internal files to guarantee data integrity when uploading to Cloud. You can completely ignore them.

user-f0b6e1 07 November, 2025, 13:02:15

(for a Neon Recording)

user-85f075 14 November, 2025, 13:21:55

Hi everyone. I am currently using Neon Player and have a few questions regarding the usage. Surface definition: I want to define a screen as a surface, while the player does recognize the apriltags, it is very laggy when I try to move the position of the markers or does not work at all. Do I have to change something in regards to height and width? I already have downloaded the newest version so it cannot be that. Events: i have commuicated certain time stamps to the eye tracker. However I can only see this time stamp when being in the exact time frame. Is there another way to jump between events other than the annotation time line?

user-f43a29 14 November, 2025, 17:36:30

Hi @user-85f075 , thanks, we received your email.

That is not exactly lag. When you edit a surface and move a corner, the system needs to re-process the Surface Definition and save it to disk. You might notice a brief message that it displays in the viewport, which says Surfaces changed. Saving to file. This process just takes a moment whenever you move a corner.

user-f43a29 14 November, 2025, 13:49:38

Hi @user-85f075 , could you share a video demonstrating the lagginess that you experience? You can share this via DM or with [email removed]

WIth respect to Events, when you activate the Annotation Player plugin, you will see a high-level overview of the Events in the Annotations timeline. Do you mean that you only see the Event name when being in the exact frame?

user-85f075 14 November, 2025, 15:43:44

Thank you for your quick reply. Yes, I do not have an overview of the events (or better, I can't find one) πŸ™‚

user-f43a29 14 November, 2025, 15:47:43

You may need to pull up the timeline. You can drag the divider at the bottom of the viewport.

user-f43a29 14 November, 2025, 15:48:08

There is a little tab that allows this.

Chat image

user-3ff3c1 14 November, 2025, 16:54:09

Is it possible to programmatically pull in the eye videos using the v2 api?

user-f43a29 14 November, 2025, 17:59:07

Hi @user-3ff3c1 , the Pupil Cloud API does not currently provide that level of granularity. It is necessary to pull the whole recording file over the API and then extract the eye video from the resulting ZIP file.

If you'd like to see this added, please make a πŸ’‘ features-requests .

user-3ff3c1 15 November, 2025, 16:33:55

Thanks for the response. I'm currently pulling the whole recording zip but the only video in the zip is the recording, I don't see videos of the eyes themselves.

user-d34f33 14 November, 2025, 18:41:35

Is there a way to use an API to automatically export recordings from pupil_player? We have the 000 folder and can open pupil_player to export, but we have hundreds of files and want to automate that process instead of opening the gui. I can't find any documentation on the website. Thanks for any help

user-4c21e5 15 November, 2025, 03:56:07

Hi @user-d34f33 πŸ‘‹. Please see this message for reference on batch exporting data: https://discord.com/channels/285728493612957698/285728493612957698/1254146039515189278 πŸ™‚

user-f43a29 17 November, 2025, 11:20:27

Hi @user-3ff3c1 , it sounds like you are using the Timeseries CSV endpoint. You rather want to use the Native Recording Data endpoint.

Chat image

user-3ff3c1 17 November, 2025, 17:44:28

That is the endpoint we're using. We're only receive one mp4 file for each recording, the video that the glasses are recording. We're looking for the video of the eyes themselves that appears as a smaller video when watching the recordings on https://cloud.pupil-labs.com/

user-f43a29 19 November, 2025, 10:53:41

Hi @user-3ff3c1 , apologies for the delay.

If you just want the eye videos, then you can use this endpoint:

https://api.cloud.pupil-labs.com/v2/workspaces/{WORKSPACE_ID}/recordings/{recording_id}/files/{filename}

where filename can be replaced with eye.mp4.

You could wrap it up in a Python function like this:

import requests

def get_individual_file(filename):
    url = f"https://api.cloud.pupil-labs.com/v2/workspaces/{WORKSPACE_ID}/recordings/{recording_id}/files/{filename}"
    response = requests.get(url, stream=True, headers={"api-key": API_TOKEN})

    save_path = f"{filename}"
    chunk_size = 128

    with open(save_path, "wb") as fd:
        for chunk in response.iter_content(chunk_size=chunk_size):
            fd.write(chunk)

    return response.status_code

And then call it as follows:

def get_eye_video(recording_id):
    get_individual_file("eye.mp4")
user-3ff3c1 19 November, 2025, 15:53:25

Thank you very much!! Tried that out and it works

user-937ec6 24 November, 2025, 16:13:05

Hello, I am using the latest Pupil Labs Realtime API for Python. I am enjoying the recently added audio support. Upon integrating audio support into my code base, I couldn’t find a reliable method of detecting when the microphone is muted in the Companion mobile application. Thus, I implemented a timeout on the audio queue to detect this case. β€’ Am I correct that there isn’t a way to reliably detect the microphone is muted in the Companion mobile app? β€’ Looking at models.py, it appears as though if the world sensor is connected, the audio sensor connected status is unconditionally set to True. β€’ If my assumptions are true, can a check be added to the API that reports the microphone status or the connected status of the audio sensor be updated accordingly?

user-f43a29 25 November, 2025, 08:51:38

Hi @user-937ec6 , there is no way to know if the microphone is muted via the standard API routines. Rather, if you try to request audio when the microphone is disabled, then it should return an error. If you would like a change to the API, feel free to open a πŸ’‘ features-requests .

user-d407c1 25 November, 2025, 08:57:01

To add a bit to my colleague's response, the audio is multiplexed on the video RTSP, but on our client we generate a separate stream as streaming only audio is a valid scenario.

The microphone status is not unconditionally set to True, but as audio has not it's own RTSP stream and the status does not report the mic's , we check the SDP of the RTSP to check whether it contains the audio signal.

End of November archive