πŸ’» software-dev


user-3be70d 02 December, 2024, 02:59:21

Hello everyone, I encountered this problem when using Pupil Lab to develop plug-ins: ERROR eye0 - launchables.eye: Process Eye0 crashed with trace: eye.py:42 Traceback (most recent call last): File "Pupil-lab/pupil/pupil_src/launchables/eye.py", line 707, in eye plugin.recent_events(event) File "Pupil-lab/pupil/capture_settings/plugins/pupil_gaze_estimation_plugin.py", line 375, in recent_events frame = cv2.imdecode(np.frombuffer(frame.jpeg_buffer, np.uint8), cv2.IMREAD_COLOR) AttributeError: 'NoneType' object has no attribute 'jpeg_buffer' How to solve this problem。

nmt 03 December, 2024, 02:30:26

Hi @user-fb8431. I've moved your message to here since it's about software dev. Are you running Pupil Core from source or from a pre-compiled bundle?

user-fb8431 03 December, 2024, 07:02:15

Thank you very much for your reply, currently I have solved this problem, currently I have a new problem, I want to get gaze_distance in the plugin And the internal and external parameter matrix of the camera

user-fb8431 03 December, 2024, 07:37:38

And this error occurred, which cannot be solved at present File "/home/fanruilong/workspace/Gaze/wearable_eye_tracking/Pupil-lab/pupil/capture_settings/plugins/pupil_gaze_estimation_plugin.py", line 396, in recent_events logger.info(f"Detection result: {gt} gaze distence: {self.binous_model.gaze_distance}") File "/home/fanruilong/workspace/Gaze/wearable_eye_tracking/Pupil-lab/pupil/pupil_src/shared_modules/gaze_mapping/gazer_3d/gazer_headset.py", line 102, in gaze_distance return self._gaze_distance AttributeError: 'Model3D_Monocular' object has no attribute '_gaze_distance'

nmt 03 December, 2024, 12:18:23

Before we dive further into this, can I ask what's your end goal with this custom plugin?

user-91a92d 03 December, 2024, 11:23:51

Hello, I am trying to run the Plimu software but nothing is showing up. I can see the video and gaze in the browser but plimu is showing nothing. If I control+c quickly after the plimu command I have the following output. I am suprised to see the port 8086. LSL is activated and I tried with and without the preview Thank you for the help

Chat image

user-d407c1 03 December, 2024, 11:36:07

Hi @user-91a92d ! Would you mind sharing which version of the realtime API and Companion App are you running? Could you try disabling the audio on the companion device?

user-91a92d 03 December, 2024, 11:40:34

the mic is bared in the app => no change. I have: pupil-labs-realtime-api 1.3.6.dev1+g8c0a790 and compagnon app updated yesterday

user-d407c1 03 December, 2024, 11:43:29

Thanks for following up! Are you on Linux by any chance? If so, could you try running our Realtime API examples and let us know if you experience the same issue?

user-91a92d 03 December, 2024, 11:58:34

No problem to read IMU data with the real time api. We have some weird result with the yaw and I would try to visualize the raw data

user-fb8431 03 December, 2024, 12:23:34

Use deep learning models to complete line of sight estimation tasks

nmt 04 December, 2024, 07:56:44

Interesting! Well, let's start with the gaze depth. It looks like you're trying to measure this from a monocular gaze estimate. This won't be possible unless it's intersecting with something known in the physical space. There are two ways around this: 1. Use the variable gaze_point_3d, which is the nearest intersection of the left and right eyes' visual axes. This is a proxy of viewing distance. Note that is gets generally less accurate at further viewing distances. You don't need a custom plugin to access this in real-time. I'd recommend instead using our real-time API. This example script shows how, and all of Core's variables are listed in this section of the docs. 2. Use the Head Pose Tracker. Then you can calculate the actual viewing distance. Let me know if those options sounds workable!

user-f3a98e 04 December, 2024, 15:11:36

Hi! I want to write a Python script to read gaze coordinates in real time from Pupil Neon, and I’d like to map these coordinates to a defined surface/screen marked with 4 AprilTags. What would be the best practice for achieving this mapping? Any tips or suggestions would be greatly appreciated! Thank you!

user-d407c1 04 December, 2024, 15:19:57

Hi @user-f3a98e πŸ‘‹ ! Have you seen this package (real-time-screen-gaze) and the accompanying tutorial? They kind of do exactly what you are looking for.

user-f3a98e 04 December, 2024, 15:23:04

Thanks, I will have a look !

user-f3a98e 05 December, 2024, 16:16:18

Hey, does anyone know if the real-time-screen-gaze package works with Asynchronous API data? I'm manually synchronizing frame data and gaze data. However, when I try to process the frame with the asynchronous API, I get an error. When I do this with Simple API everything works. This leads me to suspect that the issue could be with the frame format. I'm using the receive_video_frames function. Any advice or insights would be greatly appreciated!

user-cdcab0 05 December, 2024, 16:21:06

The data are the same between the simple and async APIs. The real-time-screen-gaze package can work with either - we are currently using the async api with real-time-screen-gaze in our PsychoPy plugin. If you can share your code, I may be able to help you troubleshoot it

user-0e6279 09 December, 2024, 09:16:32

Hi, we want to connect our neon directly through ethernet cable and hub to a ubuntu 22.04 PC. It was working before and implemented by someone who does not work here anymore and we needed reset the PC. Is there a tutorial on what to do other then the short one in the docs? Also we're not using a DHCP server on our local network

user-f3a98e 09 December, 2024, 09:21:04

Thanks for the reply! I am currently trying to make the basic call to gaze_mapper.process_frame work. I suppose I need to format the video_frame but I am not sure to what exactly.

Chat image

user-cdcab0 09 December, 2024, 09:46:10

You don't need to do anything special to the video frame - you can pass it in exactly as you have.

You didn't mention any specifics about your error, but there is a significant bug with your code. You have nested loops - this means you'll acquire one, singular gaze data packet - and then all of the scene data frames. That inner loop keeps going, so you'll never actually receive updated gaze data.

These tasks need to be done concurrently, not nested like that. You'll want something more like this:

async def receive_scene_data(status):
    sensor_world = status.direct_world_sensor()
    async for frame in receive_video_frames(sensor_world.url, run_loop=True):
        gaze_mapper.process_scene(frame)

async def receive_gaze_data(status):
    sensor_gaze = status.direct_gaze_sensor()
    async for gaze_data in receive_gaze_data(sensor_gaze.url, run_loop=True):
        surface_map = gaze_mapper.process_gaze(gaze_data)
        for surface_gaze in surface_map.mapped_gaze[screen_surface.uid]:
            print(f"Gaze based on surface at {surface_gaze.x}, {surface_gaze.y}\n")

async def do_tasks():
    status = await device.get_status()
    await asyncio.gather(
        receive_scene_data(status),
        receive_gaze_data(status)
    )
user-f43a29 09 December, 2024, 09:24:36

Hi @user-0e6279 , to connect directly via Ethernet-over-USB with Ubuntu 22 and up, try the following (no need for a DHCP server, in general, as the device is found via mDNS):

Default Gnome environment:

  1. Open Settings and go to Network
  2. Go to Wired connections, click the + sign to add a new Ethernet connection
    • Give it a memorable name like β€œNeon”
  3. Under the IPv4 tab, choose β€œShared to other computers”, click Apply/Done (see first image)

Next: 1. Make sure your Neon and USB hub are properly connected. The order in which you plug the cables is important: 1. Unplug all cables from the USB hub. Also, unplug the hub from the phone. Close the Neon Companion app 2. Plug Neon into the port marked with 5Gbps on the hub 3. Start the Neon Companion App on the phone and wait for the "Plug in and go!" message 4. Plug the USB cable of the Anker USB hub into the phone 5. Wait for Neon to be recognized 6. Now, you can connect the Ethernet cable to the USB hub and to a free Ethernet port on your computer

At this point, consider turning off the WiFi connection of the Neon Companion phone to be sure that it only uses the Ethernet connection. It could also be worth it to turn off the phone's Hotspot and Ethernet Tethering. All of these options are found in Settings -> "Network & internet"

Next, you will need to activate the appropriate Ethernet connection/interface for your Neon to be accessible via the Real-Time API. This again depends on the operating system: - In Settings β†’ Network, click on the β€œNeon” connection that was made in the previous steps. It will show a checkmark when it has been successfully activated (see image 2)

Wait about 30 seconds and test if you can connect to the device with the β€œdiscover_one_device” function

If you also want to charge the device during longer recording sessions, then at this point, you could connect a power source, such as the provided Quick Charger, to the PD[in] port on the USB hub

Chat image Chat image

user-f3a98e 09 December, 2024, 10:36:56

Sorry for not providing the error. When I just use the video_frame i get the error in the picture. However, when I use video_frame.to_ndarray(format="bgr24") it seems to work, so I guess that is a problem solved. I am just curious why the normal frame throws the error and what could it mean? And also do you know if using video_frame.to_ndarray(format="bgr24") would yield the same results as using the video_frame as is? I am sorry for bombarding you with questions and thanks a lot for the help.

Chat image

user-cdcab0 09 December, 2024, 10:51:09

Ah, my mistake, actually. There is a difference between video frames when using the simple API versus the async API.

To access the pixel data with a frame from the simple API, it's done through frame.bgr_pixels. When using the async API it's frame.bgr_buffer(). The screen gaze package will accept either the pixel data directly or a video frame from the simple API as-is, but it does expect a frame from the async api (a bug that will soon be corrected).

By the way, frame.bgr_buffer() is just a convenience wrapper for video_frame.to_ndarray(format="bgr24")

user-f3a98e 09 December, 2024, 10:55:18

Thank you

user-cdcab0 09 December, 2024, 11:47:06

You're welcome - thank you for letting me know about this oversight. If you update your real-time-screen-gaze package, you'll find that you can send it frames as-is from either the simple API or the async API - no need to use bgr_pixels or bgr_buffer() or to_ndarray(...)

Please do keep in mind what I said before about your nested async loops. I'd also suggest you review the "Scene Camera Video With Overlayed Gaze" async sample from the realtime-api documentation

user-05ba05 19 December, 2024, 12:18:10

Hi ! Would it be possible to know when the next release is planned ? Some features have been added and it would be great to have a timeframe of the release to know when these will become available

user-d407c1 19 December, 2024, 15:39:36

Hi @user-05ba05 πŸ‘‹ ! Would you mind clarifying which software update are you looking for?

user-05ba05 19 December, 2024, 22:55:06

Of course, I've been told about a feature regarding additions of keyboard shortcuts on the web UI to put labels on the live video of the glasses. I'm not sure which software is concerned but I guess it's the mobile app of the companion !

nmt 20 December, 2024, 11:48:16

Hi @user-05ba05! We are close, but I can't provide a concrete timeline just yet. Keep an eye out in πŸ“― announcements πŸ™‚

user-05ba05 20 December, 2024, 15:10:50

Okay great thanks !

user-83d076 23 December, 2024, 17:02:54

Hi community

I'm having issues regarding this: https://docs.pupil-labs.com/alpha-lab/map-your-gaze-to-a-2d-screen/ (Alpha Lab - Map Gaze Onto Dynamic Screen Content - Pupil Labs Docs Map and Visualise Gaze Onto a Display Content Using the Reference Image Mapper)

I'm using the Neon Companion version: 2.8.34-prod

I'm following the instructions and at the end I can't save the output This is the error I end up getting the following image attached

Has any figured out this issue before? I'm not an experiences coder so this is a bit difficult. I did however pull the updated code regarding the audio issue which does work.

---update--- One thing I figure out was that if you deleted the line where the error occurs it fixes it.... but I'm not sure if this is the best solution

Chat image

user-d407c1 23 December, 2024, 17:28:20

Hi @user-83d076 ! The audio codec employed on Neon changed back in April and I hadn't revisited that tutorial's code since then. I will try to allocate some time to fix it soon, but meanwhile if you do not need audio or simply removing that reference solves it, feel free to use it like that.

user-83d076 23 December, 2024, 18:12:18

Perfect. Thank you for the update. I don't plan on using the audio anyways

user-d7eed4 24 December, 2024, 16:16:04

Hey πŸ™‚ I'm trying to use the batch_exporter plugin, but I've encountered an issue with missing imports. Specifically:

The plugin attempts to:

from exporter import export as export_function

However, the exporter module doesn't seem to exist in the current setup. Has this module's name changed, or am I missing something in the configuration?

It also tries:

from player_methods import is_pupil_rec_dir

While is_pupil_rec_dir is missing, I can probably implement this myself if needed.

Also, if this module is in shared_modules, I shouldn't need to add it manually to the plugins directory, correct?

Any guidance on resolving this would be greatly appreciated.

Thanks in advance!

nmt 27 December, 2024, 14:02:06

Hi @user-d7eed4! I'm afraid that batch exporter plugin is long since deprecated (circa 2019) due to technical reasons. Depending on which data you're looking to export, you might want to check out this community-contributed repository: https://github.com/tombullock/batchExportPupilLabs

user-d7eed4 28 December, 2024, 12:27:04

Thanks!

user-84387e 28 December, 2024, 21:11:05

hey, someone at 38c3 and down for a chat about dev stuff? I'm especially interested in pye3d.

user-84387e 29 December, 2024, 15:29:52

Hey πŸ™‚ @nmt who is currently maintaining pye3d? pfaion told me to ask here in discord. I'm interested in knowing the licensing situation. Currently it just says "All Right reversed". In Pip is says MIT. The related issue: https://github.com/pupil-labs/pye3d-detector/issues/70

End of December archive