👓 neon


Year

Month (messages)

user-d407c1 01 April, 2025, 06:50:26

Note that you can also find examples for Psychopy here.

user-8825ab 01 April, 2025, 10:17:21

Hi, can i ask for unzip software you recommended for unzip the downloaded recording file from pupil lab

user-f43a29 01 April, 2025, 14:33:12

Hi @user-8825ab , have you tried 7-Zip? Please see this message: https://discord.com/channels/285728493612957698/633564003846717444/1343484891517681736 But, may I also ask what exact error message you are receiving and what Operating System you are on?

user-8825ab 01 April, 2025, 10:18:16

I have downloaded the recording file, but it does not let me open, can i ask for some help please

user-3e88a5 01 April, 2025, 15:35:35

Hello, I would like to change the parameters used to evaluate saccades and fixations in your algorithm. Is there a way to do it in the cloud? Or is there some source code in python maybe that I can run locally? Thank you

user-f43a29 02 April, 2025, 07:30:41

Hi @user-3e88a5 , the parameters on Pupil Cloud cannot be changed, but you can modify the parameters in the pl-rec-export implementation. It is written in Python and can be run locally.

user-4440c8 02 April, 2025, 01:01:33

Good afternoon, I would like to know how I can access the accelerometer data in the NEON raw data. I wanted to know the strength of gravity (hypergravity, normogravity, microgravity) at time of recording. Thank you!

user-f43a29 02 April, 2025, 06:47:28

Hi @user-4440c8 , if you're looking to extract it from the saved raw data, then our pl-neon-recording library can do that. It is open-source, so it shows how to work with the open binary file formats. Check out this IMU example. You can also stream it during the recording with the Real-time API.

user-904c9a 02 April, 2025, 14:10:52

eye_state

user-e13d09 02 April, 2025, 16:20:03

Hello, I was wondering if it was possible to get access to my previously closed tickets as I can no longer see the solutions provided by the mod team.

user-f43a29 03 April, 2025, 08:11:10

Hi @user-e13d09 , yes, this is possible. We will follow-up with you in the Support Ticket.

user-d407c1 03 April, 2025, 06:49:56

My colleague @user-f43a29 ran some tests and got a latency to available result in Matlab or Python was within ~14-15 ms (with Ethernet).

For screen based settings, you would additionally need to map gaze to the screen, which can take an extra ~1 to ~1.5ms with real-time-screen-mapper, although that also depends on your computer resources.

I hope this addresses your concerns about latency.

user-2109ae 03 April, 2025, 09:24:04

Hi! I am using RTSP to stream video from the Neon glasses and I would like to undistort the video stream to do further processing. This article (https://docs.pupil-labs.com/alpha-lab/undistort/) was helpful, although there is a dead link. From what I can tell, you have some software that undistorts the video; are the camera parameters that the software uses available somewhere?

user-d407c1 03 April, 2025, 09:44:53

Hi @user-2109ae 👋 ! If you are using the realtime api, you can directly access the camera parameters from it. Check out this example.

user-688acf 03 April, 2025, 10:12:27

help please, just recorded several studies with pupil neon. recording and saving on the phone worked fine without issues or errors, but now all recordings show 0s and upon export no data.

user-688acf 03 April, 2025, 10:15:03

found them in some strange way: all of the recordings are combined into the very first, shows 0s but is 1:02:40 long. even including durations where record was off. it looks like i pressed "record" once and then let it run for 1 hour...which definitly didn't happen

user-f43a29 03 April, 2025, 10:16:28

Hi @user-688acf , can you open a Support Ticket in the 🛟 troubleshooting channel?

user-c881f1 03 April, 2025, 20:13:51

Hello, I'm trying to install Neon Player but keep getting an error saying "This installation package could not be opened."

user-cdcab0 04 April, 2025, 04:49:58

Hi, @user-c881f1 - ~~do you mind opening a ticket in 🛟 troubleshooting?~~

I was able to replicate the issue on Windows. It seems to be some problem with our automated build pipeline. For now I have manually re-built the installer package locally and updated the download. Please re-download the installer and try again.

Note that the working installer is ~310mb. If your fresh download is <100mb, then it's the old version and you'll need to clear your browser cache and download again.

user-ccf2f6 03 April, 2025, 20:17:26

hi Pupil Labs! We were planning to get an android device to go with one of our Neon eye-trackers. Is there a minimum RAM requirement for the companion app? would a Moto 40 Pro with 8GB (+256GB storage) be sufficient for the complete 200 Hz recordings?

user-f43a29 04 April, 2025, 13:09:01

Hi @user-ccf2f6 , are you potentially referring to the 8GB RAM Moto Edge+ (2023)?

user-c881f1 04 April, 2025, 11:58:07

@user-cdcab0 Thank you!

user-cdcab0 04 April, 2025, 12:05:37

No problem - thnaks for reporting the problem 🙂

user-f4b730 04 April, 2025, 18:22:13

Hello, I am using Neon on Psychopy. The two are connected using a router and the ANKER hub suggested by PupilLab. All was working well, until when I was testing the CompanionApp stopped starting to record.

I tried many things, but at the moment, If the devices are connected: - I can see the PupilsCamera using the browser (so the laptop can connect) - the function discover_one_device is not discovering anything (none output) - the attached Psychpy script is not ABLE to make the recording start. BUT, if I detach the ANKER hub from the phone and then reattach it (with glasses and ethernet always connected to the hub), only then I can make the Psychopy-Pupil connect working.

Any advice why the companionApp is giving this issues (I already tried to uninstall/reinstall, clear cache, clear all data, I also tried turn on/off).

Pupil_v0.3.psyexp

user-cdcab0 04 April, 2025, 20:22:10

Hi, @user-f4b730 - just to make sure I understand your problem correctly - you're saying that PsychoPy can only interact with your Neon if you unplug it from the hub and then re-attach it?

user-11dbde 05 April, 2025, 14:37:30

Since the end of last week our Pupil Labs Neon is stuck on an FPGA update...Everytime we start the Neon companion app and we connect the neon eye-tracker, a FPGA update starts and successfuly finishes. After a restart, the app asks always to again install the FPGA update. What can we do?

user-d407c1 07 April, 2025, 06:43:24

Hi @user-11dbde 👋 ! Sorry to hear that. Could you open a ticket on 🛟 troubleshooting or send an email to info@pupil-labs.com so we can follow up with next steps?

user-bd5142 08 April, 2025, 06:31:33

Hi I would like to inquire about the coordinate accuracy (error) of the gaze point data measured by Neon, and how much can it be controlled within? Thanks

user-f43a29 08 April, 2025, 08:17:23

Hi @user-bd5142 , Neon's gaze estimation accuracy is ~1.3-1.8 degrees, as assessed & validated in our Neon Accuracy Test Report.

If you have observers that significantly deviate from the population average, such that their gaze estimates exhibit a constant offset, then you can apply a one-time Gaze Offset Correction in their Wearer Profile. It is saved for future recordings, so if they were to come back, let's say 4 months later, you simply load their Wearer Profile, the Offset Correction is automatically applied, and you continue recording.

user-0001be 08 April, 2025, 14:40:39

Hi Pupil Labs! I’ve created a custom program using the API with PaychoPy where it would count down to start after the recording starts when I hit enter. It worked yesterday but not today. I’ve tried with different companion phones and modules and glasses but the issue seems to persist. By any chance was there an automatic update last night?

user-d407c1 08 April, 2025, 15:06:53

Hi @user-0001be 👋 ! Just a heads-up — there was a major PsychoPy update last Friday. Could it be that it was updated on your end?

Please note also that we don’t automatically update any Python libraries on your system, so if something changed, it likely came from a local update.

user-eebf39 08 April, 2025, 18:30:19

Hello. We want to stream gaze data from the motorolla phone to a laptop when we don't have internet. We purchased the recommended Anker USB hub and I am able to connect the laptop and the motorolla together but the motorolla can't find the Neon glasses. USB Camera is also unable to find the glasses. The Neon glasses are on (led is on). Any tips on how to connect the two? Thank you. Update: seems I can either connect the laptop and motorolla or the motoroll and the glasses, so I instead need a USB switch. Has anyone connected a laptop to the phone to the glasses successfully with a usb switch without internet?

user-f43a29 08 April, 2025, 22:19:59

Hi @user-eebf39 , have you plugged Neon into the port marked "5 Gbps"? The USB cable on the Anker hub also should be plugged into the Companion phone for Neon to be properly recognized. A USB switch is not necessary with that hub.

With respect to connecting to the laptop, you do this via Ethernet cable. Do you have a router? That is the easiest method: you use the Ethernet port of the Anker hub to connect Neon to the router. Then, you also connect the laptop via Ethernet cable to the router and it handles the automatic device discovery. The router does not need an internet connection or WiFi functionality.

user-7d4a32 08 April, 2025, 18:47:56

Hello, where can I find a detailed guide for using the new pupil labs? I want to understand as many of the features as possible. I appreciate the help!

user-f43a29 08 April, 2025, 22:21:12

Hi @user-7d4a32 , you can reference Neon's Documentation. Feel free to ask any questions you have here!

user-a83fa3 09 April, 2025, 10:38:00

Hello! I am using psychopy to run a perceptual experiment. The eye-tracker (ET) and the companion device are running on the same network. When I run the experiment (or in pilot mode) it does not trigger the ET to start recording. I'm also not getting any timestamps for events, I assume because it isn't recording anything. What am I missing? I have the "start recording" component at the start of the first trial, and the stop component at the end of the experiment. The PLevents components are at the start of each trial. Still, nothing is recording. If I have chrome open with Neon Monitor while i run the pilot/experiment, it shows it is connected to the device. But still, nothing is recording, unless I do it manually.
Any help is greatly appreciated!

user-cdcab0 09 April, 2025, 12:23:12

Hi, @user-a83fa3 - could you provide some more information? The following would be really helpful: * PsychoPy log from running your experiment * Your .psyexp file * The version number of PsychoPy you're running * Your operating system

user-cdcab0 09 April, 2025, 12:04:42

Don't use that one yet. I think they are still considering it beta, as they bundled it last Friday but haven't posted it on their website yet

Some explanation - 2025.1.0 requires a minor change to plugins. Without it, plugins don't work. Unfortunately, that same change makes it so that the plugins don't work in older versions of PsychoPy.

I have the 2025.1.0 version of the plugin ready to go, but I don't want to publish it until they make 2025.1.0 available on their website (otherwise everyone who uses the download from the website will not be able to use the plugin)

If you need to use 2025.1.0, I can send you a .whl of the plugin, but the installation process is a little different

user-f4b730 09 April, 2025, 12:05:42

I try to reinstall the 2024.2.4 and see what happens

user-cdcab0 09 April, 2025, 12:54:33

I have also noticed that sometimes on Windows, the PsychoPy uninstaller doesn't complete remove the PsychoPy folder, and I have to manually delete it.

user-cdcab0 09 April, 2025, 14:11:37

I would ask that you 1. Move that to ~~the end of the routine or~~ each frame 2. Restart your Companion Device 3. Share an image of the scene camera's view of your screen while it's displaying AprilTag markers. You can either share the recording's video or just take a screenshot of the scene preview in the companion app 4. Share your psychopy log

user-f4b730 09 April, 2025, 14:14:22

Sure, I will try it tomorrow, shall I send the video to [email removed]

user-bda2e6 09 April, 2025, 16:56:07

Hi! @user-cdcab0 I have a quick question. In the hdf5 data obtained through psychopy, is there a way to tell when there is a blink?

user-cdcab0 09 April, 2025, 21:02:43

Not yet. Blink detection will soon be possible on-device, and at that point I believe it will be possible to send blinks to PsychoPy, although I haven't looked into it too deeply yet. Would you consider submitting a 💡 features-requests?

user-a83fa3 10 April, 2025, 08:32:50

I do usually have it on run mode. I'm trying it on a different computer that isn't set up to the university. it doesn't actually trigger the eye-tracker to record though? I have to manually hit record on the companion device. Would it help if I connect it direclty to the computer?

user-cdcab0 10 April, 2025, 08:40:49

it doesn't actually trigger the eye-tracker to record though Yes, the recording triggers and your events are saved in the recording for me.

Would it help if I connect it direclty to the computer? By "it" do you mean the Companion Device? That might solve your problem, but you would need a USB hub and there is some configuration

Are you able to share the PsychoPy log?

user-a83fa3 10 April, 2025, 08:55:23

so the events are saved on the companion device for you? like, the companion device is recording what the eye-tracker sees? by "it" i was referring to the actual eye tracker

user-a83fa3 10 April, 2025, 08:55:40

Pilot2_lastrun.py

user-a83fa3 10 April, 2025, 08:56:01

372414_Pilot2_2025-04-10_11h23.35.027.log

user-a83fa3 10 April, 2025, 09:00:55

I see the events in the .csv file with a time stamp, but there is no video recording saved on the companion device

user-a83fa3 10 April, 2025, 09:03:51

I got it to work1!! thank you!!

user-cdcab0 10 April, 2025, 09:09:20

Glad to hear it! What ended up being the problem?

by "it" i was referring to the actual eye tracker By the way, the Companion Device is a required component of the Neon eyetracking system. Connecting your Neon frame to a PC will not give you any gaze data. This is produced by the Companion Device

user-e6ae95 10 April, 2025, 13:16:59

Guys, Neon app started to crash on OnePlus phones. Any ideas how to solve this quickly?

user-f43a29 10 April, 2025, 13:21:30

Hi @user-e6ae95 , could you open a Support Ticket in 🛟 troubleshooting ? We will follow-up with you there.

user-12efb7 11 April, 2025, 06:31:17

Hi, I would like to ask:

1) Is there an easier way to do what I’m trying to achieve? 2) What can be a reason that the gaze.csv (with columns 'gaze position transf x [px]' and 'gaze position transf y [px]', output by pl-dynamic-rim) not match the (x, y) positions on the screen video?

  1. The output from pl-dynamic-rim includes a video composed of three views (1st screenshot) and a gaze.csv. I only need the screen recording with the gaze overlaid. To do this, I extracted the main function from pl-dynamic-rim and modified a few lines (code.txt). Is there a simpler way to obtain just the screen video with the gaze overlay?

  2. I’ve attached screenshots showing that the gaze data doesn’t align with the position shown in the screen recording. This issue hasn't occurred before—previous videos matched correctly with the gaze data. But in this case, they don’t.

Chat image Chat image code.txt

user-d407c1 11 April, 2025, 08:27:27

Hi @user-12efb7 👋

Great to hear the package is working for you! It's definitely long overdue for a cleanup and refactor 😅 — hopefully I’ll find some time to update it soon.

Just a quick note: we generally don’t provide code reviews unless you're on a consultancy package. If that’s something you’re interested in, feel free to check out our support packages.


1) Is there an easier way to do what I’m trying to achieve?

Generating the three videos is a good approach for cross-checking — especially to catch cases where gaze was incorrectly placed (e.g. due to a blink or misalignment in the reference image mapper).

If you're looking to get a final video with just the end result, you can make the following minimal changes to the library:

2) What could cause gaze.csv values to not match the (x, y) positions in the screen video?

The values in gaze position transf x [px], gaze position transf y [px] are exactly what get plotted in the screen video. If there’s a mismatch, it’s likely due to a modification in your code — perhaps an unintended change to how frames, scaling or mappings are handled, but without diving into the changes you made is hard to know.

user-12efb7 11 April, 2025, 08:44:28

Thanks so much! I didn’t know about that — I’ll definitely check it out 🙂 And thanks for answering my question as well 🙂

user-12efb7 11 April, 2025, 09:23:58

One more question — what’s the best practice for calibration if I'm planning to use Neon only with on-screen data? At the moment, I’ve only found this: https://docs.pupil-labs.com/neon/data-collection/offset-correction/

user-d407c1 11 April, 2025, 09:47:41

Neon uses NeonNet, a deep learning-based approach to infer gaze directly from eye images — so no calibration is required.

That said, performing a Gaze Offset is definitely the optimal approach if you’re aiming to fine-tune accuracy. Note, that this only needs to be done once per wearer, since the offset is stored in their profile.

You can also apply the offset correction post-hoc to individual recordings if needed.

user-937ec6 11 April, 2025, 16:22:35

Hello, thank for the help that was provided prior. I have custom Python scripts, based somewhat on the samples, that 1) stream the scene and eye cameras from the Neon and save them to MP4 files. 2) I overlay the gaze data dynamically into the scene camera video that is both saved and previewed on-screen 3) I receive and write out the IMU and gaze data to flat file 4) I overlay information from other Lab Streaming Layer feeds into the scene camera video dynamically as well. 5) I write out video heart beat information (duration and frame info) to LSL.

My customer would like to have audio muxed into the scene video. Is it possible to get the audio information? From what I can tell, RTSP is being used underneath. I can even see that a URL parameter of "audioenable=on" is being supplied. Thank you for any help that can be provided

user-937ec6 11 April, 2025, 17:42:36

Audio is definitely present in the RTSP stream. If I enable the microphone, I can hear audio if I point VLC at the following: rtsp://<ip address>:8086/?camera=world&audioenable=on

user-d407c1 14 April, 2025, 11:27:21

Hi @user-937ec6 ! Currently the Realtime Python API client does not have support for audio streaming.

user-a83fa3 14 April, 2025, 12:48:46

I've now run out of space on Pupil cloud, even though I've deleted a bunch of old recordings. How do I "empty the trash" so to speak? I am doing it in the trash section but i keep getting an internal server error. It isn't updating how much storage is actually left when there are deleted recordings, and i havent got any enrichments for these files

user-cdcab0 14 April, 2025, 16:29:15

Could you open a ticket in 🛟 troubleshooting , please?

user-f43a29 14 April, 2025, 22:21:40

Hi @user-a83fa3 , you find the Trash by clicking on the three button menu at the top left of the Workspace view and choosing Show trashed, as shown in the attached image.

Chat image

user-05ba05 14 April, 2025, 12:56:22

Hi! I was wondering if there was a way to merge two seperate recordings from the monitor app?

user-d407c1 14 April, 2025, 14:04:07

Recordings either triggered by the monitor app or the phone itself can not be merged, if you would find this a useful addition, please feel free to request it on the 💡 features-requests channel for evaluation.

user-2b5d07 14 April, 2025, 13:52:36

Hi, I used the Neon eye tracker on a surgical robot to record gaze data. Unfortunately, the scene camera was facing the robot and didn’t capture the surgical field. To visualize the gaze, I used the robot’s internal video and projected the gaze coordinates onto it after synchronization. However, the projection isn’t accurate. Could this be due to a calibration issue or the strong surgical lighting affecting the eye tracker? I’d appreciate your thoughts. Thanks !

user-d407c1 14 April, 2025, 14:23:28

Hi @user-2b5d07 👋 ! There are quite a few unknowns here — would you mind sharing how you performed the re-projection onto the surgical robot’s video?

From what you've described, that step seems like the most likely source for an issue.

user-9a1aed 14 April, 2025, 14:42:41

Hi, I am using psychopy to write a program to show image stimuli following this tutorial: https://docs.pupil-labs.com/neon/data-collection/psychopy/. I wonder if the April Tag needs to be added to each image stimulus on the computer display as shown in the gaze_contigent_demo.psyexp? Thanks in advance!

user-f43a29 14 April, 2025, 22:22:38

Hi @user-9a1aed , if you want to know where the person looked on each image, then yes, it will be necssary to display the AprilTags for each stimulus.

user-bda2e6 14 April, 2025, 16:04:14

We’ve been successfully using Neon with PsychoPy for data collection over the past three months without any issues. However, today it suddenly stopped working — PsychoPy is no longer communicating with Neon. The start and stop recording triggers aren’t functioning, gaze data is not being received, and the red gaze circle is no longer visible on the companion phone.

I’ve already tried restarting all devices, clearing the cache, and resetting any active Python backends on the laptop, but the issue persists. The browser live stream still works, so it seems that the connection between the phone/eye tracker and the computer is intact.

Do you have any idea what might be causing this? I’d appreciate any insight you could provide. Thank you!

Chat image

user-bda2e6 14 April, 2025, 17:13:38

Update: the neon.local:8080 browser live stream also stopped working

user-cdcab0 14 April, 2025, 17:33:42
  • the red gaze circle is no longer visible on the companion phone
  • neon.local:8080 browser live stream also stopped working

Can you open a ticket in 🛟 troubleshooting? You'll want to resolve these problems before trying anything with PsychoPy

user-ccf2f6 14 April, 2025, 23:06:05

Hi Pupil Labs, I was looking to get more information about how the timestamping on world/scene and eye cameras is achieved in Neon eye-trackers. Is there documentation around it already? I'm particularly interested in accessing the scene camera frames directly as a usb video device and get as accurate timestamps as possible. This'll help me do some image processing tests directly on the scene camera output instead of having to record it with the companion app and then exporting it for processing. Thanks in advance!

user-f43a29 16 April, 2025, 12:35:25

Hi @user-ccf2f6 , all data from Neon are timestamped with the same high-precision clock. If you are looking to use Neon's Scene Camera that way, then you can similarly timestamp the incoming video frame with the high-precision clock that is present on the receiving device. You will just want to account for USB transmission + processing delay, which could depend on the receiving system.

user-f4b730 15 April, 2025, 08:28:30

Hello, I am using Neon Player v5.0.1 and the lastest CompanionApp 2.9.0. After exporting the recording and try to load them in NeonPlayer, the NeonPlayer crashes. If trying to load them again the screen show "file format outdated, delete the NeonPlayer subfolder astart from fresh"... Even after doing this, the NeonPlayer crashes again

user-cdcab0 15 April, 2025, 08:40:16

Hi, @user-f4b730 - could you send me your recording to troubleshoot? I'm unable to replicate this with any of my recordings

user-f4b730 15 April, 2025, 08:35:01

the error in the log is the one at the bottom:

Chat image

user-0e6279 15 April, 2025, 16:39:48

Hello, is there an easy way to downgrade the companion app? We rely on the api version 1.3. and the phone was updated to the newest app version

user-f43a29 16 April, 2025, 12:31:48

Hi @user-0e6279 , while rolling back the app is not possible, it is useful to clarify some points:

  • The Python Real-time package does no processing of data. That happens in the Neon Companion app. The Python package simply makes it easy to receive the raw data, so updating the Python package will not alter your data quality.
  • The Python Real-time package is backwards compatible. That means function names, signatures, and field names are unchanged. Functionality has only been added, not taken away. For example, I still run the same scripts while having updated the Python package several times.
  • Releases of both the Python package and the Neon Companion app contain important fixes & stability improvements.
user-13d297 15 April, 2025, 19:51:42

Hello! I am testing my preprocessing pipeline that will use Face Mapper enhancement. I only have 2 recording so far that are about 1.5 minutes each, yet the enhancement has been running for over 30 minutes. How long should I expect this processing to be? Is there any way to speed it up, as I will need to process about 2000 of these in the near future.

user-f43a29 16 April, 2025, 12:32:43

Hi @user-13d297 , would you be able to open a Support Ticket in 🛟 troubleshooting with the associated Enrichment IDs?

user-9a1aed 16 April, 2025, 07:34:31

Hello, I am running a program on psychopy with neon connected to the phone's Neon Monitor app, but the fixation does not seem to be synced with the eye tracker. I need the participant to fixate on the cross before the trial starts. How can I achieve that? I can only view the fixation data in the app, but the program does not capture the participant's fixation. Could anyone please help?

user-cdcab0 16 April, 2025, 08:00:10

Hi, @user-9a1aed - Are you wanting to visualize the participants gaze in your PsychoPy app? PsychoPy doesn't automatically visualize gaze data for you, but it's pretty simple to implement yourself. We put together a very simple gaze-contingent demo in PsychoPy which does this. Check it out!

user-9a1aed 16 April, 2025, 08:10:02

i cannot find stream on the top left of my companion device, so perhaps i did not configure correctly.

Chat image

user-cdcab0 16 April, 2025, 08:21:59

That video is from an older version of the app. Streaming information is now available by tapping icon on the top right corner that looks like a phone with a wireless signal coming out of it (see screenshot)

If PsychoPy cannot connect to the device ("cannot connect to host neon.loca:8080"), then nothing else will work correctly. First, please make sure that the Companion Device and the PC running PsychoPy are on the same network. Then, I'd go into your experiment settings, and change neon.local to your Companion Device's IP address

Chat image

user-9a1aed 16 April, 2025, 10:44:24

Chat image

user-cdcab0 16 April, 2025, 10:49:21

For exposure, your left image is good, but the right one is pushing it. Can you try making the markers larger though?

Is it possible that the gaze indicator may have block the aprilTag? There are multiple AprilTag markers for redundancy and accuracy. You can occlude several of those markers and still achieve good tracking.

I saw that some people had issues with the local network and thus switched to the device's hotspot for connection. May I know if there is a way that I can check if the eye tracker is properly connected to the program? You could try installing the Realtime API and running the examples. This would remove PsychoPy from the equation and help identify whether the problem is a networking issue.

user-9a1aed 16 April, 2025, 10:45:54

I saw that some people had issues with the local network and thus switched to the device's hotspot for connection. May I know if there is a way that I can check if the eye tracker is properly connected to the program?

user-9a1aed 16 April, 2025, 11:15:52

Thank you. I connected my mac to the device's hotspot and ran the code to connect the deivce using the IP address https://docs.pupil-labs.com/neon/real-time-api/tutorials/ I am not familiar with Python. After following the steps, I received the following error.

File "/Users/yyang/Library/CloudStorage/OneDrive-HKUSTConnect/ustProject/aqRec_project/3results/scripts/from pupil_labs.realtime_api.py", line 1, in <module> from pupil_labs.realtime_api.simple import Device File "/Users/yyang/opt/anaconda3/lib/python3.9/site-packages/pupil_labs/realtime_api/init.py", line 4, in <module> from .device import APIPath, Device, DeviceError, StatusUpdateNotifier File "/Users/yyang/opt/anaconda3/lib/python3.9/site-packages/pupil_labs/realtime_api/device.py", line 11, in <module> from pupil_labs.neon_recording.calib import Calibration File "/Users/yyang/opt/anaconda3/lib/python3.9/site-packages/pupil_labs/neon_recording/init.py", line 6, in <module> from .neon_recording import load File "/Users/yyang/opt/anaconda3/lib/python3.9/site-packages/pupil_labs/neon_recording/neon_recording.py", line 9, in <module> from .stream.imu import IMUStream File "/Users/yyang/opt/anaconda3/lib/python3.9/site-packages/pupil_labs/neon_recording/stream/imu/init.py", line 1, in <module> from .imu_stream import IMUStream File "/Users/yyang/opt/anaconda3/lib/python3.9/site-packages/pupil_labs/neon_recording/stream/imu/imu_stream.py", line 7, in <module> from . import imu_pb2 File "/Users/yyang/opt/anaconda3/lib/python3.9/site-packages/pupil_labs/neon_recording/stream/imu/imu_pb2.py", line 11, in <module> from google.protobuf.internal import builder as _builder ImportE

user-9a1aed 16 April, 2025, 11:34:36

It looks like API is for more advanced users. I used this Monitor app to view real-time tracker on my mac like below. Does it mean the connection works out successfully?

Chat image

user-cdcab0 16 April, 2025, 11:37:04

It looks like API is for more advanced users. It is, but it can help us determine the problem better

I used this Monitor app to view real-time tracker on my mac like below. Does it mean the connection works out successfully? Yes, probably, but the error you shared above could be the cause of your PsychoPy not receiving data. Can you share your PsychoPy log?

user-9a1aed 16 April, 2025, 12:54:49

Okkk. Thank you! I got it working but the accuracy seems very off. In my program, the participant has to look at the cross before viewing each image (in the demo, i fixated on the cross all the time but did not seem to in terms of the gaze indicator?). When I test the program with the eye tracker, even if i am fixating at the cross, it does not initiate the stimulus presentation. the log from psychopy is below. Thank you!!

position: [-0.3 0. ] 2.5955 ERROR Failed to import the ioLabs library. If you're using your own copy of python (not the Standalone distribution of PsychoPy) then try installing it with:

pip install ioLabs 2.5956 ERROR Plugin psychopy-iolabs entry point requires module psychopy_iolabs, but an error occurred while loading it. Stopping run loop for rtsp://10.79.43.134:8086/?camera=gaze&audioenable=on Stopping run loop for rtsp://10.79.43.134:8086/?camera=world&audioenable=on 1.7204 WARNING Monitor specification not found. Creating a temporary one... ioHub Server Process Completed With Code: 0

########### Experiment ended with exit code 0 [pid:9436]

user-cdcab0 16 April, 2025, 13:00:45

On a laptop with Windows, you definitely should double check that UI scaling in your display settings is set to 100%, but yours looks like it just needs a gaze offset

user-b55ba6 16 April, 2025, 18:29:50

Hi. I have one question about the IMU data.. Looking at the raw data I have (full data line below): euler = [-24.877704997985777, -9.23895125875146, 109.65766517017222] quaternion = [0.5464919805526733, 0.13019901514053345, -0.1879594624042511, 0.8056462407112122]

If I run "from scipy.spatial.transform import Rotation as R" and: R1 = R.from_quat(quaternion) print(R1.as_euler('XZY', degrees=True))

I was looking for the axis combination that could explain the transformation but cant find it. What am I missing?

This is the whole line from the "imu.csv" file: c6785c19-b102-43ad-842a-43547b1779c7, de0ee700-8a17-4fae-b93b-dda8b1d9fca8,1719939445325777116, -62.200546, -2.672195, 140.501022,0.490723, -0.298828,0.9375, -24.877704997985777, -9.23895125875146, 109.65766517017222, 0.5464919805526733, 0.13019901514053345, -0.1879594624042511, 0.8056462407112122

user-cdcab0 16 April, 2025, 18:35:46

I recently made this exact same mistake 🤦🏽

edit: see below

user-639b3a 17 April, 2025, 07:55:08

Hello, I have a problem with the csv exported from the PupilCloud, specifically with the saccades data. I can see the saccades in the videos but the saccades.csv files I downloaded from the cloud are empty. Could you help me with this please ? (the app version I use is 2.9.0-prod)

user-f43a29 17 April, 2025, 08:08:44

Hi @user-639b3a , I see you opened a Support Ticket in 🛟 troubleshooting . Thanks. We will follow up with you there.

user-083b4f 18 April, 2025, 19:42:04

Hi team,

I’m still seeing this error when calling *device.receive_gazedatum() from the Real‑time API 1.5.0 on Ubuntu 22.04/ROS 2 Iron, even after disabling both “Compute eyestate” and “Compute fixations” in the Neon Companion app:

Raw gaze data has unexpected length: [email removed] \x00?\x19\xf9y', timestamp_unix_seconds=1745004922.9094741) Traceback (most recent call last): File "/home/callahan/.local/lib/python3.10/site-packages/pupil_labs/realtime_api/streaming/gaze.py", line 150, in receive cls = data_class_by_raw_len[len(data.raw)] KeyError: 89 Stopping run loop for rtsp://10.76.85.60:8086/?camera=gaze&audioenable=on * What I’ve tried so far: - Confirmed pupil_labs.realtime_api is v1.5.0 - Disabled “Compute eyestate” & “Compute fixations” in Companion - Restarted both the Companion app and my ROS node

Despite that, I still got that error. Can you advise on any additional debug steps you’d recommend?

Thanks for your help!

user-083b4f 18 April, 2025, 19:57:52

Sorry, never mind, it got fixed. Thanks!

user-083b4f 18 April, 2025, 22:40:58

Hi team,

I’m now encountering this when calling gaze_mapper.process_frame(frame, gaze):

[eye_gaze-10] Traceback (most recent call last):
[eye_gaze-10]   File "/home/garrison/ros2_ws/install/pupil_labs_ros2/lib/pupil_labs_ros2/eye_gaze", line 33, in <module>
[eye_gaze-10]     sys.exit(load_entry_point('pupil-labs-ros2==0.0.0', 'console_scripts', 'eye_gaze')())
[eye_gaze-10]   File "/home/garrison/ros2_ws/install/pupil_labs_ros2/lib/python3.10/site-packages/pupil_labs_ros2/eye_gaze.py", line 157, in main
[eye_gaze-10]     result = gaze_mapper.process_frame(frame, gaze)
[eye_gaze-10]   File "/home/garrison/.local/lib/python3.10/site-packages/pupil_labs/real_time_screen_gaze/gaze_mapper.py", line 64, in process_frame
[eye_gaze-10]     gaze_undistorted = self._camera.undistort_points_on_image_plane([[gaze[0], gaze[1]]])
[eye_gaze-10]   File "/home/garrison/.local/lib/python3.10/site-packages/pupil_labs/real_time_screen_gaze/camera_models.py", line 35, in undistort_points_on_image_plane
[eye_gaze-10]     points = self.unprojectPoints(points, use_distortion=True)
[eye_gaze-10]   File "/home/garrison/.local/lib/python3.10/site-packages/pupil_labs/real_time_screen_gaze/camera_models.py", line 60, in unprojectPoints
[eye_gaze-10]     pts_2d_undist = cv2.undistortPoints(pts_2d, self.K, _D)
[eye_gaze-10] cv2.error: OpenCV(4.10.0) /io/opencv/modules/calib3d/src/undistort.dispatch.cpp:403: error: (-215:Assertion failed) CV_IS_MAT(_cameraMatrix) && _cameraMatrix->rows == 3 && _cameraMatrix->cols == 3 in function 'cvUndistortPointsInternal'

It looks like something’s off with the camera matrix or distortion parameters. I’ve tried resetting the calibration, but the error persists. Could you suggest on how to troubleshoot this?

Thanks!

user-cdcab0 18 April, 2025, 23:30:32

Can you make sure your real-time-screen-gaze package is up to date? If it is, please share your code and I can help troubleshoot

user-083b4f 18 April, 2025, 23:46:57

also, i have another issue with

Stopping run loop for rtsp://10.76.126.146:8086/?camera=eyes&audioenable=on
Stopping run loop for rtsp://10.76.126.146:8086/?camera=world&audioenable=on

Real‑time API 1.5.0 on Ubuntu 22.04/ROS 2 Iron I've tried restarting the app and the node, but they don't seem to be working Can you advise on any additional debug steps you’d recommend?

Thanks for your help!

user-083b4f 19 April, 2025, 00:08:22

Sorry thank you! I restarted the phone and it’s working. Never mind! Thanks

user-cdcab0 19 April, 2025, 00:51:38

Great to hear! Just for the record though, the real-time-screen-gaze package is separate from the realtime-python-api package

user-bda2e6 20 April, 2025, 18:23:08

Maybe from the pupillary measurements?

user-cdcab0 21 April, 2025, 16:09:14

Probably not easily. I can tell you subjectively that, during a blink, gaze and pupillometry data that Neon produces is mostly noise. It's feasible that one might be able to reliably classify signal versus noise with one or both of those streams to identify blinks, but I don't have any examples or previous work to point you towards. It is feasible for us to send this data to PsychoPy though - please submit a post 💡 features-requests 🙂

in our specific study settings. We are not able to use store/use anything from the raw recordings Maybe you and I have discussed this before, but could you remind me why this is?

user-2b5d07 21 April, 2025, 17:23:51

Hi, Does anyone have an idea about how the projection of the gaze x and y coordinates onto the scene video is done? Also, is there any time offset between the recorded gaze coordinates and the scene video ? Thanks

user-bda2e6 21 April, 2025, 17:34:30

Thank you, I will submit a request. Just to be clear, what data are you going to send to Psychopy? The eye camera footage? The blink info?

And the reason we can't do that is due to the IRB regulation of this particular study, not due to technical reasons

user-cdcab0 21 April, 2025, 21:37:33

Blink events. Raw eye camera data might be feasible, but I'm not sure. Do you have a use case for this?

we can't do that is due to the IRB regulation Ah, thanks. Hopefully in future works you can make recordings. IMO, it's much easier to process/analyze these data post-hoc than real-time, especially with PsychoPy

End of April archive