👓 neon


Year

user-51c732 01 October, 2024, 08:53:17

Hello neon Team. Are you working with some sport teams or individuals wit using the neon pupil lab for screening. Do you have some ideas where can pupil lab help to screen or improve in the differente sport?

user-480f4c 01 October, 2024, 08:57:41

Hi @user-51c732 - I understand you're planning to use Neon in sports applications, is that right? Could you please explain a bit more your research needs and setup? I'm not sure I fully understand your question.

user-51c732 01 October, 2024, 09:01:37

Dear Nadia You are right. I have just an idea and think to start using the Neon for these purposes. And I will be happy if you already have some experiences and ideas and can share with me.

user-480f4c 01 October, 2024, 09:07:36

@user-51c732 Neon is a great option if you want to collect eye tracking data in sports applications. We have some videos on our website showing Neon when playing basketball/riding a bike etc.

For sports applications, we recommend opting for the frame "Ready Set Go" frame. This frame has an adjustable headband and fits well under helmets and other headwear.  It is suitable for sports and other activities involving fast movements, ensuring the module stays firmly on the subject’s head.

If you could share more details about your planned research, that will help me provide better feedback. We can also schedule a Demo call if you prefer and we can show you Neon in action 😉

user-51c732 01 October, 2024, 09:09:20

Thank you very much It will be great to have a call and go through

user-480f4c 01 October, 2024, 09:14:43

great, can you send us an email to [email removed] and we can coordinate via email 🙂

user-51c732 01 October, 2024, 09:35:29

Ok

user-2e0181 02 October, 2024, 23:34:43

Hello, why does Neon stop recording a couple minutes into my experiment?

nmt 03 October, 2024, 04:25:13

Hi @user-2e0181! Please open a ticket in 🛟 troubleshooting and we can coordinate there!

user-97c46e 03 October, 2024, 02:10:41

Hi everyone, We're interested in ordering a frame (a ready set go) as an extra for our recently purchased neon device. Before we do so, I would like to ask if there is any possibility we could request some customization of the headband on it?

user-97c46e 03 October, 2024, 02:14:30

Sorry I just noticed on the website that headbands are also available. Is it possible to get some images of the head straps (beyond those on the web page, i'm specifically hoping to get a closer look at the earpiece sleeve) ? and would it be possible to vary their length if needed? lastly, is there any room for customization of the strap material (elastic band instead of sheathed steel cable)?

nmt 03 October, 2024, 04:27:16

Hey @user-97c46e! Can you please contact [email removed] Someone from there will be able to assist you with these questions 🙂

user-fd453d 03 October, 2024, 15:01:57

Hola!, Lamentablemente sufri el robo del movil que viene con mi equipo Neon. Que dispositivo me recomienda adquirir para utilizar mis lentes

user-fd453d 03 October, 2024, 15:05:57

Moto Edge 50 Fusion es una opción?

user-d407c1 03 October, 2024, 15:06:35

Hola @user-fd453d , Lamento escuchar que te hayan robado el teléfono. Puedes ver los dispositivos compatibles en este enlace: https://docs.pupil-labs.com/neon/hardware/compatible-devices/. Personalmente, te recomendaría el Motorola Edge 40 en alguna de sus variantes, ya que es el dispositivo más potente.

user-fd453d 03 October, 2024, 15:07:13

Moto Edge 50 es una opción?

user-d407c1 03 October, 2024, 15:09:17

Solamente podemos garantizar, dado que han sido probados, el correcto funcionamiento con los dispositivos listados anteriormente.

user-fd453d 03 October, 2024, 15:10:24

ok gracias

user-97c46e 03 October, 2024, 15:54:13

Hi folks, we're using the ready set go frame with our neon device and within 10 minutes of starting to recording, a good bit of heat that can be felt on the skin. While i expect electronics to generate heat, its a lot more than I anticipated and its getting quite hot 15 minutes in. Is it expected to get this hot? If so, does anyone have any suggestions for how to deal with wearer discomfort if we're recording over hours?

user-d407c1 04 October, 2024, 11:49:56

Hi @user-97c46e! Which frame are you using? Does it have a heatsink? Our current frames for sale come with an aluminum heatsink.

The heatsink helps dissipate heat from the module to the front of the frames, ensuring the module gets warm but not hot. Typically, the module reaches around 10°C above ambient temperature for most frames, with slightly lower temperatures for ICSCN. These measurements were taken using FLIR and are based on the module’s core temperature beneath the silicon.

user-97c46e 04 October, 2024, 15:34:48

Hi @user-d407c1. Thank you. We have the ready set go frame. It has a plastic housing on the top which seems to have a grating kind of construction near a coil on the electronics package (i'm assuming for the purpose of heat dissipation).

Chat image

user-4df54e 07 October, 2024, 05:25:37

I want to know what the amplitude of an eye movement in degrees (i.e., as an angle from the center of the glasses essentially). The simplest way seems to be to get the coordinates of a square in the scene camera exported by the pupil neon (which is the info I have), and work out the ratio with the estimated angular dimensions of the same square. Is this reasonable, or is there a better way?

nmt 07 October, 2024, 05:35:55

Hi @user-4df54e 👋. Neon already provides the elevation and azimuth of a gaze ray in degrees. You can use this to compute the amplitude of gaze shifts. More in the docs: https://docs.pupil-labs.com/neon/data-collection/data-format/#gaze-csv. Do you think that would suffice for your needs?

wrp 07 October, 2024, 09:41:54

Pupil Cloud Add-ons Q&A

user-bce0dc 07 October, 2024, 09:48:15

I apologize if this is a silly question, but is there a way/option to use Neon without wifi (on the phone) and upload the recordings locally instead of through Cloud? Thank you!

user-d407c1 07 October, 2024, 09:51:26

Hi @user-bce0dc ! You can transfer them via USB.

user-bef103 08 October, 2024, 12:56:00

What OS version is requried to run for the Neon glasses?

user-c2d375 08 October, 2024, 12:59:30

Hi @user-bef103 👋 It depends on the model of the Companion device (phone) you're using with Neon. Which model do you have?

user-bef103 08 October, 2024, 12:56:04

of the phone

user-bef103 08 October, 2024, 13:00:44

oneplus 10 pro

user-bef103 08 October, 2024, 13:01:01

its running 14.0 Right now

user-c2d375 08 October, 2024, 13:12:44

Unfortunately, the supported android versions for OnePlus 10 Pro are Android 12 or 13, so you'll need to downgrade to a compatible version. Please open a ticket in https://discord.com/channels/285728493612957698/1203979608563650650 and we can coordinate from there!

user-bef103 08 October, 2024, 13:18:00

Okay thanks

user-b6f43d 09 October, 2024, 17:34:23

Hey pupil labs, i wrote a code that reads gaze data and world timestamp data from two CSV files, calculates relative time based on the first timestamp from the world timestamp file, and then converts the relative time in nanoseconds to a Minutes : Seconds : Milli Seconds format.

I happen to see that the converted file says that i have some gaze data beyond the time of the recording. Ex: In my case the recording end at 17minutes:31seconds but the last value in converted gaze file is 17minutes:59seconds:339ms.

What might be the problem

user-b6f43d 10 October, 2024, 06:01:11

found what the problem is, Thanks pupil labs

user-f43a29 10 October, 2024, 08:08:38

Hi @user-b6f43d , I see you found the problem, but just in case:

  • The first timestamp of the world_timestamps.csv is not the start of the recording.
  • It is also not the time at which gaze starts.
  • If you want all timestamps relative to the start of the recording (i.e., when you push the white record button, so relative to "0 seconds"), then you want to subtract the value in start_time of the info.json file from the timestamps in each data stream of interest.

This is because the different sensors (scene camera, eye cameras, gaze, etc.) run in parallel and do not all start at the exact same time. Also, they must briefly initialize for about ~1.5 seconds after you start a recording.

Also, they do not all stop at the same time when you end/stop a recording. The eye/gaze sensors might still run for a bit after the scene camera has stopped.

user-4da0be 10 October, 2024, 08:35:42

Hi everyone,

I’m using the Neon eye-tracking glasses for a study where participants look at their own bodies in a mirror. I need to define AOIs for each participant (e.g., hands, legs, stomach), and I want the software to automatically extract the eye-tracking data from the video of each participant looking at the mirror, based on these AOIs.

I understand that I need to have a reference image, but I’ve also seen mention of a scanning video. Is the scanning video absolutely necessary for this process? I’ve come across studies that don’t seem to use a scanning video, so I’m unsure of its role here.

Could someone provide clear instructions on how to do this correctly? I’ve managed to get it working in a way, but I’m not confident it’s the most systematic approach for research purposes. A detailed, step-by-step guide would be greatly appreciated!

Thanks in advance!

user-f43a29 10 October, 2024, 08:56:53

Hi @user-4da0be 👋 !

  • May I ask if you already had your free 30-minute Onboarding call? We can cover some of the details in real-time that way.
  • If you are using the Reference Image Mapper Enrichment, then a scanning recording is required. The Enrichment process cannot be started without one.
  • The main documentation for that Enrichment is here. There, you will find a step-by-step video and examples of good scanning videos are provided further down, at that same link.

Since you are working with a mirror, then that can potentially cause some difficulty for Reference Image Mapper, but they are not dealbreakers; it depends on your exact setup. May I ask, how large is the mirror and how much context, contrast/color, & variation is there in the surrounding environment?

user-f43a29 10 October, 2024, 09:01:06

@user-4da0be Also, if you do not strictly need the features of Reference Image Mapper, then you may find the Map Gaze Onto Body Parts Alpha Lab to be helpful. It does not require a scanning recording and automatically determines AOIs for body parts.

This can be especially helpful if the participants are moving their arms, for example, while looking in the mirror, but it does require more work to extract fixation metrics.

May I ask what data exactly you are looking to compute for the AOIs? Is it fixation metrics? If so and Reference Image Mapper does not work satisfactorily for you, then my colleague @user-480f4c has pointed out that you can also use the Manual Mapper Enrichment.

user-688acf 10 October, 2024, 12:26:32

hello, is saccade detection (+ export) only available in pupil cloud or also neon player? can't seem to find it in player 😦

user-cdcab0 10 October, 2024, 13:33:28

Hi, @user-688acf - you're in luck! We just published Neon Player v4.5, which includes saccade exports!

user-688acf 10 October, 2024, 13:35:03

lucky me 🙂 great thx!!

user-688acf 11 October, 2024, 07:17:42

(y) thx

user-688acf 11 October, 2024, 09:06:10

awesome! thx

user-a09f5d 11 October, 2024, 16:59:22

I have a question about best practices when using tag aligner (https://github.com/pupil-labs/tag-aligner). For the first step in tag-aligner it says you have to make a RIM enrichment. For this step you need a scanning video of the scene, a still photograph of the scene, and the actual eye tracking recording that is of interest. The github readme for the tag aligner mentions that at least one of these two recordings needs to include the AprilTag. Currently I am including the AprilTag in the scanning video (since it cannot be included in the actual eye tracking recording because of the experiment). However, does the still reference image that is used for the RIM step also need to contain the AprilTag? Also will the RIM enrichment that is used by the tag aligner be less accurate because the scene changes between the scanning and eye tracking recording since the AprilTag is not present in both?

user-f43a29 14 October, 2024, 09:51:28

Hi @user-a09f5d , a Reference Image Mapper Enrichment by default does batch processing of all the recordings in a Project. So, when the instructions say, "This can be the scanning recording, but does not need to be.", it means that it can even be a third or fourth recording, which we could call the "aligning" recording

This "aligning" recording does not even need to be an eyetracking recording. It just needs the AprilTag present, within the scanned region.

Then, you do not have to worry about any error being introduced into the mapping by the presence/lack of AprilTags in the scanning/eyetracking recordings.

user-613324 11 October, 2024, 20:53:49

Hi Neon team, I noticed there's a new feature in the Neon Companion app: Companion Settings --> NeonNet --> Compute eye state after a recent app update. I didn't remember seeing the "Compute eye state" option from the older versions. So is it related to the realtime eye state feature introduced in this April? So is this option only for generating the 3D eye state in REAL TIME (like the realtime gaze)? I tried with this feature both turned on and off in the app, and found no difference in the timeseries data downloaded from the Pupil Cloud. Even with this feature turned off, the Pupil Cloud will still generate 3d_eye_states.csv. Can you give me a use case where we need to turn on this feature? Thanks!

nmt 12 October, 2024, 15:38:06

Hi @user-613324! Yes, that's correct. Enabling the "Compute eye state" option allows you to capture 3D eye state measurements in real time. Even if this option is off, you can still download 3D eye states from Pupil Cloud, as all data streams are reprocessed there from your recording. Potential use cases for having it enabled include streaming eye states via the real-time API, or working locally without Cloud access, such as when exporting recordings directly from your phone to computer.

user-b6f43d 13 October, 2024, 06:03:11

Hey pupil labs, one of my recording shows a error like this, it says it will get fixed. how long will it aproximately take ?

Chat image

nmt 14 October, 2024, 02:38:40

Hi @user-b6f43d! Can you try refreshing your browser tab and seeing if it's been processed? If it hasn't, please open a ticket in 🛟 troubleshooting

user-dac41e 14 October, 2024, 12:18:12

Hi Pupil Labs! We are using the neon glasses indoor in our lab. We are using it for a car simulator that we built. We have two problems, and one question: - At the start of the recording(usually for 3sec, but in extreme cases 10-15 sec), the whole screen is grey, only the red circle(gaze data) is shown. I will include a screenshot about this. -The second problem is about overheating. After a few recordings, the module in the middle gets really hot, and it affects the performance of the glasses. -Question: Is there any way to connect the glasses directly to a windows machine, and manage the functions from there? (Recording, downloading videos directly) Thank you for your help!

Chat image

nmt 14 October, 2024, 14:10:07

Hi @user-dac41e! It's normal for the sensors to start at slightly different times, but 15 seconds seems unreasonable. First, ensure you're using the latest version of the Companion App. If not, please update it via the Google Play Store. Then, try clearing the app's storage and cache: disconnect Neon, long-press the Neon app icon, select 'App info', go to 'Storage & cache' and clear both. (This won't delete your recordings, but you'll need to log in again.) Reconnect Neon and make a test recording.

Secondly, could you clarify what you mean by heat affecting the performance of the glasses? Do you receive error messages or notice something else?

In terms of direct tether to a computer, this isn't supported if you want to use Neon's native gaze pipeline, as this runs on the Phone. May I ask why you want to do so?

user-f43a29 15 October, 2024, 08:40:08

blinks.csv timestamps alignment

user-87d763 15 October, 2024, 13:23:22

Hello! I am working on achieving the precise time sync for my devices. the documentation was very clear and the code worked beautifully (thank you for all the work put into this!). However, I have a question about the output. The first time I ran the code the offset estimate was about 37ms, and we are trying to achieve an offset of <20ms max. Thus, i restarted my computer and the mobile phone devices, and toggled the phones to update their time. After this, the new output was -950ms (even worse!). Any ideas why this might be occurring and what I can do to achieve an offset of less than 20ms?

user-f43a29 15 October, 2024, 15:15:23

Hi @user-87d763 , may I first ask what your use case is?

user-87d763 15 October, 2024, 15:17:51

our aim is to have two neon eyeglasses recording at the exact same time. having the precise time is essential for ensuring that we can have both eye glasses recorded at the exact same time

user-f43a29 15 October, 2024, 15:24:11

Do you need this synchronization during the experiment (real-time) or only during post-hoc analysis?

user-87d763 15 October, 2024, 15:24:44

both would be ideal

user-f43a29 15 October, 2024, 15:36:06

Ok, so, you want to compare data from two different Neons that were recording at the same time? Will other devices be involved and do you need to trigger those other devices based on data from Neon?

user-f43a29 15 October, 2024, 15:53:54

@user-87d763 Ok, I see. And just one last question: do you mean that you want both Neons to collect a gaze datum, for example, at exactly or close to exactly the same timestamp?

user-87d763 15 October, 2024, 15:54:55

yes, we want both neon's to collect the gaze data at the exact same timestamp

user-e11dd1 15 October, 2024, 18:06:33

Hey everyone, We're encountering an issue with the synchronization of the timestamps between the frames and IMU data. Specifically, we've noticed a delay of approximately 1:30 minutes between the two datasets in some sessions. After performing calibration movements during the session, we tried to synchronize the data by looking at the frame numbers marking the start and end of the movements, retrieving the corresponding timestamps, and then extracting the IMU data within that time frame, but found that the frame timestamps didn't align with the corresponding IMU data. After investigating, we found that the start times of the IMU and the frames were off by about 1:30 minutes. When we shifted the IMU data by this amount the data matched again.

However, this issue has only occurred in certain sessions—while the sessions before and after these had no such issues. Additionally, both sensors were shut down with the usual delay of 1-2 seconds at the end of each session, as expected.

In the plot below, the first panel shows the IMU data with the original timestamps, with the red lines marking the frame timestamps. Furthermore, approximately 1:30 minutes after the frame timestamps, we can also see the expected pattern in the IMU data. In the bottom panel, the IMU timestamps are shifted, and you can see the expected pattern in the IMU data between the frame timestamps.

Has anyone encountered something similar or have ideas on what might be causing this? Any help or suggestions would be greatly appreciated! Thanks in advance!

Chat image

user-f43a29 16 October, 2024, 07:56:00

Hi @user-e11dd1 , to be sure I understand:

  • When you say that you "used the frame numbers ... to retrieve the corresponding timestamps", does "frame numbers" mean the corresponding array indices of the IMU data array?
  • How did you determine the "frame numbers"?
user-5a2e2e 15 October, 2024, 19:18:04

Hi, we use neon player to process data after recording. We record each trial separately so, when we go to use Neon Player, we have to drag the export file into the interface and then wait for processing to finish. With many trials, manually uploading, waiting, then downloading results is time consuming. Is there any way to load multiple recordings for processing at once or to do them in succession automatically?

user-f43a29 16 October, 2024, 08:07:14

Hi @user-5a2e2e , I'd first like to let you know about a best practice for Neon:

  • Rather than make many short recordings, one per trial, make one long recording for the whole experiment, and mark the begin & end of separate trials with Events.

When following this method, you will have a smoother experience using Pupil Cloud and Neon Player to analyze your data.

If you have already collected all your data, then you could use the pl-rec-export tool in a command-line loop to "batch process" each of those recordings. If you prefer working in Python, then there is also the pl-neon-recording library. Then, you can write Python loop to "batch process" the recordings.

user-fd453d 16 October, 2024, 14:20:49

Hola tengo claro los equipos recomendados por pupil, sufrimos el robo del equipo mobil y tenemos un samsung A55, con android 14 pero no reconoce las gafas. Alguna chance de ocupar este equipo?...en chile no se comercializa la marca one plus y el moto 40 no se encuentra en el mercado. Que me recomiendan?

user-480f4c 16 October, 2024, 14:40:00

I'm adding some context in English for other members of this channel that might find this helpful:

Martin was asking whether other devices such as Samsung A55 or Moto Edge 50 fusion or Pro could be used with Neon.

The only supported devices that have been thoroughly tested are the ones listed in our documentation:

  • OnePlus 8/8T
  • OnePlus 10
  • Moto Edge 40 Pro, Moto Edge + (2023), Moto X40 Pro
user-480f4c 16 October, 2024, 14:31:27

Hola @user-fd453d - Siento mucho que te hayan robado el móvil. Lamentablemente, los únicos dispositivos compatibles son los recomendados en nuestra página web (OnePlus8T, OnePlus10 y Moto Edge 40 pro). Si no puedes encontrar un teléfono compatible de un distribuidor local, podemos ofrecerte uno; por favor, ponte en contacto a través del correo electrónico [email removed] y el equipo de ventas te proporcionará más detalles.

user-fd453d 16 October, 2024, 14:36:50

nadia consulta, y el motorola edge 50 fusion o edge 50 pro?

user-480f4c 16 October, 2024, 14:37:52

Lamentablemente,no. Tendría que ser uno de los que aparecen en la lista de nuestra documentación: https://docs.pupil-labs.com/neon/hardware/compatible-devices/

user-fd453d 16 October, 2024, 15:33:20

Hola nadia solo para confirmar, este es el equipo compatible. Es un motorola edge 40, no es edge 40 pro.

Chat image

user-480f4c 17 October, 2024, 07:34:11

Hola @user-fd453d ! Como mencioné en mi mensaje anterior, los únicos dispositivos compatibles son los que recomendamos en nuestra página web. Por lo tanto, el Moto Edge 40 no es compatible, mientras que el Moto Edge 40 Pro sí lo es.

English version below for anyone who may have similar questions:

The only compatible devices are those recommended on our website. Therefore, the Moto Edge 40 is not compatible, whereas the Moto Edge 40 Pro is.

user-594678 16 October, 2024, 23:16:56

Hi Pupil team, regarding to the recent decision of limiting the cloud storage, I was wondering whether if I download the data from the cloud and remove it, would it be possible to upload the data to the cloud back. It would be useful if I can upload the data after I remove it from the cloud in case I want to do some post-processing using the cloud system that I didn't realize beforehand. Thanks for your response in advance!

user-f43a29 17 October, 2024, 07:46:17

Hi @user-594678 , it is not possible to re-upload recordings to Pupil Cloud after you have deleted them.

user-37a2bd 17 October, 2024, 06:20:01

Team I need some urgent help. My device is not getting recognised on the phone

user-f43a29 17 October, 2024, 08:45:07

Hi @user-37a2bd , we have continued communication in the ticket. Hoping to get you up and running again swiftly!

user-37a2bd 17 October, 2024, 06:20:53

I have an open ticket. Can someone please join and help me with it

user-37a2bd 17 October, 2024, 06:21:17

The device is getting recognised as a charging device instead of the glasses.

user-8d2ab5 17 October, 2024, 09:25:33

Hi everybody, I‘m trying to Transfer Recordings in the Pupil Cloud from one Workspace to another workspace from a different person. I want them to be displayed just like their own recordings in their workspace. I don’t just want to invite the person to my workspace. Is there a way to make that possible? Kind Regards Nicola

user-f43a29 17 October, 2024, 09:26:23

@user-da6260 , I've moved our messages to the 👓 neon channel. We can continue our converastion here.

user-65aa8b 17 October, 2024, 09:25:45

Hi @user-da6260 , it is not possible to transfer recordings between Workspaces. Are you experiencing errors with the Workspace invite process?

user-da6260 17 October, 2024, 09:29:55

Yes, the Email of the Workspace I want the Recordings to be Transfered to doesnt have an own Google Account. Its just a forwarding email adress. If I Click on the link there will Happen two things: 1. I cant Open the link in the Account I actually want to Transfer it to because it always switches to the one connected to my personal Google Account. 2. Theres an error saying the link is either expired or has already been used, even the I just created the link.

user-ae2b7f 17 October, 2024, 13:25:05

Hello. I put a USB HUB on the smartphone with an HDMI output. I can display Neon on the TV. But I cannot display app in landscape. How must I do this ?

user-c2d375 17 October, 2024, 13:48:58

Hi @user-ae2b7f, may I ask you why you would like to stream the Neon companion app via HDMI? If you're looking to monitor what your participant is viewing during the recording, you can use the Neon Monitor app on a device connected to the same network as the Neon Companion device.

user-cdcab0 17 October, 2024, 18:25:34

The solid colors of the walls aren't going to be helpful for RIM at all, so anything you can do there to add more detail / (visual) texture would help... but I'd bet that what you have there is enough if the floor is constantly in view. With things like this though, the most honest answer is often "try it and see"

nmt 17 October, 2024, 19:03:56

I agree with @user-cdcab0. What you have there should be sufficient. But we'd always highly recommend trying it out, especially prior to any experiment.

user-d00286 17 October, 2024, 22:05:56

Hi is this an appropriate place to ask a question about AOI's and heat maps?

user-480f4c 18 October, 2024, 07:01:06

Hi @user-d00286, yes! Feel free to ask your question here!

user-a09f5d 18 October, 2024, 13:37:10

Thank you @user-cdcab0 and @nmt . I think I might print some visual texture for the walls to be on the safe side. Is there a way of validating the accuracy of a RIM enrichment for the 'try it and see' method? Or is it just a case of genetrated a heat map of an enriched eye tracking recording and checking that it looks 'right'?

user-cdcab0 18 October, 2024, 18:01:58

You can validate the mapping with the side-by-side view without generating a heatmap, but this is pretty much the same as what you're thinking.

When RIM processes each frame of a recording, it has to localize the camera and the reference image in order to map gaze. I believe that, in your use case specifically, it's only important for the camera to be localized, and this is indicated by the white dots (see the first video at the top of that linked page).

user-c541c9 18 October, 2024, 14:33:37

I have never updated my Motorola Edge 40 Pro that came with the bundle, although it keeps prompting me to update. I believe the instructions are not to update, as the companion app is designed to run optimally for the phone OS as it came in the package. I was wondering if the new companion app announced requires some updates, or we should keep the OS as it is?

user-f43a29 18 October, 2024, 15:02:00

Hi @user-c541c9 , newer versions of the Neon Companion App do not require you to update the Android version.

Aside from that, if your Moto Edge 40 Pro still has Android 13, then it is okay to update to Android 14. We now support that on Moto Edge 40 Pro.

user-d086cf 18 October, 2024, 20:19:04

Hey guys. Has anyone noticed any network interference between the Neon streaming via the unity package or through the web neon.local:8080 page? I have an image generator program running for our simulator, and I lose connection to the Neon when its running. I checked with wireshark and they are using completely different ports for communication, and there's a fair bit of traffic (60 pps for the image generator, and whatever I set the streaming to for the neon), but I feel like that isn't the issue. Any ideas?

user-f43a29 18 October, 2024, 23:07:33

Hi @user-d086cf , may I first ask some questions:

  • What kind of network are you using to connect to Neon? A work/university WiFi or a local, dedicated WiFi router/Ethernet?
  • What is the image generator program? Is it running in parallel on the Neon Companion phone or on a separate computer?
user-162faa 20 October, 2024, 18:56:45

Hello! I have a very simple question; Is there a way to toggle off the fixation lines? I only want a video with the numbered Fixation points, no lines connecting them. Thanks!

user-d407c1 21 October, 2024, 06:26:13

Hi @user-162faa 👋 ! Currently, there isn’t an option to disable just the “scanpath” line from the fixation visualisation in Pupil Cloud. The fixations visualisation are either fully enabled or disabled.

As you mentioned, you can suggest this feature in the 💡 features-requests channel. But, in the meantime, you can try:

A) Using Neon Player with the fixation plugin. Please note that the data in the Cloud is recomputed to ensure 200Hz, so you may find slight variations.

B) Alternatively, you could build your own visualisation, for example leveraging the scanpath.json endpoint from Pupil Cloud API see more on https://discord.com/channels/285728493612957698/446977689690177536/1281276487827460157

If you add the feature request, could you share more about your use case? It would help us understand why you prefer no scanpath. Wouldn't having multiple fixations points without a connection cause confusion?

user-162faa 20 October, 2024, 18:59:05

(apologies if this should go in feature requests plz let me know)

user-d6285a 21 October, 2024, 01:11:14

Hello @user-f43a29 and others at Pupil Labs. We have a Vuzix Z100 smart glasses we're using for an AR study at the MIT Media Lab. We'd like to integrate the https://pupil-labs.com/products/neon/shop#bare-metal Bare Metal Neon with it - do you think that's possible?

wrp 21 October, 2024, 06:00:55

Hi @user-d6285a we have not tried integrating the Neon Module into Vuzix Z100. The first thing to check is if there is sufficient space to physically integrate the module. If you have a rough model of the Vuzix Z100 or want to physically prototype and do not yet have Neon at your lab you can try prototyping with the files in this repo: https://github.com/pupil-labs/neon-geometry

user-162faa 21 October, 2024, 12:54:18

Thank you! I did try the ref image mapper last year, but I look at a lot of paintings in one visit. I found that to take a scan video each time interfered too much with my process. The video is good enough to illustrate for my purposes at the moment, especially since the points are transformed further in my plugin for c4d 🙂 Thank you for the recommendation though!

user-37a2bd 23 October, 2024, 06:25:23

Hi Team, If we have audio recording enabled is the audio generally not audible during the playback on the phone itself?

user-f43a29 23 October, 2024, 08:13:01

Hi @user-37a2bd , if the sound is from nearby the wearer and not too quiet, then it is generally audible during playback, provided the speaker volume of the phone is appropriate. For instance, if I wear Neon and talk at my normal volume, then I can clearly hear myself during playback. I can also hear sounds in phone playback from several meters away, such as tapping a pen against a table.

How far away is your sound source and what is the sound source?

user-37a2bd 23 October, 2024, 17:21:43

Is the microphone on the glasses or does it use the mic from the recording device?

user-f43a29 23 October, 2024, 21:40:38

Neon Microphones & Audio playback

user-b6f43d 24 October, 2024, 06:28:33

Hey pupil labs, One of my recording is showing like this ? how long should i wait for it to get processed ??

Chat image

user-480f4c 24 October, 2024, 06:44:16

Hi @user-b6f43d - could you open a ticket in our 🛟 troubleshooting channel, sharing the ID of this recording? We can assist you there in a private chat 🙂

user-d5a41b 24 October, 2024, 14:44:01

Hi! When I download gaze data from Neon Player, the cvs file contains timestamps and gaze coordinates. How do I get the corresponding frame numbers?

user-cdcab0 25 October, 2024, 05:19:23

Hi, @user-d5a41b - if you have the World Video or Eye Video Exporter plugins enabled, they will each save a timestamps.csv file that contains the timestamp for each frame in their respective videos. You can cross-reference these timestamps with your gaze data to determine the frame index.

user-3e88a5 25 October, 2024, 09:59:54

hello, I'm having troubles with the pupil neon. I'm connecting them to the companion device, I can see a white led turning on on the glasses but in the companion app the device is not recognised and there is still written plug in and go. What does that means?

user-480f4c 25 October, 2024, 10:03:56

Hi @user-3e88a5! Sorry to hear you're experiencing issues with your Neon. Could you please open a ticket in our 🛟 troubleshooting channel and we will assist you there in a private chat?

user-c1bd23 25 October, 2024, 17:52:59

Have a question re: lighting. I've done a scene recording for manual mapping (it's too complex and dynamic for RIM), but unfortunately the lighting was very poor due to the nature of the environment, and hence it's difficult to make out specific features for MM to map it accurately

user-d407c1 28 October, 2024, 07:34:51

Hi @user-c1bd23 👋! Unfortunately, currently we do not have any post-processing video editing tools in Cloud, that allow you to manage exposure or contrast, if this is something that you would like to see, you can create a feature request in the 💡 features-requests channel, in this case feel free to upvote this existing request and describe the specific functionalities you’d like to see and how they’d enhance your workflow. Meanwhile you can define exposure at the time of recording, check out here how.

Also, could you share a bit more about where you’re encountering issues with manual mapping, is it that you can't see where the fixation lands? Understanding these challenges will help us support you better and may also strengthen the feature request.

Regarding the internal error message that you are encountering, would you mind creating a ticket at 🛟 troubleshooting and sharing the enrichment and recording ID?

user-c1bd23 25 October, 2024, 17:53:25

Is there any way to boost the brightness within the mapper video?

user-c1bd23 26 October, 2024, 16:57:04

I'm also getting an "internal server error" + "fixation not found" when trying to perform manual mapping on a couple of videos, they have approx 12000 and 9000 fixations. Another video with about 2000 fixations isn't giving this error.

user-45c05f 28 October, 2024, 11:06:48

One quick question. I am attempting to connect to the IP Address and stream the view…but unfortunately the streaming does not work regardless of my browser or computer that is used. Are there any specifics to try besides?! Or ways to reset the Ip address or URL?

user-57b70f 28 October, 2024, 11:07:00

Hi @user-3bcaa4 , which eyetracker do you have? It sounds like you might have Pupil Invisible or Neon?

user-e5faf1 28 October, 2024, 11:07:08

Neon

user-f43a29 28 October, 2024, 11:07:55

Hi @user-3bcaa4 I've moved our discussion to the 👓 neon channel. - Are your computer and Neon connected to the same WiFi network? - Is it a university or work WiFi connection?

user-3bcaa4 28 October, 2024, 11:09:10

Yes they are connected to the same connection. And it’s a work connection. Usually with a mobile hotspot, so that could also be impacting the streaming I’m sure as well. It did work previously though using a hotspot for a brief time though

user-f43a29 28 October, 2024, 11:20:08

@user-3bcaa4 thanks for the clarification.

  • Work & university WiFi networks can be problematic, as they might either outright block the ports needed for streaming, or are congested with traffic from other users, which then interferes with streaming
  • You should have better success with a mobile hotspot, as you noticed. Did you use your personal cellphone or a laptop as the mobile hotspot? (You can also use a local, dedicated WiFi router for streaming. It does not need to be connected to the Internet.)
  • Are you on the latest version of the Neon Companion App, v2.8.31-prod? If not, we recommend updating and trying again.
  • Also, note that you will have the best success using Neon Monitor via Google Chrome.

Let us know how it works out.

user-2dfdbc 28 October, 2024, 13:01:59

Good day a quick question is it possible to get the eye tracking videos (as in the pupil eye videos) off the https://cloud.pupil-labs.com/

user-d407c1 28 October, 2024, 13:10:36

Hi @user-2dfdbc 👋! Are you referring to eye camera videos? Currently, Pupil Cloud doesn’t display eye images. If this is something you’d like, please consider upvoting this request: https://discord.com/channels/285728493612957698/1247514112565575732 and sharing any details on how it would benefit your workflow.

In the meantime, you can download the Native Recording Format by right-clicking on a recording. This format includes eye camera videos, which you can view alongside the world camera footage in Neon Player. For more on this, check out the documentation here

user-5a2e2e 29 October, 2024, 20:28:30

Hi, we have been using neon to record for a few months now with no issues. The last couple of times that we tried, the glasses were not recognized by the companion device. The light on the front of the glasses turns on while plugged in, but the app doesn't show that it's connected and we can't record. We are using the given USBC cable. Is there something else we can try or check?

nmt 29 October, 2024, 20:30:55

Hi @user-5a2e2e 👋. Please open a ticket in 🛟 troubleshooting and someone will help run you through some debugging steps.

user-c1bd23 30 October, 2024, 09:02:47

Hi all. I'm having issues logging in to Pupil Cloud

user-c1bd23 30 October, 2024, 09:03:18

Every time I try to sign in this morning using either the Google login or the direct login, it seems to initially work but I then get redirected to the login screen

user-c1bd23 30 October, 2024, 09:03:42

I've cleared cache, tried a different browser, and restarted my PC. Not the first time this has happened either, seems to happen every few days

user-c1bd23 30 October, 2024, 09:08:43

I've just tried a different computer as well, same issue

user-480f4c 30 October, 2024, 09:15:30

Hi @user-c1bd23 - thanks for reaching out. We're currently working on fixing this issue! Thanks for your understanding! 🙏🏽

user-c1bd23 30 October, 2024, 09:16:06

Thanks Nadia. I just managed to log in but my recordings aren't loading. All was working fine yesterday

user-d2d759 30 October, 2024, 09:53:22

Hi there, I'm encountering an issue where some recordings that I uploaded to pupil cloud is still showing processing (purple loading circle) for the whole day, it usually processes very fast on pupil cloud. I was wondering if there's anyway to fix this issue?

nmt 30 October, 2024, 10:51:17

Hi @user-c1bd23 and @user-d2d759 👋. Thanks for your report. We're aware of the issue and are working to resolve it. We'll post here once everything is sorted!

user-c1bd23 30 October, 2024, 10:51:43

Hi Miguel, my issues are sorted and I'm now able to map my recordings. Thank you to your team!

user-e11dd1 30 October, 2024, 11:07:09

Hi, can the yaw measurement from the IMU be interpreted as cardinal directions? For example, does 0 degrees correspond to north, and so on? I was a bit confused because I initially assumed this was the case, but I’ve seen some discussions on Discord referencing quaternions.

nmt 30 October, 2024, 12:05:08

Short answer is, yes. But since you ask about quaternions, I will provide a more detailed overview.

The IMU does indeed output quaternions. These describe how the IMU is rotated relative to the world, i.e. magnetic north and gravity.

We convert quaternions to Euler angles (pitch, roll, and yaw) as they are somewhat easier to understand. A yaw angle of 0° indicates alignment with magnetic north.

Be sure to read this section of the docs, as it contains some important considerations you should know about when working with Neon's IMU, such as calibrating the IMU to obtain accurate yaw readings.

user-c1bd23 31 October, 2024, 11:43:57

Hi PL team. Same problem again as yesterday. Everything was working fine until about 20 minutes ago, then the fixations stopped appearing in Manual Mapper and I got logged out. Now trying to log back in I get redirected just like yesterday. I managed to log back in successfully once and do some more mapping, but it's logged me out again and I'm unable to sign in

user-c1bd23 31 October, 2024, 11:45:41

Was the underlying cause to this found? It keeps happening whilst I'm working on mapping, currently trying to meet a project deadline

nmt 31 October, 2024, 11:52:16

Hi @user-c1bd23! Thanks for contacting us. Let me liaise with the Cloud team and I will report back to you

End of October archive