๐Ÿ‘“ neon


Year

nmt 01 February, 2024, 01:36:24

Hi @user-e3da49! 'Raw data export' is now called 'Native Recording Data'. This is what can be loaded in Neon Player.

Re. the recording affected by a low battery, please try long-pressing on the Neon Companion App Icon, select 'App info, and 'Force stop'. Then open the app as normal. Does it prompt you to save a recording from a previous session?

user-e3da49 01 February, 2024, 07:33:06

Hi Neil, thank you so much! That worked! ๐Ÿ™‚

user-ccf2f6 01 February, 2024, 04:59:34

okay, is this in the pipeline? Weโ€™re already using the other information in our remote control applications and itโ€™s very helpful but the ability to set recording parameters with the API would allow us to not worry about accidental toggles on the companion app during an experiment procedure.

nmt 01 February, 2024, 05:27:18

You can log this as a feature request. Further instructions in feedback ๐Ÿ™‚

user-e3da49 01 February, 2024, 11:15:05

Hi Neil, are you sure that 'Raw data export' is now called 'Native Recording Data'? it takes me hours to download the zip - not like before the update. Also last time I checked, the files in 'Raw data export' were equal to "Time Series Date + Scene Video" - but I am not quite sure anymore. Moreover I have the phenomenon, that I canยดt unzip the "Time Series Date + Scene Video"-zip. Canยดt be the issue of my converter, since it opens the face mapper zip... thank you so much for your help!!

user-d407c1 01 February, 2024, 11:26:39

Hi @user-e3da49 ! That is correct! Apologies for the confusion. The "Raw data export" = "Time Series Data + Scene Video" is the CSV files with all the data and the videos.

The "Raw sensor data" = "Native Recording Format"is the data as it comes from the phone in binary format.

Hmm.. downloads are now faster since the last update, could it be you have more videos on the project? worse internet connection? Could you do this test to ensure you have a proper download speed?

Reg.canยดt unzip what browser are you using? I mention it because of this

user-e3da49 01 February, 2024, 15:06:11

Hi Miguel, thanks for the clarification ๐Ÿ™‚ I downloaded the zip files via Google Chrome and itยดs just the "Time Series Data + Scene Video" - zip I canยดt open ...

user-068eb9 01 February, 2024, 13:32:52

Hello, Is there a limit on the number of collaborators that can access a workspace on pupil cloud? I was only able to add 3 so far,. The rest receive the invitation email, but the link never works.

user-d407c1 01 February, 2024, 13:44:10

Hi @user-068eb9 ! There is no limits on the number of collaborators. Could you kindly check a) that the email they are login in Cloud and the invited ones match and b) they are using the latest invitation link (especially if you clicked on resend).

Some email providers like gmail group emails, and they might have collapsed the invitation together leaving you upfront with an old invitation.

If after checking these points, you still have issues, please reach us here or by email with the workspace id and email accounts affected and we will look into it.

user-e3da49 01 February, 2024, 15:07:29

Also, a recording of mine suddenly stopped after 28 minutes without any reason. The front camera light was on. What do you think happened and how can I prevent this next time (I want to record for longer than 2h straight)? Also the uploaded 28 min recording in Pupil Cloud says "gaze pipeline failed" and that I should inform you guys. It would be great, if you could restore the whole recording of over 2 hours ๐Ÿ˜ฆ thank you so much in advance!

user-d407c1 01 February, 2024, 15:39:14

Hi @user-e3da49 ! Could you follow up by email at info@pupil-labs.com with the recording ID? You can obtain that by right clicking over the recording in Cloud.

user-f4e4e0 02 February, 2024, 02:03:12

Hi there. I am trying to use the Reference Image Mapper with an image of a simulator screen setup and then a recording from our Neon eyetracking glasses on the cloud. The enrichment is still in processing after an hour, is this normal?

user-d407c1 02 February, 2024, 12:29:08

Hi @user-f4e4e0 ! The time it takes to compute a reference image would depend on the amount and duration of the recordings in the enrichment, as well as workload in Cloud.

If it is taking too long, it might be that your browser put your tab to "sleep", and the state never changed, can you do a hard refresh and confirm whether it got stuck?

user-4724d0 02 February, 2024, 12:14:31

Hi! I am trying to use enrichments, but my video is longer than 3 minutes. I put in events in order to use the enrichment function on segments that are less than 3 minutes, but I still cannot find a way to upload the video... is there a way to cut up the video in the cloud in order to only submit the segments for enrichment?

user-d407c1 02 February, 2024, 12:31:17

Hi @user-4724d0 ! Unfortunately, you can not use subsections of a recording as a scanning recording.

The rest of the recordings can be as long as you wish but the scanning recording has to be below 3 minutes.

If you would like to use subsections, feel free to suggest it in feedback

user-23cf38 02 February, 2024, 12:57:24

We are currently experiencing recording errors due to what we suspect to be a loose contact at the USB-C plug of the Neon glasses. โ€จThe glasses repeatedly disconnect from the phone after about 1 min of recording even if the cable is not moved.โ€จCould you give me advice on whom to approach with this issue? Is there an address to ship the glasses to for repairment?

user-d407c1 02 February, 2024, 13:00:08

Hi @user-23cf38 ! Can you follow up by email to info@pupil-labs.com with your order id , and neon glasses serial number? There we will be able to organise a repair if needed

user-ccf2f6 02 February, 2024, 23:58:54

Hi, weโ€™ve encountered this issue multiple times with our neon devices where, during a recording, the sensor light turns red and the phone starts vibrating. The recording stops and the app throws an error. We cannot perform any further actions on the app until we restart it and the recording seems to be corrupted. Can you please help us know what could be the cause of this issue and how can we avoid running into it?

Chat image Chat image

nmt 03 February, 2024, 01:20:06

Hey @user-ccf2f6 ๐Ÿ‘‹. Do you already have affected recordings in Cloud? Please send an email to [email removed] with an ID. You can right-click on a recording in Cloud, select 'View recording information'. There you will find the ID. Someone will assist you from there!

user-231fb9 03 February, 2024, 13:08:13

Hi, is there a possibility to adjust the wideness of the scene camera? Right now the camera has a super wide angle, I would like to zoom in on the environment for my study with people.

nmt 03 February, 2024, 23:36:53

Hi @user-231fb9 ๐Ÿ‘‹. Adjusting the wideness, or zooming, isn't possible with Neon's scene camera. What does your experiment/research setup look like? Perhaps we can suggest some alternative solutions!

user-f4e4e0 05 February, 2024, 03:27:13

Hi there, is there a way we can check an error that occured during one of our recordings if we provide a recording ID?

user-d407c1 05 February, 2024, 07:46:09

Hi @user-f4e4e0 ! Yes! Please share with us at info@pupil-labs.com your recording ID and a description of the issue and we will look into this.

user-29f76a 05 February, 2024, 11:19:20

Hi, Pupil Labs team. I want to buy a Lens Kit for participants who need vision correction but the explanation is not clear whether I need to buy the specific frame or not for this. Currently, I only have Neon device by default. Could you give me a hint?

Chat image Chat image

user-d407c1 05 February, 2024, 11:27:11

Hi @user-29f76a ! I can see clearly now is a frame that supports prescription lenses. The lenses can be changed at ease since they are attached through magnets.

This frame by defaults comes with a set of lenses from -3Dpt to +3Dpt as described there. On top of that frame we offer this extension lens kit with additional lenses.

The "default" Just Act Natural frame does not accept this set of lenses. I hope this helps.

user-29f76a 05 February, 2024, 11:41:46

So the "I can see clearly now - extended range lens kit" cannot fit with the standard frame?

user-d407c1 05 February, 2024, 11:45:46

They won't fit, that's correct. The "I can see clearly now" frame and these lenses come with embedded magnets that connect them together.

user-0055a7 05 February, 2024, 14:40:39

Hi! First time attempting to use Neon and the connection between phone and glasses does not seem to work, ie the phone screen still displays the "plug in and go" screen. We think this might have to do with the Neon not being connected to the same network as the phone, since it worked earlier before we brought the glasses into a new environment (new network, new workspace login). But it is an uneducated guess at best, any help would be appreciated!

user-d407c1 05 February, 2024, 15:19:34

HI @user-0055a7 ! I noticed there might be some confusion regarding the connectivity of the Neon module. Neon is not wireless and requires a direct connection to the Companion Device (phone) via a USB Type-C cable.

Could you please confirm if you have it connected this way?

Additionally, did you know you're eligible for an onboarding call, which could be very helpful in addressing any questions or concerns you may have. I invite you to reach out directly to info@pupil-labs.com to schedule your session.

user-0055a7 05 February, 2024, 15:21:28

Thanks for the reply, sorry for the confusion. We do indeed have the Neon connected to the phone via USB-C cable.

The onboarding call sounds very helpful, will look into it if we cannot resolve the issue

user-d407c1 05 February, 2024, 15:30:07

thanks for following @user-0055a7 ! So you have the glasses connected but they are not detected, is that right? Could you please write to info@pupil-labs.com indicating the app version and a picture of the back of your Neon module (something where the QR code is clearly visible).

user-275c4d 05 February, 2024, 20:13:12

Hi pupil-labs team, I am having an issue with the scene camera of my device that i can't figure out. Everything is working fine when i make a recording using the companion app. However, When i try to stream data (even just using neon.local:8080) the scene video does not stream. After attempting to stream, the scene camera stops working. So clicking record on the companion app leads to recordings without scene videos, and pressing the video camera on the bottom right does not display any video. if i unplug the device for a few minutes, scene video works as usual, but attempting to stream leads to the same problem again

user-d407c1 06 February, 2024, 11:44:50

Hi @user-275c4d ! this sounds like network issue, does the same occur if you connect using the IP shown in the Stream menu? Also, have you seen these network requirements

user-275c4d 05 February, 2024, 20:14:50

Has anyone encountered this issue before?

user-9857ce 06 February, 2024, 03:00:41

Hi, this is Younghoo from Seoul National University. We're interested in using Neon device for our research, but just want to ask if there are any software or applications that we need to purchase additionally to use the device. Thank you.

wrp 06 February, 2024, 07:58:56

Hi @user-9857ce ๐Ÿ‘‹ There are no additional costs for software. If you'd like a quote feel free to fill out a request via our website (if you haven't already ๐Ÿ˜‰ )

user-9857ce 06 February, 2024, 11:36:19

Thank you @wrp for your reply

user-ffb983 06 February, 2024, 13:20:13

hello, i am a prospective customer that would be using the Neon model in a clinical setting, mainly working with patients that have suffered concussion and traumatic brain injury. I am very familiar with several VNG set ups, and am curious if the Neon could be used in a similar way. Could you please provide me with a sample sheet that displays objective gaze mapping and possibly saccadic information?

user-d407c1 06 February, 2024, 14:26:26

Hi @user-ffb983 ! thanks for sharing your use case. Neon will provide you with gaze data and eye videos at 200Hz. The gaze output looks like this.

Additionally the gaze (x,y) position can be obtained in realtime. It will be output in the scene camera coordinates although you don't need the scene camera content.

In general, it would work detecting eye movements on a videonystagmography environment.

If you would like a videocall demo to show you Neon and discuss anything, feel free to inquire about it at info@pupil-labs.com

user-057596 06 February, 2024, 16:29:20

Is it only possible to use the AOI tools/Editor in conjunction with reference image or marker mapper enrichments? The old SMI analysis software allowed you to or select an AOI post recording without requiring any form of mapper system

user-d407c1 06 February, 2024, 20:47:58

Hi @user-057596 ! We are working on some sort of manual mapping tool. For now unfortunately, you need the reference image mapper or the marker mapper tool.

user-057596 07 February, 2024, 09:43:45

Thanks @user-d407c1 a manual mapping tool would be very welcome, also is it possible then with the present AOI tool/Editor to select a particular AOI for each single recording in a project folder rather than the same one being applied to all the recordings in the project?

user-d407c1 07 February, 2024, 09:51:30

All Areas of Interest (AOIs) defined in the enrichment process are applied to all recordings that have been enriched. ๐Ÿ˜…

But... within the AOI heatmap tool, you have the flexibility to customize visualizations by selectively toggling AOIs and choosing specific recordings to include. Is that sufficient?

user-057596 07 February, 2024, 09:52:45

Thatโ€™s really helpful ๐Ÿ˜

user-cad8c8 07 February, 2024, 16:18:17

Hello, I'm new here! Does code written using the API for Pupil Core work in Neon?

user-09f634 07 February, 2024, 16:24:21

Hey Pupil Labs peeps, where can I see the gaze offset values for a wearer in the companion app? I would like to manually offset the gaze in Neon Player, but wondering if rather than adjusting it based on playback, could I just find the values for the wearer and use that?

user-09f634 07 February, 2024, 16:26:42

I have also found that neon player crashes very often while processing the exported files from the phone. Console gives this message:

Chat image

user-d407c1 07 February, 2024, 17:15:44

Hey @user-cad8c8 ! The realtime API from Core and Neon are completely different, they use different protocols as well. You can check them out here and here. They are not compatible with each other. Pupil Invisible and Neon are based on the same realtime API.

user-cad8c8 07 February, 2024, 17:22:43

Thank you!

user-d407c1 07 February, 2024, 17:22:01

Hi @user-09f634 ! The values for it are not shown directly on the app. But they are stored in the info.json file upon being recorded. Is that sufficient? I will pass the feedback, regarding the crashes. Are you using the latest version 4.1 ?

user-09f634 07 February, 2024, 17:50:07

yes this works for me, thank you! Regarding the crashes, I am on v4.1. Restoring default settings seems to help with the crashing sometimes..

user-231fb9 07 February, 2024, 19:46:09

I am initiating an experiment to investigate the disparities in viewing behavior between real-life interactions and screen-mediated experiences. In this study, I will be testing individuals with distinctive features such as a prominent birthmark or other stigmatizing aspects. During the experiment, the test participant and research fellow/screen will be positioned facing each other, and the use of a wide lens on the camera may pose challenges. While not insurmountable, it may result in slightly less sharp images.

nmt 07 February, 2024, 23:12:47

Thanks for the overview! The lens on Neon is not configurable; you will always get the wide-angle view. Have you already done some pilot testing/are there definite features that are not visible on the Neon's scene video?

user-465a3e 07 February, 2024, 20:47:09

Hi Pupil team. I am working on a project involving connecting and processing multiple Neon glasses. I have a list of static IPs of the devices and use a loop to go through each one and find it using Device(address, port). However, if the device with the passed IP is not currently connected to the network, I notice a delay of about a minute where the terminal hangs before proceeding with finding the next device. If the device can be found, there is no delay. Is there a way I can get around and/or set the delay time for finding a device by IP that is not connected? Thank you!

user-d407c1 08 February, 2024, 07:18:24

Hi @user-465a3e ! Are you using the simple or async API ? Using the async won't block the thread and is probably more suitable for your use case.

user-f4e4e0 08 February, 2024, 05:01:03

Hi there, is there a way to edit and chop up footage from the recordings? We are interested in using your enrichment feature to analyse areas of interest, but we won't be able to stop and start recordings while our participants are in mid-drive.

user-d407c1 08 February, 2024, 07:14:41

Hi @user-f4e4e0 ! Yes! that is what events are for. You can use them to select sections to enrich or sections to export in the video renderer.

user-f4e4e0 08 February, 2024, 23:12:13

Thank you

user-cdcab0 08 February, 2024, 11:04:28

Hi, @user-09f634 - I think I see the bug that causes your crash. It's Windows-specific, but simple to patch. Thanks for the report and sorry for the trouble. The fix is included in the v4.1.1 release

user-09f634 08 February, 2024, 17:22:19

thank you (: I will try the new release

user-1391e7 08 February, 2024, 13:17:12

Can I update the OnePlus 10 Pro (currently Android OS 12), or would that break something?

user-d407c1 08 February, 2024, 13:44:27

Hi @user-1391e7 ! Yes! Neon Companion App does support android 13. You will need to grant an additional permission for the storage that is new to android 13, but no worries you will be prompted.

user-1391e7 08 February, 2024, 13:17:25

(the companion device for our pupil neon glasses)

user-1391e7 08 February, 2024, 13:18:21

Android is annoyingly persistent in reminding me that there is one available. I'm slightly worried someone hits that update button during an ongoing study.

user-29f76a 08 February, 2024, 14:33:06

Okay I see. So does it mean that if I buy "I can see clearly now", I need to take of the bare metal from my "just act natural" and the put it on my new frame "I can see clearly now"? @user-d407c1

user-d407c1 08 February, 2024, 14:39:59

No, the nest (PCB) is on all frames, you just need to swap the module from one frame to the other, like is shown here.

user-29f76a 08 February, 2024, 15:08:35

Okay, just need to double check before buying it. - the "I can see clearly now" is the frame + -3 to 3 lens kit (cable to connect the phone companion and bare metal is included) - phone companion not included - I can swap the module to one another

am I correct?

user-d407c1 08 February, 2024, 15:13:12

That is totally correct, on our shop you have the bundles on top, which include:

  • the module
  • the companion device (phone)
  • the software as you have seen
  • an onboarding call and:
  • the frame of your choice (frame always with the PCB/nest).

and at the bottom, under accessories we have the frames alone with that PCB, no module, phone or anything else. - the frame "I can see clearly now " comes with prescription lenses from -3Dpt to +3Dpt, you can order this frame here.

user-29f76a 08 February, 2024, 15:27:55

Thanks

user-7413e1 08 February, 2024, 15:33:43

Hi - I am trying to use the marker mapper enrichment in an experiment where two people interact face to face, each of them wearing one pair of neon glasses. They discuss an object appearing in front of them (on a screen or on a piece of paper), which I have tracked with the markers (four QR codes, one in each angle). The issue is that it seems that the markers can be reliably detected only when appearing in front of the person, but as soon as they are slightly on the side, the enrichment fails. I have tried many different things (e.g. printing the QR, putting them on the image showed directly on the screen, changing the size and the luminosity of the image, varying the distance between the wearer and the screen etc), but none of these attempts resulted in the enrichment to detect all four QR codes reliably (yes, I made sure each QR has quite a large white border around it). Is this something that is expected? Should the marker appear exactly perpendicular to the wearer gaze to be detected? Is there anything else I could do to make sure they are detected by multiple people looking at the target from slightly different angles (e.g. in a two-person conversation)? Thanks!

user-d407c1 08 February, 2024, 15:45:03

Hi @user-7413e1 ! Could you post a picture of the markers not being detected, that would help us understand your environment. Tilted QR codes could be trickier to detect but depends on how much they are tilted.

user-7413e1 08 February, 2024, 16:29:47

yes - here is one (from person A's view) when I tried to use a printed version (not preferred), and also one (from person B's view) when it was shown directly on the screen

Chat image Chat image

nmt 09 February, 2024, 01:47:37

Thanks for sharing the image. It's very helpful! Honestly, this isn't a particularly challenging perspective, and I think you'll be able to get the markers detected! Here are two things I'd suggest you try: 1. Printed markers - They appear quite small in the scene camera image. I would suggest doubling their printed size. I expect they'd then be detected, even at the viewing angle shown in your image. 2. Digital images on screen - The markers here seem to be a decent size. So, I think what's happened is that the markers are slightly over-exposed in the scene camera image. They do look blurry. This is likely due to a combination of screen brightness and sub-optimal backlight compensation. You'll want to ensure that these are appropriate. I first try reducing the brightness of your screen and then adjusting the backlight compensation (also in the Companion app settings) until the markers are clear with good contrast. There's further context about backlight compensation in this message: https://discord.com/channels/285728493612957698/1047111711230009405/1200039507832098816

Let me know if that works!

user-09f634 08 February, 2024, 20:02:13

how long should the "World Video Exporter" take for a 2 minute video? Not more than 30 minutes right? it kinda looks like it's done based on the progress bar, but it still says it's running, maybe indefinitely. Though I have had the world video exporter run before on other exports from the phone and it did not take this long, so perhaps there is something wrong with my file but the application is not telling me?

Chat image

user-cdcab0 09 February, 2024, 00:32:48

The encoding time would depend on your PC specs, but 30 minutes for a 2 minute video would certainly be a typical. Does this happen every time you export the world video from this recording? Does it happen with every recording?

user-602698 08 February, 2024, 20:34:57

Hi Pupil Labs Team, I just wanted to ask a simple question. I recorded a video (more than 3 minutes) and I tried to make the enrichment process, but it already took more than 90 minutes, and itโ€™s not done yet. Do you have any experience? How long normally does the process normally take until it is complete for a 4-minute video?

nmt 09 February, 2024, 01:28:40

Hi @user-602698 ๐Ÿ‘‹. May I ask which enrichment you are running?

user-465a3e 08 February, 2024, 23:15:18

Hi Pupil Labs. We experienced an interesting issue with the devices. We used the API to start and stop recordings on 5 devices at the same time. A few minutes after recording ended, all the phones began vibrating and displayed an error message. "We have detected an error during recording". There were no errors thrown by the API during recording itself. The vibrating did not stop after restarting the app or unplugging the glasses. One of the glasses displayed a "Sensor Failure". Vibrating only stops after force quitting the app. Here is a video of the vibrating devices. Do we know the cause of this issue and how we can prevent it from occurring in the future?

mpk 09 February, 2024, 12:54:17

One thing you can quickly check is if the apps are all on the latest version and if the issue persists then.

nmt 09 February, 2024, 01:27:48

Hi @user-465a3e! Please contact info@pupil-labs.com and someone will help from there with the debugging process!

user-ee081d 09 February, 2024, 14:51:15

Hello! I am using marker mapper enrichment in the Pupil Cloud. The tags arent being detected. What could be the problem? The Neon player is able to detect them, so the size of the tags is proper .

Chat image

nmt 09 February, 2024, 23:10:39

Hi @user-ee081d! Please try again but this time using the markers available from our documentation: https://docs.pupil-labs.com/neon/pupil-cloud/enrichments/marker-mapper/#setup. The markers shown in your screen capture aren't compatible with the Cloud enrichment.

user-d407c1 09 February, 2024, 15:19:41

Hi @user-ee081d ! can you navigate a few frames forward or backward and see if they are detected?

user-ee081d 09 February, 2024, 15:26:56

I tried it and it didnt help.

user-d407c1 09 February, 2024, 15:29:07

Hi! I have send you a friend request, and I will follow by DM with my email address, can you invite us to that workspace, so we can further investigate what is happening?

user-057596 09 February, 2024, 16:03:12

Hi Iโ€™ve just updated the Neon app and now itโ€™s no longer connecting to eye cameras and there there is no gaze overlay.

user-d407c1 09 February, 2024, 16:07:06

Hi @user-057596 ! I'm sorry to hear that. Does Neon get recognised at all in the app?

Regardless of that, could you please contact us by email [email removed] with the serial number of the glasses or alternatively a picture of the QR code that is in the back of the module?

Chat image

user-057596 09 February, 2024, 16:08:16

They are working when you use the adjustment feature but when you press record and play back there is no gaze overlay and the stream from the eye cameras has stopped.

user-057596 09 February, 2024, 16:10:13

Hi @user-d407c1 here is the glasses QR code

user-057596 09 February, 2024, 16:10:40

Chat image

user-d407c1 09 February, 2024, 16:12:25

Can you please try clearing's the app data? But before you do so, in the Device Storage > Documents > Neon folder, there might be an app_android.log that can help us understand what happened.

user-057596 09 February, 2024, 16:17:28

@user-d407c1 how do you locate the Neon Folder?

user-d407c1 09 February, 2024, 16:18:14

Using the files app on the phone, it should be on the Internal Storage and then under Documents

user-057596 09 February, 2024, 16:22:09

@user-d407c1 Found it and there is an app_android.log and is 21.2MB

user-d407c1 09 February, 2024, 16:33:11

Great! Please share it with us at your earliest convenience

user-057596 09 February, 2024, 16:33:42

Email it ?

user-d407c1 09 February, 2024, 16:36:55

Yes, please. I think it will be too big for Discord, and easier to keep track.

user-057596 09 February, 2024, 16:38:14

I will send from our university gmail which will be Herriot Watt University Psychology

user-057596 09 February, 2024, 16:49:18

@user-d407c1 email sent.

user-d407c1 09 February, 2024, 16:52:09

thanks! We will reach you ASAP, kindly note that there is a weekend in between, and our response might be delayed until Monday. In the meantime, you can try clearing the app data

user-057596 09 February, 2024, 17:12:55

@user-d407c1 Will do and thanks. Have a great weekend

user-09f634 09 February, 2024, 18:41:55

It has worked for me with a short recording (55s), I don't think that one took very long. I tried the same recording I was having trouble with (2 minutes) on a different laptop as well - over an hour later it is still "Running". Both laptops have good specs (i7 + dedicated gpu). This recording that I am having issues with is from Sept 2023. Could that possibly be affecting things?

user-cdcab0 10 February, 2024, 05:04:10

Probably not, but if you're able to share the recording, I could take a closer look

user-4b3b4f 10 February, 2024, 21:19:40

Apologies if this has been asked before. Is it possible to share recordings, specifically a scanning recording, between workspaces? Thanks!

user-d407c1 11 February, 2024, 19:34:36

Hi @user-4b3b4f ๐Ÿ‘‹! It is currently not possible to share or move recordings across workspaces

user-4b3b4f 12 February, 2024, 19:12:11

Thank you!

user-d407c1 12 February, 2024, 08:00:46

Hi @user-ee081d ! It seems you are using the tag16h5 markers family, this family is not supported in Cloud as they can be unreliable. Could you please try again using the recommended ones (tag36h11)?

user-ee081d 12 February, 2024, 14:07:24

New problem is : after applying the enrichment and defining the surface, i cant download the gaze.csv file here

Chat image

user-ee081d 12 February, 2024, 09:31:54

I am using tag 25 family and it was detected by the neon player, but not the cloud. I would try with the tag36h11

user-d407c1 12 February, 2024, 14:37:51

Hi @user-ee081d ! On the downloads tab, there are two panels the left one corresponds to the original data, while the right one corresponds to the enrichments' data.

You should be able to download it from there.

user-ee081d 12 February, 2024, 15:23:16

after pressing download i am only getting enrichment_info.txt file

Chat image

user-d407c1 12 February, 2024, 15:24:31

Are you using Safari? https://docs.pupil-labs.com/neon/pupil-cloud/troubleshooting/#my-enrichment-download-contains-only-an-info-json-file-and-nothing-else

user-ee081d 13 February, 2024, 16:53:31

yes, thank you ! it helped!

user-28ebe6 14 February, 2024, 15:18:10

Hey, we are needing to synchronize pupil labs timestamps with other software. Going forward we implemented event markers with the API but we still need to be able to calculate a latency on recordings we have already collected. We can do so by selecting the frame of the specific event but the video playback in the cloud has some grey for a second or two before the first frame occurs. We need a way to assosiate the timestamp of the video or frame number with the recording.begin event that is included in all downloadable folders, is there a place where this information about how much time passes between the recording.begin event and the first frame in the video "neon scene camera v1.mp4" is stored?

user-d407c1 14 February, 2024, 16:20:40

Hi @user-28ebe6 ! When you download the TimeSeries + Scene Video, you will have a world_timestamps.csv which includes timestamps for every frame of the video that gets downloaded there.

Regarding the grey frames in Cloud, due to the higher sampling rate of the eye cameras compared to that of the scene camera, coupled with a possible slight delay in the scene camera's initiation, there is a possibility of capturing gaze data before the first frame from the scene camera is actually recorded. To ensure no valuable gaze information is lost, Cloud incorporates placeholder grey frames to fill these gaps. This method allows us to retain all gaze data for comprehensive analysis.

That said, each gaze point and eye camera frame is meticulously timestamped. This ensures that the synchronisation of gaze points to the scene camera frames remains precise and unaffected despite the addition of scene frames to bridge any starting timing discrepancies.

About the synchronisation moving forward, I strongly recommend having a look at our Lab Streaming Layer relay which will make it easier to sync with other devices, the time offset estimator from our realtime API and how to force syncing the Companion Device with an NTP

user-28ebe6 14 February, 2024, 16:22:51

Ah, I see, will use that download option then for the recordings we have already collected.

user-831bb5 15 February, 2024, 02:51:05

Hello, good evening! Im running into a in issue at the moment... Im performing a Reference Image Mapper enrichment, but although the process says 'completed', the gaze over the reference image does not appear, and when trying to make a Heatmap, it does not show anything over the image...

I've noticed that on the bottom, the bar is colored in gray, as opposite from other enrichments that I have where its purple. I assume its something regarding the reference image? Or what could be the problem here?

the ID is: 0462120d-7b3c-4c84-a862-13bf61c31427

Thanks in advance

Chat image

user-e825d8 15 February, 2024, 07:29:41

Good morning all, we are doing an experiment in order to analyze how many words do we read on our daily basis. The person recording this video claims that he was able to read the bar's menu on this pic. But in the video it is blurry. Can we adjust the quality of recording? Thank you very much!

Chat image

user-d407c1 15 February, 2024, 08:40:16

Hi @user-e825d8 ! The resolution of the camera can not be changed. The camera is designed with a fixed resolution of 1600x1200 pixels, offering a field of view (FOV) of 132ยฐ horizontally and 81ยฐ vertically.

This gives you approximately 13 px/deg. Kindly note, that this value also does not hold exactly true, as you also have to account for radial distorsion and the density won't be the same across the whole field of view.

For instance, an object with a height of 10cm located 5 meters away, subtending about 1.15 degrees (like the one you propose), would approximately be represented by roughly 15 pixels on the sensor. This comparison helps set a baseline for understanding object representation at a distance.

In the context of visual perception, if we were to analogise with a digital sensor, the human eye can perceive details at an approximate rate of 60 pixels per degree. This is coarse analogy as you can't compare a digital sensor with the eye, but can serve to understand why they saw the letters but they were not captured.

On top of that, there are other factors such as motion blur. ie. movement of the head during the camera exposure can lead to lose of details.

With these, I only want to help you set realistic expectations for what the camera can capture.

user-d407c1 15 February, 2024, 07:53:57

Hi @user-831bb5 ! Yes, the grey bar indicates parts of the video where the image was not found.

If you did not received an error message with "It could not be computed based on the image and scanning recording", it means that at least the scanning and image were paired. Perhaps you have another recording where the image was found?

user-831bb5 16 February, 2024, 02:25:28

Thank you for your fast reply Miguel!

Yeah I don't know what is happening here, because I've done enrichments with worst lighting conditions, and they were perfect. Do you think I need to update something or maybe I miss some firmware update? (The android is not updated, but I don't think that could be an issue right?)

Today I tested another enrichment and its been 8 hours for 3 simple 15 second clips with a good set up of light and nothing. the enrichment hasn't even finished. (it did now, almost 9 hours and it gave an error of course) Last time I did a video renderer with like 20 minutes, and it was done in a matter of minutes. Nothing too crazy in terms of time.

The attached picture is the one that took 8 hours and it has given an error.

Also, I've noticed that a simple enrichment is taking toooo long to compute. Im doing an image maper. Its beeen almost an hour. The scanning video is 23seconds + the user video is 37s as well. Is that normal? (ID e58fcdad-16a3-464b-8b86-6d23a2446d57) I know that the time is subject to the server, but I've never experienced that much time for something on the Pupil Cloud.

Now that Im checking and old project, I'm noticing that the user video does not have the dots over the object.. could that be the thing that is messing my recordings?

Is there a way to perform the image mapper offline with Python? (I did the Map Gaze onto a screen and worked wonderfully and fast, and the computer is not that fast to be honest with you)

Chat image Chat image Chat image Chat image

user-831bb5 16 February, 2024, 02:32:11

The cloud-like with the dots is not appearing on the user videos

Chat image

user-831bb5 16 February, 2024, 02:40:56

Thanks Neil for taking the time to respond. I do see your point regarding the difference between the image and the recording. But I have an other one wich is exactly the same and same result. Take a look at the second image that I;ve sent for example, the colors are different as well and the light conditions... worst if you ask me. Does making adjustments with the refrence image colors would produce better results?

I;ll DM you thank you very much

user-9d91ed 16 February, 2024, 02:51:24

Hi there! I'm planning to conduct experiments involving the observation of photographs with elderly individuals. Are there any specific concerns or factors I should be mindful of, such as issues related to offset correction?

nmt 17 February, 2024, 04:38:26

Hi @user-9d91ed! Neon generally performs well with older adults. In fact, you can read more about that in our performance whitepaper. When viewing photographs, it might sometimes be useful to use an offset correction to get the most accuracy possible, but it depends on how big the viewing stimuli are. I always recommend doing some pilot testing with the equipment to get familiar with this!

user-4b3b4f 16 February, 2024, 14:26:24

Hi. I am trying to run the Reference Image Mapper and it never finishes processing. Is this something I am doing incorrectly? I was able to run the Reference Image Mapper two days ago successfully. Thanks.

user-a370d3 16 February, 2024, 17:05:47

Hi, I also have same problem to run Reference Image Mapper from yesterday. I tried different scanning videos but it never finishes.

user-831bb5 17 February, 2024, 01:59:54

Same situation here, I left one and after 9 hours nothing. Tried one last night and I'm not 100% sure but checked today and it was 16 hours and nothing (The videos did display 100% processed when selected from the drop-down menu, but the enrichment was still processing) . Now it displays an error.

nmt 17 February, 2024, 04:30:20

@user-831bb5, I've received your invite. We will take a closer look at your workspace and data on Monday and see if we can figure out how to get your reference image mapper enrichments working properly ๐Ÿ™‚

nmt 17 February, 2024, 04:29:55

Hey @user-4b3b4f and @user-a370d3 ๐Ÿ‘‹. Is this a recent change? Did you do anything different in the Reference Image Mapper when compared to how you used it previously?

user-c5d00c 18 February, 2024, 21:41:58

Hello, I have a question. The Bare metal tracker is no longer recognized by the app I think

user-c5d00c 18 February, 2024, 21:42:11

when I pulg it into the phone nothing happens on the app

user-c5d00c 18 February, 2024, 21:43:11

This might sound dumb, but could this be because the phone battery ran out so I recharged it and for some reason now when I plug it in it does not recognize it?

nmt 19 February, 2024, 03:27:44

It's possible that a full battery drain has affected this! Firstly, please long-press on the Neon app icon, go to "Storage usage" and tap "Clear data". The reconnect the bare metal and see if it shows. If not, please reach out to info@pupil-labs.com with the serial number of the module.

user-8b5404 19 February, 2024, 19:02:28

I've set up a reference image mapper enrichment that has been processing for 20 minutes. Is that normal? How do I get it to load?

user-f43a29 20 February, 2024, 10:56:59

Hi @user-8b5404 ๐Ÿ‘‹ ! The Reference Image Mapper (RIM) Enrichment is computationally intense. Depending on the current number of other RIM requests in queue, as well as other factors, it can take a significant amount of time for the Enrichment to complete. If you check back after a period of time and it is still not showing as complete, try doing a browser tab refresh or logging out and back in. This should trigger a UI update that shows the latest processing status, which could indeed be complete at that point. For future reference, if you want to reduce the computational expense of RIM, then you can use events to process shorter sections of the recording. Check this message for details.

user-471762 19 February, 2024, 20:00:15

Hi Pupil Labs team, I have a question about Neon Monitor. It is working great, and I do not have any connection issues. I am just wondering if there is any way to stream the eye camera along with the scene camera? The lab I work in works with young children and their parents as they play with toys (and we hide behind a curtain). The kiddos are not afraid to move the eye tracker if they are annoyed with it. When we used Pupil Core models in the past, Pupil Capture let us keep tabs on the scene cam and eye cam using our computer behind the curtain. That way, we would know right away if the child (or parent!) bumped the camera, and we would step out to help them fix it. Our solution for now is to keep an eye on the gaze overlay with Neon Monitor to make sure it is tracking correctly. If there is a simple way to stream the eye camera from behind the curtain too, though, that would be great!

user-f43a29 21 February, 2024, 09:45:09

Hi @user-471762 ๐Ÿ‘‹ ! One of the many benefits of Neon is that it is calibration-free and tolerant of slippage, so if the glasses are bumped a bit, then the experiment can continue without any cause for concern. The glasses could even be briefly removed and put back on and recording will continue as planned. You will still get accurate and reliable gaze data. So, you do not need to worry as much about monitoring the eye video streams or the gaze circle. Currently, the Monitor app does not provide an eye video stream, so if you want that, then a small Python script using our Real-Time API would be an option.

user-613324 20 February, 2024, 05:09:54

Hi Neon team, I'm using Neon together with the realtime API, I used the API to discover_one_device, stream eye video, and send_event, at first it worked perfect, however, after many times of testing, it suddenly stops working and output this error message: "File "...\aiohttp\streams.py", line 622, in read await self._waiter aiohttp.client_exceptions.ServerDisconnectedError: Server disconnected" What does this mean? How should I proceed?

user-cdcab0 21 February, 2024, 05:37:06

The error itself is a little non-specific, but if you examine the .message property of the exception, it may provide additional insight as to the cause. Generally speaking, it's best to wrap your realtime api calls in try-catch blocks somewhere along your application's call stack so that your code can detect exceptions and handle them appropriately (e.g., attempting to reconnect, alerting the user, writing to a log, etc)

user-f43a29 22 February, 2024, 11:55:22

Hi @user-613324 ๐Ÿ‘‹ ! If you have encountered this error, then you want to check the "Stream" section of the Neon Companion app. Do you see the message, "waiting for dns service"? If so, then you need to restart the app, and if that does not fix it, then restart the phone. When you start the Neon Companion app again, it might ask you to allow permissions for Neon. Make sure to select "yes" and "always".

user-53a8e1 20 February, 2024, 11:46:57

Hi everyone. We are starting to play with our Neon systems now and working out what the data look like. Only one question so far: can I download enrichment videos from pupil cloud? For example the video showing detected faces after running face mapper?

user-480f4c 20 February, 2024, 13:56:03

Hi @user-53a8e1! It is not possible to download the enrichment with the face mapping overlay. However, you can use the face_positions.csv output to get the coordinates of the bounding box and overlay it on the raw video, but this would require some coding. See also this message for more information on how to render the raw scene video with csv data.

user-275c4d 20 February, 2024, 22:16:22

Hi pupil labs team, is it possible to record and stream data at the same time? Is there anything i should keep in mind when doing this? I am planning to use the async data streamer while recording/saving data to the companion device. Thanks!

user-275c4d 22 February, 2024, 18:39:52

Hi neon team, I just wanted to bump this question. Thanks!

user-6c4345 21 February, 2024, 06:48:53

Good morning Pupil Labs Team We have discovered a tiny scratch on the lens of the NEON module.

Is it possible to change the lens or somehow repair it?

Thanks.

user-480f4c 21 February, 2024, 11:59:11

Hi @user-6c4345! Could you please contact sales@pupil-labs.com in this regard, sharing an image of the module's scratch?

user-1423fd 21 February, 2024, 11:23:25

Hi, I have been collecting data with my Neons for the past few weeks. Iโ€™ve had this error come up a few times and havenโ€™t been able to pinpoint why this error occurs. We are operating the glasses exactly the same each time but the error seems to occur at random. Any help would be greatly appreciated!

Chat image

user-480f4c 21 February, 2024, 11:36:31

Hi @user-1423fd! Could you please contact us by email [email removed] sharing the serial number of the module? You can find it by connecting your Neon system to the phone, accessing the App's main screen, and tapping the top right info icon. There you can view the module serial number.

user-a7636a 21 February, 2024, 13:03:56

Hi! First time attempting to use Neon and pupil clouds. I get a error with gaze pipeline on all my recordings. The recordings work fine on the phone (for the most part). Thanks in advance.

user-a7636a 21 February, 2024, 13:14:44

this is the full error message: "Gaze pipeline failed for this recording. We have been notified of the error and will work on a fix. Please check back later or get in touch with [email removed]

user-480f4c 21 February, 2024, 13:49:00

Hi @user-a7636a! Thanks for reaching out. Can you please share the recording ID with us (right-click on the recording >ย โ€œView recording informationโ€)?

user-a7636a 21 February, 2024, 13:50:11

Absolutly, here it is: 98412e61-ada4-4ae6-8b07-73e93f15eab9

user-a7636a 21 February, 2024, 13:59:04

Okej with further investigating we noticed that this particular video did not have working gaze tracking in the video on the phone. This video has working gaze traking on the phone and does not have the same error message in the cloud aplication but instead its just buffering/processing endlesly: 52fa5b20-3137-4332-b563-2b3dd1d7a741

user-480f4c 21 February, 2024, 14:00:58

Thanks for clarifying @user-a7636a! Let me talk with the Cloud team and I'll provide some updates asap. In the meantime, it seems that you have an older version of the Neon Companion App (2.7.5-prod). Note that the latest version is 2.7.11-prod. Please update the app via Google Play Store. This might trigger some firmware and FPGA updates. During installation, grant all required permissions, selecting "Always Allow." When prompted for camera access, click the checkbox before confirming.

Please try making some new recordings with the latest app version, and let us know if the issue persists

user-480f4c 21 February, 2024, 14:02:19

@user-a7636a, I just updated my previous message ๐Ÿ‘†๐Ÿฝ , please have a look ๐Ÿ™‚

user-a7636a 21 February, 2024, 14:20:14

Hi again, i did as you suggested and its still not working, it keeps on buffering. we also tried switching acounts and workspace, to no avail.

user-480f4c 21 February, 2024, 14:21:10

can you please send me the recording ID for the recording made with the new app version 2.7.11-prod?

user-a7636a 21 February, 2024, 14:24:37

here you go: a4c5c07e-b554-4652-b774-b3e556436ac1

user-480f4c 21 February, 2024, 14:40:48

@user-a7636a there is high load in cloud currently which might cause delays in the recording processing. We appreciate your patience and understanding and we will get back to you as soon as possible. ๐Ÿ™๐Ÿฝ

user-a7636a 21 February, 2024, 14:48:49

Thank you for the help. That explanaition seems resonably because some of the videos seem to be working now.

user-480f4c 22 February, 2024, 08:14:58

@user-a7636a please note that your recordings have been processed.

user-28ebe6 21 February, 2024, 18:08:31

With the real-time API at what tick rate does the API listen for event markers sent as UDP messages?

user-cdcab0 21 February, 2024, 19:19:54

You can't send events as UDP messages. They are sent HTTP POST messages, which are strictly TCP.

With regards to the timing, you can include a timestamp with the event. If you do, then it does not matter when the message is received, it will be recorded with the timestamp you provide

user-ee081d 22 February, 2024, 07:26:18

Hi! is there a way to map gaze data to the surface ( like using Marker Mapper) without the Cloud ?

user-480f4c 22 February, 2024, 10:03:25

Hi @user-ee081d! Mapping gaze data on surfaces is also possible using the offline analysis workflow for Neon, Neon Player, and specifically our Surface Tracker plugin.

user-2b8918 22 February, 2024, 13:11:24

hi, i am having difficulties understanding how the reference image mapper works. Does it normally take a while to process? and does it use all recordings i have in the project?

user-d407c1 22 February, 2024, 13:38:05

Hi @user-2b8918 ๐Ÿ‘‹ ! All enrichments use by default every recording in the project.

If you want your enrichment to only be computed on a section or some specific recording within the project, you will need to use events to define its temporal selection.

This is selected when you create the enrichment under Advanced Settings > Temporal Selection.

If you are only interested in a portion of the recordings, we strongly recommend that you do so, as smaller sections take less time to compute.

The amount of time an enrichment takes to compute depends on several factors, like the type (eg. reference image mapper is more computationally expensive than others) , the duration (i.e. how many recordings and how long they are) and the queue (how many people are making use of the cloud resources at the moment).

Does this clarify all your questions?

user-2b8918 22 February, 2024, 13:55:27

Yes, thank you. So it is normal for it to take a while to process?

user-d407c1 22 February, 2024, 13:57:52

https://discord.com/channels/285728493612957698/1047111711230009405/1209872360006623302 We are currently under high demand in the Cloud, and you may experience longer times, but in general can take a bit, yes.

user-2b8918 22 February, 2024, 13:58:49

Ok perfect no problem, was just worried it wasn't working - thanks for your help!

user-f4e4e0 23 February, 2024, 00:01:50

Hi Neon Team, I had a question regarding the Neon glasses and Companion updates. I turned on the wifi for my phone and got a request for a FPGA firmware update. However, this update went for 30 minutes with no progress on the update bar and I believe it crashed in the app. I had to restart the app, however, now it won't recognise the glasses when I plug it in. What would you recommend doing?

user-480f4c 23 February, 2024, 10:21:32

Hi @user-f4e4e0! Can you please try disconnecting Neon, force stopping the app, starting the app again, and connecting Neon?

To force stop the app, please try long-pressing on the Neon Companion App Icon, select 'App info, and 'Force stop'.

Please let us know if this works.

user-231fb9 23 February, 2024, 07:34:58

Hi, is it normal that I've had enrichments processing for over 15 hours without result? In the past they took max. 1 hour. I've deleted those enrichments and started again one by one now, but is there something that can be done to speed up the process? Thanks ๐Ÿ™‚

user-480f4c 23 February, 2024, 09:05:37

Hi @user-231fb9! It might be that your browser put your tab to "sleep", and the state never changed. Could you please try to hard-refresh your browser ("Ctrl" + "F5" (Windows) or "Command" + "Shift" + "R" (Mac))?

user-ac085e 23 February, 2024, 16:27:25

Hey Pupil Labs team, is the Neon IEC 62471 compliant? I have not been able to find the documentation on it.

user-d407c1 26 February, 2024, 07:38:53

Hiย @user-ac085e ๐Ÿ‘‹ ! Yes, Neon has been tested for EN 62471:2008 and is classified as exempt (no photobiological hazard). ๐Ÿฆบ

user-ac085e 23 February, 2024, 23:20:02

I've found documentation that states the Invisible is IEC 62471 compliant but nothing regarding the Neon.

user-63bd0f 24 February, 2024, 14:55:35

Hello, I have had problems getting the "just act natural" to connect with the cell phone. What i can do?

user-d407c1 26 February, 2024, 07:42:03

Hi @user-63bd0f ๐Ÿ‘‹ ! Would you mind describing a bit more the issue? are the glasses not recognised by the app? In that case, could you please reach us by email at [email removed] with the serial number of your glasses? Simply send a picture of the back of the module where the datamatrix (what looks like a QR code ) is visible

user-613324 25 February, 2024, 01:31:20

Hi Rob, after I encountered the server disconnect error, I did nothing but wait overnight. Then the error was gone and everything runs fine. I checked the "Stream" section of the Neon Companion app again and this time it shows "Stream to Pupil Monitor" with a QR code below. I don't need Pupil Monitor but I do need to use realtime api to send timestamps of the events. And I believe the error of server disconnection I encountered is triggered by the device.send_event() function. So what should I do to avoid or mitigate this error?

user-8b5404 25 February, 2024, 23:11:40

Hello. I have a reference image mapper enrichment with two AOIs. Is there a way for me to download a fixations csv that will give data on whether the fixation is on AOI 1, AOI 2, or not on either AOI? Thank you!

user-480f4c 26 February, 2024, 07:46:48

Hi @user-8b5404 - You can get several fixation metrics on the AOIs you have defined in the aoi_metrics.csv file which is included in the the Reference Image Mapper download folder.

user-ac085e 26 February, 2024, 01:23:03

Any information on IEC 62471? "Hey Pupil Labs team, is the Neon IEC 62471 compliant? I have not been able to find the documentation on it."

user-ee081d 26 February, 2024, 14:41:48

Hi! I am running the gaze-controlled-cursor-demo. I can get all four tags detected after adjusting the brightness and size but only for a few seconds, after that, the red bourder starts twitching on the markers and it is hard to get all 4 detected at the same time. What could be my problem?

user-cdcab0 26 February, 2024, 14:50:51

Could you make a recording and share the scene video here? Much easier to diagnose that way

user-b6f43d 27 February, 2024, 04:49:33

Hi I am working in transportation engineering with the Neon glasses.
We went for a test run and the glasses stopped recording after 20mins, I couldn't start recording again(it started recording but stopped in 20-30sec). Again after 10mins gap I started recording again, it started working.

Is this the maximum time it can work in one go or is there any problem with the device.

user-480f4c 27 February, 2024, 07:16:45

Hi @user-b6f43d! Sorry to hear you're experiencing issues with your glasses. Recording time depends on the phone's battery. Using a fully charged device you get around 4 hours of continuous recording time.

To see what might have happened, could you please contact us at [email removed] sharing relevant information (module serial number, app version, recording ID in case the recording was uploaded to Cloud)?

user-c5d00c 27 February, 2024, 11:38:29

Hello, the Bare metal tracker, it is no longer connecting to the Neon Companion Application. Although we plug it in, it still displays "plug in and go".

We have attempted the following, none of which fixed the problem: We force quit the app, then reconnected the bare metal We cleared the data storage, then reconnected the bare metal We installed the application on a different device and connected the bare metal.

user-d407c1 27 February, 2024, 12:18:54

Hi @user-c5d00c ! We have contacted you by email, please let us know if it reached you.

user-5c56d0 28 February, 2024, 03:19:29

Thank you for your work.

I would like to utilize Neon and Invisible on multiple smartphones. This is because there are limitations on the battery operating time of a single smartphone. Also, there is a need to use another smartphone if one breaks.

Q1 Are Neon Companion and Invisible Companion available on Google Play? (I couldn't find Neon Companion) Can I install them from any Android device?

Neon Companion is already installed on the OnePlus 8T. I was able to install Invisible Companion from Google Play on the OnePlus 8T. However, Neon Companion could not be found on Google Play for the OnePlus 6.

Q2 Are there any requirements for smartphones to use Eye Tracker?

Q3. Which indicators in the csv can be used to determine the data that signify the saccade?ใ€€ Which of Pupil Core, Neon or Invisible can be used to obtain the saccade?

nmt 28 February, 2024, 06:25:51

Hi @user-5c56d0. The Companion apps that run with Neon and Invisible are specifically tuned to work with certain models of phones and Android OS versions. We require a lot of control over various low-level functions of the hardware, and we want to ensure optimum robustness and stability, among other things.

The apps are available on the Google Play Store, but only if you have a compatible phone and Android version for that app.

It's likely you couldn't install the Invisible App on the OnePlus 8T because of an incompatible Android version. The Neon App is also not compatible with the OnePlus 6.

The best thing to do is to visit each respective documentation page that outlines specific compatible phones and Android versions: - Neon Compatible Phones and Android OS versions - Invisible Compatible Phones and Android OS versions

Regarding saccades, technically all three systems can be used to compute saccades, although they're not explicitly reported in the exported data.

Neon/Invisible: For example, saccades are conceptually equivalent to the intervals between classified fixations. The caveat here is that this approach assumes no smooth pursuit eye movements occurred. We do plan to release explicit saccade metrics, although we don't have a concrete release date just yet.

Core: Inter-fixation intervals are also conceptually equivalent to saccades for Core. But note that some time ago, there was a community-contributed saccade detector: https://github.com/teresa-canasbajo/bdd-driveratt/tree/master/eye_tracking/preprocessing. Not sure if it will still work, but it's worth looking into!

user-ff3a0a 28 February, 2024, 14:34:22

Hello, would it be possible to transfer recordings from one workspace to another? Also is it possible to give workspace user exclusive access to projects?

user-d407c1 28 February, 2024, 14:40:22

Hi @user-ff3a0a ! Unfortunately, it is currently not possible to move recordings across workspaces. We did received this feedback in the past, and we were going to transfer it to our new ๐Ÿ’ก features-requests channel and consider it.

If this is something you would like to see sooner than later please create a request there.

Same goes for projects granular access. I'd recommend separating these two requests.

user-cdcab0 28 February, 2024, 15:36:30

so with the monitor app the frame and

user-ccf2f6 28 February, 2024, 19:26:00

hi Pupil Labs, I wanted to confirm whatโ€™s the storage limits on pupil cloud for Neon users?

user-d407c1 29 February, 2024, 08:19:00

Hi @user-ccf2f6 ! There are currently no storage limits in Pupil Cloud.

user-480f4c 29 February, 2024, 08:17:45

Hi @user-b6f43d - Sorry for not including this in my first message:

  • You can find the module serial number by connecting your Neon system to the phone, accessing the App's main screen, and tapping the top right info icon. There you can view the module serial number.
  • For the recording ID, did you upload the affected recording to Pupil Cloud? If yes, please right-click on the recording, select "View recording information" and share with us your recording ID.
user-edb34b 29 February, 2024, 14:14:48

Hi, I used the marker mappers enrichments to define some area of interest for my analysis. I found out that the coordinates of a same fixation are different between the two csv files exported for the two enrichments. Thus I would like to know how are computed the normalized coordinates for the marker enrichments and why are they different ? Also, are the coordinates of the marker mapper file that contains the specific fixation (fixation detected = TRUE) more reliable than the other one? I put an example in this photo with a same fixation among the different fixations csv files. Thanks !

Chat image

user-480f4c 29 February, 2024, 14:40:32

I have a follow up question regarding that. Were the surfaces defined in the same way or did you change their size by moving the corners before running the enrichment?

user-edb34b 29 February, 2024, 14:23:23

Also, I had several recordings where the scene camera was grey until more than 15s, even up to 30s (see photo). We first thought about a heating problem but it seems to happen both during the beginning and / or the end of a session. Do you know how this problem occurs, and what can we do to prevent it? Thanks !

Chat image

user-480f4c 29 February, 2024, 14:36:26

Hi @user-edb34b! The grey frames are expected and they are used as placeholders on Cloud because eye cameras start a bit earlier than the scene camera sensor and we don't throw away this data. See also this relevant message https://discord.com/channels/285728493612957698/1047111711230009405/1207360774176243803

user-480f4c 29 February, 2024, 15:05:38

Thanks for clarifying @user-edb34b!

Note that the fixation coordinates in the fixation.csv exported in the Marker Mapper enrichment folder are given in the normalized coordinates within the surface.

I recommend having a look at the explanation of the exported data here.

user-b6f43d 29 February, 2024, 19:40:13

I have a doubt, in marker mapper enrichment, my marker are getting identified in some frames and in the next frame maybe because there is too much light on it because of the sun it is not getting recognized. so will it skip the data in frames which are not getting recognized or is it okay if the marker are identified in any one of the frames

user-b6f43d 01 March, 2024, 19:26:19

Can someone reply to this ?

nmt 02 March, 2024, 09:16:39

The frames without marker detections will be skipped, yes. The markers do need to be detected for the mapping to take place, and we don't do any interpolation of frames with missing detections.

user-b6f43d 29 February, 2024, 20:20:32

Why even after the markers are identifies the surface is not properly defined ?

Chat image

wrp 01 March, 2024, 01:14:24

Can you drag the bottom left corner of the surface down so you cover the full windshield? Or was this trapezoid shape intended? It looks like the surface visualized to the right of the video is warped due to the geometry of the surface.

user-b6f43d 29 February, 2024, 20:39:18

Its showing like this

Chat image

End of February archive