πŸ•Ά invisible


user-bbf384 01 March, 2022, 07:26:07

Hello, while doing a recording using pupil invisible during an experiment recently, this error appeared. May I know how this can be rectified/prevented in the future?

Chat image

papr 01 March, 2022, 07:30:03

Hi! Could you please check if the raw recording includes an android.log file and share it with [email removed] The log file might contain further information about what went wrong.

user-bbf384 01 March, 2022, 07:36:29

Okay sure, thank you!

user-6c1e7f 01 March, 2022, 15:28:54

Are there any recommendations for maximum humidity? Would you expect Pupil Invisible to take damage if used in a building with 75-80% humidity over a longer period of time?

wrp 02 March, 2022, 07:42:37

Hi @user-6c1e7f πŸ‘‹ I do not foresee that being a problem. We use Pupil Invisible in tropical locations where humidity levels are always above 70% year-round 😸

The key is to not let water get into the glasses especially around the cameras and seams. Pupil Invisible is not water resistant.

user-ae0f24 01 March, 2022, 19:41:47

I am interested in utilizing head-mounted eye tracking technology to identify and describe gaze patterns among novice, intermediate, and expert emergency physicians who are leading trauma resuscitations. Is this something pupil could be useful for? what would the data obtained look like? Can you provide other examples where pupil was used to identify visual heat maps in a real-world complex environment?

user-9429ba 02 March, 2022, 10:57:18

Hi @user-ae0f24 πŸ‘‹ Thanks for your message! Our eye tracking glasses could indeed be useful for your medical application area in trauma rooms. There is a wealth of research on expert and novice eye movements in different medical use cases. In order to provide more tailored advice, it would be useful to know the specific tasks and situations you envisage. Please contact [email removed] with a bit more information addressing the kinds of questions raised below:

  • What questions would you ideally like eye-tracking to address? For example, doctors ability to deal with stress, follow procedure etc. -Would this be for preview of gaze data for immediate feedback, or a more carefully controlled clinical investigation? -Are you interested in doctors ability to correctly interpret and interact with equipment?

We look forward to hearing from you!

user-4bc389 02 March, 2022, 07:27:29

Hi I have some doubts about the coordinates in the following figure in the data exported by pupil cloud, and the coordinates in the usage guide are (0.0) in the upper left corner and (1.1) in the lower right corner. What is the relationship between these two coordinates?

Chat image

marc 02 March, 2022, 07:30:41

Hi @user-4bc389! I am assuming the screenshot is of gaze.csv file from the Raw Data Exporter? You can find documentation on this export format here: https://docs.pupil-labs.com/invisible/reference/export-formats.html#raw-data-exporter

The fixation columns are reported in pixels of the scene video. This is different to how fixations are reported in Pupil Player, where instead the coordinate system is normalized and ranges from (0, 0) to (1, 1).

user-4bc389 02 March, 2022, 07:38:29

Thank you for your answer, what is the coordinate origin of the fixation in the picture below?

Chat image

marc 02 March, 2022, 07:40:28

The pixel location (0, 0) is the top-left corner of the image, the bottom-right corner would be (1088, 1080). It is possible that some gaze points are located slightly outside of the range covered by the scene camera.

user-4bc389 02 March, 2022, 07:50:55

thanks,Also, is my understanding of the surface coordinate system correct in the picture below?

Chat image

marc 02 March, 2022, 07:51:51

Yes!

user-4bc389 02 March, 2022, 08:39:23

Can pupil cloud export data such as fixation duration in the surface coordinate system, or only the fixation duration of the entire viewing process?

marc 02 March, 2022, 09:10:41

Please see the export format documentation of the Marker Mapper here: https://docs.pupil-labs.com/invisible/reference/export-formats.html#marker-mapper

The export includes fixation data in surface coordinates.

user-4bc389 02 March, 2022, 09:11:32

OK,thanks

user-8d365d 02 March, 2022, 16:34:03

Hi Pupil lab team, When I used my Pupil Invisible, there was no red circle occurrence whatever in the Oneplus device or Pupil Monitor. But it appeared when I viewed the video at Pupil Cloud. Is there any set-up that can solve this problem? Thanks in advance.

user-8d365d 02 March, 2022, 16:35:46

And when I look at the records I find that the red circles are very poorly calibrated, how should I calibrate it?

user-057596 02 March, 2022, 17:24:28

Hi when playing back recordings on the cloud I can’t pick up any sounds what am I doing wrong? Thanks Gary

papr 03 March, 2022, 07:50:55

Hi, could you please check if you have enabled the Recording Audio option in the app settings? It is turned off by default and might be the reason why there is no recorded audio.

user-057596 03 March, 2022, 07:52:15

Will do Papr and thanks. 😁

marc 03 March, 2022, 08:28:25

Hi @user-8d365d! Being able to see the gaze signal in Cloud but not locally on the phone is odd, if the gaze signal exists it should be visible everywhere. Just to confirm, you do have a recording for which you can see gaze in Cloud, but if you play back the recording locally on the phone there is no gaze?

FYI note that the glasses recognize if they are not being worn and stop visualizing a gaze signal, so if you are looking for a gaze signal make sure the glasses are actually being worn at that time.

There is no calibration for Pupil Invisible and it should work out of the box. Cases that can lead to bad gaze accuracy are if the subject is looking at things at a distance <1 m. In this case there is noticeable parallax error, which is essentially an offset to the left. For a small percentage of subjects there can also be a general constant offset in their predictions, due to their eye physiology. In both cases you can apply an offset correction. To do this navigate to the live preview in the Companion app, ask the subject to look at a specific landmark, such that the offset becomes apparent, click and hold the screen and drag the gaze circle to the correct location. This offset will be saved in the wearer profile.

user-8d365d 03 March, 2022, 20:46:38

Hi Marc, thanks for your reply. I have checked again. When I played back the recording locally on the phone, there is no gaze signal. But there is a gaze signal in Cloud.

user-057596 03 March, 2022, 17:20:59

Hi I checked the to see if the recording audio option in the app setting had been enabled which had been but it is the level of sound recording is low and we are planning to eye track in a clinical setting where we want to pick up the dialogue between the clinicians. Is there a way to increase the level of sound recording or attach an external microphone? Thanks Gary

papr 03 March, 2022, 17:23:23

How far away are the clinicians from each other during the recording?

user-057596 03 March, 2022, 17:32:33

It’s within a morgue setting with Thiel cadavers using ultrasound and regional anaesthesia with a trainee performing the procedure and the trainer within a 1.5m radius of the location of the trainee.

papr 03 March, 2022, 17:36:07

Audio in that range should be audible. Unfortunately, there is no in-app setting to increase the audio. The audio is provided by the operating system and stored as is. Please contact info@pupil-labs.com in this regard.

user-057596 03 March, 2022, 17:37:07

Thanks Papr and will do

user-8d365d 03 March, 2022, 20:55:13

Hi Marc, I re-installed the companion. Now the problem is solved! Thank you!

user-28ac9f 04 March, 2022, 14:31:29

Just to be clear: in order to get the Pupil Labs Player (for Pupil Invisible) I should download the Pupil Labs Core software, right? I ask because it's not clearly stated anywhere, nor the most intuitive step.

papr 04 March, 2022, 14:34:35

Yes, Pupil Player only comes as part of the Pupil Core software bundle. I would be happy to clarify the documentation in this regard. Could you point me to the location where you would have expected this information to be written down?

user-28ac9f 04 March, 2022, 14:47:55

Thank you!

I would have expected a direct download link on top of the Pupil Player page at https://docs.pupil-labs.com/core/software/pupil-player/

and possible I'd also add links to the Pupil Player page (link above) in the Invisible user guide, where that software is mentioned, e.g. here at the local transfer guide https://docs.pupil-labs.com/invisible/user-guide/invisible-companion-app/#local-transfer

papr 04 March, 2022, 15:06:42

Thank you for the feedback!

user-bdf59c 07 March, 2022, 09:52:02

Hi, I'm looking for an automated face redaction/blurring software that would work for Pupil Invisible footage/mp4 footage. I can't find anything that works well. Can anyone here offer any suggestions? TIA

user-5b371f 07 March, 2022, 13:42:25

Hey everyone. We recently encountered a problem with the eye tracker (see the attached video). Does anyone knows the reason for this poor quality?

papr 07 March, 2022, 14:08:08

Please contact info@pupil-labs.com in this regard.

papr 07 March, 2022, 14:07:39

Hi πŸ™‚ Pupil Cloud will support automatic face blurring on upload soon! We will announce it in πŸ“― announcements when it becomes available.

user-bdf59c 07 March, 2022, 14:54:26

Oh fantastic, thanks!

user-6a91a7 07 March, 2022, 18:39:44

Is it possible to change the cloud account in the Pupil Invisible Companion application that videos are being uploaded to?

papr 08 March, 2022, 07:37:46

Yes, go to settings, scroll down to the bottom and press log out. Afterwards, you can login with a new account.

user-679744 08 March, 2022, 09:19:47

Hello ! compared to eyetrackers where you have to do a calibration step to have a reference plan how does Invisible works ? How do you define the plan represented by the computer screen that participants will have to look at ?

nmt 08 March, 2022, 09:46:35

Hi @user-679744 πŸ‘‹. Do I understand you correctly in that you wish to analyse gaze in relation to a computer screen? If so, we have a workflow that can facilitate this. You can add April Tag markers to the screen (physically or digitally) and use the Marker Mapper enrichment in Pupil Cloud. This enrichment automatically detects markers, defines surfaces, and obtains gaze positions relative to these surfaces. You can also generate heatmaps. Further details here: https://docs.pupil-labs.com/invisible/explainers/enrichments/#marker-mapper

user-679744 08 March, 2022, 10:18:41

fantastic thank youi so much ! excatly what i needed !

user-04f0a4 08 March, 2022, 16:00:36

Hi I am having some problems with my invisible gaze tracker. The red light started blinking and I had an error. The error said that I could not start capturing and the red circle was not showing the gaze anymore while wearing the glasses and was stuck in the left corner for a while. I do see what I am looking at but now I do not even see a red circle anymore (the gaze). I wonder what is going on? Any ideas? I had to cancel my experiment and I do also have some planned tomorrow.

nmt 08 March, 2022, 16:46:02

Hi @user-04f0a4 πŸ‘‹. Please send an email to info@pupil-labs.com with a screenshot of the error dialogue and we can assist!

user-04f0a4 08 March, 2022, 16:57:09

Sent an email!

user-783c3d 10 March, 2022, 08:29:34

Hi, we were doing a test with 2 pairs of pupil invisible the other day, but upon finishing only one of the POVs could upload, and the other says it's uploaded and the length is 14 min. on the companion app, but when you view it in companion you can only see 3 minutes. On cloud it says the file is 99% uploaded.

I suspect the file is just corrupted but my teacher recommended giving you a shout to see if there is anything we can do to fix it on the backend?

marc 10 March, 2022, 09:05:41

Hi @user-783c3d! Could you please DM me the recording ID of the potentially corrupted recording? You should be able to get it e.g. in cloud by righ-click on the recording -> view details. Then we can take a look at what is wrong with it!

user-28ac9f 11 March, 2022, 13:39:59

Is it possible to open the glasses? I wanted to see if there's any spare space and a spare part on the USB line in the glasses to hack an eSIM into it.

marc 11 March, 2022, 13:41:26

No, it is not possible to open them up without damaging them. The physical space left in the plastic parts for the electronics is also pretty maxed out I think.

user-5b0955 12 March, 2022, 21:03:53

Does pupil invisible collect timestamp or each frame? we need this info to sync with other cameras (Two Zed cameras).

marc 14 March, 2022, 07:30:06

Yes, every sensor collects UTC timestamps for every sample! Using them to sync with external sensors should be straight forward.

user-5b0955 14 March, 2022, 21:28:56

Thanks @marc , Can we sync two zed cameras and two pupils invisible without timestamps based approach. That means, are there any tools that can record sync video streams from these four devices?

marc 14 March, 2022, 21:39:44

Using the according timestamps is the only way to synchronize. In principle an arbitrary number of devices can be synced this way, but I am not sure what exactly the zed cameras provide.

user-5b0955 15 March, 2022, 03:18:54

Thank you very much. We collected some data using pupil invisible. Could you please help us how to get the timestamps of each frame? I noticed some .time files when we download recorded session. One more thing when we download data it has raw video without the gaze circle overlay. How can we download the video with gaze overlay?

papr 15 March, 2022, 08:07:21

The easiest way to get access to timestamps is via the Pupil Cloud Raw Data Exporter enrichment https://docs.pupil-labs.com/invisible/reference/export-formats.html#world-timestamps-csv Read more about enrichments here https://docs.pupil-labs.com/invisible/explainers/enrichments/ (see also the Gaze Overlay enrichment at the bottom of this document)

user-80fbf8 15 March, 2022, 14:44:16

Hi team, I am trying to upload recordings from the companion device to Pupil Cloud.. On the device it looks like the recordings have finished uploading (tick on a cloud icon), but on Pupil Cloud there are none

papr 15 March, 2022, 14:45:25

but on Pupil Cloud there are none Hi, could you clarify if none of the recordings are listed in Cloud or if none of the recordings listed in Cloud have a tick mark?

user-80fbf8 15 March, 2022, 14:45:56

none of the recordings are listed in Cloud

papr 15 March, 2022, 14:46:47

Two possible issues come to mind: 1) You are not logged in with the same account as on the phone. 2) The recordings have been uploaded to a different workspace than the one selected in Pupil Cloud. Could you please check that neither is the case?

user-80fbf8 15 March, 2022, 14:48:39

I am logged in on the same account and I have only one workspace on the companion app

user-80fbf8 15 March, 2022, 14:49:39

I've just created a new workspace on Cloud to test if it will also be visible on the companion app, but it isn't and both are empty

marc 15 March, 2022, 14:49:43

@user-80fbf8 We had an issue in the past where the uploaded recordings were falsely put into the trash. Could you select the Trash filter from the saved searches in the top right next to the search bar and check if the recordings show up there?

user-80fbf8 15 March, 2022, 14:52:01

I don't have any saved searches

Chat image

marc 15 March, 2022, 14:52:46

The icon with the three lines in the top right should give you a dropdown that contains a "trash" filter

user-80fbf8 15 March, 2022, 14:54:08

ah ok, but there are still no recordings..

marc 15 March, 2022, 14:56:46

Okay, thanks for checking anyway! It would be helpful to know one of the recording IDs to check the recordings state in Cloud. Could you send us one? You will find the recording ID in the info.json file in the recording folder, which you can access directly on the phone. See also this guide https://docs.pupil-labs.com/invisible/how-tos/data-collection-with-the-companion-app/transfer-recordings-via-usb.html

user-80fbf8 15 March, 2022, 14:58:27

8f9884f8-036a-4717-b8d9-ae17bde99da6 this is one

marc 15 March, 2022, 14:59:40

Thanks! I'll get back to you as soon as I know more!

user-80fbf8 15 March, 2022, 14:59:55

thank you

marc 15 March, 2022, 15:32:26

@user-80fbf8 Could you please DM me the email address of your Pupil Cloud account?

user-80fbf8 15 March, 2022, 15:56:44

I have

user-a9c078 16 March, 2022, 05:33:00

Hi all, I want to run Reference Image Mapper. How do I make the reference image?

marc 16 March, 2022, 11:51:04

Hi! You can take the reference image with any camera, e.g. the camera of the companion phone!

user-a9c078 16 March, 2022, 23:17:02

thank you!

user-e0a93f 16 March, 2022, 20:52:26

Hi πŸ™‚ I was able to de-distort my world video to counter the fisheye effect of the world camera using OpenCV. Now I am facing a question: is the image distortion considered in the projection of the gaze orientation? In other words, should I trust the gaze position that is projected on the output video of Pupil Player or the pixel gaze position (x, y) from the output file gaze.csv?

papr 17 March, 2022, 06:13:23

Hi, could you elaborate on your workflow? To me, it sounds like you need to undistort the gaze data, too, before being able to render it into the manually undistorted scene video.

user-0ee84d 16 March, 2022, 20:52:34

@marc is the real-time api available?

marc 17 March, 2022, 07:32:43

Yes! We are a bit behind on the official announcement, but the new real-time API is now available. Please install the newest version of the app and see here: https://docs.pupil-labs.com/invisible/how-tos/integrate-with-the-real-time-api/introduction/

user-0ee84d 17 March, 2022, 09:04:25

Nice! Does it also have the option to replay pre-made recordings as real-time test data?

papr 17 March, 2022, 09:30:44

This option is not available on the phone. Simulating the whole host API would mean rebuilding the app from scratch. Instead, we could provide a partial simulation of the Python client API. Specifically, we could provide a Python module with the same receive_* data functions as the original, with the difference that the data would be read from a recorded file instead from a host in realtime. The control api, e.g. start/stop recording, etc, would not be available. Would that be sufficient for your development?

user-0ee84d 17 March, 2022, 09:40:19

that would be the best!

papr 17 March, 2022, 09:52:44

ok, it will take me a few days to build that given my list of other todos. I will let you know once it is available!

user-0ee84d 17 March, 2022, 12:00:55

is it possible to fetch the eye camera video, audio and imu data from the real-time api too? https://pupil-labs-realtime-api.readthedocs.io/en/stable/_modules/pupil_labs/realtime_api/streaming/gaze.html#GazeData

papr 17 March, 2022, 12:40:14

No, that is currently not possible.

papr 18 March, 2022, 11:30:06

@user-0ee84d https://gist.github.com/papr/1341a7f8badbb2da8e0e7dddda126c99#file-realtime_api_simple_simulation-py-L419

Here you go. πŸ™‚

with SimulatedDeviceFromRecording(path_to_recording) as device:
    while True:
        frame, gaze = device.receive_matched_scene_video_frame_and_gaze()

The code includes a working example that can be called from the command line:

usage: realtime_api_simple_simulation.py [-h]
                                         [--mode {matched,scene-only,gaze-only}]
                                         [--debug] [--timeout TIMEOUT]
                                         recording

positional arguments:
  recording

optional arguments:
  -h, --help            show this help message and exit
  --mode {matched,scene-only,gaze-only}, -m {matched,scene-only,gaze-only}
  --debug
  --timeout TIMEOUT
user-0ee84d 21 March, 2022, 02:16:15

thank you!!!!!!!!!!

user-e0a93f 18 March, 2022, 15:01:16

Thank for your answer @papr, I think I have my answer then. 1. I guess I can trust the gaze position that is projected on the world video. 2. My current workflow only distort the world video images, but now I will distort at the same time the gaze position and the images from the video.
3. If possible, could you just explain to me quickly how, in Pupil, the gaze position that is projected on the video is modified to consider the fisheye effect ? (is it directly in the trained AI, or is it post process treatment ?)

marc 21 March, 2022, 08:57:30

The gaze estimation algorithm of Pupil Invisible does initially output a 3D gaze direction originating in the scene camera. This value is independent of the scene camera distortion. During manufacturing every scene camera is calibrated and those values are used to project the 3D direction to a 2D gaze point in pixel coordinates of the scene camera. Due to the accurate calibration, this is taking the distortion into account.

user-057596 18 March, 2022, 16:17:49

Hi I’m trying out the enrichments on Pupil Cloud and at the moment the one for Marker Mapper. When I have downloaded a recording before from the cloud last year and used Pupil Player it had no problem detecting the markers but on the Cloud it can’t detect the markers on the same recording and I’m wondering why there is discrepancy? Thanks Gary

marc 21 March, 2022, 09:02:52

A potential explanation might be the "tag family" you used. There are different families of AprilTag with slightly different properties. While Surface Tracker in Pupil Player allows you to specify which family is used, the Marker Mapper in Pupil Cloud only support one family: tag36h11. This is used by default in Player too though. Besides that a difference in detection results would be odd as both implementations use the same detection algorithm. We would need an example recording to see what is going on in detail (you could share one with [email removed]

user-6499eb 21 March, 2022, 16:05:11

Hello, I had a longer recording (I know, not best practice) that works great on the companion device. Uploading to the cloud on the device says complete (green tick) but within Pupil Cloud it shows a circle. On hover it states "uploading 100%". This has not changed for several days. Is there a way to reattempt upload, or to fix this problem?

marc 21 March, 2022, 16:07:04

Hi @user-6499eb! In principle also long recordings should work just fine. Could you send me a DM with the ID of the recording in question please? You can see the ID when you right-click a recording and select "View Details".

user-5a2110 21 March, 2022, 19:10:07

Hello, I hope it's ok to ask for help with this in this channel πŸ™‚ I am currently completing my first ever eye-tracking research project, and I was wondering if there are any tools built in the PupilCloud/Player that will make the coding process faster? I know there is a heat map available, which is a great help as well. I am looking at lateral eye movements specifically.

marc 22 March, 2022, 08:09:10

Hi @user-5a2110! There is not tool for manual coding available in Pupil Player or Cloud. As you noted, there are several Tools for automatic mapping though: https://docs.pupil-labs.com/invisible/explainers/enrichments/#marker-mapper https://docs.pupil-labs.com/invisible/explainers/enrichments/#reference-image-mapper

Those require a bit of preparation before the recordings are made though to provide appropriate inputs for the algorithms. If you are looking to code recordings that have already been made these may no longer be applicable.

user-46e93e 22 March, 2022, 00:02:05

Hi All, my lab recently has acquired two pairs of Pupil invisible glasses and have been running into some issues with one of them. The app does not seem to recognize the glasses when connected at all times. it works sometimes and other times it doesn't. On occasion, when the phone does connect, it vibrates seemingly unprovoked and is sometimes accompanied with a "recording/sensor failure" error. Looking through the discord, it seems like there might be a hardware malfunction? The glasses are fairly new so I just wanted to check in and make sure there was no easy fix! Thanks in advance

wrp 22 March, 2022, 03:40:37

Hi @user-46e93e this could be software, cable, or HW issue. I see that you already emailed us as well. Let's continue the discussion there. One of my colleagues will follow up to help troubleshoot today.

user-0ee84d 22 March, 2022, 10:14:32

is it also possible to access the feed from the eye camera? πŸ™‚

papr 22 March, 2022, 10:15:02

No, this is currently not possible. Neither in the simulated nor in the real api πŸ™‚

user-0ee84d 22 March, 2022, 10:15:20

may be can i add it as a feature request?

papr 22 March, 2022, 10:16:05

It is one of many planned features but I am not able to share a release date yet.

user-0ee84d 22 March, 2022, 10:16:21

Both the feed from the eye camera and scene camera needs to be processed. It would be nice to have such a feature.

papr 22 March, 2022, 10:18:21

Out of interest, what kind of processing are you planning for the eye videos?

user-0ee84d 22 March, 2022, 10:20:35

I managed to improvise pupil detection in dark environments by using a custom detector/segmentation. I need to access the eye scenes to be able to test offline/online

papr 22 March, 2022, 10:22:29

The only way to get access to the eye video is through loading it from a recording at the moment.

user-0ee84d 22 March, 2022, 10:24:31

could you please direct me in that direction? or share some snippets?

papr 22 March, 2022, 10:32:39

This is how I would load frames from a video file: https://gist.github.com/papr/1341a7f8badbb2da8e0e7dddda126c99#file-realtime_api_simple_simulation-py-L381-L386

You should be able to reuse the whole VideoStream class by only changing the name property to the corresponding video file name.

papr 22 March, 2022, 10:25:29

@user-0ee84d not sure what you mean. You should have access to the recording files already, correct?

user-0ee84d 22 March, 2022, 10:40:01

ah perfect! I will do those changes

papr 22 March, 2022, 10:42:40

I might have asked this before, but is there a hard requirement for processing the data in realtime? It is much more reliable to process the data from the recording and you do not have to wait for the app to implement the eye video streaming. Did you know that you can download recordings programatically from Pupil Cloud? You could automate the processing of new recordings in this way.

user-0ee84d 22 March, 2022, 10:40:48

thank you!

user-5a2110 22 March, 2022, 15:21:27

Thank you so much for your response! We actually are using marker mapper indeed! I've run into a small problem with them, however - it seems like the brightness needs to be a certain level for them to be recognized. I have a couple of videos that needed to be recorded in a low light setting (you can still see the QR codes though, see image attached). Does that mean this video is useless for marker mapping, or can I possibly run it through a brightness adjustment filter and then reupload to run the marker mapping?

Chat image

marc 22 March, 2022, 17:41:23

When recording screens it can be very difficult to achieve good exposure. In an ideal setting the environment light would be bright enough, such that the screen does not dominate the lighting conditions as much. To accommodate difficult settings you can however switch to manual exposure mode in the settings and optimize the exposure for the markers. This may lead to the screen contents being less legible though.

The recordings like this you have already are unfortunately not really fixable. Processing the video first and re-uploading it is not possible (and likely would not help much). For future recordings I recommend improving the lighting or the exposure settings, and placing the markers such that the screen can not shine through them. I.e. either using more opaque material to print on, or placing them fully outside the illuminated area. Increasing them a bit in size could also improve the detection rate.

user-057596 22 March, 2022, 17:28:46

Hi Marc thanks for your response and yes you were right as it was the older square tag family I was using which were detected when I specified them on Pupil Player but weren’t picked up on Pupil Cloud. Is there a link to the tag:36h11 family to print them out? Also it possible to export the fixations for each defined surface area from Pupil Cloud when using the surface tracker as you can do for for Pupil Player?

marc 22 March, 2022, 17:42:51

Please see here for markers of the tag36h11 family to print: https://docs.pupil-labs.com/invisible/explainers/enrichments/#setup

Fixations are included in the Marker Mapper export. See the fixations.csv file here: https://docs.pupil-labs.com/invisible/reference/export-formats.html#marker-mapper

user-057596 22 March, 2022, 17:43:49

Perfect Marc and thanks again for the help. Regards Gary

user-d1072e 23 March, 2022, 08:41:07

Hi, can anyone help me explain why sometimes there are missing world_index (video frame) in the gaze data? For example, in this picture, the gazes from the first 53 video frames are missing.

Chat image

papr 23 March, 2022, 08:42:30

Hi, this happens if the gaze estimation starts later than the scene video recording. This is normal and nothing to worry about πŸ™‚

user-d1072e 23 March, 2022, 08:53:43

OK. I also notice that sometimes the world.mp4 video in the exports folder is longer than the original world.mp4 video. So, I think that gaze estimation can also end later than the scene video recording, right?

papr 23 March, 2022, 09:03:54

Mmh, the only reason I can see for that is if the original recording has multiple parts. But I would need to check. If you could share an example with [email removed] then I could confirm it specifically.

user-d1072e 23 March, 2022, 23:59:33

Unfortunately, this is private data. But we don't use that data after all. Thanks for your help anyway!

user-51951e 24 March, 2022, 16:59:13

Hello, I just bought the Pupil Invisible glasses for a research project. Initially, the first recordings I took with the glasses were uploaded automatically in the Pupil cloud. However, I deleted them on the Pupil cloud web application. And now I am unable to upload new recordings on Pupil Cloud via the mobile app. The Pupil Invisible Companion on the smartphone is running, but nothing is uploading. Do you know where the problem might come from ? I tried to shut down the app, but I always got notifications "RTSP service started" and "Upload service running" once I restart it.

user-51951e 24 March, 2022, 16:59:25

Thanks in advance for the help !

marc 25 March, 2022, 08:18:07

Hi @user-51951e! I see you have reached out via email as well. We will answer your inquiry there!

user-a9b74c 27 March, 2022, 07:44:15

I use this code to create scan path visualization video, but i ran out to an keyerror, and I have no idea how to fix it, is there any suggested way to try on this problem? thank you~

Chat image

marc 28 March, 2022, 07:30:40

Hi @user-a9b74c! Thanks for reporting the error. I have revised the script and updated the gist. Please try again with the new version! https://gist.github.com/marc-tonsen/d8e4cd7d1a322a8edcc7f1666961b2b9

user-fb5b59 28 March, 2022, 07:41:41

Hello! πŸ™‚ Are there any plans in the future that the single eye images of the PupilInvisible will be streamed as well?

marc 28 March, 2022, 08:05:18

Hey @user-fb5b59! This is definitely on the roadmap for the new real-time API, but will take some time to get implemented. Streaming eye video is possible via NDSI though and you could also run NDSI and the new API in parallel to use the new one for everything else.

user-fb5b59 28 March, 2022, 08:10:23

Thanks for your fast response. Maybe I did not get it completely: utilizing the current App and the implementation based on NDSI I can receive already the eye images? I know that there was a setting in the communication protocol which I could set but I thought it would do nothing..

marc 28 March, 2022, 08:14:20

Yes, using NDSI (the older, now mostly deprecated real-time API) it is possible to stream eye video. This was available for a longish time and we will keep the Companion app compatible with NDSI for the foreseeable future.

The new real-time API, that was just released, does not yet support streaming eye video, but this is planned for the future.

user-fb5b59 28 March, 2022, 09:05:35

Just an additional question: the new real-time API is "only" for the communication part between the PupilInvisible and the mobile App, am I right? There is nothing integrated related to live pupil radius or eye opening estimation (or any other eye signal)?

papr 28 March, 2022, 10:16:22

I would also like to clarify the components and how they work together: 1. Pupil Invisible glasses connect via USB to the Android phone 2. The Pupil Invisible Companion mobile app - processes the video from the glasses - provides access to the Network API 3. A possible client connects to and receives data from the app via a network connection

marc 28 March, 2022, 10:11:14

The new API allows you to - stream gaze and scene video data - remote control Pupil Invisible devices to start/stop recordings or save events.

No other data stream is currently available. Pupil Invisible will never be able to estimate the pupil radius, as the pupil is often not visible at all in the eye video. Blink detection is about to be released (probably today) for Pupil Cloud. Blink data will only be available as download from Pupil Cloud though, not in real-time via the API.

user-fd74bf 28 March, 2022, 12:52:11

Hi , due to my client's cybersecurity constraints we cannot use cloud services .. is there a way to use pupil invisible using iMotions without connecting to any cloud service ?

marc 28 March, 2022, 13:20:45

Hi Mohamed! Yes, you can disable cloud uploads and transfer recordings directly off of the phone via USB and import them to iMotions. See here for a guide on USB transfer: https://docs.pupil-labs.com/invisible/how-tos/data-collection-with-the-companion-app/transfer-recordings-via-usb.html

Note that this comes with limitations however: you will not be able to access fixation, blink, and 200 Hz gaze data, which is all generated in Pupil Cloud after upload. The 65 Hz gaze signal, which is generated on the phone in real-time is available however. AFAIK iMotions is currently not importing the fixation or blink data anyway though, and has an alternative fixation detection algorithm of their own which may be applicable.

Does you client have a specific issue with cloud uploads that may be fixable? We already have a very restrictive privacy policy and are looking into certification options to proof security and privacy. If there are any specific requirements you can name, I would be very interested!

user-ae76c9 29 March, 2022, 12:04:07

Hi Pupil team! We have a potential participant who is blind in one eye; would this exclude them from being able to wear the Invisible?

marc 29 March, 2022, 12:06:47

Hi @user-ae76c9! We have not had the chance to test Pupil Invisible in this condition. Since the gaze estimation algorithm is utilizing the images of both eyes simultaneously, there is a good chance that gaze estimation accuracy for this subject would be reduced. Its a bit difficult to predict how it will behave exactly though. If that is a possibility I'd recommend just trying how well it works.

user-ae76c9 29 March, 2022, 12:11:12

Got it, thank you!

nmt 29 March, 2022, 13:52:35

Hi @user-1936a5 πŸ‘‹. I'll respond to your question (https://discord.com/channels/285728493612957698/285728493612957698/958360062533976186) in this channel. Pupil Cloud is free for Pupil Invisible customers!

user-1936a5 29 March, 2022, 13:59:13

@nmtThank you for your reply. We are considering purchasing hardware products

user-1936a5 29 March, 2022, 14:02:20

btw,What is the frequency at which these devices record data?

nmt 29 March, 2022, 14:46:09

Pupil Invisible Gaze data is available at 200 Hz once recordings are uploaded to Pupil Cloud. If Cloud uploads are disabled, gaze data is available at around 66 Hz (generated in real-time on the Companion Device)

nmt 29 March, 2022, 14:46:47

Tech-specs available here: https://pupil-labs.com/products/invisible/tech-specs/

user-1936a5 29 March, 2022, 14:49:23

thank you!

user-d1072e 30 March, 2022, 10:35:20

Hi, I notice that the gaze confidence of Invisible exported from Pupil Player is always either 0 or 1, not in between (unlike Core which has number in between). Is this always the case for Invisible, i.e just binary values for confidence score?

nmt 30 March, 2022, 10:39:43

Please see this message for reference: https://discord.com/channels/285728493612957698/633564003846717444/849260790954983424

user-d1072e 30 March, 2022, 11:04:07

Thank you

user-8cef73 30 March, 2022, 13:24:53

I have Pupil Invisible and recorded a few participants, the footage is now on Pupil Cloud so I can see the gaze overlay. I want to take these short clips out and put them together in one video. I can't seem to download the gaze overlay footage from Pupil Cloud, and when I try to download the raw footage and upload it to Pupil Player, that doesn't work. Do I need to get this footage on Pupil Player to cut clips, and if so, how do I do that?

nmt 30 March, 2022, 14:53:49

Welcome to the community @user-8cef73 πŸ‘‹. Check out the Gaze Overlay enrichment in Pupil Cloud: https://docs.pupil-labs.com/invisible/explainers/enrichments/#gaze-overlay. You can use that to export the gaze overlay footage, in sections or for the whole recording.

user-8cef73 30 March, 2022, 15:09:38

Thanks Neil, I did read this page but it wasn't helpful. It mentions downloading the videos but I can't seem to do that with gaze overlay, there's no option. I have noted some sections with timestamps in my gaze overlay footage, but can't see any way to separate them from my full-length videos.

user-50cf87 30 March, 2022, 15:21:16

Hello! i use a pupil invisible for my research, and i have a problem with record.. the world video is cut in 2 parts (Ps1 et ps2) .. why does it happen? can I get back the all record for my data? thank you

user-9429ba 31 March, 2022, 12:34:44

Hi Hanna! It sounds like the USB connection to the smartphone could have been inadvertently disconnected during recording. This would cause the multi-part recordings you see. See these messages for reference:

https://discord.com/channels/285728493612957698/633564003846717444/814330706045566997 https://discord.com/channels/285728493612957698/633564003846717444/814429047554703360

nmt 30 March, 2022, 15:24:35

Once you have run the Gaze Overlay enrichment, you can right-click on the enrichment and hit download. When setting up the enrichment, you can choose which events to run the enrichment between. Have you looked at the 'Analysis in Pupil Cloud' section of the Getting Started guide? You can find it here: https://docs.pupil-labs.com/invisible/getting-started/analyse-recordings-in-pupil-cloud/. It gives step-by-step instructions on how to run enrichments. It actually uses the Face Mapper as an example, but you can follow the guide just using the Gaze Overlay.

user-8cef73 30 March, 2022, 15:27:29

Yes, I've followed those instructions but there's no download option when I right click. Hopefully you can see from the attached screenshot

Chat image

nmt 30 March, 2022, 15:38:35

I see. You'll also need to click the arrow symbol to run the enrichment. Once that's processed, the download option should be available πŸ‘

user-8cef73 30 March, 2022, 15:41:52

Oh I see, the guide made it sound like it was ready once you reached that page and I thought it was a play button for that video clip. Thank you! And just back to my original question, can I edit these videos together in Pupil Cloud or Pupil Player, or do I need to do that on some separate video editing software?

nmt 30 March, 2022, 15:41:06

That's actually an additional step. For the face mapper, you don't need to press the arrow. We'll get the docs updated to clarify that!

nmt 30 March, 2022, 15:44:54

You'd have to do that using video editing software.

user-8cef73 30 March, 2022, 15:45:08

Thanks!

user-4efa6b 31 March, 2022, 18:02:07

Hi! I am wondering what happens if you use the Pupil Invisible glasses in an area without internet? Will the recordings upload to the cloud once you do get to an area with internet connection (if cloud enable is selected)? Or do you have to record data in an area with connection for this to take place?

marc 31 March, 2022, 20:02:02

Hi @user-4efa6b! If no internet is immediately available, recordings will upload later, once it becomes available. Recordings can be made fully offline. The only hard requirement for internet, is when you first login to the app during the initial setup.

user-4efa6b 01 April, 2022, 13:42:28

Thank you!

End of March archive