πŸ•Ά invisible


user-e93961 01 March, 2023, 07:57:19

Hi, I wonder is there a way to drag the recording video in pupil cloud in millisecond precision to achieve sync with some stimuli appearing in the video, thanks! Also is there a way to directly show UTC timestamp on the pupil cloud?

marc 01 March, 2023, 08:03:37

Hi @user-e93961! The closest thing might be to use the keyboard shortcuts , and . which would seek backwards/forwards with a step size of 30 ms, which roughly mimics the step size of the 30 Hz scene video. Regarding the UTC timestamp, no, unfortunately this is currently not possible. It's an issue/request we have seen before and we are considering implementing this.

user-e93961 01 March, 2023, 08:09:53

Hi @marc , thank you for your reply! So may I assume the systemic bias of this kind of sync approach is 30 ms?

marc 01 March, 2023, 08:12:37

Depends on what datastream you sync to. E.g. the gaze signal is recorded at 200 Hz so it would allow for tighter synchronization. But for the scene camera that is the limitation. Although I guess the synchronization error would be 15 ms on average rather than 30 ms πŸ€”

user-e93961 01 March, 2023, 08:18:23

Yeah you're right, thank you! About the UTC timestamp, so the first timestamp in the world_timestamps.csv corresponds to the recording.begin in the pupil cloud, right?

marc 01 March, 2023, 08:18:49

Correct

user-e93961 01 March, 2023, 08:19:19

I see, thank you!

user-413ab6 01 March, 2023, 10:42:01

Hi! In gaze.csv, what is azimuth and elevation?

user-413ab6 01 March, 2023, 10:42:57

for example, in this diagram is azimuth the angle between the orange vector and the z axis, and elevation the angle between the orange vector and the y axis? (imagine the eye is the scene camera)

Chat image

user-80123a 02 March, 2023, 07:55:33

Hello, I would ask a question about post-hoc gaze calibration. These are my steps: I record my data from a Raspberry pi, then post-hoc the data via a desktop PC. I calibrate the eye tracker on the Raspberry pi and unchecked the pupil detection during the recording. Then, on the desktop side, I select post-hoc pupil detection. My situation is: when I only select Post-Hoc Gaze Calibration, I do not have data on gaze direction. I only get data on gaze direction when I press the button Detect References (see figure bellow). And my question is: is it the right way for having the gaze direction data? Or do I need to do other steps? Thank you in advance in your instruction, have a nice day.

Chat image

marc 02 March, 2023, 08:53:46

Hi @user-80123a! Could you first clarify what eye tracking device you are using? Are you using Pupil Invisible? Using the gaze pipeline implemented in Pupil Capture/Player was designed for Pupil Core and it is expected to deliver very poor results for Pupil Invisible.

user-f1fcca 06 March, 2023, 08:31:40

Good morning pupil Labs! I have a short question, how can I download the video with the red Circle from Pupil Cloud to my pc?

marc 06 March, 2023, 08:35:11

Good morning @user-f1fcca! If you want the red circle to be rendered into the scene video, you will have to create a gaze overlay enrichment. Steps for this are 1) create a project 2) add the recording you want to render out to the project 3) create the gaze overlay enrichment 4) start computing the enrichment and wait until it finishes 5) download the finished video.

Let me know if you have further questions!

user-f1fcca 06 March, 2023, 08:49:57

Ahh wonderful! Can I set the starting time of the enrichment (the red circle) be at the start of the lsl labrecorder ?

user-d407c1 06 March, 2023, 08:53:48

Yes, if you have the event already created, you can select it when generating the enrichment, it would let you choose the start and end events, by default they will be recording.begin and recording.end, which corresponds as you can guess to the recording start and end. If you want another event, you can also manually add them. https://docs.pupil-labs.com/invisible/basic-concepts/events/

user-f1fcca 06 March, 2023, 08:59:12

By default on the Events it has a lsl.time_synch.etc, is this the right event, or i should follow the lsl time alignment procedure to get the json file and then create the corresponding event ?

user-d407c1 06 March, 2023, 09:48:46

Hi @user-f1fcca! The lsl.time_sync event is generated by the relay and is used for post-hoc synchronisation, it would be generated at the point of start of the LSL. If you only want to visualise the gaze overlay from the beginning of the LSL recording that is totally fine. The correction that you link to, is done to match the gaze timestamps recorded on the LSL (at 60Hz) with the ones provided by the Cloud (at 200Hz). Such that you can use the 200Hz sampled data with the LSL time sync.

user-f1fcca 06 March, 2023, 10:00:42

Ahh, so this event with red arrow, is the beginning of the LSL! That means that is the beginning of the LabRecorder? Because the LabRecorder works with LSL!

Chat image

user-2ecd13 07 March, 2023, 22:16:48

Hey, I know this gets asked many times, but we are having issues reliably connecting to the pi.local:8080 and pi-1.local:8080. We can usually connect to either the pi.local or the ip address without issues the first time, but we if close the web browser and try to open the link back up, we just get a white screen.

We are not using an institutional router and we've reserved the IP address so we could create a url shortcut on our desktop to quickly open it during our study

nmt 08 March, 2023, 12:32:21

Hi @user-2ecd13 - have you tried hard-refreshing the browser page? Out of curiosity, why do you need to close the web browser?

user-2ecd13 08 March, 2023, 16:02:35

@nmt I've tried refreshing the browser and it just stays white. It's not even just from closing the browser, it's refreshing the browser, closing it, etc. We work primarily with kids, so sometimes things come unplugged and we are just trying to understand how we can reliably get the monitoring to work (should we need to troubleshoot/close something).

For example, yesterday in testing I got both monitoring apps to work but after closing out the browsers, to test it again, I couldn't pull up the browser interface. The webpage just stayed white. It only worked if I opened up chrome (for the first time) and tried, but then even that stopped working after closing that browser.

nmt 08 March, 2023, 16:57:14

Thanks for confirming. I've just tested this out with Chrome + my home network and I'm unable to replicate the behaviour you describe. I can open multiple browser instances, refresh and close/reopen. Do you have any other apps running on the phone?

nmt 09 March, 2023, 07:32:37

Monitor app streaming

user-4771db 09 March, 2023, 09:33:41

Hi Pupil Labs Team! I saw that its possible to buy an "Pupil Invisible Companion" package from your website. I was just wondering: What does it include in comparison to when I simply buy an OnePlus 8 Smartphone myself?

marc 09 March, 2023, 10:04:20

If you mean the "Pupil Invisible Companion" that is mentioned in the accessories section here, it's exactly the same, except for the added USB-C cable and the phone case. There is no functional change to the phone itself. You could absolutely buy it somewhere else! https://pupil-labs.com/products/invisible/accessories/

user-d209ef 09 March, 2023, 10:45:17

Hello. We share an Invisible between two labs. Can we both have our own workspace? Now the Companion is showing only one Workspace (xx Owner). Can I create a new workspace that is completely separate from the owner's workspace?

marc 09 March, 2023, 10:47:17

Hi @user-d209ef! Yes, you can create entirely separate workspaces within the Pupil Cloud UI. You might also want to create separate Pupil Cloud accounts for each lab (or every user) for access control.

user-d209ef 09 March, 2023, 10:49:09

Thanks @marc ! Could you direct me to a link / documentation on how to do this? So far I have only found https://docs.pupil-labs.com/invisible/glasses-and-companion/companion-device/

user-c2d375 09 March, 2023, 11:41:22

Hi @user-d209ef πŸ‘‹ here you can find more information about workspaces https://docs.pupil-labs.com/invisible/basic-concepts/projects-and-workspaces/#workspaces. To create a new workspace, go to Pupil Cloud, click on your profile icon that is at the bottom left corner of the page, and click on "Create a new workspace" in the menu.

user-d209ef 09 March, 2023, 12:56:00

@user-c2d375 I can create a new workspace, but how do I make sure that my recordings end up in my workspace?

user-c2d375 09 March, 2023, 13:00:14

You need to switch to the desired workspace in the Invisible Companion App. Tap on the current workspace name (the default one is called [name surname]'s Workspace), and then on "switch workspace" to activate it. Once you have activated the desired workspace, any new recordings will be uploaded to that workspace.

user-51951e 09 March, 2023, 16:18:45

Hi, we use the Pupil Invisible for a research project, and we cannot upload the data on Pupil Cloud because some data are sensitive. We would like to use the timestamps saved by the Companion App on the phone, we found the documentation on how they are encoded but we are struggling to decode and read them. Could you provide us with some tips or a procedure to decode these timestamps, preferably using Python ? Thank you in advance for your help !

user-d407c1 09 March, 2023, 16:28:27

Hi! You can still load Pupil Invisible recordings to Pupil Player for easiness, or here you have how to read them in python https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/pupil_recording/update/invisible.py#L494 https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/pupil_recording/update/invisible.py#L460

user-f01a4e 10 March, 2023, 07:58:06

Hi I am trying to change my work space but the device(phone) is taking a lot of time to sync. Plz help.

user-d407c1 10 March, 2023, 08:20:06

Can you try to logging out of the Companion App, by clicking on the hamburger menu, settings, scrolling down and selecting log out, and then log in again? If the issue persist after that, can you try https://speedtest.cloud.pupil-labs.com/ and let us know your results?

user-f1fcca 12 March, 2023, 13:26:28

Hello! I was wondering, is there a way if i have a specific timestamp to know the exact time on a video ? For example the timestamp in ns : 1677583793620556032 what is the closest frame of the video that it's corresponds to ? Thanks in advance!

user-f1fcca 13 March, 2023, 08:33:48

Good Morning! How I'm Able to check in what point of the video a specific gaze data referring to ?

marc 13 March, 2023, 08:35:35

Hi @user-f1fcca! Have a look at the following how-to guide, which discusses this problem! https://docs.pupil-labs.com/invisible/how-tos/advanced-analysis/syncing-sensors/

user-e93961 13 March, 2023, 11:20:38

Hello! After putting some markers on the surface of interest, I wonder if there is a way to obtain the surface coordinates during recording. I'm assuming using real-time api to get the gaze data and frame image, and processing the data on the host side, like marker mapper.

marc 13 March, 2023, 11:39:54

Hi @user-e93961! Technically this is possible, because the implementation of the surface tracker (that is what this feature is called in the Pupil Core software; its the same as the Marker Mapper algorithmically) is open-source. https://github.com/pupil-labs/surface-tracker

Practically speaking though, the implementation is currently more or less free of any documentation and thus very difficult to use. It's definitely on our roadmap to generate documentation for the repo, but its gonna take a while until we get around to it I am afraid.

marc 13 March, 2023, 11:41:51

If you are up for the challenge, I maybe have a couple additional code snippets that might be helpful. Let me know!

user-7c714e 13 March, 2023, 11:48:00

Hi @marc ! Is there a way to evaluate the gaze data (which parameters?) in different AOI? What I mean is % gaze, number of fixations, fixation duration, frequency of gaze shifts between AOIs etc. I have only the heatmaps in every single AOI but I need more data.

marc 13 March, 2023, 12:59:34

Hi @user-7c714e! If you download the enrichment results you get gaze.csv and fixation.csv files with data mapped onto the AOI. From these values you can calculate all the metrics you listed per AOI. Frequency of gaze shifts between AOI might be a little more difficult to compute, but the others can be aggregated directly from the CSV files. Let me know if this is still not clear enough!

user-e93961 13 March, 2023, 11:58:17

Hi @marc , thank you for the reply! I would like to have a try, could you please provide with the additional code snippets?

marc 13 March, 2023, 12:48:19

Yes, I will DM you!

marc 13 March, 2023, 13:00:05

See e.g. the documentation of the Marker Mapper exports here: https://docs.pupil-labs.com/invisible/reference/export-formats/#marker-mapper

user-7c714e 14 March, 2023, 08:19:08

Thanks @marc, I'll try it.

user-e2db0a 13 March, 2023, 21:55:26

Hello developers, I'm Rohan. I've recently started using the Pupil Invisible for a test experiment, and I have an issue. The test goes as follows: I wore the glasses to look, and five individual dots, sequentially placed on a screen, are all separated by a good amount of distance.

Upon looking at the data/recording, it seems as if the gaze markers are off by a significant amount. Even though I'm looking at the rightmost dot on the screen, the gaze markers are stuck to the left and would only reach the second marker from the left at most.

I'm still an amateur, trying to learn how to calibrate, make things accurate, and ultimately learn. I would like to have any help from the community. Thanks

nmt 14 March, 2023, 07:18:42

Hi @user-e2db0a πŸ‘‹. Check out this message for reference: https://discord.com/channels/285728493612957698/633564003846717444/999986149378506832

nmt 14 March, 2023, 10:16:22

LSL Relay

user-057596 14 March, 2023, 16:02:23

Hi we used the Marker Mapper on many occasions and I’ve always placed the markers with the same orientation as they are given on Pupil page but do they have to be placed this way to work or does it not matter which side is on the top? We are just about to use them in a medical training research project where the clinicians will be setting up and so we want to make the process as simple as possible.

user-c2d375 14 March, 2023, 16:46:54

Hi @user-057596 πŸ‘‹ markers orientation does not matter. Please note that it's essential to use distinct markers to avoid any duplicates.

user-057596 14 March, 2023, 16:48:09

Thanks Eleonora that will make it so much easier for them to use. πŸ‘πŸ»

user-f01a4e 16 March, 2023, 06:01:24

Hi I am uploading the raw data exported to the pupil player but I am unable to see any video. there is nothing visible. I even tried reinstalling the player but i am getting the same issue. I am able to see the data on the cloud but not on the player. can I get some help.

user-d407c1 16 March, 2023, 06:25:23

Hi Anuj! How are you downloading the recording? To be opened by Player, you should right click on the recording and download it as β€œpupil player format”. If you did so, what issues are you finding? What the download folder contains?

user-f01a4e 16 March, 2023, 06:56:15

yes i am downloading for pupil player

user-f01a4e 16 March, 2023, 06:59:44

this is the error i am recieveing

Chat image

nmt 16 March, 2023, 07:21:22

Hi @user-f01a4e. Please share the full player.log file. Search on your machine, 'pupil_player_settings' - inside that folder is the log file

user-f01a4e 16 March, 2023, 09:00:10

@nmt This one?

player.log

nmt 16 March, 2023, 09:50:34

Hi @user-f01a4e. It seems that the world video is missing from what you tried to load into Pupil Player. Can you please try re-downloading freshly from Cloud?

user-0a5287 16 March, 2023, 09:42:48

Hello! Tell me, please, in the Pupil Invisible Monitor application, the sound from the microphone of the glasses should be broadcast?

user-d209ef 16 March, 2023, 10:24:47

Hello! I would like to stream my Invisible to my PC. Works fine at home via pi.local:8080 but not at work (academic institution). Could this be due to the fact that this connection is considered "not safe" (http instead of https)? Is there a workaround?

marc 16 March, 2023, 11:07:42

Large public networks usually block the type of traffic that is required for streaming the data. Given that you won't be able to change the network configuration, you need to use a different network. This could be another less strict network, a dedicated router or a hot spot hostess by the phone or computer.

user-f01a4e 16 March, 2023, 10:31:47

@nmt hey thanks for responding. I have done that multiple times and getting I am getting the same results.

user-d407c1 16 March, 2023, 10:33:54

@user-f01a4e are you using Safari?

user-f01a4e 16 March, 2023, 12:05:36

@user-d407c1 I am using chrome

user-d407c1 16 March, 2023, 12:07:21

could you please write an email to info@pupil-labs.com with the recording ID and stating the content of the downloaded folder?

user-f01a4e 16 March, 2023, 12:09:05

Sure I'll do that. Thank you.

user-60d698 16 March, 2023, 13:24:48

Hello, I have a few questions, 1. After downloading the whole recording folder from the pupil cloud using the API, I noticed that the timeseries data is not included and I can't seem to find an appropriate API request to do so. 2. The recordings include files with .time format and .raw formatand I am not sure how to decode them. I would like to synchronize them and export all 3 cameras synchronized frame by frame labeled. I have seen the "Raw Data Exporter" from the pupil player that partly may solve this? But I would rather not have to use a GUI. https://github.com/pupil-labs/pupil/blob/master/pupil_src/shared_modules/raw_data_exporter.py

marc 17 March, 2023, 08:40:37

Hi @user-60d698! The documentation of the cloud API is a bit lacking unfortunately. But you can use the following endpoint to download recordings in the convenient format also used by the raw data exporter. https://api.cloud.pupil-labs.com/v2/workspaces/<workspace_id>/recordings:raw-data-export?ids=<recording_id_1>&ids=<recording_id_2>

user-6e1fb1 16 March, 2023, 14:19:45

Hi, I forgot to change who the wearer was between my recordings and I have calibrated for each wearer at the beginning, is there a way to change the wearer afterwards so that the correct calibration compensation can be applied?

nmt 16 March, 2023, 17:36:58

Hi @user-6e1fb1! There's no way to retrospectively apply offset corrections to different recordings. However, if you set the offset with each new wearer and then made some recordings, the offset will have been applied to those recordings.

user-f01a4e 17 March, 2023, 07:12:07

@user-d407c1 Hi, i have sent your the mail as per your request. kindly have a look and let me know how i can resolve the issue.

user-d407c1 17 March, 2023, 07:44:53

I've answered there, basically you will need the Pupil Player Format in order to use it with Pupil Player, not the raw data enrichment download.

user-f01a4e 17 March, 2023, 08:08:29

@user-d407c1 sry, I think attached the wrong one then. I'll send the pupil player format.

user-f01a4e 17 March, 2023, 09:21:24

@user-d407c1 it working now. thanksπŸ˜…

marc 17 March, 2023, 17:15:24

@user-60d698 Okay, got it! The timestamps included in the gaze.csv file coincide with the timestamps of the left eye video, i.e. there should be one row per video frame. For the right eye this is more complicated, as the timestamps are not included in this download. Instead you will indeed have to decode the .time files. The raw sensor files are not really meant to be consumed directly, so this is a bit inconvenient. The timestamps of the left eye camera video are in PI left v1 ps1.time and the ones for the right eye in PI right v1 ps1.time.

To read them in Python you could use

import numpy as np
np.fromfile(file_path, dtype="uint64")
user-60d698 17 March, 2023, 17:17:25

thank you! πŸ™‚

user-60d698 21 March, 2023, 11:51:44

hello again, The size of the arrays of the [camera_name].time files does not match the frame count of the mp4 videos (using ffmepg.probe), how do you deal with this? Sometimes there are less frames than timestamps, sometimes there are more frames than timestamps. I need to be precise on the timestamp to frame matching to synchronize the frames from each camera to eachother. Do all the 3 cameras take the first frame perfectly at the same time, cause if that is the case, I can just cut off the last ones?

user-35fbd7 18 March, 2023, 14:28:34

Hello! Can't calibrate the Invisible. The red circle is not visible, so I can't to adjust the gaze. In the recordings errors are significant. Could you advice me how to make the recording more precize?

nmt 18 March, 2023, 16:43:01

Can you please make sure the app is up to date in Google Play store, then try logging out and back into the app. If that doesn't work, please reach out to info@pupil-labs.com

user-35fbd7 19 March, 2023, 10:23:56

I followed your advice, log out and log in and as the result the App doesn't repy at all. What's the hell?! It is unreliable for work. I have the recording scheduled for today. What have I to do? to reject the project? or there is some possibiilty to come back to working regime?

user-35fbd7 19 March, 2023, 10:50:28

It doesn't work. "The recording is failed. To disconnect P1". THat's all

nmt 19 March, 2023, 10:43:32

Invisible app

user-0a5287 20 March, 2023, 12:57:34

Hello @nmt , @marc ! Our glasses get hot (about 55 Β°C). Please tell me what is the normal temperature for glasses to work for an hour?

nmt 21 March, 2023, 08:56:57

Hi @user-0a5287. The Invisible glasses do get warm, but nothing hot should ever touch the users skin and it should never be uncomfortable. The temperature reached after several minutes is the maximum temperature and the glasses can be used continuously for around 150 minutes prior to the phone running out of battery.

user-c414da 21 March, 2023, 17:13:06

Hello! Is it possible to calibrate the fixations in Pupil Player after making a recording?

user-d407c1 21 March, 2023, 17:19:00

Hi! are you using Pupil Invisible? Fixations for Pupil Invisible are available through the Cloud, you can download there the .csv file and apply any offset correction you want to the x and y coordinates reported.

user-c414da 21 March, 2023, 17:33:20

Yes, I am using invisible. Can you point me to any existing code or a formula that I could use to make those corrections in the .csv file?

user-ace7a4 23 March, 2023, 13:54:14

Hi! I conducted an experiment with the invisible glasses and want to upload it to the cloud. The experiment room had limited WIFI and thus the upload stayed at ~0%. Now, the "upload circle" is just empty. When I click on "upload" no visible changes are seen. It is also not uploaded on pupil cloud. Is there a trick to make the upload still work? I can see the whole recording on the companion app by clicking on it. It is just not uploading. I am also now in an environment with regular wifi strength

user-d407c1 23 March, 2023, 13:58:03

Hi @user-ace7a4 ! In the recordings view if you click on "upload" it should be sufficient to restart the upload, please try somewhere where you have an stable connection. You can test your connection speed to the Cloud here https://speedtest.cloud.pupil-labs.com/

user-ace7a4 23 March, 2023, 14:05:17

I should be in an environment with a stable connection now. Following the link gives me a "Site cant be reached; The connection was reset" error

user-d407c1 23 March, 2023, 14:08:46

Are you connected to an institutional network? If so, the firewall might be blocking the connection, would you mind testing on a different network?

user-ace7a4 23 March, 2023, 15:02:43

Ah yes, that was indeed the issue. Thanks!

user-9c87a0 23 March, 2023, 22:07:20

Hi, I wanna uninstall the Pupil Player v3.4.0 so as to successfully install the v3.5.0. But I came up with this error prompt all the time.Please help

Chat image Chat image

nmt 24 March, 2023, 16:46:59

Hi @user-9c87a0 πŸ‘‹. This is a Window's permission error if I'm not mistaken. Does your user have necessary privileges to uninstall/install software?

user-cd03b7 24 March, 2023, 01:10:00

Hey guys, I swear I saw a pupil guide for the invisible on running a study on mobile applications / for dynamic screens, does anyone know where I can find that?

user-480f4c 24 March, 2023, 07:50:37

Hey @user-cd03b7 , I understand that you mean the tutorial in our Alpha lab website. Here it is https://docs.pupil-labs.com/alpha-lab/map-your-gaze-to-a-2d-screen/ - with this tutorial you can map and visualise gaze onto a screen with dynamic content, e.g. a video, web browsing or any other content of your choice, using the Reference Image Mapper enrichment.

user-ace7a4 24 March, 2023, 13:38:52

Hi! I am trying to run the dense-pose-colab notebook. I uploaded the raw folder to my drive and ran the notebook. Unfortunately, I do not get a new folder that holds the dense-pose video and csv files. I renamed the raw folder from pupil cloud into something meaningful. Could this be the issue? I restarted the notebook and ran it again, but I still have the same "error". Of course I adapted the path and it seems to be correct.

user-ace7a4 27 March, 2023, 11:28:43

Chat image

user-d407c1 27 March, 2023, 11:52:38

Hi @user-ace7a4 Where do you get the error in Colab?

user-ace7a4 27 March, 2023, 11:29:48

My folder structure looks like this. Do I need to restructure it? I ran the notebook again but I still do not get a new folder in my google drive.

user-ace7a4 27 March, 2023, 11:54:59

I get the run time error, which is, according to the comments, "normal", but other than that every cell gets a green tick. The issue I have is that I do not get the new results folder in drive, that holds the video and the two csv files.

user-d407c1 27 March, 2023, 12:00:33

On the last cell click on show code, there you will have the possibility to change the path of output. Please let me know if you still have issues.

user-ace7a4 27 March, 2023, 17:35:48

While this indeed changed something (a % value indicating the current processing progress was printed ) it took ~4-5 hours to reach 76% and then the notebook crashed. Is this due to a poor network connection of mine, or is this still "normal"? Also a mp4 called denseposewas created. I cannot open it and it only holds 48 kbs.

user-d407c1 27 March, 2023, 20:27:03

I think I know where the problem occurs, I will look a t it tomorrow and update the notebook.

user-257877 27 March, 2023, 21:01:56

Alright, I figured it out and it worked, thank you very much! One follow-up question: In your documentation, I saw that it makes sense to cut out portions where participants are looking somewhere else (e.g., their smartphone). Do you have a rule of thumb regarding how long of a 'distracted gaze' may be acceptable? For example, if we are talking about a 60-second clip of someone looking at a supermarket shelf and one second of that is spent looking at a smartphone or a person standing next to them, would you still recommend cutting out that part or would you say it doesn't matter too much since it makes up less than 2% of the overall video length? I hope my question makes sense. Thanks!

user-d407c1 27 March, 2023, 21:21:14

It's great to hear that you were able to figure it out! In terms of cutting out portions where participants are looking somewhere else, there is no hard and fast rule regarding how long of a 'distracted gaze' may be acceptable. And it makes total sense as a question

The reason why we ask to use events to make smaller sections, where only you expect subjects to gaze on the reference image, is mostly computational.

Not only it will take longer for your enrichment to finish, since it will have to localise the whole video, but also since you already know that the participant is looking somewhere else (e.g. cashier), it makes no sense to compute that part of the recording.

Kindly note also, that the amount of points/features used to build the "3d environment" and match them between the scanning recording and the experiment recordings is fixed. Thus, if you cover a large area (that is for example the whole supermarket), the points will be dispersed across the whole supermarket, and only few of them will be on the shelf. When matching those points, you will have less points to match and the matching might be less accurate. While if you select only a portion of the video where you expect to see the shelf, more points will be concentrated there and therefore matching them will be easier/better.

I hope explained myself clearly.

user-e93961 28 March, 2023, 05:28:10

Hi there! I wonder if there is a easy way to get each frame in the scene video with corresponding timestamp.

user-c2d375 28 March, 2023, 07:37:19

Hi @user-e93961 πŸ‘‹ in the export file world_timestamps.csv is possible to easily access the timestamp of every world video frame

user-d407c1 28 March, 2023, 15:19:48

@user-ace7a4 I've updated the repo and roll back the notebook. Although I was not able to replicate the problem.

Few things, please kindly make a copy of the notebook if you want to modify it, rather than disabling the test mode.

Second, would you mind trying it with a small section of your video using events? Please, create events on the cloud and download the raw enrichment again, which should include those events in the events.csv file. Then define them on the last cell as arguments using --start "event.name" and --end "event.name"

If you see any error, please let me know what the error states . If the error is a Drive timeout, you will need to not use Google Drive. Upload the folder on the side bar and change the input path and output path from Colab accordingly.

If that still doesn't work you can try substituting !python -m pip install 'git+https://github.com/pupil-labs/densepose-module.git'

on the 3rd cell with: `!python -m pip install [email removed]

which will use an older version of the pl-densepose module.

Also kindly note:

Google Colab notebooks have an idle timeout of 90 minutes and absolute timeout of 12 hours. This means, if user does not interact with his Google Colab notebook for more than 90 minutes, its instance is automatically terminated. Also, maximum lifetime of a Colab instance is 12 hours.

If you want to run it locally https://github.com/pupil-labs/densepose-module

user-c9d495 28 March, 2023, 15:57:11

hello, I am new here and not sure where to start, but this thread is immediately related to what I am trying to do: Short story: I have pupil invisible recordings that we downloaded from the phone and exported the relevant ssections via the player software. I've uploaded the folder to gdrive and tried to run the colab-notebook. It took everything, but upon clicking the button to exectute it, it was done immediately, but had no output file and there was not error message nor any output in the folder. Is there i) maybe a demo video that I could use to make sure I am doing it right? 2) i read that one should upload data to the cloud (which I'd need permission for before I could). Is that really necessary? I mean I have exported data that should contain everything. THanks a lot for your help!

user-5f1b97 28 March, 2023, 16:18:37

Hi there! We discovered a problem, that after every recording we get the message "You have an unsaved recording from a previous session" and we can only start a new recording after closing and restarting the companion app. It occurs also after deleting all recordings. Is this a known problem or is there any solution to this? I hope you can help me.

user-d407c1 28 March, 2023, 16:44:30

Hi @user-5f1b97 ! Can you try clearing the app's cache data? Hold the icon, press on App Info, then Storage and cache and then click on Clear cache button.

user-c9d495 28 March, 2023, 17:58:00

(also, just to confirm that that isn't the culprit: I can work with the data transferred from the phone and exported via the player, right?)

user-d407c1 28 March, 2023, 18:25:51

That’s what the override flag is meant for

user-d407c1 29 March, 2023, 11:46:34

@user-c9d495 unfortunately the change above means no override in Colab for now, I will try to fix it and update. But if you have a linux or mac, feel free to try it locally.

user-c9d495 29 March, 2023, 11:58:35

@user-d407c1 thanks for the update! Update from my side: I was able to run it and it is currently running locally (I noticed that while the colab cell didn't show outputs, downloading the nb and opening it in jupyter showed errors, whcih helped me fix it). So, as said, now it is running. However, your message suggests that next time I disconnect (which happens regulalry on colab, as you will know), it will no longer run with the --override? That's unfortunate. Is there a chance for me to change the line "!python -m pip install 'git+https://github.com/pupil-labs/densepose-module.git'" to some older version that would be the one I am currently executing (branch, flag?)??

user-d407c1 29 March, 2023, 12:01:28

You can save your notebook and use it, I did not modify the repo but only changed the notebook to use an older version. The line you mention will pull the latest one, which does have the override flag. Thanks for the update too!

user-d407c1 29 March, 2023, 14:56:41

Again me, feel free to use the latest version of the notebook, it works. ℒ️
Google Colab seems to skip logging some times but at least now you should see some progress bars. I can't find an easy fix for the logging output, so I will leave as it is, the code works and you get the output but the displaying of the current state may fail.

I also added a few more parameters to choose the confidence value, the start and end events for easiness.

user-94f03a 29 March, 2023, 12:09:17

Hello! is there a way (or a roadmap for) to fix gaze offset errors in the pupil cloud? I think we can do this offine (pupil player) but that means downloading all recordings and losing the event annotations we have made on the cloud interface. cheers!

user-d407c1 29 March, 2023, 13:40:44

You can check the roadmap here https://pupil-labs.canny.io/pupil-cloud

Feel free to suggest this feature

user-14536d 30 March, 2023, 00:12:33

Hi, new to using Pupil Labs invisible, but have run some successful trials with them last week. I'm struggling to get the app to sync, in particular the wearers are not syncing and it's not uploading to the cloud even though I'm connected to the internet. Still able to record footage, but not able to sync. Help appreciated!

user-14536d 30 March, 2023, 00:19:30

I uninstalled the app and re-installed it and that seemed to fix it πŸ™‚

End of March archive