invisible


Year

Month (messages)

wrp 15 October, 2019, 11:34:26

@user-df9629 your question from core ported to this channel:

thank you for clarifying the functionality. I thought the "R" was for record. My bad. One more thing, I noticed the reset button on the companion app but couldn't figure out how to add the offset. Any advice on that?

wrp 15 October, 2019, 11:40:43

Offset correction is an feature that we're currently testing (not a finalized feature). In the current iteration you can use it by doing the following: 1. Make a short recording with the Wearer looking at known landmarks in the scene. Save the recording. 2. Open the recording from the recording view. Pause the video when wearer is looking at the landmark. Touch & Drag anywhere on screen to move the crosshair to the "corrected" position. Click save offset button. 3. Any gaze data now produced with this Wearer profile will now have the offset applied. Gaze circle color changes from red to green to indicate offset applied by the user. 4. You can reset the user defined offset in the wearer view.

user-df9629 15 October, 2019, 12:25:29

Thank you @wrp . This will help a lot!

user-367c1f 17 October, 2019, 06:54:29

We acquired pupil invisible few days ago and we noticed an error of gaze estimation

user-367c1f 17 October, 2019, 06:55:05

is there anyway to fix that by calibration ?

mpk 17 October, 2019, 06:57:26

@user-367c1f , @wrp talked about this in the link above, I hope this is helpful. Let me know if you need more input.

wrp 17 October, 2019, 09:41:31

@user-367c1f if you have any questions/feedback after trying the above steps, please let us know.

user-30a16e 17 October, 2019, 12:50:26

Today the phone is not saving the data to the cloud. It is stuck at 0 percent, with the option of pausing. Any idea?

wrp 18 October, 2019, 06:23:52

@user-30a16e I believe this was responded to already in another channel of communication.

user-367c1f 20 October, 2019, 13:19:08

@wrp your previous link was helpful indeed we could check it quickly

user-367c1f 20 October, 2019, 13:20:36

I would like to know if it is possible to change gaze offset calibration after a recording and post fix the gaze data

user-367c1f 20 October, 2019, 13:26:22

so far we have been doing it manually

wrp 21 October, 2019, 09:19:55

@user-367c1f post-hoc offset is a feature that we are considering adding to Pupil Cloud. Please also note that the current offset feature is very much a beta feature that is being actively iterated on/developed.

user-df9629 22 October, 2019, 16:05:40

Hi, I am testing the raw IMU data and the accelerometer data makes sense when read as float32 big endian instead of float32 little endian (as mentioned in the developer docs). Can somebody please look into this?

user-367c1f 23 October, 2019, 07:02:02

I have an issue with one PI recording when exporting to pupil player

user-367c1f 23 October, 2019, 07:03:21

I'm getting " Oops! There was error updating the recording" message and no gaze data is apparently generated by P player

user-367c1f 23 October, 2019, 07:04:24

Lookup tables for Eye0 and Eye1 are missing so I'm wondering what would be the issue on player

user-367c1f 23 October, 2019, 07:06:00

The recording data (the one with update problem )seems similar to other ones

user-c5fb8b 23 October, 2019, 07:58:08

Hi @user-367c1f thanks for reporting this issue! Can you send us the player.log file? If you are running Pupil from bundle, you can find it in your user folder under pupil_player_settings. It is overwritten every time you open Pupil Player again, so please make sure to reproduce the issue and then send it.

user-367c1f 23 October, 2019, 08:17:13

This is the player Log after error

player.log

user-367c1f 23 October, 2019, 08:17:23

user_settings_player

user-fd5a69 23 October, 2019, 14:42:21

Hi, I acquired pupil invisible two days ago, and I noticed that the pupil invisible recording folder does not have any audio files. If I want to record audio using pupil invisible, what I need to do to enable audio recording?

user-c5fb8b 23 October, 2019, 15:04:26

Hi @user-367c1f thanks for sending the files. Can you also share the info.invisible.json file that you find in the recording folder?

user-87c1b3 23 October, 2019, 19:40:28

Hello! We purchased the invisible a while ago. However, the videos are not saving to the cloud (it remains at 0%). Has this question been resolved? I saw that this question had been asked in this discussion, and it was mentioned that this issue was discussed in another channel of communication. However, I cannot find where this issue was discussed in another channel of communication.

user-367c1f 24 October, 2019, 04:00:40

@user-c5fb8b

info.invisible.json

wrp 24 October, 2019, 04:43:21

Hi @user-87c1b3 what version of Pupil Invisible Companion app are you using? The issue had to do with the app version pointing towards an staging cloud env.

user-7df12d 24 October, 2019, 12:57:11

Hello, i want to use the Invisible Glasses in a sports setting. I have a question about the analysis.

user-7df12d 24 October, 2019, 12:57:26

Sorry for the split post.

user-7df12d 24 October, 2019, 12:58:43

Is there already a software which contains standard analyis like saccade time, fixation time etc.? Or do i have to program somthing?

user-fd5a69 24 October, 2019, 14:08:37

Hi, is there a way to do the data collection with PI offline and afterwards get the calibration? We like the automatic calibration but we don't have internet connection in outdoor settings. Also, do I need to do something to enable audio recording using PI?

user-c5fb8b 24 October, 2019, 14:09:25

@user-367c1f Thanks you, we are looking into the issue and will report back to you soon!

user-87c1b3 24 October, 2019, 15:01:15

Hi @wrp thank you for the message! I am using App version 0.6.19-beta-prod. Is this the app that had this issue?

user-df9629 24 October, 2019, 19:34:56

Thank you for updating the recording format's documentation 🙂

mpk 24 October, 2019, 20:22:29

@user-df9629 sure, sorry I did not get back to you on that, but we acted on your comment and fixed the docs.

user-df9629 24 October, 2019, 22:14:40

@mpk , no problem. Thank you so much for looking into it!

wrp 25 October, 2019, 03:42:58

@user-fd5a69 Responses to your points: - You don't need internet connection to use Invisible Companion App. You only need internet connection once for sign up/log in. If you want to use Pupil Cloud features, you will need an internet connection to sync wearers and templates with the DB. You can record offline and upload recordings later to cloud when your device is back online. - What version of the app are you using? - Audio - we will enable audio recording in the near future. All Pupil Invisible devices have a microphone, but we just have not yet enabled recording in the app.

wrp 25 October, 2019, 03:45:07

@user-87c1b3 you should be able to upload to cloud.pupil-labs.com with this app version. The issue you are experiencing is not the same as the other user. I have a feeling this is related to your app settings. Can you please go to the app settings menu and check that Cloud upload toggle is on ?

wrp 25 October, 2019, 03:46:36

@user-7df12d Pupil Invisible would be a great fit for sports research. You will be able to classify fixations in Pupil Player (our open source desktop app). In the future we will likely be adding more enrichments features to Pupil Cloud as well.

user-c5fb8b 25 October, 2019, 06:14:56

Hi @user-367c1f with the new release of Pupil v1.17 we might have fixed the issue of opening your Invisible recording. Please give it a go and report back to us if you are still having problems!

user-367c1f 25 October, 2019, 15:06:42

@user-c5fb8b thank you very much. We could get eye tracking data with the new release

user-df9629 25 October, 2019, 16:21:57

Hi, I noticed a delay in scene video from the time I start recording on companion till it actually starts recording. Is this an intentional delay? If yes, can somebody tell me the exact delay in seconds/milliseconds ? The delay is shown in the playback on the app as well. Its almost 2 seconds from my observation but I can't measure it precisely.

user-87c1b3 25 October, 2019, 17:13:55

@wrp I had updated the app through Google Play, and I was able to upload the videos. Thank you again so much for your help!

wrp 26 October, 2019, 01:48:09

@user-87c1b3 thanks for the update. Pleased that this was resolved.

wrp 26 October, 2019, 01:58:23

@user-df9629 I don't have the numbers in front of me right now, but it does take a little bit of time before frames are captured from the scene camera.

user-df9629 26 October, 2019, 14:48:20

@wrp , no worries. Thank you for confirming the lag.

user-df9629 26 October, 2019, 17:15:29

Hi, I know offset feature is still not a finalized one. May I suggest adding/ saving the offset as a separate file or in the info.json so the raw gaze data is untouched, and the offset can be applied to every individual recording as opposed to just the wearer.

wrp 27 October, 2019, 01:28:29

@user-df9629 were working on a number of changes to the situational offset - thanks for the feedback

user-df9629 28 October, 2019, 14:14:55

@wrp , thank you, and you're welcome!

user-7df12d 29 October, 2019, 14:45:33

@wrp Is there already a possibility to get information of saccades or quiet eye?

user-7df12d 29 October, 2019, 14:45:56

Thank you for the first informations above

mpk 29 October, 2019, 14:47:56

@user-7df12d you can open Pupil Invisible recordings in Pupil Player and use the fixation detector. You can change the parameters to detect very long fixations. Would that help?

user-7df12d 29 October, 2019, 14:56:09

@mpk That helps but is there something to detect or analyse the saccades or gaps between fixations?

user-94ac2a 30 October, 2019, 13:06:57

Does invisible has IR LED?

mpk 30 October, 2019, 15:16:36

@user-94ac2a yes it does, but its only required when there is no environment IR present.

user-94ac2a 30 October, 2019, 15:17:49

@mpk thx. And Invisible is running the same core algorithm?

mpk 30 October, 2019, 15:19:15

Pupil Invisible uses machine learning for gaze estimation. Pupil Core uses a 'traditional' computer vision pipeline.

user-94ac2a 30 October, 2019, 15:26:29

So invisible is better?

mpk 30 October, 2019, 15:33:35

@user-94ac2a Invisible is better for work 'in the wild'. Pupil Core is better for the 'lab'.

user-94ac2a 30 October, 2019, 15:35:02

Ok. Will core move to machine learning based algorithm in future?

user-90faf2 30 October, 2019, 23:03:23

Hi, when will support for working with imotions be available?

user-90faf2 30 October, 2019, 23:05:16

I am working with Invisible but I also need to process data with Imotions 😦

user-90faf2 30 October, 2019, 23:06:56

thanks

user-83c9fb 31 October, 2019, 22:10:38

Hello, has anyone been able to run a setup for the pupil invisible for at least 4 hours? maybe by using an battery phone case or a usb c OTG splitter?

End of October archive